Body

'Combo' nanoplatforms for chemotherapy

In a paper to be published in the forthcoming issue in NANO, researchers from Harbin Institute of Technology, China have systematically discussed the recent progresses, current challenges and future perspectives of smart graphene-based nanoplatforms for synergistic tumor therapy and bio-imaging.

Due to metastasis and drug resistance of the tumor, malignant tumors are still difficult to cure. Chemotherapy, alone or combined with radiotherapy, is still the standard treatment after resection. However, patients with chemotherapy drugs often suffer from undesired side effects to normal cells/tissues. The encapsulation of chemotherapy drugs into nanocarriers can remedy many current problems of conventional "free" drugs, such as limited stability, poor solubility, and rapid clearing.

Starting from simplex drug delivery system, the graphene oxide (GO) and reduced graphene oxide (rGO) materials have been developed to be "combo" nanoplatforms containing multiple therapeutic modalities due to their distinctive physical/chemical and optical properties including excellent biocompatibility, modifiable active groups, ultra-large surface area, and intense photothermal effect. The graphene-based nanoplatforms were used for the stimuli-responsive nanocarriers, showing excellent therapeutic effects activated by endogenous stimuli including low pH, overexpressed enzymes, biomolecules, elevated glutathione, and exogenous stimuli including light, magnetic/electric field, and ultrasound.

This article highlighted the recent advances in the manufacture of functionalized GO and rGO system. The main emphasis is on their biomedical applications, including surface modification, endogenous/exogenous drug delivery, chemotherapy-based synergistic therapy, and various imaging techniques.

Credit: 
World Scientific

BU finds some child development milestones may be set too early

CDC guidelines say most children reach a milestone by a certain age, but new data shows that "most" may mean over 99% or barely half.

A new Boston University School of Public Health (BUSPH) study published in the journal Pediatrics provides more specific data on what ages young children reach different developmental milestones. Guidelines from the CDC say "most children" reach each milestone by a certain age, but do not define "most" and do not say how often or well a child should be demonstrating an ability.

The new BUSPH study finds that, for example, 93% of nine-month-olds copy sounds, but only 44% demonstrate this ability "very much." While the CDC says to "act early" if a nine-month-old doesn't play games like peek-a-boo, the study finds that as many as one in ten children that age haven't reached that milestone, and only 49% demonstrate the behavior "very much."
"The CDC guidelines are just that: guidelines," says study lead author Dr. R. Christopher Sheldrick, research associate professor of health law, policy & management at BUSPH. "Parents should know that medical guidelines of all kinds are frequently updated based on new information. In the meantime, parents should consider advice from a range of sources and ask their pediatric providers if they have concerns."

The researchers used data from 41,465 parent responses to the Survey of Wellbeing of Young Children (SWYC) in Massachusetts, Rhode Island, and Minnesota. SWYC was created by Dr. Sheldrick and study co-author Dr. Ellen Perrin of the Tufts University School of Medicine and Medical Center in 2010, and includes 54 questions about milestones over a child's first five years. Unlike the CDC milestones, which only refer to a child doing or not doing something, SWYC asks if a child displays a behavior "not yet," "somewhat," or "very much."

The researchers found that a very high percentage (generally over 90%) of children "somewhat" or "very much" demonstrated behaviors by the ages that the CDC says, and an even higher percentage by the age at which the CDC says parents should "act early." On the other hand, if only demonstrating a behavior "very much" counted as a pass, the researchers found that less than half of children reached several of the milestones at the ages that the CDC says "most" do or at the ages to "act early." See the comparison of all milestones in the study here.

Credit: 
Boston University School of Medicine

Study shows lower mortality from induction of labor at 41 weeks

image: This is Henrik Hagberg, senior clinical physician and professor of obstetrics and gynecology at Sahlgrenska Academy, University of Gothenburg.

Image: 
Photo by Malin Arnesson

Inducing labor after 41 instead of 42 full weeks' pregnancy appears to be safer in terms of perinatal survival, new Swedish research shows. The current study is expected to provide a key piece of evidence for upcoming decisions in maternity care.

In Sweden, the risk of a baby dying before, during or shortly after birth ("perinatal death") is generally very low. However, a progressive rise in risk from a low level is known to take place after the 40th week, for as long as the pregnancy continues.

The purpose of the current study, published in The BMJ, was to investigate these risks and compare outcomes of induction after 41 and 42 full gestational weeks. To date, there have been some doubts regarding the best means of protecting mother and baby alike.

The study comprised 2,760 women admitted to 14 maternity hospitals in Sweden in the years 2016-2018. None had an underlying disease, they were all expecting one baby only (a "singleton birth"), and their pregnancies had lasted for 41 full weeks at the time of inclusion in the study.

Half of the women in the study (1,381 individuals) were randomly assigned to receive induction at 41 full weeks. In this group 86% underwent induction, while labor started without assistance among the others.

In the second group (1,379 women), induction was planned to take place at 42 full weeks. This is routine management at most birth centers in Sweden for pregnancies not involving complications. In practice, 33% of this group of women needed induction to start labor.

Regarding the women's state of health after childbirth, there was no difference between the groups. The proportions of cesareans and instrumental deliveries (using a suction cup, and/or forceps) were also approximately equal. According to the criterion that combines perinatal morbidity and death, just over 30 babies were affected in each group.

On the other hand, specifically in terms of the deaths, a significant difference was found. In the group where labor was induced after 41 full gestational weeks, there were no deaths. In the other group, in which spontaneous labor was waited for ("expectant management") and induction carried out at 42 weeks, six cases of perinatal death were registered: five of the babies were stillborn and one died immediately after birth.

Because of these deaths, the study was discontinued early. In the event, the number of women included was therefore 2,760 instead of the planned 10,000. Sweden's birth centers were informed of the outcome and, after external review of the material, the study -- headed by Sahlgrenska Academy at the University of Gothenburg, and Sahlgrenska University Hospital -- is now being published.

Ulla-Britt Wennerholm, senior clinical physician and associate professor of obstetrics and gynecology at Sahlgrenska Academy, University of Gothenburg, is one of the two lead authors.

"The study shows that there's a difference in the number of deaths when induction is done at 41 and 42 weeks. All the same, we should be a bit cautious in interpreting the results, since we had to break of the study early. The outcome might have been slightly different in a larger study, but the pattern would probably have been the same," she says.

Henrik Hagberg, senior clinical physician and professor of obstetrics and gynecology at Sahlgrenska Academy, University of Gothenburg, is one of the senior authors.

"If you put together all the data from our own and previous studies, you can see that mortality is lower if labor is induced at 41 weeks than if we wait and start labor at 42 weeks. The state of current knowledge is now being reviewed, and it's not unreasonable to expect women to be offered induction at 41 weeks," he says.

"What we've also shown in this study is that there don't seem to be any medical disadvantages of induction at 41 weeks instead of 42. Many people predicted that it would increase the risk of cesarean and instrumental deliveries, but neither rate increased," Henrik Hagberg states.

Credit: 
University of Gothenburg

Building better bacteriophage to combat antibiotic-resistant bacteria

image: PHAGE: Therapy, Applications, and Research is the only peer-reviewed journal dedicated to fundamental bacteriophage research and its applications in medicine, agriculture, aquaculture, veterinary applications, animal production, food safety, and food production.

Image: 
Mary Ann Liebert, Inc., publishers

New Rochelle, NY, November 21, 2019-Researchers are pursuing engineered bacteriophage as alternatives to antibiotics to infect and kill multi-drug resistant bacteria. The potential for an innovative synthetic biology approach to enhance phage therapeutics and the role a biofoundry can play in making this approach feasible and effective is discussed in an article in PHAGE: Therapy, Applications, and Research, a new peer-reviewed journal from Mary Ann Liebert, Inc., publishers launching in early 2020. Click here to read the full-text article free on the PHAGE website through December 21, 2019.

The article entitled "Building Better Bacteriophage with Biofoundries to Combat Antibiotic-Resistant Bacteria" was coauthored by Karen Weynberg, PhD, The University of Queensland (St. Lucia) and CSIRO Future Science Platform (Brisbane) and Paul Jaschke, PhD, Macquarie University (Sydney), Australia. The authors discuss the promise of phage therapy as a radical alternative to antibiotics, and the use of synthetic biology to engineer novel phage with desirable characteristics. They also describe the emerging use of cutting-edge facilities called biofoundries, in which automated, high-throughput laboratory processes can accelerate the bioengineering, modification, and selection of bacteriophage, making their development more effective and cost-efficient.

"This article nicely summarizes the state-of-the-art in terms of using molecular biology to create 'next-generation phages,'" says Martha Clokie, PhD, Editor-in-Chief of PHAGE and Professor of Microbiology, University of Leicester (U.K.). "Once we have understood the biology of these organisms, the sky is possibly the limit in terms of how we can engineer them to make them even more attuned to specific purposes."

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

Fighting opioids with an unlikely supplemental painkiller: Anti-itch medicine

image: Shane Kaski, a graduate student in the WVU School of Medicine's M.D./Ph.D. program, researches the pain-relieving effects of morphine given in combination with nalfurafine, a newer opioid that interacts with nerve cells in a distinct way. His preclinical study showed that--in animal models--a nalfurafine/morphine combination alleviated pain more effectively than either drug alone. It also seemed to trigger less severe side effects than the older drugs in nalfurafine's class often do.

Image: 
Aira Burkhart/West Virginia University

Pain is a mosaic. From a distance, it looks like one big "ouch." But if you step closer, many types of pain--like tiles of different shades--emerge. A headache doesn't feel like a papercut. The sting of denim against a sunburn is nothing like the jolt of ice against a sensitive tooth.

Pain relievers are equally diverse. By interacting with different parts of the nervous system, they treat some pains better than others. West Virginia University researcher Shane Kaski is investigating whether an anti-itch medication that targets a specific part of our nerve cells can make morphine--which targets a different part--more effective. His findings suggest it can.

If he's right, doctors may be able to prescribe lower doses of morphine by supplementing it with the drug, called nalfurafine, and still soothe their patients' pain.

Morphine is a classic, widely used opioid. Using less of it could mean fewer morphine-related side effects--such as constipation and nausea--and a lower risk of addiction.

"Right now there's a lot of work looking for replacements for opioids, for obvious reasons," said Kaski, a graduate student in the School of Medicine's M.D./Ph.D. program. "Maybe nalfurafine is not so great as a replacement on its own, but maybe it does enough that we could put it together with other opioids and get this dose-sparing effect."

In his study, which the National Institute of Drug Abuse funded, Kaski used animal models to test how well morphine treated pain on its own and in combination with nalfurafine. He administered the drugs in different amounts to determine which relieved the most pain at the lowest dose. Then he compared each regimen's effectiveness as a pain reliever.

He discovered that using a small supplement of nalfurafine alongside a lower dose of morphine reduced pain as dramatically as using a large dose of morphine alone. His findings appeared in the Journal of Pharmacology and Experimental Therapeutics.

"It's possible that you just need a tiny smidgen of nalfurafine with a smidgen of this other addictive drug to get the equivalent pain relief from a larger dose of your addictive drug," he said. "That's what we're seeing in our early work. That's the promise that we saw."

If future studies--including eventual clinical trials--affirm Kaski's results, then doctors may be able to combat the opioid epidemic by prescribing nalfurafine as a supplemental painkiller. That's especially significant for West Virginia, which leads the nation in opioid-related deaths, according to NIDA.

"Shane's demonstration of the anti-addictive and dose-sparing effects of nalfurafine will end up rescuing opioid painkillers from the dustbin of pharmacology, turning the tide against avoiding opioid prescriptions that greatly relieve pain but led to the present opioid crisis in West Virginia and the Appalachian region," said David Siderovski, professor of pharmacology in the School of Medicine.

Opioids: Unlikely weapons in the fight against opioid addiction

Like morphine, nalfurafine is an opioid.

"That might sound weird," Kaski said. "Why are you going to use another opioid on top of morphine? It comes down to the specifics of the biology of opioid receptors."

Opioid receptors are like assigned parking spaces on the surface of a nerve cell. Certain molecules--but not others--can "park" in them. From there, the molecules can amplify or inhibit the nerve cell's activity.

"The biggest three opioid receptors you'll hear about are mu, kappa and delta," Kaski said.

Like other traditional opioids, morphine parks in the mu receptor, where it quiets the nerve's pain signal. It also activates the brain's reward circuit.

On a simple level, "if something feels good, it increases dopamine in the right circuits and motivates you to do that thing more often," he said. "The mu opioid receptor does that."

The problem is, those surges of dopamine make classic opioids--like morphine, oxycodone and codeine--addictive.

Nalfurafine is unusual. Even though it's an opioid, it doesn't have a permit to park in mu. Instead it parks in the kappa receptor. Once it does, it alleviates pain while also "putting the brakes on" the reward circuit that can lead to addiction, Kaski said.

But muting those dopamine swells can cause an unpleasant side effect: dysphoria, the opposite of euphoria.

"Opioid pain relievers, at least in the short term, produce euphoria in many--but not all--people. However, with long-term use, opioid pain medications can change the levels of important brain molecules, including those that regulate mood," said Vincent Setola, an assistant professor of neuroscience, physiology and pharmacology, and behavioral medicine and psychiatry. He and Siderovski are mentors to Kaski.

Old painkiller, new tricks

For decades, physicians and researchers alike believed drugs that targeted the kappa opioid receptor weren't worthwhile because they could extinguish happiness. Yes, they relieved pain, but they were distressing, causing anxiety and even symptoms of psychosis at high doses.

"The reason we thought this could still be something worth trying is nalfurafine itself," Kaski said. "This drug is apparently one that doesn't cause dysphoria as much. It's actually used in humans in Japan to treat itching related to renal failure. It's used for itch because it's very good at alleviating that type of pain, but it's also good at relieving other kinds of pain."

Kaski wanted to pin down the effect that the nalfurafine/morphine combination had on the animal models' disposition. When nalfurafine was given on its own, the drug did seem to trigger dysphoria in the animals, but when it was coupled with morphine, the dysphoria disappeared.

In fact, administering nalfurafine alongside morphine had a beneficial emotional effect. It neutralized the rewarding effects of morphine--which, on its own, was very rewarding to the animals. This effect might make morphine less addictive.

And according to Setola, ongoing studies show that nalfurafine has similar dose-sparing and anti-rewarding effects on oxycodone, "another commonly used--and addiction-fraught--opioid pain medication."

"You have to go several layers deep before you realize this isn't as bad of an idea as it looks on the surface," Kaski said. "If you've been around for a while, you're like, 'They've tried that a bunch. Why are you doing that?' But we're making incremental progress. We tried this new drug. There's so much work in pharmaceuticals that we went for it, and it looks like there might be something there."

Credit: 
West Virginia University

How an AI solution can design new tuberculosis drug regimens

ANN ARBOR--With a shortage of new tuberculosis drugs in the pipeline, a software tool from the University of Michigan can predict how current drugs--including unlikely candidates--can be combined in new ways to create more effective treatments.

"This could replace our traditional trial-and-error system for drug development that is comparatively slow and expensive," said Sriram Chandrasekaran, U-M assistant professor of biomedical engineering, who leads the research.

Dubbed INDIGO, short for INferring Drug Interactions using chemoGenomics and Orthology, the software tool has shown that the potency of tuberculosis drugs can be amplified when they are teamed with antipsychotics or antimalarials.

"This tool can accurately predict the activity of drug combinations, including synergy--where the activity of the combination is greater than the sum of the individual drugs," said Shuyi Ma, a research scientist at the University of Washington and a first author of the study. "It also accurately predicts antagonism between drugs, where the activity of the combination is lesser. In addition, it also identifies the genes that control these drug responses."

Among the combinations INDIGO identified as showing a strong likelihood of effectiveness against tuberculosis were:

A five-drug combination of tuberculosis drugs Bedaquiline, Clofazimine, Rifampicin, Clarithromycin with the antimalarial drug P218.
A four-drug combination of Bedaquiline, Clofazimine, Pretomanid and the antipsychotic drug Thioridazine.
A combination of antibiotics Moxifloxacin, Spectinomycin--two drugs that are typically antagonistic but can be made highly synergistic by the addition of a third drug, Clofazimine.

All three groupings were in the top .01% of synergistic combinations identified by INDIGO.

"Successful combinations identified by INDIGO, when tested in a lab setting, showed synergy 88.8% of the time," Chandrasekaran said.

Tuberculosis kills 1.8 million people each year and is the world's deadliest bacterial infection. There are 28 drugs currently used to treat tuberculosis, and those can be combined into 24,000 three- or four-drug combinations. If a pair of new drugs is added to the mix, that increases potential combinations to 32,000.

These numbers make developing new treatment regimens time-consuming and expensive, the researchers say. At the same time, multidrug resistant strains are rapidly spreading.

At a time when new drugs are in short supply to deal with old-but-evolving diseases, this tool presents a new way to utilize medicine's current toolbox, they say. Answers may already be out there, and INDIGO's outside-the-box approach represents a faster way of finding them.

INDIGO utilizes a database of previously published research, broken down and quantified by the authors, along with detailed information on the properties of hundreds of drugs.

Credit: 
University of Michigan

What leads to compulsive alcohol use? New experiments into binge drinking provide answers

image: Example of imaging method used in research. Clip highlights normal, non-drinking activity.

Image: 
Vanderbilt University

Occasional binge drinking isn't uncommon, but about 30 percent of all adults exposed to alcohol go on to engage in compulsive drinking behaviors despite negative effects and consequences - a major feature of alcohol use disorder.

For years, researchers have sought answers as to why alcohol produces such radically different outcomes for drinkers - how is it that some individuals can drink for their entire adult life without developing compulsive habits, while others transition quickly to problem drinking?

Now, a new study from neuroscientists at Vanderbilt and The Salk Institute is providing initial answers to those long-standing scientific questions and a new method for researching what causes this transition from moderate to compulsive alcohol consumption.

The paper appears this week in Science.

"In our lab, we're focused on the neuroscience of addiction and understanding how neural activity patterns give rise to compulsive drug and alcohol use," said Cody Siciliano, assistant professor of pharmacology and author on the study. "In this study, we initially sought to understand how the brain is altered by binge drinking to drive compulsive alcohol consumption. In the process, we stumbled across a surprising finding where we were actually able to predict which subjects would become compulsive based on neural activity during the very first time they drank."

Using a behavioral model in mice, the team presents findings showing that even when subjects are given the same opportunity to drink, they split into distinct categories based on characteristics: light, heavy and compulsive binge drinkers (that is, those that continued to drink despite it resulting in a negative outcome).

The team began by recreating a drinking scenario (called a "binge-induced compulsion task") to assess how predisposition interacts with experience to produce compulsive drinking. They tracked compulsive alcohol drinking during these first drinking experiences, and again at later timepoints.

Using cellular-resolution calcium imaging and miniature microscopes, the researchers tracked the luminescence of the activity in neurons during the very first time the subjects drank alcohol. The brighter and more active the neurons became, the less likely the subject would be to go on to develop compulsive drinking behaviors. In contrast, the neurons in drinkers predisposed for compulsive behavior quieted and decreased activity during drinking events.

Interestingly, the differences in neural activity were observed during the very first drinking experience, well before compulsive behaviors emerged, allowing researchers to predict ahead of time which subjects would go on to display problem drinking behaviors.

As a result, the findings helped construct a novel behavioral model, and the team identified the specific cortical-brainstem circuit that serves as both a biomarker and a cellular platform for the eventual development of compulsive drinking behavior.

According to Siciliano, the biomarker and platform findings not only have implications on the future of alcohol addiction studies - but on other substance abuse studies, as well.

"We developed this model to study the path to alcohol use disorder, but we plan to apply a similar framework to advance our understanding of compulsive use of other substances."

Credit: 
Vanderbilt University

Cancer linked with a more than doubled risk of dying from stroke

People living with or beyond cancer are more likely to die from stroke than the general public, according to new Penn State research, and certain types of cancer may boost the risk even more.

Researchers at Penn State College of Medicine found that compared to the general population, people who have or have had cancer are more than twice as likely to die of a stroke, and the risk increases with time. Additionally, cancers of the breast, prostate or colorectum were the type most commonly associated with fatal stroke.

Nicholas Zaorsky, assistant professor in radiation oncology and public health sciences, said the results -- recently published in Nature Communications -- may help physicians identify patients at risk for fatal strokes.

"Previous research has shown that most cancer patients aren't going to die of their cancer, they're going to die of something else," Zaorsky said. "A stroke is one possibility. Our findings suggest that patients may benefit from a screening program to help prevent some of these early deaths from stroke, as well as help identify which patients we could target with those preventative efforts."

According to the researchers, cancer is the leading cause of death in the United States, with stroke being the fifth leading cause. But while institutions like the American Heart Association and the National Comprehensive Cancer Network provide separate guidelines for stroke prevention and advice for people beyond cancer treatment, there is little guidance for preventing strokes in people who have or have had cancer.

Zaorsky, a member of the Penn State Cancer Institute, said he and the other researchers were interested in identifying those at the highest risk of stroke to help future prevention efforts.

The researchers used data gathered from the National Cancer Institute's Surveillance, Epidemiology and End Results (SEER) program. SEER includes data about cancer incidence, survival, treatment and age and year of diagnosis, and covers 28 percent of the U.S. population.

For the current study, the researchers used SEER data on more than 7.2 million patients who had been diagnosed with invasive cancer -- cancer that has spread beyond the tissue in which it originally developed -- between 1992 and 2015.

The researchers found that out of 7,529,481 cancer patients, 80,513 died of a stroke. Males and females had equal chances of dying from a stroke, but those diagnosed with cancer at a younger age had a higher chance of a fatal stroke.

Additionally, they found that among those diagnosed with cancer before they turned 40, most strokes occurred in people treated for brain tumors and lymphomas. In patients diagnosed with cancer above the age of 40, fatal strokes were most commonly associated with cancer of the prostate, breast and colorectum.

Zaorsky said one explanation for the increased risk could be that many people who are diagnosed with cancer are in a "prothrombotic" state, which means they are more likely to form a blood clot.

"That blood clot may then go to the lungs and cause a pulmonary embolism, for example, or cause a stroke if it goes to the brain," Zaorsky said. "In general, it's an underlying theme and risk factor for a lot of cancer patients. And because certain cancers like those of the prostate, breast and colorectum are some of the most common cancers, that could also help explain that high association."

Brad Zacharia, assistant professor of neurosurgery, said another explanation may stem from the effects of certain types of cancer treatment.

"We can speculate that a subset of cancer patients are receiving chemotherapy or radiation treatments that may have a direct effect on the blood vessels to the brain and could increase stroke risk," Zacharia said. "This may be particularly true in patients with brain cancer."

The researchers added that future studies could help pinpoint mechanisms and further establish the relationship between cancer and strokes.

Credit: 
Penn State

CUHK Faculty of Engineering develops novel imaging approach

image: The researchers prepared two-photon microscopy images of a pollen grain by using (a) traditional point-scanning and (b) the new compressive imaging approach. The point-scanning imaging time was 2.2 seconds while the compressive imaging time required only 0.55 seconds.

Image: 
The Chinese University of Hong Kong

The research result has been published in the journal Optics Letters recently.

Activities of neurons are generally completed on a time scale of 10 milliseconds, which makes it hard for conventional microscopes to observe these phenomena directly. This new compressive sensing two-photon microscopy can be applied to 3D imaging of the nerve distribution of living things or to monitoring activities from hundreds of neurons simultaneously.

New multi-focus laser scanning method to break the scanning speed limit of two-photon microscope

Two-photon microscopy works by delivering ultrafast pulses of infrared laser light to the sample, where it interacts with fluorescent labels to create an image. It is extensively used for biological researches because of its ability to produce high-resolution 3D images up to a depth of one millimeter in a living tissue. These advantages, however, come with a limited imaging speed of the two-photon microscopy because of the weak fluorescent signal.

To speed up scanning, the research team developed a multi-focus laser illumination method that uses a digital micromirror device (DMD). The research solves the problem of conventional DMD being unusable to work with ultrafast laser, enabling them to be integrated and used in beam shaping, pulse shaping, and two-photon imaging.

The DMD generates 30 points of focused laser light on randomly selected locations within a specimen. The position and intensity of each point of light are controlled by a binary hologram that is projected onto the device. During each measurement, the DMD reflashes the hologram to change the position of each focus and records the intensity of the two-photon fluorescence with a single-pixel detector. Although, in many ways, the DMD multi-focus scanning is more flexible and faster than traditional mechanical scanning, the speed is still limited by the DMD's refresh rate.

Combining the compressive sensing algorithm to further improve the imaging speed

The researchers further increased the imaging speed in this research by combining multi-focus scanning with compressive sensing. This approach enables image acquisition with fewer measurements. This is because it carries out image measurement and compression in a single step and then uses an algorithm to rebuild the images from the measurement results. For two-photon microscopy, it can reduce the number of measurements by between 70% and 90%.

After conducting a simulation experiment to demonstrate the new method's performance and parameters, the researchers tested it with two-photon imaging experiments. These experiments demonstrated the technique's ability to produce high-quality 3D images with high imaging speeds from any field of view. For example, they were able to acquire 3D images from a pollen grain, in just 0.55 seconds. The same images acquired with traditional point scanning took 2.2 seconds.

Prof. Shih-Chi Chen said, "This method achieved a three to five times enhancement in imaging speed without sacrificing the resolution. We believe this novel approach will lead to new discoveries in biology and medicine, such as optogenetics. The team is now working to further improve the speed of the reconstruction algorithm and image quality. We also plan to use the DMD together with other advanced imaging techniques, which allows imaging in deeper tissues."

Credit: 
The Chinese University of Hong Kong

Study finds associations between rheumatoid arthritis, other diseases before and after diagnosis

ROCHESTER, Minn. -- A Mayo Clinic-led study involving 3,276 patients has found that people with inflammatory bowel disease, Type 1 diabetes or blood clots may be at increased risk of developing rheumatoid arthritis. The study, published in Mayo Clinic Proceedings, also found that people who have rheumatoid arthritis are at increased risk of developing heart disease, blood clots and sleep apnea.

Comorbidities, or other chronic diseases or conditions, have been linked to poorer outcomes for patients with rheumatoid arthritis, including worsened physical disability, functional decline, poorer quality of life and increased mortality. While some research exists on comorbidities and their effects, this study leverages the Mayo Clinic Biobank, which contains data on 74 comorbidities and the age of onset for these comorbidities.

"We found that comorbidities accumulate in an accelerated fashion after diagnosis of rheumatoid arthritis," says Vanessa Kronzer, M.D., a clinician investigator fellow in rheumatology at Mayo Clinic and the study's corresponding author. "We also found that autoimmune diseases and epilepsy may predispose to development of rheumatoid arthritis, while heart disease and other conditions may develop as a result of rheumatoid arthritis."

The findings have important implications for understanding how rheumatoid arthritis develops. It also could lead to earlier detection and screening initiatives for other diseases and conditions.

Rheumatoid arthritis is a chronic inflammatory disorder that can affect not only the joints, but also can damage a wide variety of body systems, including the lungs, heart and blood vessels. Unlike the wear-and-tear damage of osteoarthritis, rheumatoid arthritis affects the lining of joints, causing a painful swelling that can result in bone erosion and joint deformity.

The study identified 821 patients with rheumatoid arthritis who were diagnosed at Mayo Clinic in Minnesota and Florida between January 2009 and February 2018, and enlisted 2,455 control participants, for a total sample of 3,276 participants. Researchers found that 11 comorbidities were associated with rheumatoid arthritis, including epilepsy and pulmonary fibrosis.

Among other new information in the study, blood clots occurred more commonly in rheumatoid arthritis cases before diagnosis, suggesting that systemic inflammation may start before the rheumatoid arthritis symptoms become clinically apparent. The association with Type 1 diabetes prior to diagnosis of rheumatoid arthritis also was strong, highlighting the importance of heightened suspicion of rheumatoid arthritis in patients with autoimmune diseases, and vice versa.

"Our findings suggest that people with certain conditions, such as Type 1 diabetes or inflammatory bowel disease, should be carefully monitored for rheumatoid arthritis," says Dr. Kronzer. "In addition, people who have rheumatoid arthritis, and their health care providers, should have heightened suspicion and a low threshold to screen for cardiovascular disease, blood clots and sleep apnea."

The Mayo Clinic Biobank is a collection of samples, including blood and blood derivatives, and health information donated by Mayo Clinic patients and other volunteers. Among its distinctive values for this research is the depth of self-reported health information gathered, including an extensive list of comorbidities.

Credit: 
Mayo Clinic

Photoinitiators detected in human breast milk

Photoinitators (PIs) are compounds used in the ink of many types of food packaging. The substances have been shown to migrate into food and, when consumed, show up in human blood serum. Now, for the first time, researchers report they have detected PIs in human breast milk, although they say the levels consumed by breastfeeding infants are unlikely to be a health concern. The report appears in ACS' Environmental Science & Technology Letters.

Photopolymerization is widely considered a "green" technology for the manufacture of light-sensitive materials, such as ultraviolet (UV)-curable inks, coatings and resins. In this process, UV light degrades PIs to free radicals and other active substances that harden, or cure, the ink. However, not all of the PIs are used up during the reaction, and scientists have detected the compounds in food, indoor dust and blood serum. At high enough levels, some PIs have toxic or carcinogenic effects. Runzeng Liu and Scott Mabury wondered whether PIs could pass into human breast milk and, if so, how much of the compounds breastfed infants were likely to ingest. 

The researchers used mass spectrometry to analyze breast milk samples collected from 60 U.S. women. They detected 15 different PIs at a wide range of concentrations: from 0.46 ng/mL to 81.7 ng/mL. Benzophenone (BP) -- a potential carcinogen -- comprised 79% of the total PIs and was detected in 97% of the breast milk samples. The researchers note that BP is a natural product also present in fruits such as grapes, which could have contributed to the levels in milk. Based on infants' average milk consumption at different ages, the team estimated that infants younger than one month have the highest daily intake of PIs. However, the maximum amount of BP ingested as calculated by the researchers would still be about 4 times lower than the safe level set by the European Food Safety Authority, suggesting no or minor health risks to breastfeeding infants. Future studies should explore potential risks caused by simultaneous exposure to several PIs, the researchers say.    

Credit: 
American Chemical Society

Successful study of Swedish vaccine candidate against diarrhea

image: Prof. Ann-Mari Svennerholm, Sahlgrenska Academy, University of Gothenburg

Image: 
Photo by Johan Wingborg

University of Gothenburg reports first successful results of the oral, inactivated vaccine candidate ETVAX against enterotoxigenic E. coli diarrhea in a placebo-controlled phase I/II study in infants and children from 6 months to 5 years of age in Bangladesh.

All predefined primary endpoints for the study were achieved, showing that the vaccine candidate was safe and broadly immunogenic, stimulating immune responses to all key vaccine components.

Only a few mild to moderate adverse reactions were observed among participants; while the vaccine induced impressive serum and intestinal immune responses in young children and infants, with 80-100% of children 2-5 years of age and 50-80% of infants 6-11 months of age responding to all key vaccine antigens.

Giving the vaccine together with an adjuvant enhanced the magnitude, breadth and kinetics of the intestinal immune responses in infants. Results are presented in this week´s issue of The Lancet Infectious Diseases.

Enterotoxigenic E. coli (ETEC) bacteria are a primary cause of diarrhea, leading to substantial illness and death in children in low- and middle-income countries (LMICs) as well as in travelers to LMICs. Currently there is no ETEC vaccine available on the market for use in either children or travelers to ETEC high-risk areas, and ETEC vaccine development is a World Health Organization priority.

An oral ETEC vaccine candidate, ETVAX, was developed at University of Gothenburg in collaboration with Scandinavian Biopharma, Stockholm. ETVAX consists of inactivated E. coli bacteria expressing high levels of protective antigens and the ETEC-based B subunit protein LCTBA.

This clinical study examined the administration of ETVAX, given alone or together with different doses of an adjuvant, double-mutant heat-labile toxin (dmLT), to assess the vaccine's safety and immunogenicity in 450 children.

Descending age-groups of children 2-5 years, 1-2 years and 6-11 months were given two doses of vaccine in one of three fractionated doses levels (i.e., 1/8, 1/4 and 1/2) of a full adult dose with or without different doses (2.5-10 μg) of the dmLT adjuvant or buffer alone (placebo) as a drink two weeks apart in a double-blind manner.

In addition to safety analyses, immune responses were determined by measuring the amount of antibodies produced in the intestine (feces) as well as antibodies secreted by lymphocytes circulating in blood to the intestine.

The study was conducted at icddr,b in Dhaka, Bangladesh, as a collaborative effort between the Sahlgrenska Academy at University of Gothenburg, icddr,b, Scandinavian Biopharma, and the non-profit global health organization, PATH.

The results confirm and extend the promising data previously reported from ETVAX trials in Swedish and Bangladeshi adults. Based on the results of this trial, studies were initiated in September 2019 to further test safety, immune responses and protection of ETVAX (including dmLT) in African children 6-23 months of age. A study evaluating the protective efficacy of ETVAX in Finnish travelers to Africa will be completed by the end of the first quarter of 2020.

Credit: 
University of Gothenburg

The tera-electron-volt from outer space

image: This is the Fermi space telescope under construction.

Image: 
NASA

Gamma-ray bursts are the most energetic phenomenon known to humankind. Although short-lived, they outshine stars and even galactic quasars. They usually display energies in the region of tens of giga-electron-volts, but for the first time, researchers discovered a gamma-ray burst in the region of a tera-electron-volt. This level of energy has long been theorized, and this study demonstrates these energies might actually be more common than once thought.

During the height of the Cold War between the United States and the former Soviet Union, there were satellites in orbit around the Earth whose sole purpose was to keep an eye out for the telltale gamma-ray signature of an atomic explosion. From time to time, these satellites would send an alert to their controllers who would no doubt scramble to find out what the eyes above had seen. Only in these cases there was no evidence of a nuclear explosion, and through repeated observations from multiple satellites, it seemed the signals were not coming from Earth but were in fact coming from outer space.

This is how gamma-ray bursts (GRBs) were first discovered. But it took many years for the data to reach the scientific community as it was strictly classified at the time. Fast-forward to the present day and now astronomers have a fairly good understanding of these highly energetic phenomena. GRBs result from the formation of neutron stars or black holes as dying stars collapse. They are triggered by outflows of plasma ejected near the speed of light.

GRBs first give away their presence with brief flashes of gamma rays with energies in the region of mega-electron-volts (MeV), or millions of electron volts. One electron volt is the equivalent energy a single electron gains when accelerated by a voltage of one volt. Hot on the tail of the brief flashes is a long-lasting afterglow of electromagnetic radiation from weaker radio waves to strong gamma rays with energies of giga-electron-volts (GeV) -- billions of electron volts. These were the highest GRB energies observed, until now.

"High-energy GRBs with energies in the region of tera-electron-volts (TeV) (trillions of electron volts) were theoretically predicted. Astronomers have searched for such powerful bursts for 15 years," said Professor Masahiro Teshima from the Institute for Cosmic Ray Research at the University of Tokyo. "My international team and I are proud to announce the discovery of the first gamma-ray burst with observed energies up to 1 tera-electron-volt, by far the highest-energy photons ever detected from a GRB."

The GRB in question is dubbed GRB190114C (as it was observed on Jan. 14, 2019) and it was first seen by two scientific satellites in orbit around Earth, NASA's Neil Gehrels Swift Observatory and Fermi Gamma-ray Space Telescope. Although instruments on these are made to see gamma rays, neither satellite could observe a TeV signal directly. However, when they notice any GRB activity, they immediately signal a more versatile ground-based instrument that can, and it's MAGIC.

MAGIC stands for the Major Atmospheric Gamma Imaging Cherenkov telescope and it has the capacity to detect high-energy gamma rays in the region of TeV. After the satellites caught the initial burst, which was nothing out of the ordinary, MAGIC turned its two giant 17-meter reflecting mirrors towards the source in time to see the high-energy gamma-ray afterglow at an unprecedented 0.3-1 TeV. It began a minute after the burst and lasted for about 20 more.

"Although long anticipated, the detection of TeV gamma rays from GRBs had been an extremely challenging endeavor. It was finally realized here with very high significance for the first time, after many years of technical improvements and dedicated efforts," explained Teshima. "Continuing efforts with existing gamma-ray telescopes, as well as the new Cherenkov Telescope Array currently under construction, promise to bring forth new physical insight into the most luminous electromagnetic explosions in the universe."

Credit: 
University of Tokyo

R.I. researchers, policymakers outline new framework for opioid use disorder treatment

PROVIDENCE, R.I. [Brown University] -- Every day, more than 100 Americans lose their lives to the opioid crisis, and researchers from across the nation are racing to find solutions. One of the latest strategies -- a cascade of care model for the State of Rhode Island -- was developed collaboratively by a diverse group of stakeholders, including experts from Brown University, state agency leaders and community advocates.

The research team detailed the model in a paper published in the journal PLOS Medicine on Tuesday, Nov. 19.

"We hope we've created a tool that policymakers and state agencies can use to make data-driven decisions that improve care in our state," said Jesse Yedinak, the study's lead author and a project director at the Centers for Epidemiology and Environmental Health at the Brown University School of Public Health.

To create the model, the team revised an existing framework to define five stages of care for people with opioid use disorder (OUD):

Stage 0: at risk for OUD
Stage 1: diagnosed with OUD
Stage 2: initiated a medication-based treatment plan
Stage 3: continuously engaged with this treatment plan
Stage 4: recovery

Next, the team consulted national surveys and statewide insurance claims databases to estimate the number of Rhode Islanders in each stage. These estimates help to identify gaps in care, the researchers said.

For instance, 47,000 Rhode Islanders were estimated to be at risk for OUD in 2016, meaning that they reported using heroin or taking other opioids for non-medical purposes. However, only about 26,000 of those individuals -- 55 percent -- had received an OUD diagnosis.

"This first gap suggests that we need a lot more screening to identify people who have active opioid use disorder or are significantly high risk of overdose," said Brandon Marshall, an associate professor of epidemiology at Brown and senior author of the paper.

The model also highlights a significant gap between diagnosis and linkage to treatment: Of those estimated 26,000 individuals who had been diagnosed, less than half had initiated medication-based treatment. As a follow-up to this finding, further research is being done to evaluate the factors that make people more likely to seek treatment after an OUD diagnosis.

Stage 3, which contained an estimated 8,300 Rhode Islanders, consisted of individuals who stayed in medication-based treatment for more than 180 days. Stage 4 -- recovery -- contained about 4,200 and was a unique feature of this model.

"In many of the other opioid use disorder care continuums, the final stage is remission, which is clinically defined as the absence of opioid-related problems," Marshall said. "The committee did not feel this was very inspiring or patient-centered, so they strongly encouraged us to define the final stage as recovery -- which is more positive and moves beyond the absence of OUD-related problems to look at the person as a whole."

During the development process, the team was also conscious of the broader impact the model could have. To that end, they tried to make it adaptable for implementation in other states. The paper includes a specific glossary, for example, and the data sources that the model drew from should be available throughout the nation.

Marshall and Yedinak added that they hope to update the model at least once a year, and they have several long-term goals in mind.

One goal is to use the model's data to aid in the prevention of OUD by reducing the number of people who are classified as at risk. As a longer-term implication, they also hope to start generating population health targets. For example, the United Nations AIDS organization, UNAIDS, set a 90-90-90 target for HIV: By 2020, they aim for 90 percent of all people living with HIV to have received a diagnosis. Of those, 90 percent will receive treatment. And of the 90 percent receiving treatment, 90 percent will have achieved viral suppression.

"We don't have a target like that for opioid use disorder yet," Marshall said. "But now that we have the estimates, we can start thinking: What should we reach toward, given our current resources -- and given more resources, what would be realistic to achieve?"

Credit: 
Brown University

Academics call for targeted healthcare for pregnant women and new mums with depression

Professor Sue Jordan from the University’s College of Human and Health Science, who led the study, says the findings could be used to help clinicians improve care for women during pregnancy and after they have given birth. This new research entitled Antidepressants and perinatal outcomes, including breastfeeding, is published today in the PLOS ONE journal.

To investigate the health of babies born to women who had been treated for depression, the study team used data curated by the SAIL Databank, in collaboration with SAIL Databank’s analytical services team, based in Swansea University Medical School.  

Researchers examined data from more than 100,000 babies born between 2000 and 2010.  This included 2043 babies [1.9%] whose mothers were prescribed antidepressants throughout their pregnancy and 4252 whose prescriptions stopped in the first trimester.

The study gives key insights into the outcomes recorded, such as preterm birth or low birth weight and also shows, for the first time, which babies were being breastfed at 6-8 weeks. 

A key finding of the study was:

Although the analysis does not detail the causes and reasons why, women prescribed antidepressants, particularly high dose selective serotonin reuptake inhibitors, were less likely to be breastfeeding at 6-8 weeks, and might be more likely to have babies with low birth weights.

Professor Jordan said: “Our study makes for sobering reading: the data show which women are vulnerable to reduced breastfeeding rates, preterm delivery, and giving birth to small babies. The data should be considered alongside our previous reports of increased risks of congenital anomalies following antidepressant prescriptions in early pregnancy.

“Women prescribed antidepressants could and should be identified from primary care prescription records and targeted for additional support before conception. Our analysis makes a very strong case for closer monitoring for women prescribed antidepressants, including scans in the third trimester (or alternative continuous monitoring technology) to check on the baby’s growth and development.”

Journal

PLoS ONE

DOI

10.1371/journal.pone.0225133

Credit: 
Swansea University