Culture

Astronomers detect regular rhythm of radio waves, with origins unknown

A team of astronomers, including researchers at MIT, has picked up on a curious, repeating rhythm of fast radio bursts emanating from an unknown source outside our galaxy, 500 million light years away.

Fast radio bursts, or FRBs, are short, intense flashes of radio waves that are thought to be the product of small, distant, extremely dense objects, though exactly what those objects might be is a longstanding mystery in astrophysics. FRBs typically last a few milliseconds, during which time they can outshine entire galaxies.

Since the first FRB was observed in 2007, astronomers have catalogued over 100 fast radio bursts from distant sources scattered across the universe, outside our own galaxy. For the most part, these detections were one-offs, flashing briefly before disappearing entirely. In a handful of instances, astronomers observed fast radio bursts multiple times from the same source, though with no discernible pattern.

This new FRB source, which the team has catalogued as FRB 180916.J0158+65, is the first to produce a periodic, or cyclical pattern of fast radio bursts. The pattern begins with a noisy, four-day window, during which the source emits random bursts of radio waves, followed by a 12-day period of radio silence.

The astronomers observed that this 16-day pattern of fast radio bursts reoccurred consistently over 500 days of observations.
"This FRB we're reporting now is like clockwork," says Kiyoshi Masui, assistant professor of physics in MIT's Kavli Institute for Astrophysics and Space Research. "It's the most definitive pattern we've seen from one of these sources. And it's a big clue that we can use to start hunting down the physics of what's causing these bright flashes, which nobody really understands."

Masui is a member of the CHIME/FRB collaboration, a group of more than 50 scientists led by the University of British Columbia, McGill University, University of Toronto, and the National Research Council of Canada, that operates and analyzes the data from the Canadian Hydrogen Intensity Mapping Experiment, or CHIME, a radio telescope in British Columbia that was the first to pick up signals of the new periodic FRB source.

The CHIME/FRB Collaboration has published the details of the new observation today in the journal Nature.

A radio view

In 2017, CHIME was erected at the Dominion Radio Astrophysical Observatory in British Columbia, where it quickly began detecting fast radio bursts from galaxies across the universe, billions of light years from Earth.

CHIME consists of four large antennas, each about the size and shape of a snowboarding half-pipe, and is designed with no moving parts. Rather than swiveling to focus on different parts of the sky, CHIME stares fixedly at the entire sky, using digital signal processing to pinpoint the region of space where incoming radio waves are originating.

From September 2018 to February 2020, CHIME picked out 38 fast radio bursts from a single source, FRB 180916.J0158+65, which the astronomers traced to a star-churning region on the outskirts of a massive spiral galaxy, 500 million light years from Earth. The source is the most active FRB source that CHIME has yet detected, and until recently it was the closest FRB source to Earth.

As the researchers plotted each of the 38 bursts over time, a pattern began to emerge: One or two bursts would occur over four days, followed by a 12-day period without any bursts, after which the pattern would repeat. This 16-day cycle occurred again and again over the 500 days that they observed the source.

"These periodic bursts are something that we've never seen before, and it's a new phenomenon in astrophysics," Masui says.

Circling scenarios

Exactly what phenomenon is behind this new extragalactic rhythm is a big unknown, although the team explores some ideas in their new paper. One possibility is that the periodic bursts may be coming from a single compact object, such as a neutron star, that is both spinning and wobbling -- an astrophysical phenomenon known as precession. Assuming that the radio waves are emanating from a fixed location on the object, if the object is spinning along an axis and that axis is only pointed toward the direction of Earth every four out of 16 days, then we would observe the radio waves as periodic bursts.

Another possibility involves a binary system, such as a neutron star orbiting another neutron star or black hole. If the first neutron star emits radio waves, and is on an eccentric orbit that briefly brings it close to the second object, the tides between the two objects could be strong enough to cause the first neutron star to deform and burst briefly before it swings away. This pattern would repeat when the neutron star swings back along its orbit.

The researchers considered a third scenario, involving a radio-emitting source that circles a central star. If the star emits a wind, or cloud of gas, then every time the source passes through the cloud, the gas from the cloud could periodically magnify the source's radio emissions.

"Maybe the source is always giving off these bursts, but we only see them when it's going through these clouds, because the clouds act as a lens," Masui says.

Perhaps the most exciting possibility is the idea that this new FRB, and even those that are not periodic or even repeating, may originate from magnetars -- a type of neutron star that is thought to have an extremely powerful magnetic field. The particulars of magnetars are still a bit of a mystery, but astronomers have observed that they do occasionally release massive amounts of radiation across the electromagnetic spectrum, including energy in the radio band.

"People have been working on how to make these magnetars emit fast radio bursts, and this periodicity we've observed has since been worked into these models to figure out how this all fits together," Masui says.

Very recently, the same group made a new observation that supports the idea that magnetars may in fact be a viable source for fast radio bursts. In late April, CHIME picked up a signal that looked like a fast radio burst, coming from a flaring magnetar, some 30,000 light years from Earth. If the signal is confirmed, this would be the first FRB detected within our own galaxy, as well as the most compelling evidence of magnetars as a source of these mysterious cosmic sparks.

Credit: 
Massachusetts Institute of Technology

New family of enzymes reveals the Achilles' heel of fungal pathogens

image: Dr. Lynne Howell (left) and Dr. Don Sheppard (right) solved the structure of a key enzyme responsible for biofilm formation in the fungal pathogen A. fumigatus.

Image: 
University of Toronto (Dr. Lynne Howell), McGill University (Dr. Don Sheppard)

Aspergillus fumigatus is a species of fungus that can cause serious illnesses in immunocompromised individuals such as those who are undergoing transplantation or cancer chemotherapy. Every year, about 500,000 new Aspergillus cases are reported, and even with antifungal agents in place, the mortality rate remains over 50%. Infections caused by A. fumigatus are difficult to treat because during an infection, the fungus aggregates into small communities called "biofilms." These biofilms not only protect the pathogens from antifungal agents, but also help the fungus evade the immune system. Researchers around the world have been trying to understand how biofilms are produced and how they can be disrupted, as this knowledge will be crucial for developing effective therapeutics.

In a recent paper published in Nature Communications, GlycoNet researchers Dr. Lynne Howell from The Hospital for Sick Children and Dr. Don Sheppard from McGill University solved the structure of a key enzyme called Agd3, which is critical for biofilm formation in A. fumigatus.

For over six years, Howell and Sheppard have been working together to find the vulnerabilities in the biofilm produced by different pathogens. More specifically, they are investigating a group of carbohydrate polymers produced by different enzymes in the pathogens. These carbohydrate polymers serve as a strong 'glue' to hold biofilm together.

"We want to know how these carbohydrates are synthesized and which enzymes are making them," says Sheppard. "If we know how it (biofilm) is made, we know how to take it apart."

The team first found that when this enzyme was missing, the biofilm did not form, and the fungus was weakened. After locating the enzyme in the fungal genome, the team took a deeper dive into the 3D-structure of the enzyme to understand the mechanism by which Agd3 functions in biofilm formation.

"With structural studies, we were able to visualize where and how the enzyme binds the carbohydrate polymers and modifies them to help form the biofilm," says Howell. "Furthermore, we found that this enzyme is composed of several different domains. The architecture of these domains and how they piece together to form the enzyme have never been seen before." Howell says the structural analysis also helped the team define a new family of carbohydrate-processing enzymes that has not been previously characterized.

Resistance of A. fumigatus to antifungal reagents continues to be a health threat worldwide. Sheppard believes gaining structural knowledge of Agd3 will be helpful to develop strategies addressing this concern. In fact, the team is already onto the next step.

"We are now designing antibodies that can inhibit the function of Agd3 based on structural information we gathered," says Sheppard. In collaboration with Howell lab, the team hopes to use structural data of how the antibodies binding to Agd3 to further the development of antibody therapeutics for infections caused by A. fumigatus.

To learn more about the story behind the paper from conception to publications, as well as the highs and lows, read the blog written by first author and former GlycoNet trainee Dr. Natalie Bamford on Nature Microbiology Community here.

Credit: 
Canadian Glycomics Network

Gut bacteria may modify behavior in worms, influencing eating habits

Gut bacteria are tiny but may play an outsized role not only in the host animal's digestive health, but in their overall well-being. According to a new study in Nature, specific gut bacteria in the worm may modify the animal's behavior, directing its dining decisions. The research was funded in part by the National Institutes of Health.

"We keep finding surprising roles for gut bacteria that go beyond the stomach," said Robert Riddle, Ph.D., program director at the NIH's National Institute of Neurological Disorders and Stroke (NINDS), which supported the study. "Here, the gut bacteria are influencing how the animal senses its environment and causing it to move toward an external source of the same bacteria. The gut bacteria are literally making their species tastier to the animal."

Researchers at Brandeis University, Waltham, Massachusetts, led by Michael O'Donnell, Ph.D., postdoctoral fellow and first author of the paper, and Piali Sengupta, Ph.D., professor of biology and senior author of the study, were interested in seeing whether it was possible for gut bacteria to control a host animal's behavior. The group investigated the effects of gut bacteria on how worms, called C. elegans, sniff out and choose their next meal.

Bacteria are the worms' primary food. In this study, the researchers measured how worms fed different strains of bacteria reacted to octanol, a large alcohol molecule secreted by some bacteria, which worms normally avoid when it is present at high concentrations.

Dr. O'Donnell and his colleagues discovered that worms grown on Providencia alcalifaciens (JUb39) were less likely to avoid octanol compared to animals grown on other bacteria. Curiously, they found that live JUb39 bacteria were present in the gut of the worms that moved toward octanol, suggesting that the behavior may be determined in part by a substance produced by these bacteria.

Next, the researchers wanted to know how the bacteria exerted control over the worms.

"We were able to connect the dots, all the way from microbe to behavior, and determine the entire pathway that could be involved in this process," said Dr. O'Donnell.

The brain chemical tyramine may play an important role in this response. In the worms, tyramine is transformed into the chemical octopamine, which targets a receptor on sensory neurons that controls avoidance behavior. The results of this study suggested that tyramine produced by bacteria increased levels of octopamine, which made the worms more tolerant of octanol by suppressing the avoidance of octanol that is driven by these neurons.

Using other behavioral tests, the researchers found that genetically engineering worms so that they did not produce tyramine did not affect suppression of octanol avoidance when the worms were grown on JUb39. This suggests that tyramine made by the bacteria may be able to compensate for the endogenous tyramine missing in those animals.

Additional experiments indicated that worms grown on JUb39 preferred eating that type of bacteria over other bacterial food sources. Tyramine produced by the bacteria was also found to be required for this decision.

"In this way, the bacteria can take control over the host animal's sensory decision-making process, which affects their responses to odors and may influence food choices" said Dr. Sengupta.

Future studies will identify additional brain chemicals produced by bacteria that may be involved in changing other worm behaviors. In addition, it is unknown whether specific combinations of bacterial strains present in the gut will result in different responses to environmental cues. Although worms and mammals share many of the same genes and biochemical processes, it is not known whether similar pathways and outcomes exist in higher order animals.

Credit: 
NIH/National Institute of Neurological Disorders and Stroke

Study finds 82 percent of avocado oil rancid or mixed with other oils

Consumer demand is rising for all things avocado, including oil made from the fruit. Avocado oil is a great source of vitamins, minerals and the type of fats associated with reducing the risk of heart disease, stroke and diabetes. But according to new research from food science experts at the University of California, Davis, the vast majority of avocado oil sold in the U.S. is of poor quality, mislabeled or adulterated with other oils.

In the country's first extensive study of commercial avocado oil quality and purity, UC Davis researchers report that at least 82 percent of test samples were either stale before expiration date or mixed with other oils. In three cases, bottles labeled as "pure" or "extra virgin" avocado oil contained near 100 percent soybean oil, an oil commonly used in processed foods that's much less expensive to produce.

"I was surprised some of the samples didn't contain any avocado oil," said Selina Wang, Cooperative Extension specialist in the Department of Food Science and Technology, who led the study recently published in the journal Food Control. "Most people who buy avocado oil are interested in the health benefits, as well as the mild, fresh flavor, and are willing to pay more for the product. But because there are no standards to determine if an avocado oil is of the quality and purity advertised, no one is regulating false or misleading labels. These findings highlight the urgent need for standards to protect consumers and establish a level playing field to support the continuing growth of the avocado oil industry."

TESTING DOMESTIC AND IMPORTED BRANDS

Wang and Hilary Green, a Ph.D. candidate in Wang's lab, analyzed various chemical parameters of 22 domestic and imported avocado oil samples, which included all the brands they could find in local stores and online. Wang and Green received a $25,000 grant from Dipasa USA, part of the Dipasa Group, a sesame-seed and avocado-oil processor and supplier based in Mexico.

"In addition to testing commercial brands, we also bought avocados and extracted our own oil in the lab, so we would know, chemically, what pure avocado oil looks like," Wang said.

Test samples included oils of various prices, some labeled extra virgin or refined. Virgin oil is supposed to be extracted from fresh fruit using only mechanical means, and refined oil is processed with heat or chemicals to remove any flaws.

Fifteen of the samples were oxidized before the expiration date. Oil loses its flavor and health benefits when it oxidizes, which happens over time and when exposed to too much light, heat or air. Six samples were mixed with large amounts of other oils, including sunflower, safflower and soybean oil.

Only two brands produced samples that were pure and nonoxidized. Those were Chosen Foods and Marianne's Avocado Oil, both refined avocado oils made in Mexico. Among the virgin grades, CalPure produced in California was pure and fresher than the other samples in the same grade.

A PUSH FOR STANDARDS

Ensuring quality is important for consumers, retailers, producers and people throughout the avocado oil industry. Retailers want to sell quality products, shoppers want to get their money's worth and honest producers want to keep fraudulent and low-quality oil out of the marketplace.

But since avocado oil is relatively new on the scene, the Food and Drug Administration has not yet adopted "standards of identity," which are basic food standards designed to protect consumers from being cheated by inferior products or confused by misleading labels. Over the last 80 years, the FDA has issued standards of identity for hundreds of products, like whiskey, chocolate, juices and mayonnaise. Without standards, the FDA has no means to regulate avocado oil quality and authenticity.

Avocado oil isn't the only product without enforceable standards. Honey, spices and ground coffee are other common examples. Foods that fetch a higher price are especially ripe for manipulating, especially when adulterations can be too subtle to detect outside a lab.

Wang is working to develop faster, better and cheaper chemical methods to detect adulteration so bulk buyers can test avocado oil before selling it. She is also evaluating more samples, performing shelf-life studies to see how time and storage affect quality, and encouraging FDA officials to establish reasonable standards for avocado oil.

Wang has experience collaborating with industry and the FDA. Ten years ago, she analyzed the quality and purity of extra virgin olive oil and discovered that most of what was being sold in the U.S. was actually a much lower grade. Her research sparked a cascade of responses that led California to establish one of the world's most stringent standards for different grades of olive oil. The FDA is working with importers and domestic producers to develop standards of identity for olive oil.

"Consumers seeking the health benefits of avocado oil deserve to get what they think they are buying," Wang said. "Working together with the industry, we can establish standards and make sure customers are getting high-quality, authentic avocado oil and the companies are competing on a level playing field."

TIPS FOR CONSUMERS

-The flavor of virgin avocado oil can differ by varieties and region. In general, authentic, fresh, virgin avocado oil tastes grassy, buttery and a little bit like mushrooms.

-Virgin avocado oil should be green in color, whereas refined avocado oil is light yellow and almost clear due to pigments removed during refining.

-Even good oil becomes rancid with time. It's important to purchase a reasonable size that can be finished before the oil oxidizes. Store the oil away from light and heat. A cool, dark cabinet is a good choice, rather than next to the stove.

-How do you know if the oil is rancid? It starts to smell stale, sort of like play dough.

-When possible, choose an oil that's closest to the harvest/production time to ensure maximum freshness. The "best before date" is not always a reliable indicator of quality.

Credit: 
University of California - Davis

Researchers develop microscopy technique for noninvasive evaluation of wound healing

image: Microscopy images obtained on day 3 of the wound healing study. OCT-A shows the development of blood vessels, SHG shows collagen reorganization around the wound, FAD and FLIM images provide chemical information about the imaged area.

Image: 
Images courtesy of the GSK Center for Optical Molecular Imaging

Researchers at the GSK Center for Optical Molecular Imaging have developed a new microscope that looks at the different parameters that change during wound healing. They hope to use this technique to understand how skin disorders, such as foot ulcers in diabetic patients and psoriasis, can be treated.

“Nobody really understands how topical drugs affect the skin because they can’t see below the skin,” said Marina Marjanovic, an associate professor of bioengineering and associate director of the center, which is located at the Beckman Institute for Advanced Science and Technology at the University of Illinois at Urbana-Champaign

“We need this technique to understand whether the available treatments are curing the underlying condition or just the symptoms,” said Marjanovic, who also is a member of Beckman’s Biophotonics Imaging Lab.

The new microscope can look at different aspects of wound healing simultaneously. The researchers used it to get images of the wound, track collagen that helps in wound healing, and visualize the blood vessel distribution around the wound. Additionally, they also have measured various chemicals in the tissue that indicate how much inflammation is occurring.

The paper “Non-invasive monitoring of pharmacodynamics during the skin wound healing process using multimodal optical microscopy” was published in BMJ Open Diabetes Research & Care.

“This is a continuation of a previous study we did where we made a wound on the ears of mice,” said Aneesh Alex, a visiting scholar at the Beckman Institute and a scientist at GlaxoSmithKline. “In this study, we added more visualization capabilities and studied the wound healing process on the backs of the mice, which is a more accurate model.”

The researchers studied the healing process for a month. They looked at four groups of mice: mice with untreated wounds, mice which had placebo treatments, and mice with two different concentrations of the treatment drug.

“One of the main limitations of optical imaging techniques is its shallow penetration depth in biological tissues,” said Eric Chaney, a research scientist in the Biophotonics Imaging Lab and at the center. The limitation is due to the scattering of light, which makes it difficult for the researchers to look at deeper tissue structures.

“The major advantages of this technique are that we do not use any labels in our imaging and it is completely noninvasive,” Marjanovic said. “We have also done follow-up studies in humans and we have been able to look at the changes in the skin of healthy volunteers and patients with skin conditions.”

The researchers hope to further refine the technique so it can be used for routine studies of skin disorders and their treatments.

“Our Center for Optical Molecular Imaging at the Beckman Institute has been a unique and productive academic-industry partnership with GlaxoSmithKline,” said Dr. Stephen Boppart, Abel Bliss Professor of Engineering and a professor of electrical and computer engineering and of bioengineering, who also is a medical doctor. “By combining our state-of-the-art optical imaging technologies, we have the ability to not only visualize and understand the molecular interactions that occur, but also how these processes may affect drug action and efficacy.”

The work was done in collaboration with GlaxoSmithKline. The data analysis was done with the help of Salma Musaad, a research biostatistician at the Interdisciplinary Health Sciences Institute.

 

The paper “Non-invasive monitoring of pharmacodynamics during the skin wound healing process using multimodal optical microscopy” can be found at http://dx.doi.org/10.1136/bmjdrc-2019-000974.

Journal

BMJ Open Diabetes Research & Care

DOI

10.1136/bmjdrc-2019-000974.

Credit: 
Beckman Institute for Advanced Science and Technology

Quantum-inspired approach dramatically lowers light power needed for OCT

image: Researchers used a technology borrowed from quantum optics to perform optical coherence tomography (OCT) with much lower light powers than previously possible. Two views of their optical setup are shown.

Image: 
Andrzej Roma?ski

WASHINGTON -- Researchers have shown that a detection technology borrowed from quantum optics can be used to perform optical coherence tomography (OCT) with much lower light power than previously possible. This could greatly improve the imaging quality available from OCT used for medical imaging applications.

OCT uses light to provide high-resolution 3D images in a non-invasive manner. Although it is commonly used for ophthalmology applications, OCT can also be used to image many other parts of the body such as the skin and inside the ears, mouth, arteries and gastrointestinal tract.

"For clinical applications, being able to perform OCT with low light power is crucial because safety standards limit the light intensity levels that can be used," said research team leader Sylwia Kolenderska from The University of Auckland in New Zealand. "In some cases, these power levels are not high enough to achieve good image quality."

In The Optical Society (OSA) journal Optics Letters, the researchers describe how they replaced standard OCT detectors with superconducting single-photon detectors (SSPDs), a technology used in quantum optics to distinguish individual photons. This setup allowed them to achieve good image quality with power levels up to 1 million times lower than those currently used in OCT instruments.

"In the future, if single-photon detection technology could be made much smaller and less expensive, a line of portable diagnostic machines based on light-based imaging might be created for safe self-diagnosis purposes in the comfort of one's home," said Kolenderska.

Capturing single photons

The researchers came up with the new detection scheme while developing an OCT method based on quantum light for which SSPDs were central. They soon realized that SSPDs could also be used in a standard OCT arrangement to enhance sensitivity.

"Because SSPDs can detect single photons, an OCT instrument using them requires only a tiny amount of light compared to what is currently used in modern OCT machines," said Kolenderska. "Yet, it still produces high-detail images that are comparable with existing OCT systems."

Incorporating SSPDs into a standard OCT system required some changes to the typical optical setup. Modern OCT instruments work by discerning the colors, or wavelengths, of light reflected from an object. This wavelength discrimination can be performed by using a single pixel detector while the light source produces one wavelength at a time or it can be done with a diffraction grating that splits the light into different wavelengths like a prism and a camera that detects these wavelengths.

The researchers used a fiber instead of a grating to separate different colors, which each travel at different speeds down the fiber. At the fiber's output end, they used the SSPD to capture the different colors as they arrive at different times. This allowed the light spectrum to be acquired for reconstructing OCT images.

Low-power light yields high-quality images

To demonstrate the new detection scheme, the researchers acquired OCT images of a stack of three types of glass and a piece of onion, which represented a biological sample. They obtained good-quality images of both samples at light intensity levels at least five orders of magnitude lower than those set by safety standards.

"Our results show that the new detection approach could allow quality OCT imaging of different parts of the body, especially sensitive organs such as the eyes, without worrying about going above the safety levels in terms of light power," said Kolenderska. "In fact, the SSPD would be damaged beyond repair long before even 1% of the safety level is reached."

The researchers did, however, observe artifacts -- elements that do not correspond to the structure of the sample -- in the OCT images they acquired. These appear because the detection system detects all kinds of interactions between photons, not just the ones needed to reconstruct an actual image. They are experimenting to find the best way to prevent these artifacts without compromising imaging speed, which would be important to maintain for clinical applications.

Credit: 
Optica

A proven method for stabilizing efforts to bring fusion power to Earth

image: Physicist Florian Laggner before the DIII-D tokamak with a figure from his paper.

Image: 
Photo by Alessandro Bortolon. Composite by Elle Starkman/PPPL Office of Communications.

All efforts to replicate in tokamak fusion facilities the fusion energy that powers the sun and stars must cope with a constant problem -- transient heat bursts that can halt fusion reactions and damage the doughnut-shaped tokamaks. These bursts, called edge localized modes (ELMs), occur at the edge of hot, charged plasma gas when it kicks into high gear to fuel fusion reactions.

To prevent such bursts researchers at the DIII-D National Fusion Facility, which General Atomics (GA) operates for the U.S. Department of Energy (DOE), previously pioneered an approach that injects small ripples of magnetic fields into the plasma to cause heat to leak out controllably. Now scientists at the DOE's Princeton Plasma Physics Laboratory (PPPL) have developed a control scheme to optimize the levels of these fields for maximum performance without ELMs.

Path to suppressing ELMs

The research, led by PPPL physicist Florian Laggner and funded by the DOE Office of Science, developed the scheme at DIII-D in San Diego. Laggner said the method, put together with researchers from GA and other collaborating institutions, reveals a path to suppressing ELMs and maximizing fusion power on ITER, the international tokamak under construction in France that is designed to demonstrate the practicality of fusion energy. "We show a path forward, a way that it can be done," said Laggner, lead author of a paper reporting the findings in Nuclear Fusion.

Fusion powers the sun and stars by combining light elements in the form of plasma -- the hot, charged state of matter composed of free electrons and atomic nuclei that makes up 99 percent of the visible universe -- to generate massive amounts of energy. Scientists around the world are seeking to harness fusion for a virtually inexhaustible supply of safe and clean power to generate electricity.

The demonstrated technique uses the expanded capacity of the DIII-D plasma control system to address the inherent conflict between optimizing fusion energy and controlling ELMs. The scheme focuses on the "pedestal," the thin, dense layer of plasma near the edge of the tokamak that increases the pressure of the plasma and thus fusion power. However, if the pedestal grows too high it can create ELM heat bursts by suddenly collapsing.

So the key is controlling the height of the pedestal to maximize fusion power while preventing the layer from becoming so high that it triggers ELMs. The combination calls for real-time control of the process. "You can't just preprogram some constant scheme beforehand, since the plasma and wall conditions may evolve," said Egemen Kolemen, an assistant professor of Mechanical and Aerospace Engineering at Princeton University and a PPPL physicist who oversaw the project. "The control must provide adjustments in real time."

Stable ELM suppression

The developed system created ELM suppression at the minimum amplitude, or size, of the magnetic disturbance. It further reduced the amplitude to allow partial recovery of the confinement lost during the process, thereby achieving both stable ELM suppression and high fusion performance.

"Laggner and colleagues have assembled an impressive suite of control tools to regulate core and edge plasma stability in real-time," said GA physicist Carlos Paz-Soldan, a coauthor of the paper. "Some kind of adaptive control like the techniques pioneered in this work will likely be necessary to regulate the plasma edge stability in ITER."

While the international facility will not simply apply the control system developed by PPPL and GA, it must create its own method for coping with ELMs. Indeed, "active control schemes will enable safe operation at maximized [fusion] gain in future devices such as ITER," the authors said. Moreover, they added, implementation of such a scheme on DIII-D provides proof of principle and "guides future development."

PPPL, on Princeton University's Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas -- ultra-hot, charged gases -- and to developing practical solutions for the creation of fusion energy. The Laboratory is managed by the University for the U.S. Department of Energy's Office of Science, which is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit energy.gov/science (link is external).

Credit: 
DOE/Princeton Plasma Physics Laboratory

Centenarian study suggests living environment may be key to longevity

Spokane, Wash. - When it comes to living to the ripe old age of 100, good genes help but don't tell the full story. Where you live has a significant impact on the likelihood that you will reach centenarian age, suggests a new study conducted by scientists at Washington State University's Elson S. Floyd College of Medicine.

Published in the International Journal of Environmental Research and Public Health and based on Washington State mortality data, the research team's findings suggest that Washingtonians who live in highly walkable, mixed-age communities may be more likely to live to their 100th birthday. They also found socioeconomic status to be correlated, and an additional analysis showed that geographic clusters where the probability of reaching centenarian age is high are located in urban areas and smaller towns with higher socioeconomic status, including the Seattle area and the region around Pullman, Wash.

"Our study adds to the growing body of evidence that social and environmental factors contribute significantly to longevity, said study author Rajan Bhardwaj, a second-year WSU medical student who took an interest in the topic after serving as a home care aide to his aging grandfather. Earlier research, he said, has estimated that heritable factors only explain about 20 to 35% of an individual's chances of reaching centenarian age.

"We know from previous research that you can modify, through behavior, your susceptibility to different diseases based on your genetics," explained Ofer Amram, the study's senior author and an assistant professor who runs WSU's Community Health and Spatial Epidemiology (CHaSE) lab.

In other words, when you live in an environment that supports healthy aging, this likely impacts your ability to successfully beat your genetic odds through lifestyle changes. However, there was a gap in knowledge as to the exact environmental and social factors that make for an environment that best supports living to centenarian age, which this study helped to address.

In collaboration with co-authors Solmaz Amiri and Dedra Buchwald, Bhardwaj and Amram looked at state-provided data about the deaths of nearly 145,000 Washingtonians who died at age 75 or older between 2011 and 2015. The data included information on each person's age and place of residence at the time of death, as well as their sex, race, education level and marital status.

Based on where the person lived, the researchers used data from the American Community Survey, Environmental Protection Agency, and other sources to assign a value or score to different environmental variables for their neighborhood. The variables they looked at included poverty level, access to transit and primary care, walkability, percentage of working age population, rural-urban status, air pollution, and green space exposure. Subsequently, they conducted a survival analysis to determine which neighborhood and demographic factors were tied to a lower probability of dying before centenarian age.

They found that neighborhood walkability, higher socioeconomic status, and a high percentage of working age population (a measure of age diversity) were positively correlated with reaching centenarian status.

"These findings indicate that mixed-age communities are very beneficial for everyone involved," said Bhardwaj. "They also support the big push in growing urban centers toward making streets more walkable, which makes exercise more accessible to older adults and makes it easier for them to access medical care and grocery stores."
Amram added that neighborhoods that offer more age diversity tend to be in urban areas, where older adults are likely to experience less isolation and more community support.

Meanwhile, Bhardwaj said their findings also highlight the importance of continuing efforts to address health disparities experienced by racial minorities, such as African Americans and Native Americans. Consistent with previous research findings, for example, the data shows being white is correlated with living to 100. Looking at gender, the researchers also found that women were more likely to reach centenarian age.

Finally, the researchers wanted to see in which areas of the state people had a higher probability of reaching centenarian age. For each neighborhood, they calculated the years of potential life lost, or the average number of years deceased individuals would have had to continue living to reach age 100. Neighborhoods with lower values for years of potential life lost were considered to have a higher likelihood of reaching centenarian age, and vice versa.

When they mapped the years of potential life lost for all neighborhoods across the state, they saw clusters with high likelihood of living to centenarian age in higher socioeconomic areas in urban centers and small towns across the state, including the greater Seattle area and the Pullman region.

While more research is needed to expand upon their findings, the researchers said the study findings could eventually be used to create healthier communities that promote longevity in older adults.

Credit: 
Washington State University

Tick surveillance and control lagging in US, study shows

image: While the prevalence of Lyme disease and other illnesses spread by ticks has steadily increased in the United States over the past 20 years, a new study of the state of American tick surveillance and control reveals an inconsistent and often under-supported patchwork of programs across the country. Such programs are critical in managing the public-health threat posed by ticks such as the blacklegged tick (Ixodes scapularis), shown here in multiple life stages suspended in a vial.

Image: 
Northeast Regional Center for Excellence in Vector-Borne Diseases

Annapolis, MD; June 17, 2020--While the prevalence of Lyme disease and other illnesses spread by ticks has steadily increased in the United States over the past 20 years, a new study of the state of American tick surveillance and control reveals an inconsistent and often under-supported patchwork of programs across the country.

Annually reported cases of tickborne disease more than doubled between 2004 and 2018, according to the U.S. Centers for Disease Control and Prevention (CDC), while seven new tickborne germs were discovered in that same timeframe. But a clear gap exists in our public health infrastructure, say researchers who have conducted the first-ever survey of the nation's tick management programs.

The survey showed that less than half of public health and vector-control agencies engage in active tick surveillance, and only 12 percent directly conduct or otherwise support tick-control efforts. These and other findings from the survey, conducted by university researchers at the CDC's five Vector-Borne Disease Regional Centers of Excellence, are published today in the Journal of Medical Entomology.

"Ticks are responsible for the majority of our vector-borne illnesses in the U.S., and our programming does not adequately meet the need in its current form, for both surveillance and control," says Emily M. Mader, MPH MPP, lead author on the study and program manager at the Northeast Regional Center for Excellence in Vector-Borne Diseases, housed at Cornell University.

Mader and colleagues surveyed 140 vector-borne disease professionals working at state, county, and local agencies in the fall of 2018 to learn about their program objectives and capabilities for tick surveillance and control, testing ticks for disease-causing germs, and barriers to success. Reaching even that many respondents proved challenging, as no central database of tick-management programs or contacts was available.

Highlights from the survey of tick-management programs include:

Less than half of tick-management programs proactively collect ticks in their area. While about two-thirds of respondents (65 percent) said their programs engage in passive tick surveillance, such as accepting tick samples submitted by the public, only 46 percent said their programs engage in routine active tick surveillance, such as focused collection of tick samples within their community.

Only a quarter of tick-management programs test ticks for disease-causing germs. Just 26 percent of survey respondents said their jurisdiction conducts or financially supports testing of tick samples for disease-causing pathogens. And only 7 percent said their programs work to evaluate the presence of such pathogens in the animal hosts (such as mice and other rodents) from which ticks acquire those pathogens in their area.

"Pathogen testing is an essential component of surveillance and is needed in order to understand tickborne disease risk to communities," says Mader. "There appears to be a significant barrier for many tick-surveillance programs across the country to access pathogen-testing services."

Capacity for public tick-control efforts is low. Only 12 percent of respondents said that their jurisdiction conducts or financially supports tick control, with those efforts primarily focused reducing tick presence on animal hosts (such as deer and rodents).

Mader says limited resources mean tick-management programs need reliable, proven control methods. "They are not going to invest in a strategy unless it has been investigated and shown to make a difference in reducing the burden of ticks and tickborne diseases," she says. "Right now, supporting this research is a major need. These kinds of evaluations often take at least three years to complete and require a significant investment."

Tick surveillance and control happen in a range of sectors. The most common employment sectors among respondents being public health, mosquito control, cooperative extension, and agriculture. And more than half of respondents (57 percent) said their programs work with academic partners to conduct tick surveillance.

"The world of ticks reaches entomologists, veterinarians, medical doctors, public health, natural resource managers, farmers, pet owners, scientists, and anyone that enjoys the outdoors," says Nohra Mateus-Pinilla, Ph.D., co-author on the study and director of the Veterinary Epidemiology Laboratory at the University of Illinois's Illinois Natural History Survey. "The partnerships stand out because broad, collaborative networks are paramount to a positive and productive path for the advancement of this field."

Info and data sharing on ticks and public health is lagging. Less than a quarter of respondents said their tick-management programs disseminate information to local health departments (23 percent) or report data to the CDC (14 percent).

Greater support for tick-management programs is critical. To improve tick-management programs, respondents commonly cited the need for stable funding, training for personnel, and standardized, research-based guidance and protocols.

In December 2019, the Kay Hagan Tick Act was signed into federal law, authorizing $150 million to strengthen various aspects of the nation's efforts to vector-borne disease, including reauthorization of the CDC's Vector-Borne Disease Regional Centers of Excellence for an additional five years, through 2026. The CDC also issued guidance documents in late 2018 and early 2020 to provide tick-management programs with best practices for surveillance of blacklegged tick species (Ixodes scapularis and Ixodes pacificus) and other hard tick species across the U.S.

These steps address needs revealed in the survey of tick-management programs, and Mader and Mateus-Pinilla say the survey will serve as an important baseline from which to measure future progress and improvement.

"Overall, tick-work demands a long-term commitment. Ticks can take years to complete their life cycle, use different hosts to move around, and take advantage of weather and habitat changes," says Mateus-Pinilla. "As such, research on these vectors requires long-term and sustained commitment to research, surveillance, and partnerships across a broad range of disciplines, health professionals, and the public."

Credit: 
Entomological Society of America

Nanofiber masks can be sterilized multiple times without filter performance deterioration

image: Schematic diagram on spraying and dipping treatments of face mask filters.

Image: 
Copyright ©2020 American Chemical Society

With the global spread of coronavirus infections, personal protective equipments especially hygeine face masks are receiving much attention. Masks are essential items for the primary protection of the respiratory tract from viruses and bacteria that are transmitted through the air as droplets.

N95 masks are currently difficult to obtain, so there is an urgent need for a safe method of prolonging their usability through disinfection and reuse with minimal loss of the performance and integrity. Particulate filtration and air permeability are key factors in determining performance while cleaning and disinfecting N95 certified masks. This is crucial in preventing infections. Shinshu University has a track record of conducting research on production methods and applications of "nanofiber non-woven fabric" since before the coronavirus outbreak.

With this current social backdrop, a research team led by Professor Ick Soo Kim of Shinshu University's Institute for Fiber Engineering (IFES) with Ph.D. students Sana Ullah and Azeem Ullah and Professor Cha Hyung Joon of POSTECH (specially invited professor of IFES) with Ph.D. students Jaeyun Lee and Yeonsu Jeong looked into the effectiveness of sterilizing N95 masks. They looked at commercially available melt-blown nonwoven fabric N95 masks and nonwoven nanofiber masks with N95 filters. They examined the filtration efficiency, comfort of the wearer, and filter shape change after washing and disinfecting. The methods of disinfection test were directly spraying 75% ethanol on the mask filter and air drying, and soaking the mask filter in 75% ethanol solution for 5 minutes to 24 hours and leaving it to air dry naturally.

Filtration efficiency of both of the filters (melt-blown filter and the nanofiber filter) was 95% or more before use, which indicates that the respiratory organs of the wearer can be effectively protected. The tests also clarified that the inside of the filter can be effectively sterilized by spraying ethanol 3 times or more or immersing it in an ethanol solution for more than 5 minutes. However, when the mask was reused after the ethanol disinfection, the filtration efficiency of the melt-blown filter decreased to 64%. On the other hand, the nanofiber filter did not deteriorate in filter performance even after 10 or more uses.

Melt-blown filter works on the principle of electrostatic charge for removal of particulate matter, as in the result of ethanol spraying or dipping the electrostatic charge on the surface of melt-blown filter was lost, so efficiency of melt-blown filter was significantly decreased. On the other hand, filtration mechanism of nanofiber filter is independent of static charge and fully dependent on pore diameter, pore distribution, and morphology of nanofibers. As in the result of disinfection, morphology of nanofibers was not affected, thus it also maintained it's filtration as optimum as it was before use.

In addition, the nanofiber filter has higher heat emission and carbon dioxide emission performance than the melt-blown filter, and exhibits excellent breathability. Similarly, it was confirmed that the nanofiber filter had lower cytotoxicity than the melt-blown filter when a safety experiment using human skin and vascular cells was performed.

As stated above, both mask filters have similar filtering performance at the time of first use, but after disinfecting and reusing, the nanofiber filter does not exhibit performance deterioration. In other words, nanofiber filters can be easily sterilized with ethanol at home and reused multiple times.

"This research is an experimental verification of the biological safety of nanofiber masks and the maintenance of filtration efficiency after washing, which has recently become a problem," Professor Cha Hyung Joon states, who co-presided the research. Professor Ick Soo Kim hopes that nanofiber masks will serve as a means of prevention in the second and third wave of coronavirus infections.

Credit: 
Shinshu University

Better than cyclodextrins

Molecular containers that remove drugs, toxins, or malodorous substances from the environment are called sequestering agents. Scientists have developed a class of molecular containers that specifically sequester neurotransmitter antagonists. The barrel-shaped molecules called Pillar[n]MaxQ bind neuromuscular blocking chemicals 100,000-fold more tightly than established macrocyclic detoxification agents, the researchers report in the journal Angewandte Chemie.

Molecular containers of the cyclodextrin type sequester their targets by complexation. The ring- or barrel-shaped molecules recognize the molecular features of the target molecules and pull them into the central cavity using hydrophobic forces. Once the target molecule is inside this molecular container, it is neutralized. This host-guest complexation is the mechanism by which cyclodextrins, which are large, ring-shaped sugar molecules, eliminate unpleasant odors.

However, cyclodextrins are not very specific and fail for most alkaloids--a class of nitrogen-containing chemicals, including neurotransmitters and many illicit drugs. For these compounds, a class of molecular containers called pillararenes appear to be useful. They keep the alkaloids tightly bound in their pillararene cavity by wrapping a ring wall of aromatic benzene units around the hydrocarbon-rich molecular body.

Lyle Isaacs and his research team from the University of Maryland have further advanced the structure of the pillararenes to make the host-guest interactions stronger and more specific. "We envisioned to create a higher negative charge density around the mouth of the cavity by introducing acidic sulfate functional groups," the authors wrote. The negatively charged sulfate groups attract and bind quaternary ammonium ions, which are a hallmark of several clinically important neuromuscular blocking agents. The sulfate groups also stiffened the molecular structure of the barrels, the researchers found, so that the drug guest was smoothly pulled into the cavity by hydrophobic forces.

The researchers dubbed the molecular containers Pillar[n]MaxQ, where n indicates a target-size-dependent diameter that is variable. They observed that this class of sequestering agents binds the neuromuscular blockers up to 100,000-fold more tightly than the cyclodextrin container Sugammadex, which is in clinical use. Moreover, the sequestering agent discriminated against acetyl choline, a natural transmitter substance of nerve impulses within the central and peripheral nervous systems, which should not be sequestered.

The authors measured the host-guest complexation activities of Pillar[n]MaxQ by titration studies involving calorimetry and nuclear magnetic resonance of the guest molecules. As pillararenes have also been shown to reverse the effects of neuromuscular agents in rats, the researchers are aiming to study the new Pillar[n]MaxQ sequestering actions in animal models. Because of the high binding and the specificity of the chemically tailored molecular containers, they are confident that they will observe positive results.

Credit: 
Wiley

Mild thyroid dysfunction affects one in five women with a history of miscarriage or subfertility

WASHINGTON--Mild thyroid abnormalities affect up to one in five women with a history of miscarriage or subfertility which is a prolonged time span of trying to become pregnant, according to a new study published in the Endocrine Society's Journal of Clinical Endocrinology & Metabolism.

Thyroid disorders are common in women of reproductive age. Although the prevalence of thyroid disorders in pregnancy are well understood, little is known about how common these disorders are in women prior to pregnancy. Detecting thyroid disorders before a woman becomes pregnant is essential because thyroid abnormalities can have negative effects such as reduced fertility, miscarriage and pre-term birth.

"This study has found that mild thyroid abnormalities affect up to one in five women who have a history of miscarriage or subfertility and are trying for a pregnancy," said Rima Dhillon-Smith, M.B.Ch.B., Ph.D., of the University of Birmingham and the Birmingham Women's and Children's NHS Foundation Trust in Birmingham, U.K. "It is important to establish whether treatment of mild thyroid abnormalities can improve pregnancy outcomes, given the high proportion of women who could potentially be affected."

This study was conducted across 49 hospitals in the U.K. over five years. The researchers studied over 19,000 women with a history of miscarriage or subfertility who were tested for thyroid function. They found up to one in five women had mild thyroid dysfunction, especially those with an elevated BMI and of Asian ethnicity, but overt thyroid disease was rare. Women who suffered multiple miscarriages were no more likely to have thyroid abnormalities compared to women who have conceived naturally with a history of one miscarriage.

Credit: 
The Endocrine Society

Nanosponges could intercept coronavirus infection

video: Nanoparticles cloaked in human lung cell membranes and human immune cell membranes can attract and neutralize the SARS-CoV-2 virus in cell culture, causing the virus to lose its ability to hijack host cells and reproduce. The UC San Diego researchers call their nano-scale particles "nanosponges" because they soak up harmful pathogens and toxins.

Image: 
David Baillot/University of California San Diego

Nanoparticles cloaked in human lung cell membranes and human immune cell membranes can attract and neutralize the SARS-CoV-2 virus in cell culture, causing the virus to lose its ability to hijack host cells and reproduce.

The first data describing this new direction for fighting COVID-19 were published on June 17 in the journal Nano Letters. The "nanosponges" were developed by engineers at the University of California San Diego and tested by researchers at Boston University.

The UC San Diego researchers call their nano-scale particles "nanosponges" because they soak up harmful pathogens and toxins.

In lab experiments, both the lung cell and immune cell types of nanosponges caused the SARS-CoV-2 virus to lose nearly 90% of its "viral infectivity" in a dose-dependent manner. Viral infectivity is a measure of the ability of the virus to enter the host cell and exploit its resources to replicate and produce additional infectious viral particles.

Instead of targeting the virus itself, these nanosponges are designed to protect the healthy cells the virus invades.

"Traditionally, drug developers for infectious diseases dive deep on the details of the pathogen in order to find druggable targets. Our approach is different. We only need to know what the target cells are. And then we aim to protect the targets by creating biomimetic decoys," said Liangfang Zhang, a nanoengineering professor at the UC San Diego Jacobs School of Engineering.

His lab first created this biomimetic nanosponge platform more than a decade ago and has been developing it for a wide range of applications ever since. When the novel coronavirus appeared, the idea of using the nanosponge platform to fight it came to Zhang "almost immediately," he said.

In addition to the encouraging data on neutralizing the virus in cell culture, the researchers note that nanosponges cloaked with fragments of the outer membranes of macrophages could have an added benefit: soaking up inflammatory cytokine proteins, which are implicated in some of the most dangerous aspects of COVID-19 and are driven by immune response to the infection.

Making and testing COVID-19 nanosponges

Each COVID-19 nanosponge--a thousand times smaller than the width of a human hair--consists of a polymer core coated in cell membranes extracted from either lung epithelial type II cells or macrophage cells. The membranes cover the sponges with all the same protein receptors as the cells they impersonate--and this inherently includes whatever receptors SARS-CoV-2 uses to enter cells in the body.

The researchers prepared several different concentrations of nanosponges in solution to test against the novel coronavirus. To test the ability of the nanosponges to block SARS-CoV-2 infectivity, the UC San Diego researchers turned to a team at Boston University's National Emerging Infectious Diseases Laboratories (NEIDL) to perform independent tests. In this BSL-4 lab--the highest biosafety level for a research facility--the researchers, led by Anthony Griffiths, associate professor of microbiology at Boston University School of Medicine, tested the ability of various concentrations of each nanosponge type to reduce the infectivity of live SARS-CoV-2 virus--the same strains that are being tested in other COVID-19 therapeutic and vaccine research.

At a concentration of 5 milligrams per milliliter, the lung cell membrane-cloaked sponges inhibited 93% of the viral infectivity of SARS-CoV-2. The macrophage-cloaked sponges inhibited 88% of the viral infectivity of SARS-CoV-2. Viral infectivity is a measure of the ability of the virus to enter the host cell and exploit its resources to replicate and produce additional infectious viral particles.

"From the perspective of an immunologist and virologist, the nanosponge platform was immediately appealing as a potential antiviral because of its ability to work against viruses of any kind. This means that as opposed to a drug or antibody that might very specifically block SARS-CoV-2 infection or replication, these cell membrane nanosponges might function in a more holistic manner in treating a broad spectrum of viral infectious diseases. I was optimistically skeptical initially that it would work, and then thrilled once I saw the results and it sunk in what this could mean for therapeutic development as a whole," said Anna Honko, a co-first author on the paper and a Research Associate Professor, Microbiology at Boston University's National Emerging Infectious Diseases Laboratories (NEIDL).

In the next few months, the UC San Diego researchers and collaborators will evaluate the nanosponges' efficacy in animal models. The UC San Diego team has already shown short-term safety in the respiratory tracts and lungs of mice. If and when these COVID-19 nanosponges will be tested in humans depends on a variety of factors, but the researchers are moving as fast as possible.

"Another interesting aspect of our approach is that even as SARS-CoV-2 mutates, as long as the virus can still invade the cells we are mimicking, our nanosponge approach should still work. I'm not sure this can be said for some of the vaccines and therapeutics that are currently being developed," said Zhang.

The researchers also expect these nanosponges would work against any new coronavirus or even other respiratory viruses, including whatever virus might trigger the next respiratory pandemic.

Mimicking lung epithelial cells and immune cells

Since the novel coronavirus often infects lung epithelial cells as the first step in COVID-19 infection, Zhang and his colleagues reasoned that it would make sense to cloak a nanoparticle in fragments of the outer membranes of lung epithelial cells to see if the virus could be tricked into latching on it instead of a lung cell.

Macrophages, which are white blood cells that play a major role in inflammation, also are very active in the lung during the course of a COVID-19 illness, so Zhang and colleagues created a second sponge cloaked in macrophage membrane.

The research team plans to study whether the macrophage sponges also have the ability to quiet cytokine storms in COVID-19 patients.

"We will see if the macrophage nanosponges can neutralize the excessive amount of these cytokines as well as neutralize the virus," said Zhang.

Using macrophage cell fragments as cloaks builds on years of work to develop therapies for sepsis using macrophage nanosponges.

In a paper published in 2017 in Proceedings of the National Academy of Sciences, Zhang and a team of researchers at UC San Diego showed that macrophage nanosponges can safely neutralize both endotoxins and pro-inflammatory cytokines in the bloodstream of mice.
A San Diego biotechnology company co-founded by Zhang called Cellics Therapeutics is working to translate this macrophage nanosponge work into the clinic.

A potential COVID-19 therapeutic
The COVID-19 nanosponge platform has significant testing ahead of it before scientists know whether it would be a safe and effective therapy against the virus in humans, Zhang cautioned. But if the sponges reach the clinical trial stage, there are multiple potential ways of delivering the therapy that include direct delivery into the lung for intubated patients, via an inhaler like for asthmatic patients, or intravenously, especially to treat the complication of cytokine storm.

A therapeutic dose of nanosponges might flood the lung with a trillion or more tiny nanosponges that could draw the virus away from healthy cells. Once the virus binds with a sponge, "it loses its viability and is not infective anymore, and will be taken up by our own immune cells and digested," said Zhang.

"I see potential for a preventive treatment, for a therapeutic that could be given early because once the nanosponges get in the lung, they can stay in the lung for some time," Zhang said. "If a virus comes, it could be blocked if there are nanosponges waiting for it."

Growing momentum for nanosponges

Zhang's lab at UC San Diego created the first membrane-cloaked nanoparticles over a decade ago. The first of these nanosponges were cloaked with fragments of red blood cell membranes. These nanosponges are being developed to treat bacterial pneumonia and have undergone all stages of pre-clinical testing by Cellics Therapeutics, the San Diego startup cofounded by Zhang. The company is currently in the process of submitting the investigational new drug (IND) application to the FDA for their lead candidate: red blood cell nanosponges for the treatment of methicillin-resistant staphylococcus aureus (MRSA) pneumonia. The company estimates the first patients in a clinical trial will be dosed next year.

The UC San Diego researchers have also shown that nanosponges can deliver drugs to a wound site; sop up bacterial toxins that trigger sepsis; and intercept HIV before it can infect human T cells.

The basic construction for each of these nanosponges is the same: a biodegradable, FDA-approved polymer core is coated in a specific type of cell membrane, so that it might be disguised as a red blood cell, or an immune T cell or a platelet cell. The cloaking keeps the immune system from spotting and attacking the particles as dangerous invaders.

"I think of the cell membrane fragments as the active ingredients. This is a different way of looking at drug development," said Zhang. "For COVID-19, I hope other teams come up with safe and effective therapies and vaccines as soon as possible. At the same time, we are working and planning as if the world is counting on us."

Credit: 
University of California - San Diego

Cellular nanosponges could soak up SARS-CoV-2

image: In this illustration, a nanosponge coated with a human cell membrane acts as a decoy to prevent a virus from entering cells.

Image: 
Adapted from <i>Nano Letters</i> <b>2020</b>, DOI: 10.1021/acs.nanolett.0c02278

Scientists are working overtime to find an effective treatment for COVID-19, the illness caused by the new coronavirus, SARS-CoV-2. Many of these efforts target a specific part of the virus, such as the spike protein. Now, researchers reporting in Nano Letters have taken a different approach, using nanosponges coated with human cell membranes -- the natural targets of the virus -- to soak up SARS-CoV-2 and keep it from infecting cells in a petri dish.

To gain entry, SARS-CoV-2 uses its spike protein to bind to two known proteins on human cells, called ACE2 and CD147. Blocking these interactions would keep the virus from infecting cells, so many researchers are trying to identify drugs directed against the spike protein. Anthony Griffiths, Liangfang Zhang and colleagues had a different idea: making a nanoparticle decoy with the virus' natural targets, including ACE2 and CD147, to lure SARS-CoV-2 away from cells. And to test this idea, they conducted experiments with the actual SARS-CoV-2 virus in a biosafety level 4 lab.

The researchers coated a nanoparticle polymer core with cell membranes from either human lung epithelial cells or macrophages -- two cell types infected by SARS-CoV-2. They showed that the nanosponges had ACE2 and CD147, as well as other cell membrane proteins, projecting outward from the polymer core. When administered to mice, the nanosponges did not show any short-term toxicity. Then, the researchers treated cells in a dish with SARS-CoV-2 and the lung epithelial or macrophage nanosponges. Both decoys neutralized SARS-CoV-2 and prevented it from infecting cells to a similar extent. The researchers plan to next test the nanosponges in animals before moving to human clinical trials. In theory, the nanosponge approach would work even if SARS-CoV-2 mutates to resist other therapies, and it could be used against other viruses, as well, the researchers say.

Credit: 
American Chemical Society

Bouillon fortified with a new iron compound could help reduce iron deficiency

image: Iron fortification of bouillon or stock offers good potential for tackling iron deficiency, reaching large populations in Africa for example. Consumption of bouillon has been estimated at between 1.9 grams a day in Cameroon, to 8.6 grams a day in urban Senegal, for example.

Image: 
Yen Strandqvist/Chalmers University of Technology

Iron fortification of food is a cost-effective method of preventing iron deficiency. But finding iron compounds that are easily absorbed by the intestine without compromising food quality is a major challenge. Now, studies from Chalmers University of Technology, ETH Zurich and Nestlé show that a brand-new iron compound, containing the iron uptake inhibitor phytate and the iron uptake enhancing corn protein hydrolysate, meets the criteria.

Two billion people in the world suffer from iron deficiency. It is mainly prevalent in women of childbearing age, young children and adolescents. Severe iron deficiency can lead to premature birth, increased risk of illness and mortality for mother and child, as well as impaired development of brain function in children.

The situation is most serious in low-income countries where the diet is mainly plant-based. Cereals and legumes are rich in iron, but the iron is not available for absorption by the body. This is mainly because these foods also contain phytate, which inhibits iron absorption by forming insoluble compounds with iron in the gut.

One cost-effective way to prevent iron deficiency, especially in low-income countries, is to iron-fortify foods such as bouillon or stock. But one problem with this is that iron compounds which are easily absorbed by the gut tend to also be chemically reactive and can therefore affect the colour and taste of the food, and also contribute to the food perishing and being destroyed.

Conversely, stable iron compounds, such as ferric pyrophosphate, which is used today for iron fortification of bouillon and stock, are difficult for the intestine to absorb.

"The major challenge lies in finding a compound that can solve this balancing act. Nestlé and Chalmers began discussing this a few years ago, which led to Nestlé Research producing a new compound containing monoferric phytate (Fe-PA)," says Ann-Sofie Sandberg, Professor of Food Science at Chalmers University of Technology.

To make the compound easier for the intestine to absorb, it is bound to amino acids. Previous studies have shown how this helps make iron compounds more absorbable.

"Nestlé Research tested the compound's stability and effect on taste, colour and odour. Then we at Chalmers examined the iron uptake in human intestinal cells exposed to the bouillon fortified with different variants of the Fe-PA compound," says Ann-Sofie Sandberg.

The result turned out to be very positive. In addition to exposing the intestinal cells to the bouillon where the iron compound is bound to different amino acids, researchers from Nestlé also prepared variants where the amino acids were replaced by hydrolysed protein of corn and soy. The advantage of these proteins is that they cost less to produce. In addition, corn protein is not associated with allergies, so it particularly suitable for use in food.

"When we compared the rate of iron uptake with the new compound against that of ferrous sulfate, we could see that the intestinal cells exposed to all the different varieties of fortified bouillon had a good iron uptake. Ferrous sulfate is very readily absorbed, but is unsuitable in food because of its high reactivity," says Nathalie Scheers, Associate Professor of Molecular Metal Nutrition, who has led the development of the cell model for studying iron uptake.

In the parallel published human study from the Nestlé Research Center in Lausanne and ETH in Zurich, it has been shown that the iron absorption from the fortified bouillon with the hydrolysed corn protein compound was twice the rate compared to ferric pyrophosphate, which is often used today for iron fortification of foods outside Europe. When the new compound was tested in foods containing iron absorption inhibitors, such as corn porridge, the absorption was five times as high compared to ferric pyrophosphate.

The hope is that the new iron compound could be used in bouillon and stock cubes in low-income countries to reduce the incidence of iron deficiency - and thereby the rate of disease and mortality, especially in women and children.

"Unless side effects which we have not yet foreseen arise, we are hopeful that food fortified with this new ferric phytate compound could be of great interest in helping to reduce human suffering worldwide. But further research is needed here," says Ann-Sofie Sandberg.

More about the research:

In the Chalmers-developed cellular co-culture model for iron uptake, the human intestinal cells were exposed to bouillon enriched with the compounds Fe-PA-Histidine-Glutamine (Fe-PA-Hist-Gln) and Fe-PA-Histidine-Glycine (Fe-PA-Hist-Gly), but also compounds where the amino acids are replaced by hydrolysed soy protein (Fe-PA-HSP) and corn (Fe-PA-HCP).

The iron uptake was measured indirectly with the marker ferritin and was compared to the uptake of ferrous sulfate.

Credit: 
Chalmers University of Technology