Culture

Palliative Care in emergency departments during COVID-19 pandemic

What The Study Did: The clinical characteristics and outcomes of patients who received intervention by a COVID-19 palliative care response team are examined in this case series.

Authors: Shunichi Nakagawa, M.D., of the Columbia University Medical Center in NewYork, is the corresponding author.

To access the embargoed study:  Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamainternmed.2020.2713)

Editor's Note: Please see the article for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

New smart fabrics from bioactive inks monitor body and environment by changing color

video: New smart fabrics can provide a high resolution color-coded map of body response and the environment.

Image: 
Focus Vision Media & Tufts University

MEDFORD/SOMERVILLE, Mass. (June 5, 2020)--Researchers at Tufts University's School of Engineering have developed biomaterial-based inks that respond to and quantify chemicals released from the body (e.g. in sweat and potentially other biofluids) or in the surrounding environment by changing color. The inks can be screen printed onto textiles such as clothes, shoes, or even face masks in complex patterns and at high resolution, providing a detailed map of human response or exposure. The advance in wearable sensing, reported in Advanced Materials, could simultaneously detect and quantify a wide range of biological conditions, molecules and, possibly, pathogens over the surface of the body using conventional garments and uniforms.

"The use of novel bioactive inks with the very common method of screen printing opens up promising opportunities for the mass-production of soft, wearable fabrics with large numbers of sensors that could be applied to detect a range of conditions," said Fiorenzo Omenetto, corresponding author and the Frank C. Doble Professor of Engineering at Tufts' School of Engineering. "The fabrics can end up in uniforms for the workplace, sports clothing, or even on furniture and architectural structures."

Wearable sensing devices have attracted considerable interest in monitoring human performance and health. Many such devices have been invented incorporating electronics in wearable patches, wristbands, and other configurations that monitor either localized or overall physiological information such as heart rate or blood glucose. The research presented by the Tufts team takes a different, complementary approach - non-electronic, colorimetric detection of a theoretically very large number of analytes using sensing garments that can be distributed to cover very large areas: anything from a patch to the entire body, and beyond.

The components that make the sensing garments possible are biologically activated silk-based inks. The soluble silk substrate in these ink formulations can be modified by embedding various "reporter" molecules - such as pH sensitive indicators, or enzymes like lactate oxidase to indicate levels of lactate in sweat. The former could be an indicator of skin health or dehydration, while the latter could indicate levels of fatigue of the wearer. Many other derivatives of the inks can be created due to the versatility of the silk fibroin protein by modifying it with active molecules such as chemically sensitive dyes, enzymes, antibodies and more. While the reporter molecules could be unstable on their own, they can become shelf-stable when embedded within the silk fibroin in the ink formulation.

The inks are formulated for screen printing applications by combining with a thickener (sodium alginate) and a plasticizer (glycerol). The screen printable bio-inks can be used like any ink developed for screen printing, and so can be applied not just to clothing but also to various surfaces such as wood, plastics and paper to generate patterns ranging from hundreds of microns to tens of meters. While the changes in color presented by the inks can provide a visual cue to the presence or absence of an analyte, use of camera imaging analysis scanning the garments or other material can gather more precise information on both quantity and high resolution, sub-millimeter mapping.

The technology builds upon earlier work by the same researchers developing bioactive silk inks formulated for inkjet-printing to create petri dishes, paper sensors, and laboratory gloves that can indicate bacterial contamination by changing colors.

"The screen printing approach provides the equivalent of having a large, multiplexed arrangement of sensors covering extensive areas of the body, if worn as a garment, or even on large surfaces such as room interiors," said Giusy Matzeu, research assistant professor of biomedical engineering at Tufts School of Engineering and first author of the paper. "Coupled with image analysis, we can obtain a high resolution mapof color reactions over a large area and gain more insight on overall physiological or environmental state. In theory, we could extend this method to track air quality, or support environmental monitoring for epidemiology."

The fact that the method uses common printing techniques also opens up avenues in creative applications - something explored by Laia Mogas-Soldevila, architect and recent PhD graduate at Tufts in Omenetto's SilkLab. Mogas-Soldevila has helped to create beautiful tapestries, displaying them in museums across the United States and Europe. The displays are interactive, allowing visitors to spray different, non-toxic chemicals onto the fabric and watch the patterns transform. "This is really a great example of how art and engineering can gain from and inspire each other," said Mogas-Soldevila. "The engineered inks open up a new dimension in responsive, interactive tapestries and surfaces, while the 1,000-year old art of screen printing has provided a foundation well suited to the need for a modern high resolution, wearable sensing surface."

Credit: 
Tufts University

Increasingly efficient serological tests thanks to a new ECL based mechanism

A research group led by the University of Bologna identified a new mechanism that allows to obtain serological tests that are quicker, more cost-effective and more reliable than those currently in use. This mechanism is based on the technique of electrochemiluminescence (ECL) and is applicable to serological tests devised to detect the antibodies against SARS-CoV-2. The future of this innovative technique is an industrial application, as two leading firms in the fields of diagnostics and technologies, were also involved in this research: the Germany-based Roche Diagnostic and the Japanese Hitachi High Tech.

The study was published on Nature Communications and its outcomes show serological tests with increased sensitivity levels up to 128% more than those currently in use is possible thanks to highly efficient reactants.

"The results we obtained mark a new milestone in the state of the art of signal enhancement of ECL-based immunoassays", says Francesco Paolucci leader of the research group and professor at the University of Bologna. "This milestone is the outcome of years and years of international research into electrochemistry and of a close synergy with R&D sectors".

Serological tests are based on the ability to translate into measurable and visible signals the interactions between some molecules and the antibodies the test intends to quantify. ECL can play a major role here, as it relies on an electrochemical reaction producing a light signal. In the case of serological tests, electrochemiluminescence "switches on" the antibodies as if they were lamps.

This mechanism has, however, some limitations: the molecules that are able to prompt this reaction are in very low concentrations in the human blood. Therefore, we need highly-sensitive techniques to identify the antibodies in the blood. The results obtained by this research group seems to go exactly in this direction.

"Our work represents something unprecedented in the field of ECL because it relies on the enhancement of the signal as opposed to the enhancement of the target as it usually happens with enzymatic methods or PCR (Polymerase Chain Reaction)", explains Giovanni Valenti, study coordinator and researcher at the University of Bologna. "These results pave the way for the development of ultra-sensitive serological tests".

The researchers obtained a twofold result with this study. On the one hand, they refined the mechanisms regulating ECL analyses; on the other hand, they employed these mechanisms to develop new reactants that allow to obtain far more efficient serological tests.

"From these results, we managed to identify highly efficient reactants that are able to enhance the sensitivity of this technique way beyond that currently employed for serological tests", confirms Alessandra Zanut, first author of the study and researcher at the University of Bologna. "With this technique, we obtained an ECL signal enhancement up to 128% compared to current techniques".

Credit: 
Università di Bologna

Something in the water: Environmental pollutant may be more hazardous than previously thought

Sometimes toxins, such as hazardous wastes and industrial byproducts, seep into groundwater, the source of our drinking water. One such pollutant is perchlorate, a chemical compound used in rocket fuels, fireworks, fertilizers and other materials. The compound is thought to contribute to health issues in humans such as hypothyroidism, the decreased production of hormones from the thyroid gland, which can impact development.

The findings, they say, suggest that an acceptable safe concentration of perchlorate in drinking water is 10 times less than previously thought.

The researchers focused on how perchlorate blocks a main route by which iodide, the negatively charged form of the element iodine, enters thyroid cells. Iodide helps the thyroid make hormones that are essential to the body's regulation of metabolism, temperature and other important functions.

Thyroid cells control the incoming flow of iodide by using a protein channel called the sodium/iodide symporter, also known as the Na+/I- symporter or NIS. Like other cellular transport systems, a "lock-and-key" approach is used to move iodide, with NIS acting as the lock and sodium as the key. Sodium fits into NIS at two binding sites to unlock the channel, enabling iodide to pass through and accumulate inside a thyroid cell.

The team, led by L Mario Amzel, Ph.D., professor of biophysics and biophysical chemistry at the Johns Hopkins University School of Medicine, and Vanderbilt University researcher Nancy Carrasco, M.D., determined that perchlorate blocks the channel by latching onto the NIS protein and changing its shape. Less sodium binds to the misshaped channel, thereby significantly lowering the amount of iodide that can be moved inside thyroid cells.

The researchers studied how varying concentrations of perchlorate affects iodide transport by first growing thyroid cells that expressed the gene SLC5A5, which encodes the instructions for building NIS channels. Next, perchlorate and radioactive iodine were placed outside of some of the cells and just radioactive iodine outside the others. Finally, the researchers tracked how much glowing iodide was allowed to enter the cells in both groups. They found that there was much less iodide inside thyroid cells treated with perchlorate than in untreated ones, even at very low concentrations of the chemical.

In May 2020, the U.S. Environmental Protection Agency (EPA) ruled not to place regulations on the amount of perchlorate that can be allowed in drinking water. The findings from the new study strongly suggest that this environmental pollutant is more hazardous than previously thought, raising serious concern about the decision.

"We hope that these findings will prompt the EPA to change its mind," Amzel says.

Credit: 
Johns Hopkins Medicine

COVID-19 safety recommendations, aim to reduce deaths among elderly in nursing homes

Seeking to address estimates that more than a third of COVID-19 deaths nationally have occurred in nursing homes and long-term care facilities--more than 38,000 - the American Medical Directors Association published recommendations for reducing the spread of the pandemic virus among residents and staff.

Among the recommendations were the creation of COVID-specific units, screenings of residents twice daily, discontinuing of drug delivery modes (e.g. nebulizers) that might spread the virus, and reviews with patients and families of do-not-intubate and do-not-hospitalize advance directives.

"The scope and speed of the COVID-19 pandemic brought continual changes in healthcare protocols as providers learned more about the disease's transmission," said Paula Lester, MD, FACP, CMD, a geriatrician at NYU Winthrop Hospital and the corresponding author of the consensus recommendations, which were recently published online in Journal of American Medical Directors Association (JAMDA).

"The time has come to consolidate our learnings as a field in terms of caring for at-risky elderly and implement uniform, best practices, especially as we prepare for a potential second wave of infections in the coming months, as well as for future pandemics," adds Lester, who along with her co-authors, serves as a skilled nursing facility (SNF) certified medical director.

Recommended protocols for facility staff also include COVID testing on a serial basis--three tests one-week apart--to enable identification of newly infected staff. Also recommended is to have staff assigned to specific units to permit easier contact tracing in the event of COVID cases, and to have staff that are assigned to COVID-19 units not work elsewhere in the facility.

The report also states that the authors "do not support the mandatory admission of COVID-19 patients from hospitals to nursing homes as it may force unprepared facilities to provide care to COVID patients without the necessary resources or precautions."

The consensus guidelines in the report - titled "Policy Recommendations Regarding Skilled Nursing Facility Management of COVID-19: Lessons From New York State" - are endorsed by the Executive Board of the New York Medical Directors Association and the Board of the Metropolitan Area Geriatrics Society. The authors noted, however, that the suggestions in the report should not take precedence over local Department of Health or Centers for Disease Control recommendations.

Credit: 
NYU Langone Health / NYU Grossman School of Medicine

Psychedelic drug psilocybin tamps down brain's ego center

Perhaps no region of the brain is more fittingly named than the claustrum, taken from the Latin word for "hidden or shut away." The claustrum is an extremely thin sheet of neurons deep within the cortex, yet it reaches out to every other region of the brain. Its true purpose remains "hidden away" as well, with researchers speculating about many functions. For example, Francis Crick of DNA-discovery fame believed that the claustrum is the seat of consciousness, responsible for awareness and sense of self.

What is known is that this region contains a large number of receptors targeted by psychedelic drugs such as LSD or psilocybin ¾ the hallucinogenic chemical found in certain mushrooms. To see what happens in the claustrum when people are on psychedelics, Johns Hopkins Medicine researchers compared the brain scans of people after they took psilocybin with their scans after taking a placebo.

Their findings were published online on May 23, 2020, in the journal NeuroImage.

The scans after psilocybin use showed that the claustrum was less active, meaning the area of the brain believed responsible for setting attention and switching tasks is turned down when on the drug. The researchers say that this ties in with what people report as typical effects of psychedelic drugs, including feelings of being connected to everything and reduced senses of self or ego.

"Our findings move us one step closer to understanding mechanisms underlying how psilocybin works in the brain," says Frederick Barrett, Ph.D., assistant professor of psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine and a member of the school's Center for Psychedelic and Consciousness Research. "This will hopefully enable us to better understand why it's an effective therapy for certain psychiatric disorders, which might help us tailor therapies to help people more."

Because of its deep-rooted location in the brain, the claustrum has been difficult to access and study. Last year, Barrett and his colleagues at the University of Maryland, Baltimore, developed a method to detect brain activity in the claustrum using functional magnetic resonance imaging (fMRI).

For this new study, the researchers used fMRI with 15 people and observed the claustrum brain region after the participants took either psilocybin or a placebo. They found that psilocybin reduced neural activity in the claustrum by 15% to 30%. This lowered activity also appeared to be associated with stronger subjective effects of the drug, such as emotional and mystical experiences. The researchers also found that psilocybin changed the way that the claustrum communicated with brain regions involved in hearing, attention, decision-making and remembering.

With the highly detailed imaging of the claustrum provided by fMRI, the researchers next hope to look at the mysterious brain region in people with certain psychiatric disorders such as depression and substance use disorder. The goal of these experiments will be to see what roles, if any, the claustrum plays in these conditions. The researchers also plan to observe the claustrum's activity when under the influence of other psychedelics, such as salvinorin A, a hallucinogen derived from a Mexican plant.

Credit: 
Johns Hopkins Medicine

Physicists create quantum-inspired optical sensor

image: Apparatus for measuring the position of an object using optical coherence alone.

Image: 
Nikita Kirsanov/MIPT

Researchers from the Moscow Institute of Physics and Technology, joined by a colleague from Argonne National Laboratory, U.S., have implemented an advanced quantum algorithm for measuring physical quantities using simple optical tools. Published in Scientific Reports, their study takes us a step closer to affordable linear optics-based sensors with high performance characteristics. Such tools are sought after in diverse research fields, from astronomy to biology.

Maximizing the sensitivity of measurement tools is crucial for any field of science and technology. Astronomers seek to detect remote cosmic phenomena, biologists need to discern exceedingly tiny organic structures, and engineers have to measure the positions and velocities of objects, to name a few examples.

Until recently, no measurement tool could ensure precision above the so-called shot noise limit, which has to do with the statistical features inherent in classical observations. Quantum technology has provided a way around this, boosting precision to the fundamental Heisenberg limit, stemming from the basic principles of quantum mechanics. The LIGO experiment, which detected gravitational waves for the first time in 2016, shows it is possible to achieve Heisenberg-limited sensitivity by combining complex optical interference schemes and quantum techniques.

Quantum metrology is a cutting-edge area of physics concerned with the technological and algorithmic tools for making highly precise quantum measurements. In their recent study, the team from MIPT and ANL fused quantum metrology with linear optics.

"We devised and constructed an optical scheme that runs the Fourier transform-based phase estimation procedure," said study co-author Nikita Kirsanov from MIPT. "This procedure lies at the core of many quantum algorithms, including high-precision measurement protocols."

A specific arrangement of a very large number of linear optical elements -- beam splitters, phase shifters, and mirrors -- makes it possible to gain information about the geometric angles, positions, velocities as well as other parameters of physical objects. The measurement involves encoding the quantity of interest in the optical phases, which are then determined directly.

"This research is a follow-up to our work on universal quantum measurement algorithms," commented principal investigator Gordey Lesovik, who heads the MIPT Laboratory of the Physics of Quantum Information Technology. "In an earlier collaboration with a research group from Aalto University in Finland, we experimentally implemented a similar measurement algorithm on transmon qubits."

The experiment showed that despite the large number of optical elements in the scheme, it is nevertheless tunable and controllable. According to the theoretical estimates provided in the paper linear optics tools are viable for implementing even operations that are considerably more complex.

"The study has demonstrated that linear optics offers an affordable and effective platform for implementing moderate-scale quantum measurements and computations," said Argonne Distinguished Fellow Valerii Vinokur.

Credit: 
Moscow Institute of Physics and Technology

Protecting the neuronal architecture

image: Stroke leads to a reduction of VEGFD levels, loss of dendrites, brain damage, and impaired motor functions. As research on a mouse model has shown, VEGFD-based therapies can prevent structural disintegration, thereby facilitating functional recovery.

Image: 
Daniela Mauceri

Protecting nerve cells from losing their characteristic extensions, the dendrites, can reduce brain damage after a stroke. Neurobiologists from Heidelberg University have demonstrated this by means of research on a mouse model. The team, led by Prof. Dr Hilmar Bading in cooperation with Junior Professor Dr Daniela Mauceri, is investigating the protection of neuronal architecture to develop new approaches to treating neurodegenerative diseases. The current research findings were published in the journal "Proceedings of the National Academy of Sciences".

Brain nerve cells possess many arborised dendrites, which can make connections with other neurons. The highly complex, ramified structure of neurons is an important precondition for their ability to connect with other nerve cells, in order to enable the brain to function normally. In earlier studies, the Heidelberg researchers identified the signal molecule VEGFD - Vascular Endothelial Growth Factor D - as a central regulator for maintaining and restoring neuronal structures. "Our current research results demonstrate that a stroke as a consequence of an interruption of the blood supply to the brain leads to a reduction of VEGFD levels. That causes the nerve cells to lose part of their dendrites. They shrink and this leads to impairments of the cognitive and motor abilities," explains Prof. Bading.

Based on these findings, the researchers at the Interdisciplinary Centre for Neurosciences explored the question of whether the reduction of neuronal structures after a stroke can be prevented by restoring the VEGFD levels. To that effect, they applied recombinant VEGFD - produced using biotechnological methods - to the brains of mice that had suffered a stroke. "The treatment successfully preserved the dendritic arborisation and, what is important, brain damage was reduced. Furthermore, the motor abilities recovered more quickly," says Prof. Mauceri. In a second step, the researchers administered a modified form of VEGFD as nose drops, in order to simplify the treatment. They achieved the same results with this peptide mimetic, i.e. a simplified but biologically still effective version of VEGFD, which was developed in cooperation with Prof. Dr Christian Klein from Heidelberg University's Institute of Pharmacy and Molecular Biotechnology.

The scientists hope that their research findings to protect the neuronal architecture will lead to new approaches to treating stroke in the long run. "The principle of nasal delivery, in particular, would be a safe and simple form of intervention," says Prof. Bading. The Heidelberg scientists are now working on expanding the treatment trialled in the mouse model with a view to a possible clinical application.

Credit: 
Heidelberg University

Something in the water: Pollutant may be more hazardous than previously thought

image: Perchlorate, a chemical compound used in rocket fuels (such as the Space Shuttle's solid propellant seen here during the program's final launch in 2011) and other materials, may be a more hazardous pollutant than previously thought.

Image: 
M.E. Newman, Johns Hopkins Medicine, using NASA and public domain images

Sometimes toxins, such as hazardous wastes and industrial byproducts, seep into groundwater, the source of our drinking water. One such pollutant is perchlorate, a chemical compound used in rocket fuels, fireworks, fertilizers and other materials. The compound is thought to contribute to health issues in humans such as hypothyroidism, the decreased production of hormones from the thyroid gland, which can impact development.

In a new study published May 25, 2020, in the journal Nature Structural & Molecular Biology, researchers at Johns Hopkins Medicine, Vanderbilt University and the University of California, Irvine, report on the mechanism that perchlorate uses to impact and damage normal functioning of the thyroid gland.

The findings, they say, suggest that an acceptable safe concentration of perchlorate in drinking water is 10 times less than previously thought.

The researchers focused on how perchlorate blocks a main route by which iodide, the negatively charged form of the element iodine, enters thyroid cells. Iodide helps the thyroid make hormones that are essential to the body's regulation of metabolism, temperature and other important functions.

Thyroid cells control the incoming flow of iodide by using a protein channel called the sodium/iodide symporter, also known as the Na+/I- symporter or NIS. Like other cellular transport systems, a "lock-and-key" approach is used to move iodide, with NIS acting as the lock and sodium as the key. Sodium fits into NIS at two binding sites to unlock the channel, enabling iodide to pass through and accumulate inside a thyroid cell.

The team, led by L Mario Amzel, Ph.D., professor of biophysics and biophysical chemistry at the Johns Hopkins University School of Medicine, and Vanderbilt University researcher Nancy Carrasco, M.D., determined that perchlorate blocks the channel by latching onto the NIS protein and changing its shape. Less sodium binds to the misshaped channel, thereby significantly lowering the amount of iodide that can be moved inside thyroid cells.

The researchers studied how varying concentrations of perchlorate affects iodide transport by first growing thyroid cells that expressed the gene SLC5A5, which encodes the instructions for building NIS channels. Next, perchlorate and radioactive iodine were placed outside of some of the cells and just radioactive iodine outside the others. Finally, the researchers tracked how much glowing iodide was allowed to enter the cells in both groups. They found that there was much less iodide inside thyroid cells treated with perchlorate than in untreated ones, even at very low concentrations of the chemical.

In May 2020, the U.S. Environmental Protection Agency (EPA) ruled not to place regulations on the amount of perchlorate that can be allowed in drinking water. The findings from the new study strongly suggest that this environmental pollutant is more hazardous than previously thought, raising serious concern about the decision.

"We hope that these findings will prompt the EPA to change its mind," Amzel says.

Credit: 
Johns Hopkins Medicine

Could the blood of COVID-19 patients be used to predict disease progression?

image: Researchers used mass spectrometry to identify biomarker profiles which can be used to classify disease severity in patients with COVID-19. These biomarker profiles could also be used to predict disease progression.

Image: 
Photo: Arne Sattler/Charité

People respond very differently to infection with the novel coronavirus (SARS-CoV-2). While some patients develop no symptoms at all, others will develop severe disease and may even die. For this reason, there is an urgent need for 'biomarkers', quantifiable biological characteristics which could provide a reliable means of predicting disease progression and severity. A research team led by Prof. Dr. Markus Ralser (Director of Charité's Institute of Biochemistry, holder of an Einstein Professorship and Group Leader at the Francis Crick Institute) used state-of-the-art analytical techniques to rapidly determine the levels of various proteins in the blood plasma. This approach enabled the researchers to identify various protein biomarkers in the blood plasma of patients with COVID-19 which were linked to the severity of their disease.

The researchers developed a precise, high-throughput mass spectrometry platform capable of analyzing the patients' proteomes - the compendium of proteins found in biological material - at a rate of 180 samples per day. Using this technology, the team analyzed blood plasma samples from 31 men and women who were receiving treatment at Charité for COVID-19 of varying degrees of severity. The researchers were able to identify 27 proteins in the blood which varied in quantity depending on disease severity. The researchers then validated these molecular signatures by analyzing samples from another group of 17 COVID-19 patients and 15 healthy people. Protein expression signatures were able to precisely classify patients according to the World Health Organization's coding criteria for COVID-19.

"These results lay the foundations for two very different applications. One possible future use would be for disease prognosis," explains Prof. Ralser, who is also group leader at the Francis Crick Institute in London. "An early blood test would enable the treating physician to predict whether or not a patient with COVID-19 will develop severe symptoms. This could potentially save lives: the sooner physicians know which patients will require intensive care, the faster they can make use of the available treatment options." In order to get closer to this goal, the researchers will now study how the biomarker signatures change over the course of the disease.

"Another possible future use would be as an in-hospital diagnostic test, which could provide clarity regarding a patient's condition - regardless of how they themselves describe it," explains the biochemist. He adds: "In some cases, a patient's symptoms do not appear to provide an accurate picture of their true health status. An objective evaluation, based on their biomarker profile, could be extremely valuable in this regard." The research team now plan to test their new method in a larger number of patients in the hope of getting closer to developing a diagnostic test.

*Messner CB et al. Ultra-high-throughput clinical proteomics reveals classifiers of COVID-19 infection. Cell Systems (2020), doi: 10.1016/j-cels.2020.05.012.

Changes in the protein profile

Some of the 27 proteins which were found to predict the severity of COVID-19 had not previously been linked to an immune response. However, the biomarkers identified by the researchers also included clotting factors and regulators of inflammation. Some of these proteins act on interleukin 6 (IL-6) at the molecular level. IL-6 is a protein which is known to cause inflammation, and which, according to preliminary studies, is associated with severe COVID-19 symptoms. A number of the biomarkers identified as part of this study might therefore be suitable targets for treatment.

Credit: 
Charité - Universitätsmedizin Berlin

Measuring Atlantic bluefin tuna with a drone

Researchers have used an unmanned aerial system (or drone) to gather data on schooling juvenile Atlantic bluefin tuna in the Gulf of Maine.

This pilot study tested whether a drone could keep up with the tuna while also taking photographs that captured physical details of this fast-moving fish. The drone was equipped with a high-resolution digital still image camera. Results show that drones can capture images of both individual fish and schools. They may be a useful tool for remotely monitoring behavior and body conditions of the elusive fish.

Individual fish lengths and widths, and the distance between fish near the sea surface, were measured to less than a centimeter of precision. We used an APH-22, a battery-powered, six-rotor drone. The pilot study was conducted in the Atlantic bluefin tuna's foraging grounds northeast of Cape Cod in the southern Gulf of Maine.

"Multi-rotor unmanned aerial systems won't replace shipboard surveys or the reliance on manned aircraft to cover a large area," said Mike Jech, an acoustics researcher at the Northeast Fisheries Science Center in Woods Hole, Massachusetts and lead author of the study. "They have a limited flight range due to battery power and can only collect data in bursts. Despite some limitations, they will be invaluable for collecting remote high-resolution images that can provide data at the accuracy and precision needed by managers for growth and ecosystem models of Atlantic bluefin tuna."

Results from the APH-22 study were published in March 2020 in the Journal of Unmanned Vehicle Systems. Researchers conducted their work in 2015. They then compared their study results to values in published data collected in the same general area. They also compared it to recreational landings data collected through NOAA Fisheries' Marine Recreational Information Program.

Atlantic bluefin tuna is a commercially and ecologically important fish. The population size in the western Atlantic Ocean is unknown. Fishery managers need biological data about this population, but it is hard to get. Highly migratory species like Atlantic bluefin tuna often move faster than the vessels trying to sample them. The tuna are distributed across large areas, and can be found from the sea surface to hundreds of feet deep.

Sampling with traditional gear -- nets and trawls -- is ineffective. Acoustical methods are useful but limited to sampling directly below a seagoing vessel with echosounders or within range of horizontal sonar.

It is also difficult to estimate the number of tuna in a school from an airplane. Both fish availability and perception biases introduced by observers can affect results. Estimates of abundance and size of individuals within a school are hard to independently verify.

Taking precision measurements of animals that are in constant motion near the surface proved easier with a drone that is lightweight, portable, and agile in flight. It can carry a high-quality digital still camera, and be deployed quickly from a small fishing boat.

Short flight times limit a drone's ability to survey large areas. However, they can provide two-dimensional images of the shape of a fish school and data to count specific individuals just below the ocean surface.

The APH-22 system has been tested and evaluated for measuring other marine animals. It's been used in a number of environments -- from Antarctica to the Pacific Ocean -- prior to its use in the northwest Atlantic Ocean. Previous studies estimated the abundance and size of penguins and leopard seals, and the size and identity of individual killer whales.

"The platform is ideal for accurately measuring fish length, width, and the distance between individuals in a school when you apply calibration settings and performance measures," Jech said. "We were able to locate the hexacopter in three-dimensional space and monitor its orientation to obtain images with a resolution that allowed us to make measurements of individual fish."

As new unmanned aerial systems are developed, their use to remotely survey Atlantic bluefin tuna and other animals at the sea surface will evolve. It may minimize the reliance on manned aircraft or supplement shipboard surveys.

The International Commission for the Conservation of Atlantic Tunas governs tuna fishing. It is entrusted to monitor and manage tuna and tuna-like species in the Atlantic Ocean and adjacent seas. NOAA Fisheries manages the Atlantic bluefin tuna fishery in the United States and sets regulations for the U.S. fishery based on conservation and management recommendations from the international commission.

Credit: 
NOAA Northeast Fisheries Science Center

Volcanic glass spray shows promise in controlling mosquitoes

image: The lower portion of a mosquito's leg after contact with a volcanic rock powder. Statically transferred perlite particles dehydrate mosquitoes, killing them.

Image: 
Photo courtesy of Michael Roe, NC State University.

An indoor residual spray made by combining a type of volcanic glass with water showed effective control of mosquitoes that carry malaria, according to a new study. The findings could be useful in reducing disease-carrying mosquito populations - and the risk of malaria - in Africa.

Malaria, an infectious disease transmitted by mosquitoes, annually kills some 400,000 people in Africa. The use of insecticide-treated bed nets and indoor residual sprays are the most common and effective methods of reducing mosquito populations in Africa. But mosquitoes are becoming increasingly resistant to the commonly used insecticides such as pyrethroids, so the need for alternative safe chemistry to use in controlling mosquitoes is important.

The volcanic glass material used in this new intervention is perlite, an industrial mineral most frequently used in building materials and in gardens as a soil additive. The tested insecticide created from perlite, called Imergard WP, can be applied to interior walls and ceilings - and perhaps even inside roofs - as an indoor residual spray. The spray contains no additional chemicals, is not toxic to mammals and will be cost effective. Early results show that mosquitoes do not appear to have resistance to the perlite spray.

In the study, North Carolina State University entomologists worked with the Innovative Vector Control Consortium (IVCC) based at the Liverpool School of Tropical Medicine and Imerys Filtration Minerals Inc. to test Imergard WP. Researchers used the spray in experimental huts in the Republic of Benin (West Africa) to test the effects of the spray on both wild and more susceptible strains of Anopheles gambiae mosquitoes, the primary malaria vector in sub-Saharan Africa.

Researchers used four different tests to verify the efficacy of Imergard WP. Control study huts had no mosquito-prevention spray. In the second group hut walls were coated with a common pyrethroid. Hut walls were sprayed with Imergard WP in the third group, while in the fourth group hut walls were sprayed with a mixture of Imergard WP and the common pyrethroid.

Huts with walls treated with Imergard WP, with and without the pyrethroid, showed the largest mosquito mortality rates. Results showed mortality rates of mosquitos alighting on Imergard WP-treated walls were greater than 80% up to five months after treatments, and 78% at six months. The treatments were effective against both susceptible and wild-type mosquitoes.

"The statically transferred perlite particles essentially dehydrate the mosquito," said Mike Roe, William Neal Reynolds Distinguished Professor of Entomology at NC State and the corresponding author of the paper. "Many die within a few hours of contact with the treated surface. Mosquitoes are not repelled from a treated surface because there is no olfactory mechanism to smell rock."

Huts sprayed with only the common pesticide had mosquito mortality rates of around 40 to 45% over five months, with those rates dropping to 25% in month six of the study.

"The processing of perlite as an insecticide is novel," said David Stewart, commercial development manager for Imerys, the company that created Imergard WP, and co-author of the paper. "This material is not a silver bullet but a new tool that can be considered as part of an insect vector management program."

Credit: 
North Carolina State University

Genomic surveillance of antibiotic resistance in the Philippines established

Antibiotic resistance surveillance in the Philippines has moved into the genomic era, enabling better tracking of dangerous bacteria. Researchers at the Centre for Genomic Pathogen Surveillance (CGPS housed at the Wellcome Sanger Institute and The Big Data Institute (BDI), University of Oxford), and the Philippine Research Institute for Tropical Medicine (RITM), set up local DNA sequencing and analysis of drug resistant bacteria in the Philippines. This genomic capacity has enhanced ongoing national infection control including tracking the spread of resistance to last-line antibiotics and identifying drug resistant infections in a hospital baby unit, helping control the outbreak.

Reported in Nature Communications this week, this study shows the power of local genomic sequencing within national surveillance networks in low- and middle-income countries, and could be extended to other locations to tackle the global challenge of antimicrobial resistance.

Antimicrobial resistance (AMR) is a global health problem, with resistance to common antibiotics found in all regions of the world. This means it can be extremely difficult to treat some bacterial diseases such as MRSA, tuberculosis and gonorrhoea, and raises risks of any surgery.

Surveillance of AMR is critical to understand and try to halt its spread, and DNA sequencing can pinpoint resistance mechanisms and uncover transmission patterns. However, genomic surveillance is less common in low- and middle-income countries (LMICs), which are predicted to be the most affected by AMR.

The Philippines has a very well established Antimicrobial Resistance Surveillance Program within the Philippine Department of Health, which uses laboratory-based methods to track antimicrobial resistance. In 2018 the researchers helped set up a DNA sequencing facility within this to build local capacity for genomic surveillance in the Philippines*. This has included establishing local capacity in genomics and data interpretation through shared training.

Samples were sequenced from more than 20 sites across the Philippines, focusing on bacteria that are resistant to the last-line antibiotics, and listed by the World Health Organisation as top priority pathogens for the development of new antibiotics**. The teams collectively analysed the data, creating phylogenetic trees that showed how the bacterial strains are related to each other, and uncovered several high-risk clones.

Combining the genetic findings with epidemiological data allowed the researchers to pinpoint strains in particular locations. In one hospital they identified a cluster of the same strain of carbapenem-resistant Klebsiella pneumoniae in a neonatal intensive care unit, and revealed that this was being spread within the hospital. This evidence enabled the hospital to bolster their infection control team, to control potential future outbreaks.

Dr Celia Carlos, joint lead of the project from the Research Institute for Tropical Medicine, Philippines, said: "Here in the Philippines we have more than 30 years of experience developing laboratory methods to track AMR, with our Antimicrobial Resistance Surveillance Program. Now, working with our partners in the UK, we have established local capacity and expertise for whole genome sequencing in the Philippines, adding genomic surveillance to these other methods. This is helping us to identify emerging resistant strains much faster, so we can understand what is happening, prevent transmission of AMR and save lives."

Dr Silvia Argimón, first author on the paper from the Centre for Genomic Pathogen Surveillance said: "The programme not only helped set up the genomic infrastructure in the Philippines, but also enabled close collaboration between the teams in the UK and the Philippines. This included exchange visits between the researchers and training to transfer ownership of the sequencing, analysis and understanding to the team in the Philippines, and ensured that everyone understood the resourcefulness and challenges of the sentinel sites."

Genomic surveillance allows the team to describe drug-resistant bacteria in terms of their strains, which genes enable the resistance, and how those genes are transferred between bacteria. Through genomics the Philippines now have a greater lens on AMR at the local, the national and international scale, allowing data analysis at a previously difficult level. The data are shared with Philippine public health agencies and with the WHO to inform both local and global understanding of the spread of carbapenam resistance.

Professor David Aanensen, Director of the Centre for Genomic Pathogen Surveillance and joint lead on the project, said: "Understanding national dynamics in antimicrobial resistance is important in every country in the world to prevent spread globally, and new technology and tools that enhance this capacity are required. The work by the Philippines team to establish genomics within a national surveillance network is an exemplar for adoption that could be extended to tackle the global challenge of antimicrobial resistance or other infections. "

Credit: 
Wellcome Trust Sanger Institute

COVID-19 mortality alarmingly high in dialysis patients

Spain is one of the European countries besides the UK and Italy that was particularly hard hit by the coronavirus pandemic in March. Early on, it was not possible to predict the extent of the outbreak. The first case in Spain was a German tourist who brought the virus to La Gomera in the Canary Islands at the end of January; a second case, this time a British tourist, was identified on Mallorca on February 10. Three days later, the first patient in Spain died of COVID-19 (however it was diagnosed later on postmortem)- followed by a rapidly rising death toll. By the end of March, there were more than 238,000 confirmed cases and 29,000 deaths in Spain, out of a total population of almost 47 million. The UK, with a total population of 66 million, had 38,000 deaths at that time, while Italy, with 60 million, had more than 33,000 deaths.

On March 31, 789 COVID-19 patients were receiving treatment at the Hospital Vall d'Hebron in Barcelona, 168 of them on the ICU. But what were the outcomes among end-stage renal disease/ESRD patients (dialysis patients and transplanted patients)? At what rate did they fall ill and what was their prognosis?

At the Opening Conference of the ERA-EDTA Congress, Dr. Maria Jose Soler Romeo presented data gathered at the Hospital Vall d'Hebron. Of 400 dialysis patients with the Vall d´Hebron as a reference hospital, 21 or a good 5% had COVID-19. In the whole of Spain, 238,000 out of 47 million people (about 0.5%) had contracted the disease at that time. The figures obtained from the Hospital Vall d'Hebron on the incidence of COVID-19 are not representative, of course, as it is only one center, but they do indicate a significantly higher rate of infection for dialysis patients. Of the 21 dialysis patients who contracted COVID-19, 15 were discharged, one was on the ICU at the time of the survey, and five had died. The mortality rate in this center was 24%. This high death rate among infected dialysis patients was also verified in an analysis of the Spanish COVID-19 Dialysis/Transplantation Registry, which included a total of 1572 ESRD patients, including 998 HD patients, 51 PD patients and 523 kidney transplant patients. The mortality rate among HD patients was more than 27% for the whole Spain, but was also more than 23% for kidney transplant patients. PD patients had a significantly lower mortality rate of 15%, but their number is so small in proportion that it is almost impossible to make statistically valid statements about this patient group.

The high mortality rate among dialysis patients was also verified in a study that monitored the course of disease in 36 HD patients between March 12 and April 10 in Hospital Gregorio Marañón in Madrid. The death rate here was as high as 30.5%, but what is particularly interesting about this study is that it analyzed predictors of mortality. The conclusion was that, in addition to patient older age and pneumonia, there are three factors that significantly influence the mortality rate among coronavirus-positive dialysis patients: (1) the number of years on dialysis (dialysis vintage), (2) lymphopenia, which describes a low number of special white blood cells (lymphocytes) that protect the body from infections, and (3) elevated LDH levels, a surrogate for tissue damage.

"What we had to learn from nephrologists in Spain is that dialysis patients are more susceptible to the virus and that the risk of patients dying is very high at a rate of 1:4. These patients need special protection. Many studies have shown that even people without symptoms or with asymptomatic symptoms can carry and pass on the virus. In dialysis units, therefore, we cannot rely on always being able to detect infected patients and to isolate them in time. To protect our highly vulnerable patients, it is essential that all the patients and staff be tested on a regular basis in order to minimize the risk of infection in COVID-19 outbreaks. We must continually remind ourselves that, of four coronavirus-positive dialysis patients, one will not survive. Outbreaks in dialysis units must therefore be prevented at all costs", Dr. Soler concluded.

Credit: 
ERA – European Renal Association

Creating hairy human skin: Not as easy as you think

image: A cultured human skin cell with developing hair follicles (budding protusions) and embedded nerves (in red).

Image: 
Jiyoon Lee, Boston Children's Hospital

For more than 40 years, scientists and commercial companies have been recreating human skin in laboratories around the world. Yet all of these products lack important aspects of normal skin--hair, nerves, and fat.

In new research, cultured human skin cells embedded with fat and nerves and capable of growing hair are a reality. The achievement represents more than five years of study that started in the laboratory of Karl Koehler, PhD (then at Indiana University School of Medicine) and completed in Koehler's new laboratory at Boston Children's Hospital in the departments of otolaryngology and communication enhancement and plastic and oral surgery research. The technique appears in a paper published this week in Nature.

"In this latest work, we discovered a way to grow both layers of human skin together," says Koehler, referring to the top and bottom layers of human skin (the epidermis and dermis, respectively). "Those cells talk to each other in a skin organoid culture - or skin in a dish we created - and sprout hair follicles accompanied by fat cells and neurons."

Taking the discovery a step further, the team transplanted the human hairy skin into mice. The mice eventually sprouted human hair follicles at the site of transplantation. Potential applications include testing cosmetics, drugs, and burn treatments.

Skin in a dish includes mini organs

Skin in a dish models are not new. And like many, Koehler and his colleagues thought that the challenge of growing fully functional, hairy skin cells, had long been solved. Skin cells are some of the first cells to have been grown in cultures outside of the body and incubator.

But the skin that people make in a dish never has mini organs or appendages, like hair follicles or sweat glands, embedded in the skin. These mini organs are important for heat regulation, touch sensation, and appearance.

In 2018, the team published a paper showing they could generate hairy skin from mouse stem cells. To create human hairy skin cells, the team started with human induced pluripotent stem cells, which are human adult skin cells that are coaxed back to an embryonic form.

"So we applied a cocktail of growth factors and small molecules, kind of a cooking recipe for human pluripotent stem cells," says Koehler.

The team first noticed co-development of skin epidermis and the dermis. The interaction and signaling between the two tissue layers led to budding of hair follicles at 70 days, which lines up well with the timing of hair development in the human fetus.

In addition to growing hair, the organoids produce fat and muscle-like cells of the skin, as well as, nerves similar to those that mediate touch sensation. "The fat is an unsung hero of the skin and recent studies suggest it plays a critical role in wound healing," says Jiyoon Lee, PhD, first-author on the paper and research associate in the department of otolaryngology at Boston Children's Hospital.

The organoids also produce Merkel cells, specialized touch responsive cells of the skin that have also been implicated in diseases such as Merkel cell carcinoma. "The inclusion of these other cell types likely expands the potential uses of the skin organoid model to research on sensory disorders and cancer," she adds.

Mice grew pigmented human hairs

To see if the technique worked in a living animal, the team cultured the organoids for over four months and then implanted them on the back of mice specially developed not to reject the grafts. "We noticed that within a month, tiny brown hairs sprang up from the transplant site," explains Lee. "This showed us, amazingly, that pigment cells also developed in the organoids."

They compared the transplanted skin with adult human skin samples observing several unique features of human skin in the transplants. One includes 'rete ridges' or valleys in the wavy pattern of human epidermis that helps anchor it into skin membranes. And, the transplanted hair developed elaborate sebaceous glands that produced sebum, the natural oil that lubricates human skin.

An unexpected and fortuitous discovery

This new discovery is literally an outgrowth of work Koehler began at Indiana University when working on a system of recapitulating the inner ear. His goal at the time was to create cells that sense auditory stimuli - sound - to model hearing loss disorders and test gene therapies for hearing loss and balance disorders.

There, he manipulated human induced pluripotent stem cells with the same cocktail of chemicals and proteins used during normal embryonic development guiding them to become inner ear structures.

In the production of this technique, as the inner ear cells were budding during early development, the team found that skin tissue formed as a byproduct.

"This was surprising and we initially tried to get rid of the skin tissue, thinking it was a pesky off-target tissue, like a weed in a garden," recalls Koehler. "Once we saw the scientific value of growing hairy skin in dish, we switched tactics, trying to eliminate the inner ear organoids in favor of growing skin."

In their purification attempts, they discovered that the skin tissue contained both layers of skin, epidermis and dermis. In culture, the skin formed outgrowing hair follicles.

A proof of concept

Translating any mouse study into humans is a long road. "But we think we have developed a proof of concept showing that the cells integrate into skin and form outgrowing hair follicles," says Koehler.

The team hopes that in the long term, they can use the technology to seed wound beds with cultured skin to reconstruct skin, such as in the case of extensive burns or scars.

And while it might be tempting to think of the approach as a "cure" for baldness, Koehler cautions that many challenges lay ahead. "We now have a technique that could generate nearly unlimited hair follicles for transplantation" he says. "But immune rejection is a major hurdle and generating follicles tailored to an individual will be incredibly costly and take a year or more." To meet these challenges, the team is working on ways to accelerate development in a dish, engineer organoids to evade immune detection, or produce similar skin organoids from adult patient-derived cells.

Credit: 
Boston Children's Hospital