Tech

New CRISPR-based test for COVID-19 uses a smartphone camera

image: A new CRISPR-based test for COVID-19 developed by researchers at Gladstone Institutes, UC Berkeley, and UC San Francisco essentially converts a smartphone camera into a microscope to provide quick and accurate results.

Image: 
Gladstone Institutes

SAN FRANCISCO, CA--December 4, 2020--Imagine swabbing your nostrils, putting the swab in a device, and getting a read-out on your phone in 15 to 30 minutes that tells you if you are infected with the COVID-19 virus. This has been the vision for a team of scientists at Gladstone Institutes, University of California, Berkeley (UC Berkeley), and University of California, San Francisco (UCSF). And now, they report a scientific breakthrough that brings them closer to making this vision a reality.

One of the major hurdles to combating the COVID-19 pandemic and fully reopening communities across the country is the availability of mass rapid testing. Knowing who is infected would provide valuable insights about the potential spread and threat of the virus for policymakers and citizens alike.

Yet, people must often wait several days for their results, or even longer when there is a backlog in processing lab tests. And, the situation is worsened by the fact that most infected people have mild or no symptoms, yet still carry and spread the virus.

In a new study published in the scientific journal Cell, the team from Gladstone, UC Berkeley, and UCSF has outlined the technology for a CRISPR-based test for COVID-19 that uses a smartphone camera to provide accurate results in under 30 minutes.

"It has been an urgent task for the scientific community to not only increase testing, but also to provide new testing options," says Melanie Ott, MD, PhD, director of the Gladstone Institute of Virology and one of the leaders of the study. "The assay we developed could provide rapid, low-cost testing to help control the spread of COVID-19."

The technique was designed in collaboration with UC Berkeley bioengineer Daniel Fletcher, PhD, as well as Jennifer Doudna, PhD, who is a senior investigator at Gladstone, a professor at UC Berkeley, president of the Innovative Genomics Institute, and an investigator of the Howard Hughes Medical Institute. Doudna recently won the 2020 Nobel Prize in Chemistry for co-discovering CRISPR-Cas genome editing, the technology that underlies this work.

Not only can their new diagnostic test generate a positive or negative result, it also measures the viral load (or the concentration of SARS-CoV-2, the virus that causes COVID-19) in a given sample.

"When coupled with repeated testing, measuring viral load could help determine whether an infection is increasing or decreasing," says Fletcher, who is also a Chan Zuckerberg Biohub Investigator. "Monitoring the course of a patient's infection could help health care professionals estimate the stage of infection and predict, in real time, how long is likely needed for recovery."

A Simpler Test through Direct Detection

Current COVID-19 tests use a method called quantitative PCR--the gold standard of testing. However, one of the issues with using this technique to test for SARS-CoV-2 is that it requires DNA. Coronavirus is an RNA virus, which means that to use the PCR approach, the viral RNA must first be converted to DNA. In addition, this technique relies on a two-step chemical reaction, including an amplification step to provide enough of the DNA to make it detectable. So, current tests typically need trained users, specialized reagents, and cumbersome lab equipment, which severely limits where testing can occur and causes delays in receiving results.

As an alternative to PCR, scientists are developing testing strategies based on the gene-editing technology CRISPR, which excels at specifically identifying genetic material.

All CRISPR diagnostics to date have required that the viral RNA be converted to DNA and amplified before it can be detected, adding time and complexity. In contrast, the novel approach described in this recent study skips all the conversion and amplification steps, using CRISPR to directly detect the viral RNA.

"One reason we're excited about CRISPR-based diagnostics is the potential for quick, accurate results at the point of need," says Doudna. "This is especially helpful in places with limited access to testing, or when frequent, rapid testing is needed. It could eliminate a lot of the bottlenecks we've seen with COVID-19."

Parinaz Fozouni, a UCSF graduate student working in Ott's lab at Gladstone, had been working on an RNA detection system for HIV for the past few years. But in January 2020, when it became clear that the coronavirus was becoming a bigger issue globally and that testing was a potential pitfall, she and her colleagues decided to shift their focus to COVID-19.

"We knew the assay we were developing would be a logical fit to help the crisis by allowing rapid testing with minimal resources," says Fozouni, who is co-first author of the paper, along with Sungmin Son and María Díaz de León Derby from Fletcher's team at UC Berkeley. "Instead of the well-known CRISPR protein called Cas9, which recognizes and cleaves DNA, we used Cas13, which cleaves RNA."

In the new test, the Cas13 protein is combined with a reporter molecule that becomes fluorescent when cut, and then mixed with a patient sample from a nasal swab. The sample is placed in a device that attaches to a smartphone. If the sample contains RNA from SARS-CoV-2, Cas13 will be activated and will cut the reporter molecule, causing the emission of a fluorescent signal. Then, the smartphone camera, essentially converted into a microscope, can detect the fluorescence and report that a swab tested positive for the virus.

"What really makes this test unique is that it uses a one-step reaction to directly test the viral RNA, as opposed to the two-step process in traditional PCR tests," says Ott, who is also a professor in the Department of Medicine at UCSF. "The simpler chemistry, paired with the smartphone camera, cuts down detection time and doesn't require complex lab equipment. It also allows the test to yield quantitative measurements rather than simply a positive or negative result."

The researchers also say that their assay could be adapted to a variety of mobile phones, making the technology easily accessible.

"We chose to use mobile phones as the basis for our detection device since they have intuitive user interfaces and highly sensitive cameras that we can use to detect fluorescence," explains Fletcher. "Mobile phones are also mass-produced and cost-effective, demonstrating that specialized lab instruments aren't necessary for this assay."

Accurate and Quick Results to Limit the Pandemic

When the scientists tested their device using patient samples, they confirmed that it could provide a very fast turnaround time of results for samples with clinically relevant viral loads. In fact, the device accurately detected a set of positive samples in under 5 minutes. For samples with a low viral load, the device required up to 30 minutes to distinguish it from a negative test.

"Recent models of SARS-CoV-2 suggest that frequent testing with a fast turnaround time is what we need to overcome the current pandemic," says Ott. "We hope that with increased testing, we can avoid lockdowns and protect the most vulnerable populations."

Not only does the new CRISPR-based test offer a promising option for rapid testing, but by using a smartphone and avoiding the need for bulky lab equipment, it has the potential to become portable and eventually be made available for point-of-care or even at-home use. And, it could also be expanded to diagnose other respiratory viruses beyond SARS-CoV-2.

In addition, the high sensitivity of smartphone cameras, together with their connectivity, GPS, and data-processing capabilities, have made them attractive tools for diagnosing disease in low-resource regions.

"We hope to develop our test into a device that could instantly upload results into cloud-based systems while maintaining patient privacy, which would be important for contact tracing and epidemiologic studies," Ott says. "This type of smartphone-based diagnostic test could play a crucial role in controlling the current and future pandemics."

Credit: 
Gladstone Institutes

No strings attached: maximizing wireless charging efficiency with multiple transmitters

image: Wireless charging has already found its way in many consumer electronics and medical applications, and maximizing efficiency when using multiple transmitters could extend its use to industrial robotics and electric vehicles

Image: 
Unsplash

Wireless power transfer has proven to be quite useful in electronic devices such as medical implants and smartphones. In most cases, this is done by aligning or "coupling" two separate coils of wire (transmitter Tx and receiver Rx). The electrical current circulating in the Tx coil then creates a magnetic field that transfers energy to the Rx coil. Recently, the use of multiple Txs has been explored, which can cover a wide charging area.

However, although methods for transferring power wirelessly with maximum efficiency have been studied in great detail in single-Tx systems, the same is not true for systems with multiple Tx coils. Maximizing efficiency in the multi-Tx problem is challenging because the Rx could be located anywhere over the surface covered by the Txs, leading to stronger coupling with some and negligible coupling with others. To date, there have been no control schemes that can optimize the currents delivered to each Tx in real time--until now.

In a study published in IEEE Transactions on Power Electronics, scientists at Incheon National University, Korea, devised an effective control strategy for maximizing efficiency in multi-Tx wireless charging. They first formulated a theoretical background and found important relationships between many variables in the problem, such as the connection between the degree of coupling of each Tx to the Rx, its "perceived" or "reflected" impedance from the Rx, and the optimal current that should be fed.

With this knowledge, the researchers implemented a novel, maximally efficient, and relatively simpler method for multi-Tx wireless charging. "Our strategy breaks away from the more traditional approach of locating the Rx with a position sensor and only turning on the Tx closest to it," explains Professor Dukju Ahn, "Instead, we found that the coupling degree of each Tx can be measured indirectly in real time through its impedance, allowing us to dynamically adjust the output of each Tx coil to achieve maximum efficiency."

Prof Ahn also stated that although other techniques have been previously published, their performance was assessed by having the Rx stand still on different locations. "Wireless charging technology is aimed for applications involving moving receivers. In this sense, our work is the first to verify the efficiency of a multi-Tx control scheme compatible with a receiver that's actually moving in real time," he remarks.

Wireless charging technology will help remove the hurdles of wired power supplies in many applications. With efficient multi-Tx wireless power transfer, we might be able to do away with the large and heavy batteries that current electric vehicles and industrial robots use, making them cheaper and easier to move.

Let us hope this study energizes further research in this field!

Credit: 
Incheon National University

Research reveals how airflow inside a car may affect COVID-19 transmission risk

image: A new study looks at how airflow patterns inside the passenger cabin of a car might affect the transmission of SARS-CoV-2 and other airborne pathogens.

Using computer simulations, the study looked at the risk of aerosol particles being shared between a driver and a passenger in different window configurations. Redder shades indicate more particles. Risk was shown to be higher with windows closed (top left), and decreasing with each window opened. The best case was having all windows open (bottom right).

Image: 
Breuer lab / Brown University

PROVIDENCE, R.I. [Brown University] -- A new study of airflow patterns inside a car's passenger cabin offers some suggestions for potentially reducing the risk of COVID-19 transmission while sharing rides with others.

The study, by a team of Brown University researchers, used computer models to simulate the airflow inside a compact car with various combinations of windows open or closed. The simulations showed that opening windows -- the more windows the better -- created airflow patterns that dramatically reduced the concentration of airborne particles exchanged between a driver and a single passenger. Blasting the car's ventilation system didn't circulate air nearly as well as a few open windows, the researchers found.

"Driving around with the windows up and the air conditioning or heat on is definitely the worst scenario, according to our computer simulations," said Asimanshu Das, a graduate student in Brown's School of Engineering and co-lead author of the research. "The best scenario we found was having all four windows open, but even having one or two open was far better than having them all closed."

Das co-led the research with Varghese Mathai, a former postdoctoral researcher at Brown who is now an assistant professor of physics at the University of Massachusetts, Amherst. The study is published in the journal Science Advances.

The researchers stress that there's no way to eliminate risk completely -- and, of course, current guidance from the U.S. Centers for Disease Control (CDC) notes that postponing travel and staying home is the best way to protect personal and community health. The goal of the study was simply to study how changes in airflow inside a car may worsen or reduce risk of pathogen transmission.

The computer models used in the study simulated a car, loosely based on a Toyota Prius, with two people inside -- a driver and a passenger sitting in the back seat on the opposite side from the driver. The researchers chose that seating arrangement because it maximizes the physical distance between the two people (though still less than the 6 feet recommended by the CDC). The models simulated airflow around and inside a car moving at 50 miles per hour, as well as the movement and concentration of aerosols coming from both driver and passenger. Aerosols are tiny particles that can linger in the air for extended periods of time. They are thought to be one way in which the SARS-CoV-2 virus is transmitted, particularly in enclosed spaces.

Part of the reason that opening windows is better in terms of aerosol transmission is because it increases the number of air changes per hour (ACH) inside the car, which helps to reduce the overall concentration of aerosols. But ACH was only part of the story, the researchers say. The study showed that different combinations of open windows created different air currents inside the car that could either increase or decrease exposure to remaining aerosols.

Because of the way air flows across the outside of the car, air pressure near the rear windows tends to be higher than pressure at the front windows. As a result, air tends to enter the car through the back windows and exit through the front windows. With all the windows open, this tendency creates two more-or-less independent flows on either side of the cabin. Since the occupants in the simulations were sitting on opposite sides of the cabin, very few particles end up being transferred between the two. The driver in this scenario is at slightly higher risk than the passenger because the average airflow in the car goes from back to front, but both occupants experience a dramatically lower transfer of particles compared to any other scenario.

The simulations for scenarios in which some but not all windows are down yielded some possibly counterintuitive results. For example, one might expect that opening windows directly beside each occupant might be the simplest way to reduce exposure. The simulations found that while this configuration is better than no windows down at all, it carries a higher exposure risk compared to putting down the window opposite each occupant.

"When the windows opposite the occupants are open, you get a flow that enters the car behind the driver, sweeps across the cabin behind the passenger and then goes out the passenger-side front window," said Kenny Breuer, a professor of engineering at Brown and a senior author of the research. "That pattern helps to reduce cross-contamination between the driver and passenger."

It's important to note, the researchers say, that airflow adjustments are no substitute for mask-wearing by both occupants when inside a car. And the findings are limited to potential exposure to lingering aerosols that may contain pathogens. The study did not model larger respiratory droplets or the risk of actually becoming infected by the virus.

Still, the researchers say the study provides valuable new insights into air circulation patterns inside a car's passenger compartment -- something that had received little attention before now.

"This is the first study we're aware of that really looked at the microclimate inside a car," Breuer said. "There had been some studies that looked at how much external pollution gets into a car, or how long cigarette smoke lingers in a car. But this is the first time anyone has looked at airflow patterns in detail."

The research grew out of a COVID-19 research task force established at Brown to gather expertise from across the University to address widely varying aspects of the pandemic. Jeffrey Bailey, an associate professor of pathology and laboratory medicine and a coauthor of the airflow study, leads the group. Bailey was impressed with how quickly the research came together, with Mathai suggesting the use of computer simulations that could be done while laboratory research at Brown was paused for the pandemic.

"This is really a great example of how different disciplines can come together quickly and produce valuable findings," Bailey said. "I talked to Kenny briefly about this idea, and within three or four days his team was already doing some preliminary testing. That's one of the great things about being at a place like Brown, where people are eager to collaborate and work across disciplines."

Credit: 
Brown University

Researchers adapt cell phone camera for SARS-CoV-2 detection

image: A photo of a device attached to an ordinary smartphone that can detect the presence of SARS-CoV-2 in a nasal swab.

Image: 
Daniel Fletcher and Melanie Ott

Researchers have developed an assay that can detect the presence of SARS-CoV-2 in a nasal swab using a device attached to an ordinary smartphone, they report December 4 in the journal Cell. Although more research is needed before such a test can be rolled out, the results are promising and ultimately may be applicable to screening more broadly for other viruses.

"Our study shows that we can do the detection part of this assay very quickly, making the measurement with mass-produced consumer electronics," says Daniel Fletcher, a bioengineer at the University of California in Berkeley and co-senior author on the paper. "We don't need fancy laboratory equipment."

Fletcher and other co-senior author Melanie Ott (@TheOttLab), a virologist at Gladstone Institutes and the University of California, San Francisco, began collaborating with Nobel laureate Jennifer Doudna, also a co-author on the study, about two years ago on a rapid, at-home test for HIV. They were looking to address the need for frequent testing that has arisen because of current drug trials that require close monitoring of patients' viral loads. When COVID-19 hit the scene in January, they quickly pivoted their research to develop a test that would detect the presence of a different virus--SARS-CoV-2.

The test makes use of CRISPR-Cas technology. Specifically, RNA in the sample can be detected with the Cas13 enzyme, eliminating the need for reverse transcription of the RNA into DNA and then amplification by PCR technology used in current standard tests. When Cas13 binds to the RNA from the virus, it cleaves any surrounding RNA sequences; the researchers added an RNA-based probe to the reaction that gets cleaved and produces fluorescence that can be detected with the camera. The assay provides results within 30 minutes of detection time.

In the current study, which was primarily designed to be a test of the amplification-free CRISPR-Cas technology and the detector, the nasal swabs were spiked with SARS-CoV-2 RNA. The investigators are currently working on a solution that would induce a single-step reaction in which the RNA is released from the virus without the need for purification. Because it doesn't require amplification, the assay is able to quantify the amount of virus in the sample.

"It's super exciting to have this quantitative aspect in the assay," Ott says. "PCR is the gold standard, but you have to go through so many steps. There are huge opportunities here for pathogens and for biology in general to make RNA quantification more precise."

The fluorescence detector consists of a laser to produce illumination and excite the fluorescence and an added lens to help collect light. The phone is placed on top of it. "One takeaway is that the phone camera is ten times better than the plate reader in the lab," Ott says. "This is directly translatable to it being a better diagnosis reader." Previous research in Fletcher's lab has led to phone-based devices that visually detect parasites in blood and other samples, and the current assay demonstrates how phone cameras can also be useful for molecular detection.

Ultimately, Fletcher and Ott would like to have this type of test be part of a broader system that could be used at home to screen not only for SARS-CoV-2 but other viruses--like those that cause colds and flu. But more immediately, the researchers hope to develop a testing device using this technology that could be rolled out to pharmacies and drop-in clinics. They would like to get the cost of testing cartridges down to about $10. The final device would probably not actually use a phone but have a phone camera built into it.

Ott notes that what they've learned developing this SARS-CoV-2 test can also be applied to their work with HIV tests. "We will need to change the extraction methods because we'll be dealing with blood instead of nasal swabs, but it's really helpful that we've developed the fluorescent detection part," she says. "This is the start of an era when we can give the individual more authority and autonomy" in terms of being able to test themselves.

Credit: 
Cell Press

Research identifies nanoscale effect of water and mineral content on bone

image: Arun Nair, University of Arkansas

Image: 
University of Arkansas

University of Arkansas researchers Marco Fielder and Arun Nair have conducted the first study of the combined nanoscale effects of water and mineral content on the deformation mechanisms and thermal properties of collagen, the essence of bone material.

The researchers also compared the results to the same properties of non-mineralized collagen reinforced with carbon nanotubes, which have shown promise as a reinforcing material for bio-composites. This research aids in the development of synthetic materials to mimic bone.

Using molecular dynamics -- in this case a computer simulation of the physical movements of atoms and molecules -- Nair and Fielder examined the mechanics and thermal properties of collagen-based bio-composites containing different weight percentages of minerals, water and carbon nanotubes when subjected to external loads.

They found that variations of water and mineral content had a strong impact on the mechanical behavior and properties of the bio-composites, the structure of which mimics nanoscale bone composition. With increased hydration, the bio-composites became more vulnerable to stress. Additionally, Nair and Fielder found that the presence of carbon nanotubes in non-mineralized collagen reduced the deformation of the gap regions.

The researchers also tested stiffness, which is the standard measurement of a material's resistance to deformation. Both mineralized and non-mineralized collagen bio-composites demonstrated less stability with greater water content. Composites with 40% mineralization were twice as strong as those without minerals, regardless of the amount of water content. Stiffness of composites with carbon nanotubes was comparable to that of the mineralized collagen.

"As the degree of mineralization or carbon nanotube content of the collagenous bio-composites increased, the effect of water to change the magnitude of deformation decreased," Fielder said.

The bio-composites made of collagen and carbon nanotubes were also found to have a higher specific heat than the studied mineralized collagen bio-composites, making them more likely to be resistant to thermal damage that could occur during implantation or functional use of the composite. Like most biological materials, bone is a hierarchical - with different structures at different length scales. At the microscale level, bone is made of collagen fibers, composed of smaller nanofibers called fibrils, which are a composite of collagen proteins, mineralized crystals called apatite and water. Collagen fibrils overlap each other in some areas and are separated by gaps in other areas.

"Though several studies have characterized the mechanics of fibrils, the effects of variation and distribution of water and mineral content in fibril gap and overlap regions are unexplored," said Nair, who is an associate professor of mechanical engineering. "Exploring these regions builds an understanding of the structure of bone, which is important for uncovering its material properties. If we understand these properties, we can design and build better bio-inspired materials and bio-composites."

Credit: 
University of Arkansas

The same vision for all primates

image: The gray mouse lemur, the smallest species of primates, has excellent vision. More than one fifth of his cerebral cortex is dedicated to visual processing in order to accommodate a sufficient number of pixel processing units.

Image: 
UNIGE/HUBER

Primates process visual information in front of their eyes, similar to pixels in a digital camera, using small computing units located in the visual cortex of their brains. In order to understand the origins of our visual abilities, scientists at the University of Geneva (UNIGE), in collaboration with the Max Planck Institute in Göttingen and the National Museum of Natural History in Paris, have now investigated whether these computational units scale across the large differences in size between primates. The gray mouse lemur (Microcebus murinus) from Madagascar is one of the smallest of them and weighs barely 60 grams. In a study published in the journal Current Biology, the scientists compared the visual system of the mouse lemur to that of other primates and found that the size of these visual processing units is identical in all primates, independent of their body size. As the mouse lemur is a very special species, sharing many traits with the very first primates that evolved 55 Million years ago, these results suggest an incredible preservation of our visual system and highlight the importance of vision in our daily lives and that of our ancestors in the distant past.

For more than a century, the visual system of primates has been intensely studied. These studies uncovered that unlike other mammals such as rodents, visual information is processed by small dedicated computing units located in the visual cortex. "As the different primate species cover a wide range of sizes, we were led to wonder whether this basic computing unit scales with body or brain size. Is it simplified or miniaturized, for example, in the world's smallest primate, the gray mouse lemur," asks Daniel Huber, professor in the Department of Fundamental Neurosciences at the UNIGE Faculty of Medicine?

Don't Mind the Size

To answer this question, the visual system of the mouse lemur was studied using an optical brain imaging technique. Geometrical shapes representing lines of various orientations were presented to the lemurs and the activity of the neurons responding to the visual stimuli was imaged. The repetition of such measurements gradually allowed them to determine the size of the minimal units processing form information. "We expected to see a unit of tiny size, proportional to the small size of the lemur, but our data revealed that they measure more than half a millimeter in diameter," says Daniel Huber.

In collaboration with the Max Planck Researchers, Huber compared hundreds of these units imaged in the tiny mouse lemur brain with the data obtained for the visual circuits of other, much larger primate species. The team made a surprising discovery: not only was the basic processing unit almost identical in size in the 60-gram mouse lemur, as in larger monkeys such as macaques weighing about seven kilograms, or even larger primates such as us humans.

They also found that the way the units are arranged across the brain was totally indistinguishable, following the same rules with mathematical precision. The researchers also found that the number of nerve cells per visual unit was almost identical in all primates studied so far. Göttingen Max Planck physicist Fred Wolf who had pointed out that universal mathematical principles may rule visual system evolution ten years ago is still amazed by the degree of invariance: "55 Million years of separation on different continents is a very long evolutionary path to travel. I would have expected some mix of general similarity and characteristic differences between species in these neural modules. But the fact of the matter simply is: It is practically impossible to tell them apart."

The Visual Circuits Are Powerful and Incompressible

These results thus provide insights into the origins of primate vision. First of all, the fact that this unit is so well preserved suggests that it probably evolved very early in primate history, indicating that when it comes to form vision our primate ancestors had visual abilities similar to ours from the start.

Second, the discovery by UNIGE scientists and their collaborators reveals that this part of the visual system cannot be compressed or miniaturized. A fixed number of neurons seems therefore to be required to ensure its optimal functionality. "For tiny primate species with excellent vision, such as the mouse lemur, the visual system must hence be relatively large, compared to the size of their entire brain, to accommodate a sufficient number of visual processing units," says the Geneva-based neuroscientist. Indeed, more than a fifth of the cerebral cortex of this lemur is dedicated to visual processing. In comparison, the neural circuits related to vision occupy barely 3% of the human brain.

"This study also highlights the crucial importance of conserving the habitat of primate species such as the mouse lemur, particularly in the forests of Madagascar. These habitats are disappearing at an alarming pace, taking with them precious keys to understanding our own origins," concludes Daniel Huber.

Credit: 
Université de Genève

New COVID surveillance predicts direction, speed and acceleration of virus

CHICAGO --- A new COVID-19 global surveillance system has been developed which can dynamically track not just where the virus is now, but where it is going, how fast it will arrive and whether that speed is accelerating.

The new surveillance system, the first to dynamically track the virus, is being rolled out in 195 countries Dec. 3. It also will dynamically track the virus in individual U.S. states and metropolitan areas and in Canadian provinces.

"Now we can easily identify outbreaks at their beginning," said Lori Post, the lead investigator and director of the Buehler Center for Health Policy and Economics at Northwestern University Feinberg School of Medicine. "You want to know where the pandemic is accelerating, how fast it is moving and how that compares to prior weeks."

Post, James Oehmke of Northwestern, and Charles Moss of the University of Florida worked day and night over the past four months to develop the novel surveillance system, based on path-breaking research led by Oehmke.

"We can inform leaders where the outbreak is occurring before it shows up in overcrowded hospitals and morgues," Post said. "Current systems are static and ours is dynamic."

Northwestern is hosting a dashboard for the new COVID tracking system -- open to anyone -- with the new metrics as well as traditional metrics. Each country's dashboard will be monitored in every U.S. embassy in the world to inform policy leaders around the globe. Users will have metrics of the whole world at their fingertips.

The new system and the first U.S. surveillance report will be published in the Journal of Medical Internet Research on Dec. 3.

The global surveillance app analyzes the virus in the same way the field of economics measures the expansion and contraction of the economy.

"These methods are tried and tested, but this is the first time they are being applied to disease surveillance," Post said. "We had the model and metrics validated for medical surveillance and published. We know they work."

The project is named GASSP (GlobAl Sars-Co2 Surveillance Project).

Existing surveillance can't identify pandemic shifts or signal a coming outbreak

Existing surveillance, which hasn't changed much in 50 years, measures the caseload in terms of new and cumulative deaths and infections. They don't identify significant shifts in the pandemic or sound the alarm when there is concerning acceleration of disease transmission signaling an outbreak.

The surveillance system can help U.S. embassies and missions support partnering countries to formulate and implement policies that mitigate COVID-19 or adverse outcomes such as food insecurity, and understand which policies are working best.

These new metrics also can help developed countries and their health systems prepare for swift changes in the pandemic.

"For example, relative to other countries, the Netherlands is a small country and doesn't have the same caseload as some larger countries like Spain," Post said, "But they have alarming signs right now -- increased speed, acceleration and positive jerk and that means potential for explosive growth."

Drilling down to county level to save state economies

In the U.S., if the pulse of one state is bad, the surveillance can drill down into the county level to identify where the problem is originating from. This would enable only the troubled areas to be under strict quarantine masking rules to preserve the state economy, Post said.

Among the surveillance system's many findings in the U.S., the surveillance system reports:

Hawaii, Vermont and Maine have the smallest rate of new daily infections per 100,000 population, but because their speed is accelerating and their persistence remains positive, they need to enforce masking, social distancing, crowd control and hygiene or they could potentially escalate into explosive growth.

Wisconsin is a state where the outbreak will likely continue to explode. Wisconsin and California have similar average number of new COVID cases per day with California 6.8 times bigger than Wisconsin. Wisconsin is disproportionately affected by the pandemic, and yet it was California that declared an emergency stay-at-home order.

Wyoming has several indicators over the past three weeks indicating their outbreak is going to get much worse.

Speed isn't enough, why acceleration and jerk matter

"Speed itself doesn't tell us enough," Post said. "We have to know the acceleration and how that compares week to week -- jerk -- to be prepared for what's coming in the pandemic."

Jerk is a measure of increasing acceleration and may help predict the stress the pandemic will place on health care systems. "Jerk can help turn a reactive policy response into a proactive policy response," said Oehmke, adjunct professor of emergency medicine at Northwestern. Jerk is a physics term, Oehmke said, because until now the concept did not exist in the public health nomenclature.

"If you are the governor of New York, it is not helpful to prevent future outbreaks by looking at how many people already were infected by the novel coronavirus," Post said. "You want to know what is going on now and what are likely scenarios in the near future. By looking at speed, acceleration and jerk, we can inform leaders where the outbreak is occurring before it shows up in overcrowded hospitals and morgues."

The system also controls for incomplete data using state-of-the-art statistical methods. Existing surveillance picks up severe cases, Post said, so in the case of COVID, those numbers likely only represent 10% to 20% of caseload.

"We are like air traffic controllers"

"We are picking up the dynamic characteristics of the pandemic," Post said. "Pandemics move around and change. We are like air traffic controllers guiding in an airplane during a thunderstorm. The pilot can't see. They don't know where to go -- they need information. We have to guide that plane on instruments. Analogously, we need to inform public health leaders when there are significant shifts to the pandemic."

Persistence -- an echo effect forward -- is equally important to pandemic surveillance.

"The persistence effect measures the likelihood people newly presenting last week infected others who will present with COVID-19 this week, who are infecting others who will present next week, and so on," Oehmke said. "To flatten the curve and end the pandemic, we need to reduce the persistence effect and eventually bring it to zero -- that has to be a key policy objective."

While persistence effects are known in financial market analysis, their application to pandemics is new.

The researchers have written papers about the COVID-19 pandemic from 12 global regions of the world. Two foundational papers published in JMIR analyzed U.S. data, then surveillance papers analyzed sub-Sahara Africa, South Asia and now the third surveillance paper from the U.S.

Credit: 
Northwestern University

How plants compete for underground real estate affects climate change and food production

image: Pepper plants were grown in a greenhouse at the Museo Nacional de Ciencias Naturales (CSIC) in Madrid to investigate how their below-ground behavior differed when planted alone vs. alongside a neighbor.

Image: 
Ciro Cabal, Princeton University

You might have observed plants competing for sunlight -- the way they stretch upwards and outwards to block each other's access to the sun's rays -- but out of sight, another type of competition is happening underground. In the same way that you might change the way you forage for free snacks in the break room when your colleagues are present, plants change their use of underground resources when they're planted alongside other plants.

In a paper published today in Science, an international team of researchers led by Princeton graduate student Ciro Cabal sheds light on the underground life of plants. Their research used a combination of modeling and a greenhouse experiment to discover whether plants invest differently in root structures when planted alone versus when planted alongside a neighbor.

"This study was a lot of fun because it combined several different kinds of mind candy to reconcile seemingly contradictory results in the literature: a clever experiment, a new method for observing root systems in intact soils and simple mathematical theory," said Stephen Pacala, the Frederick D. Petrie Professor in Ecology and Evolutionary Biology (EEB) and the senior author on the paper.

"While the aboveground parts of plants have been extensively studied, including how much carbon they can store, we know much less about how belowground parts -- that is, roots -- store carbon," said Cabal, a Ph.D. student in Pacala's lab. "As about a third of the world's vegetation biomass, hence carbon, is belowground, our model provides a valuable tool to predict root proliferation in global earth-system models."

Plants make two different types of roots: fine roots that absorb water and nutrients from the soil, and coarse transportation roots that transport these substances back to the plant's center. Plant "investment" in roots involves both the total volume of roots produced and the way in which these roots are distributed throughout the soil. A plant could concentrate all of its roots directly beneath its shoots, or it could spread its roots out horizontally to forage in the adjacent soil -- which risks competition with the roots of neighboring plants.

The team's model predicted two potential outcomes for root investment when plants find themselves sharing soil. In the first outcome, the neighboring plants "cooperate" by segregating their root systems to reduce overlap, which leads to producing less roots overall than they would if they were solitary. In the second outcome, when a plant senses reduced resources on one side due to the presence of a neighbor, it shortens its root system on that side but invests more in roots directly below its stem.

Natural selection predicts this second scenario, because each plant acts to increase its own fitness, regardless of how those actions impact other individuals. If plants are very close together, this increased investment in root volume, despite segregation of those roots, could result in a tragedy of the commons, whereby the resources (in this case, soil moisture and nutrients) are depleted.

To test the model's predictions, the researchers grew pepper plants in a greenhouse both individually and in pairs. At the end of the experiment, they dyed the roots of the plants different colors so that they could easily see which roots belonged to which plant. Then, they calculated the total biomass of each plant's root system and the ratio of roots to shoots, to see whether plants changed how much energy and carbon they deposited into belowground and aboveground structures when planted alongside neighbors, and counted the number of seeds produced by each plant as a measure of relative fitness.

The team discovered that the outcome depends on how close a pair of plants are to each other. If planted very close together, plants will be more likely to heavily invest in their root systems to try to outcompete each other for finite underground resources; if they are planted further apart, they will likely invest less in their root systems than a solitary plant would.

Specifically, they found that when planted near others, pepper plants increased investment in roots locally and reduced how far they stretched their roots horizontally, to reduce overlap with neighbors. There was no evidence for a "tragedy of the commons" scenario, since there was no difference in the total root biomass or relative investment in roots compared to aboveground structures (including the number of seeds produced per plant) for solitary versus co-habiting plants.

Plants remove carbon dioxide from the atmosphere and deposit it in their structures -- and a third of this vegetative carbon is stored in roots. Understanding how carbon deposition changes in different scenarios could help us more accurately predict carbon uptake, which in turn could help design strategies to mitigate climate change. This research could also help optimize food production, because in order to maximize crop yield, it's helpful to understand how to optimally use belowground (and aboveground) resources.

Credit: 
Princeton University

Robot fleet dives for climate answers in 'marine snow'

image: CSIRO's RV Investigator

Image: 
CSIRO

A fleet of new-generation, deep-diving ocean robots will be deployed in the Southern Ocean, in a major study of how marine life acts as a handbrake on global warming.

The automated probes will be looking for 'marine snow', which is the name given to the shower of dead algae and carbon-rich organic particles that sinks from upper waters to the deep ocean.

Sailing from Hobart on Friday, twenty researchers aboard CSIRO's RV Investigator hope to capture the most detailed picture yet of how marine life in the Southern Ocean captures and stores carbon from the atmosphere.

Voyage Chief Scientist, Professor Philip Boyd, from AAPP and IMAS, said it would be the first voyage of its kind to combine ship-board observations, deep-diving robots, automated ocean gliders and satellite measurements.

"The microscopic algae in the ocean are responsible for removing carbon dioxide from the atmosphere as much as the forests on land are," said Prof. Boyd.

"When they die, these tiny carbon-rich particles fall slowly to the ocean floor like a scene from a snow globe."

"We are excited about how this combination of new imaging sensors will allow us to get a larger and much clearer picture of how ocean life helps to store carbon."

"It's a bit like an astronomer who has only been able to study one star at a time suddenly being able to observe the galaxy in three-dimensions."

Prof Boyd said the research would improve our understanding of a process scientists call the 'carbon pump', so named because it is responsible for pumping large volumes of carbon from the atmosphere into the ocean.

"We are just beginning to understand how the biological carbon pump works, but we know it helps in the removal of about a quarter of all the carbon dioxide that humans emit by burning fossil fuels. "

"During the voyage, we will deploy a fleet of deep-diving robotic floats and gliders that use new bio-optical sensors to 'photograph' the density of the algae at different depths."

"When they return to the ocean surface, these floats will immediately transmit their data back to us via satellite."

"It is a major step forward in our ability to measure carbon uptake by marine life," said Prof. Boyd.

The Southern Ocean Large Areal Carbon Export (SOLACE) voyage is scheduled to depart on Friday 04 December at 8:00AM.

Credit: 
University of Tasmania

Potential means of improving learning and memory in people with mental illnesses

image: Average brain activity during a working memory task in a group of healthy subjects as measured by fMRI. The colors represent higher brain activity in the carriers of the G version of the GCPII enzyme, where brains are less efficient at performing the task, compared with those carriers with the A version of the enzyme.

Image: 
Bigos laboratory

More than a dozen drugs are known to treat symptoms such as hallucinations, erratic behaviors, disordered thinking and emotional extremes associated with schizophrenia, bipolar disorder and other severe mental illnesses. But, drug treatments specifically able to target the learning, memory and concentration problems that may accompany such disorders remain elusive.

In an effort to find such treatments, Johns Hopkins Medicine researchers report they have identified a genetic variation in the brain tissue of a subset of deceased people -- some with typical mental health and some with schizophrenia or other psychoses -- that may influence cognition and IQ. In the process, they unearthed biochemical details about how the gene operates.

Results of their work, described in the Dec. 1 issue of the American Journal of Psychiatry, could advance the development of drugs that target the enzyme made by this gene, and thus improve cognition in some people with serious mental illnesses or other conditions that cause reduced capacity in learning and memory.

Typical antipsychotic medications that treat schizophrenia symptoms regulate the brain chemical dopamine, a transmitter of nerve impulses associated with the ability to feel pleasure, think and plan, which malfunctions in patients with the disorder. However, previous genetic studies have also shown that another brain chemical signal transmitter, glutamate, a so-called "excitatory" chemical associated with learning and memory, plays a role as well. Another so-called neurotransmitter in this process, N-acetyl-aspartyl-glutamate (NAAG), specifically binds to a protein receptor found on brain cells that has been linked to schizophrenia, but how it impacts this disorder is unknown.

The research of clinical pharmacologist Kristin Bigos, Ph.D., assistant professor of medicine at the Johns Hopkins University School of Medicine, sought to explore more deeply the role of NAAG in cognitive impairment with the goal of eventually developing therapies for treating these learning, memory or concentration problems.

Using tissues gathered from a repository of brains from deceased donors belonging to the Lieber Institute for Brain Development, Bigos and her team measured and compared levels of certain genetic products in the brains of 175 people who had schizophrenia and the brains of 237 typical controls.

Bigos and her colleagues specifically looked at the gene that makes an enzyme known as glutamate carboxypeptidase II (GCPII), which breaks down NAAG into its component parts ? NAA and glutamate. In the brains of people with schizophrenia and in the typical controls, they found that carriers of this genetic variant (having one or two copies of the gene variation) had higher levels of the genetic product that makes the GCPII enzyme.

In the gene for the enzyme, the only difference in the versions was a single letter of the genetic code, either G or A (for the nucleotide bases guanine and adenine). If people had the version of the gene with one copy of G, then the tissue at the front of their brain ? the seat of cognition ? had 10.8% higher levels of the enzyme than those who had the version of the gene with A, and if people had two copies, they had 21% higher levels of the enzyme.

To see if this genetic variation in GCPII controlled the levels of NAAG in the brains of living people, the researchers measured levels of NAAG in the brain using magnetic resonance spectroscopy, which uses a combination of strong magnetic field and radio waves to measure the quantity of a chemical in a tissue or organ.

In this experiment, they focused on 65 people without psychosis and 57 patients diagnosed with recent onset of psychosis, meaning many of them were likely to eventually be diagnosed with schizophrenia, at the Johns Hopkins Schizophrenia Center. Participants averaged 24 years of age, and 59% were men. About 64% of participants identified as African American, and the remaining 36% were white.

The researchers found 20% lower levels of NAAG in the left centrum semiovale -- a region of the brain found deep inside the upper left side of the head -- in the white participants both with and without psychosis who had two copies of the G version of the enzyme compared with other white people who had the A version.

To see if having the G or A version of the gene plays a role in cognition, the researchers tested IQ and visual memory in the healthy participants and those with psychosis, both white and African American. They found that people with the most NAAG in their brain (in the top 25%) scored 10% higher on the visual memory test than those in the bottom 25%. They also found that people with two copies of the G version of the GCPII sequence scored 10 points lower on their IQ test on average than the people with the A version of the gene, which the researchers say is a meaningful difference in IQ.

Finally, they showed that healthy carriers of the G version of the GCPII sequence had less efficient brain activity during a working memory task, as measured by functional MRI, by at least 20% compared with those people with the A version of the gene.

"Our results suggest that higher levels of the NAAG are associated with better visual and working memory, and that may eventually lead us to develop therapies that specifically raise these levels in people with mental illness and other disorders related to poor memory to see if that can improve cognition," says Bigos.

Credit: 
Johns Hopkins Medicine

Adaptive Image Receive (AIR) coil from GE shows promise for whole-brain imaging

image: In this prototype design, no elements are placed over patient's eyes, nose, or mouth, which may improve comfort compared with conventional coils. In this image, identifiable portions of participant's face have been obscured for publication and privacy purposes.

Image: 
American Roentgen Ray Society (ARRS), American Journal of Roentgenology (AJR)

Leesburg, VA, December 3, 2020--According to an article in ARRS' American Journal of Roentgenology (AJR), a prototype 16-channel head Adaptive Image Receive (AIR) radiofrequency coil from GE Healthcare outperformed a conventional 8-channel head coil for in vivo whole-brain imaging, though it did not perform as well as a conventional 32-channel head coil.

"This study shows the feasibility of the novel AIR coil technology for imaging the brain and provides insight for future coil design improvements," concluded first author Petrice M. Cogswell of the Mayo Clinic in Rochester, Minnesota.

Lightweight and flexible with an open, ski-mask design, the emergent AIR coil technology exhibits electrical characteristics that overcome several of the limitations of traditional rigid coil designs.

Imaging a phantom and 15 healthy adult participants, Cogswell and colleagues used clinically available MRI sequences to compare their 16-channel head AIR coil with conventional 8- and 32-channel head coils. During consensus review, two board-certified neuroradiologists graded the AIR coil against the 8-channel coil and the 32-channel coil on a 5-point ordinal scale in multiple categories.

On average, the signal-to-noise ratio, structural sharpness, and overall image quality scores of the 16-channel AIR coil prototype were better than those of the 8-channel coil but not as good as those of the 32-channel coil.

Noise covariance matrices showed stable performance of the AIR coil across participants. Overall, the median g-factors for the 16-channel AIR coil were less than those of the 8-channel coil but greater than those of the 32-channel coil.

"The advantages of the AIR coil technology for reduction of claustrophobia, improved airway access and monitoring of patients under anesthesia, and overall better user comfort may be investigated in future studies," the authors of this AJR article added.

Credit: 
American Roentgen Ray Society

Wind farm and sleep disruption

As wind power generation becomes more important, experts in Australia are examining whether wind 'farm' turbine background noise in the environmental can affect sleep and wellbeing of nearby residents.

In a review of existing literature on wind turbine noise effects on sleep, the Flinders sleep researchers have weighed up the results of five prior studies. While previous studies showed no systemic effects on common sleep markers such as time taken to fall asleep and total sleep time - they did reveal some more subtle effect on sleep such as shifts in sleep stages and less time in deep sleep. "Comparing wind turbine noise to quiet background noise conditions showed no systematic effects on the most widely used objective markers of sleep, including time taken to fall asleep, total sleep time, time spent awake during the night and time spent asleep relative to overall time in bed," lead author Tessa Liebich says of a new review paper published in the international Journal of Sleep Research.

"However, some more subtle effects on sleep in some objective studies were established including shifts in sleep stages, less time spent in deep sleep and more time spent in light sleep.

Australian NHMRC funding, the Adelaide Institute for Sleep Health study at Flinders is studying sleep patterns in more than 70 volunteers in carefully controlled in-laboratory experimental study to investigate potential wind turbine noise impacts on sleep and daytime outcomes. Their final results are expected to be available around mid-2021

Senior author Dr Gorica Micic says limited knowledge and data in this area emphasises a need for further well-controlled experimental studies to provide more conclusive evidence regarding wind turbine noise effects on sleep.

"Environmental noises, such as traffic noise, are well known to impact sleep," she says. "Given wind power generation is connected with low frequency noise that can travel long distances and more readily into buildings, it is important to better understand the potential impacts of wind turbine noise on sleep."

This study aimed to comprehensively review published evidence regarding the impact of wind turbine noise on the most widely accepted objective and subjective measures of sleep time and quality.

Subjective sleep outcomes were not sufficiently uniform for combining data or comparisons between studies, researchers explain.

"Nevertheless, the available self-report data appeared to support that insomnia severity, sleep quality and daytime sleepiness can be impacted by wind turbine noise exposure in comparison to quiet background noise.

"However, firm conclusions were difficult to draw from the available studies given inconsistent study methods, variable outcome measures and limited sample sizes," researchers conclude.

Credit: 
Flinders University

K9 chemistry: A safer way to train detection dogs

video: Scientists at NIST and the Canine Performance Sciences program at the Auburn University College of Veterinary Medicine worked together on a study that tested a new method for training dogs to detect explosives and narcotics. In this video, which shows an experimental setup similar to the one used in the NIST study, chief canine instructor Terrence Fischer sets up the test and canine instructor Jennifer Jankiewicz records the dog's responses. If the dog, a Labrador retriever named Buddy, alerts to the correct sample by sitting down next to it, he will be rewarded with his favorite toy and a little play time with another canine instructor offscreen.

Image: 
Used with permission of the Auburn University College of Veterinary Medicine.

Trained dogs are incredible chemical sensors, far better at detecting explosives, narcotics and other substances than even the most advanced technological device. But one challenge is that dogs have to be trained, and training them with real hazardous substances can be inconvenient and dangerous.

NIST scientists have been working to solve this problem using a jello-like material called polydimethylsiloxane, or PDMS for short. PDMS absorbs odors and releases them slowly over time. Enclose it in a container with an explosive or narcotic for a few weeks until it absorbs the odors, and you can then use it to safely train dogs to detect the real thing.

But a few weeks is a long time, and now, NIST researchers have developed a faster way to infuse PDMS with vapors. In the journal Forensic Chemistry, they describe warming compounds found in explosives, causing them to release vapors more quickly, then capturing those vapors with PDMS that is maintained at a cooler temperature, which allows it to absorb vapors more readily. This two-temperature method cut the time it took to "charge" PDMS training aids from a few weeks to a few days.

"That time savings can be critical," said NIST research chemist Bill MacCrehan. "If terrorists are using a new type of explosive, you don't want to wait a month for the training aids to be ready."

For this experiment, MacCrehan infused PDMS with vapors from dinitrotoluene (DNT), which is a low-level contaminant present in TNT explosives but the main odorant that dogs respond to when detecting TNT. He also infused PDMS with vapors from a small quantity of TNT. Co-authors at the Auburn University College of Veterinary Medicine then demonstrated that trained detection dogs responded to the DNT-infused PDMS training aids as if they were real TNT.

While this study focused on DNT as a proof of concept, MacCrehan says he believes the two-temperature method will also work with other explosives and with narcotics such as fentanyl. Some forms of fentanyl are so potent that inhaling a small amount can be harmful or fatal to humans and dogs. But by controlling how much vapor the PDMS absorbs, MacCrehan says, it should be possible to create safe training aids for fentanyl.

Other safe training aids already exist. Some are prepared by dissolving explosives and applying the solution to glass beads, for example. "But most have not been widely accepted in the canine detection community because their effectiveness has not been proven," said Paul Waggoner, a co-author and co-director of Auburn's Canine Performance Sciences Program. "If you put an explosive in a solvent, the dogs might actually be detecting the solvent, not the explosive."

To test the two-temperature method, MacCrehan devised a PDMS "charging station" with a hot plate on one side and a cooling plate on the other (so the "hot stays hot and the cool stays cool," as a 1980s commercial jingle put it). He prepared various samples by placing the DNT on the hot side, where the chemical was warmed to temperatures ranging from 30 to 35 degrees Celsius (86 to 95 degrees Fahrenheit) -- well below the temperature that would cause TNT to detonate. The PDMS was kept a relatively cool 20 degrees Celsius, or about room temperature, on the other side of the charging station.

MacCrehan loaded the DNT-infused PDMS samples, which hold their charge for up to a few months, into perforated metal cans. He also loaded several cans with blanks -- PDMS samples to which no vapors were added. He labeled the cans with codes and shipped them to Auburn University.

The researchers at Auburn had trained a team of six Labrador retrievers to detect TNT using real TNT explosives. They then conducted a study to determine if the dogs would alert to the PDMS from NIST samples as if it were real TNT.

This study was "double blind": Neither the dog handlers nor the note-takers who scored the dogs' responses knew which containers underwent which preparation. This is important because dogs are keenly attuned to the body language of their handlers. If the handlers knew which samples were prepared with DNT, they might inadvertently cue the dogs with the direction of their gaze, a subtle shift in body position or some other subconscious gesture. And if the note-takers knew which samples were which, they might over-interpret the dogs' responses.

The dogs alerted to all the DNT-infused PDMS samples. They did not alert to the blanks, meaning that they were responding to the DNT, not to the PDMS itself. "They responded to the samples as if they were the real thing," Waggoner said.

The dogs did not respond as consistently to PDMS that was infused with limited quantities of TNT. However, MacCrehan explains that the very small amounts of TNT he used for this purpose may not have contained sufficient amounts of DNT to fully infuse the samples.

Looking forward, MacCrehan will be experimenting with ways to safely prepare PDMS training aids for the improvised explosives TATP and HMTD. These compounds are extremely unstable and detonate easily, so having safe training aids for them will be especially useful.

MacCrehan is a laboratory chemist, not an animal behavior expert. But despite his technological orientation, he is amazed by dogs. He estimates that they are 10,000 to 100,000 times more sensitive than the most sophisticated analytical instruments. "We are nowhere near having a hand-held gizmo that can do what they do," he said.

Credit: 
National Institute of Standards and Technology (NIST)

Oral drug blocks SARS-CoV-2 transmission, Georgia State biomedical sciences researchers find

image: Dr. Richard Plemper, Distinguished University Professor in the Institute for Biomedical Sciences at Georgia State University

Image: 
Georgia State University

ATLANTA--Treatment of SARS-CoV-2 infection with a new antiviral drug, MK-4482/EIDD-2801 or Molnupiravir, completely suppresses virus transmission within 24 hours, researchers in the Institute for Biomedical Sciences at Georgia State University have discovered.

The group led by Dr. Richard Plemper, Distinguished University Professor at Georgia State, originally discovered that the drug is potent against influenza viruses.

"This is the first demonstration of an orally available drug to rapidly block SARS-CoV-2 transmission," said Plemper. "MK-4482/EIDD-2801 could be game-changing."

Interrupting widespread community transmission of SARS-CoV-2 until mass vaccination is available is paramount to managing COVID-19 and mitigating the catastrophic consequences of the pandemic.

Because the drug can be taken by mouth, treatment can be started early for a potentially three-fold benefit: inhibit patients' progress to severe disease, shorten the infectious phase to ease the emotional and socioeconomic toll of prolonged patient isolation and rapidly silence local outbreaks.

"We noted early on that MK-4482/EIDD-2801 has broad-spectrum activity against respiratory RNA viruses and that treating infected animals by mouth with the drug lowers the amount of shed viral particles by several orders of magnitude, dramatically reducing transmission," said Plemper. "These properties made MK-4482/EIDD/2801 a powerful candidate for pharmacologic control of COVID-19."

In the study published in Nature Microbiology, Plemper's team repurposed MK-4482/EIDD-2801 against SARS-CoV-2 and used a ferret model to test the effect of the drug on halting virus spread.

"We believe ferrets are a relevant transmission model because they readily spread SARS-CoV-2, but mostly do not develop severe disease, which closely resembles SARS-CoV-2 spread in young adults," said Dr. Robert Cox, a postdoctoral fellow in the Plemper group and a co-lead author of the study.

The researchers infected ferrets with SARS-CoV-2 and initiated treatment with MK-4482/EIDD-2801 when the animals started to shed virus from the nose.

"When we co-housed those infected and then treated source animals with untreated contact ferrets in the same cage, none of the contacts became infected," said Josef Wolf, a doctoral student in the Plemper lab and co-lead author of the study. By comparison, all contacts of source ferrets that had received placebo became infected.

If these ferret-based data translate to humans, COVID-19 patients treated with the drug could become non-infectious within 24 hours after the beginning of treatment.

MK-4482/EIDD-2801 is in advanced phase II/III clinical trials against SARS-CoV-2 infection.

Credit: 
Georgia State University

Gut microbiome snapshot could reveal chemical exposures in children

Researchers at Duke University have completed the most comprehensive study to date on how a class of persistent pollutants called semi-volatile organic compounds (SVOCs) are associated with the gut microbiome in human children.

The results show that certain SVOCs are correlated with the abundance of bacterial and fungal species living in the human digestive tract and may affect them differently, providing a potential mechanism for measuring exposure to a wide variety of these substances. The study also suggests that exposure to toxic halogenated compounds, chemicals containing carbon and a halogen such as chlorine and bromine, may create a niche for bacteria that feed off of them - bacteria that are not usually found in the human gut.

"We found bacteria that researchers use for soil bioremediation to remove chlorinated solvents, which is not an organism that you would expect to find in somebody's gut," said Claudia Gunsch, the Theodore Kennedy Professor of Civil and Environmental Engineering at Duke. "The reason it's used in soils is to detoxify and remove chlorines, which suggests that maybe that's also exactly why they're in these guts."

The results appear online on October 30 in the journal Environmental Science & Technology Letters.

"We want to understand the impacts of exposure to SVOCs on our gut microbiome and how that translates to positive or negative health outcomes," Gunsch said. "But right now it's a big black box that we don't understand."

SVOCs are a broad class of odorless chemicals that are emitted from building materials and consumer products, often slowly evaporating and settling on dust particles and water droplets. Almost everyone in the developed world is exposed regularly to at least some of these compounds, due to their common use in industrial and consumer products.

The thought that these chemicals might have effects on the human microbiome and impact health is relatively new, and the research to uncover what these may be and why they occur is still in its infancy. One important line of work is aimed at children because they typically have higher exposure rates, due to spending more time playing on dusty floors where SVOCs accumulate, and because their growing bodies are more susceptible to novel environmental stressors.

One avenue for causing turbulence in a growing child's life is through affecting the gut microbiome. Made up of the complex communities of bacteria and fungi growing and living together throughout the human digestive tract, the gut microbiome has been shown to have a clear importance to childhood development as well as adult health. While some studies have already shown that certain SVOCs have an impact on the gut microbiome of children, the chemicals studied are just a tiny fraction of those that people are exposed to.

"In theory, perturbations in the gut microbiome of children might be associated with long-term health impacts," added Courtney Gardner, assistant professor of civil and environmental engineering at Washington State University, who conducted the study while still a member of Gunsch's laboratory. "But before we can really study any of them for clear causations, we need to get a sense of which SVOC classes seem to be the most negatively associated with microbiome communities."

In their first explorative foray into this field, Gunsch, Gardner and their colleagues at Duke measured the levels of dozens of SVOCs circulating in the bodies of almost 80 children between the ages of three and six. They also characterized each of the children's gut microbiome and then looked for relationships between the differences they found and exposures to SVOCs.

There wasn't any shortage of data to work with, as the researchers found 29 SVOC compounds in more than 95% of the samples taken. They also found relationships between the compounds present in children's blood or urine and the relative amounts of key microbes, including 61 bacteria and 24 fungi. After working through the various biomarkers and relationships, the researchers came away with two interesting insights.

The first was that children with high levels of halogenated SVOCs have some unusual guests in their guts.

The researchers also found that while some SVOCs had a negative effect on bacteria in the gut microbiome, others had a positive effect. With more research into exactly how these various chemicals affect the different species of the gut in their own ways, this work may provide the possibility of using a snapshot of the gut's microbial community as a window into what SVOCs a child has been exposed to.

"It's currently really complicated and expensive to measure what chemicals people have been exposed to if you don't already know what you're looking for," said Gunsch. "By contrast, this is pretty simple. If we could get a reliable snapshot of SVOC exposure just by sequencing a microbiome's genetic signature, we could use that to help us understand more about the health impacts these chemicals have on our children and ourselves."

Credit: 
Duke University