Culture

UMD develops technology allowing researchers to image wetland soil activity in real time

Featured on the cover of the Soil Science Society of America Journal, researchers at the University of Maryland (UMD) and the Spanish National Research Council partnered to create a new camera allowing for the imaging of wetland soil activity in real time. This camera gives the classic IRIS (indicator of reduction in soils) technology a big upgrade. IRIS is used universally by researchers and soil assessors to determine if soils are behaving like wetland soils and should therefore be classified as such. However, before this new camera, soil assessors couldn't quantify the rate of iron reduction in saturated wetland soils, and researchers had no way to visualize the process in real time. This technology opens up new research avenues in soil science, and gives a compelling peak at how biochemically active wetland soils really are.

"The interest in this paper has been really amazing, although it wasn't initially why I created the camera," says Brian Scott, doctoral candidate in Environmental Science and Technology at UMD. "The paper shows that this camera really works, but what interested people was the real time imaging and rates of iron reduction in wetland soils. But to be honest, the real reason I did it wasn't for the practical reason of calculating rates. It was more about trying to explore ways to visualize what's happening in the environment. I study soils, and everything is underground. So I developed this method to look at what is actually happening under the surface, which is really exciting to me."

"There are three major parameters needed to classify an area as a wetland: hydrology or water, the plant community, and soil properties," adds Martin Rabenhorst, esteemed soil scientist, professor in Environmental Science and Technology at UMD, and co-author on this paper. "These are all critical because wetlands are highly regulated and protected ecosystems. The soil is perhaps the most complicated piece of the puzzle because you have to confirm that certain biogeochemical processes are actually happening below ground where they are not easily seen."

Rabenhorst himself is an inventor of a greener method of IRIS, a technology used to measure the amount of iron reduction occurring in soils. The technology uses iron-oxide coatings on plastic tubes or films that are pushed into the soil and left for 30 days so the soil can react with the paint. As these reactions occur, the paint is partially dissolved from the tube. If 30 percent or more of the paint is stripped off, the soil is behaving like typical wetland soil.

"This is really because of the biochemistry of microorganisms in the soil," explains Scott. "The organisms I study breathe iron the same way we breathe oxygen. These microorganisms are anaerobic because they thrive in environments without oxygen and need the iron to respire. Oxygen is toxic to them, so they live in wetlands where the soil is often saturated with water and less oxygen rich. These organisms are so prevalent in wetland soils that they are the basis for our testing to see if a soil is hydric. IRIS testing has therefore become a focal point for biogeochemists that study wetlands."

While this technology has the potential to lead scientists down all sorts of new research avenues, it is unclear whether it might lead to improvements down the road for the typical soil assessor using classic IRIS technology.

But as Scott describes it, the real findings of the paper come in the methods used to create this camera, which he says is now reproducible by anyone for about $100. He converted a borescope camera used by plumbers and other industry professionals to image down pipes, and coupled that with a wireless system sending information in real time with just a small solar panel to see what is happening 24-7. "Some of the things that are the most important for this paper weren't really the findings; it was the process of development that opens up new applications and research avenues which is really exciting," says Scott.

The idea came to Scott while he was volunteering in Osvaldo Sala's lab at Arizona State University using a machine called a mini rhizotron that is used to count tree roots with a camera through a hollow tube in the ground. Scott thought, "If we can take pictures of roots, we should be able to take pictures of other things underground." So eventually, when Scott came to Maryland to pursue his PhD and started working with Rabenhorst, things all fell into place. The process, however, was not without its challenges.

"Once somebody has gone through this whole long process of how to make something work, then we can do it again and again easily, but it takes a long time to figure it out," says Scott. "It took a long time to figure out how to make this camera work, and I ran into roadblocks where I almost quit if it weren't for other people's input and ideas."

Scott particularly calls out a few people along the way that helped keep this process moving. An undergraduate assistant, Kristin Webb, helped sketch out the initial designs for the camera. Another undergraduate in Environmental Science and Technology and recent graduate, William Jacob Mast, helped design and print the camera shell using a 3-D printer. And Spanish collaborators at the Spanish National Research Council had a similar idea simultaneously and helped find ways to convert the video imagery into flat images that could be analyzed.

Scott stresses the importance of collaborative science throughout this process, and wants to make this technology available to others so that it can advance science and ultimately environmental health. "I'm not interested in patenting this particular technology because I want the science to benefit everyone," explains Scott. "It's not about money with this, it's about the impact for the environment. I spent my own money actually to help make sure that this could get built along with the support of the department, and I think that if it works, and if it helps another scientist make some even greater discovery, then that's worth it. It's about helping the world we live in."

Scott is pleased to be able to contribute to soil science and focus on the restoration of critical ecosystems like wetlands. "I was an environmental engineer for years, so I have an interest in taking care of the environment, and a lot of what environmental engineers do is clean up messes," says Scott. "Everything I do now is related to ecosystem recovery and restoration. I used to clean up messes, but it's a different animal to actually take ecosystems back to their former glory and restore their ecological functioning."

Credit: 
University of Maryland

Though risk is minuscule, infection after COVID-19 vaccination is possible

In a letter to The New England Journal of Medicine, published online March 23, 2021, a group of investigators from University of California San Diego School of Medicine and the David Geffen School of Medicine at UCLA report COVID-19 infection rates for a cohort of health care workers previously vaccinated for the novel coronavirus.

"Because of the compulsory daily symptom screening of health care personnel, patients, and visitors, and the high testing capacity at both UC San Diego Health and UCLA Health, we were able to identify symptomatic and asymptomatic infections among health care workers at our institutions," said co-author Jocelyn Keehner MD, an infectious disease fellow at UC San Diego School of Medicine.

"Moreover, we were able to describe the infection rates in a real-world scenario, where vaccine roll-out coincided with a surge of infections. We observed a low overall positivity rate among fully immunized health care workers, supporting the high protection rates of these vaccines."

The authors looked at pooled data from UC San Diego and UCLA health care workers who received either the Pfizer or Moderna vaccines between December 16, 2020 and February 9, 2021 (36,659 first doses, 28,184 second doses), a time period that coincided with a significant surge in COVID-19 infections in the region.

Within this group, 379 individuals tested positive for SARS-CoV-2 at least one day following vaccination, with the majority (71 percent) testing positive within the first two weeks after the first dose. Thirty-seven health care workers tested positive after receiving two doses, which is when maximum immune protection is expected to be achieved with both vaccines.

The authors estimated that absolute risk of testing positive for SARS-CoV-2 following vaccination was 1.19 percent for health care workers at UC San Diego Health and 0.97 percent at UCLA Health, both higher than the risk identified in the Moderna and Pfizer clinical trials, which were not limited to health care workers.

"There are several possible explanations for this elevated risk," said co-author Lucy E. Horton, MD, MPH, associate professor in the Division of Infectious Diseases and Global Public Health at UC San Diego School of Medicine and medical director of the UC San Diego Health Contact Tracing Unit.

"First, the health care workers surveyed have access to regular asymptomatic and symptomatic testing. Second, there was a regional surge in infections overlapping with vaccination campaigns during this time period. And third, there are differences in the demographics of health care workers compared to participants in the vaccine clinical trials. Health care workers tend to be younger and have a greater overall risk of exposure to SARS-CoV-2 in the community."

Increased rates of infection have been strongly linked to behaviors that heighten risk of exposure, such as attending social gatherings in restaurants and bars without adequate masking and physical distancing. This connection is more strongly associated with younger age demographics.

Additionally, noted co-author Michael A. Pfeffer, MD, assistant vice chancellor and chief information officer at UCLA Health, the Moderna and Pfizer clinical trials stopped collecting data before the December-February surge and there was little to no asymptomatic testing conducted.

The authors found that risk of infection 14 days after second dose, when maximum immunity is expected to be reached, was rare. "It suggests the efficacy of these vaccines is maintained outside of the trial setting," they wrote.

Nonetheless, they also noted that risk is not zero. While both Pfizer and Moderna report efficacy levels in the mid-90s, neither is 100 percent.

"It underscores the critical importance of continued public health mitigation measures (masking, physical distancing, daily symptom screening and regular testing), even in highly vaccinated environments, until herd immunity is reached at large," said corresponding author Francesca Torriani, MD, professor of clinical medicine in the Division of Infectious Diseases and Global Public Health in the UC San Diego School of Medicine and program director of Infection Prevention and Clinical Epidemiology at UC San Diego Health.

Credit: 
University of California - San Diego

Rodenticides in the environment pose threats to birds of prey

image: Goshawk in Berlin, Germany

Image: 
Oliver Krone/Leibniz-IZW

Over the past decades, the increased use of chemicals in many areas led to environmental pollution - of water, soil and also wildlife. In addition to plant protection substances and human and veterinary medical drugs, rodenticides have had toxic effects on wildlife. A new scientific investigation from scientists of the Leibniz Institute for Zoo and Wildlife Research (Leibniz-IZW), the Julius Kühn Institute (JKI) and the German Environment Agency (Umweltbundesamt - UBA) demonstrate that these substances are widely found in liver tissues of birds of prey from Germany. Anticoagulant rodenticides, commonly used to kill rodents in agriculture and forestry, were frequently detected, particularly in birds of prey close to or in urban environments. Especially northern goshawks in the urban conurbation of Berlin and red kites in all habitats were frequently exposed to rodenticides. Evidence of rodenticides in white-tailed sea eagles demonstrated that scavengers occupying habitats more distant from human-modified landscapes are subjected to exposure as well. The results, which were supported by WWF Germany, are published in the scientific journal Environmental Research.

Europe's bird populations currently experience a substantial decline. Among the drivers of this decline are continued urbanisation, growing intensification of agriculture, the massive decline of insect populations as well as chemical pollution linked to the aforementioned processes of land use. "Raptors are known to be particularly sensitive to bioaccumulating pollutants", says Oliver Krone, bird of prey specialist at the Leibniz-IZW Department of Wildlife Diseases. Together with doctoral student Alexander Badry from Leibniz-IZW and colleagues Detlef Schenke from JKI and Gabriele Treu from UBA he now analysed in detail which substances are detectable in deceased red kites (Milvus milvus), northern goshawks (Accipiter gentilis), Eurasian sparrowhawks (Accipiter nisus), white-tailed sea eagles (Haliaeetus albicilla) and ospreys (Pandion haliaetus). The team analysed carcasses collected between 1996 and 2018.

"We found rodenticide residues in liver tissues of more than 80 percent of the northern goshawks and red kites which we examined", says lead author Badry. In total, 18 percent of the northern goshawks and 14 percent of the red kites exceeded the threshold level of 200 ng per gram body mass for acute toxic effects. This is expected to contribute to previously reported declines in survival of red kites in Germany. "In white-tailed sea eagles we found rodenticides in almost 40 percent of our samples, at lower concentrations, whereas exposure in sparrowhawks and ospreys was low or zero." Overall, more than 50 percent of the birds had rodenticide levels in their liver tissue, about 30% had combinations of more than one of these substances.

"Rodenticide poisoning represents an important cause of death for birds of prey", Badry and Krone conclude. "Species that facultatively scavenge have shown to be at high risk for rodenticide exposure." The application of these pesticides is not restricted to agricultural contexts, such as barns and stables or for controlling common vole populations on arable land. Anticoagulant rodenticides are also frequently used in large-scale forest plantations and in the sewage systems and canals of towns and cities to control rodent populations. The results of the analyses demonstrated that the closer a dead bird was found to urban landscapes such as industrial areas and the urban conurbation, the more likely it was exposed to rodenticides. "It seems that urban areas pose a great risk for birds of prey in terms of exposure to rodenticides, although the extent of exposure was not linked to the urban gradient", the authors explain. "This means that birds of prey are more likely to be exposed to rodenticides in the vicinity or inside urban areas but it does not automatically mean that more of these substances accumulate." Species-specific traits such as facultative scavenging on small mammals or foraging on birds that have direct access to rodenticide bait boxes seem to be responsible for the extent of exposure rather than urban habitat use as such. Additionally, accumulation takes place through multiple exposures throughout the life of an individual, which is why adults were more likely to be exposed than juvenile birds.

In addition to rodenticides, the scientists also detected medical drugs such as ibuprofen (14.3 %) and fluoroquinolones (2.3 %) in the bird of prey carcasses. Among the plant protection products, they detected the insecticide dimethoate, which was allowed for use until 2019, and its metabolite omethoate as well as the neonicotinoid thiacloprid in four red kites, which were allowed for use until 2021. The scientists assume that the levels of dimethoate they found were a consequence of deliberate poisoning. The traces of thiacloprid - a substance with a very short half-life in bird organs - hint at an exposure briefly before their death.

The results of these analyses clearly show that especially rodenticides and deliberate poisoning pose a threat to birds of prey, the authors conclude. This is true both for raptors living in or near urban habitats and facultative scavengers. Known sources of these substances need to be re-evaluated in terms of their effects along the food chain, i.e. in terms of secondary poisoning and potential toxicity to birds of prey. Furthermore, the levels of rodenticides found in white-tailed sea eagles, which do not usually feed on the species that the rodenticides target, indicate that further research on the sources is needed.

Credit: 
Leibniz Institute for Zoo and Wildlife Research (IZW)

Union-friendly states enjoy higher economic growth, individual earnings

ITHACA, N.Y. - New research from Mildred Warner, professor of city and regional planning at Cornell University, shows that state laws designed to hinder union activity and indulge corporate entities do not enhance economic productivity.

"We find that where state policy is captured by corporate interests, this undermines inclusive growth," Warner said. "These interests see union and city power as a threat, which is why there are groups like the American Legislative Exchange Council, for example, focused on crafting state laws that erode labor protections and enhance corporate interests."

The paper, "Productivity Divergence: State Policy, Corporate Capture and Labor Power," written with co-author Yuanshuo Xu, assistant professor at Zhejiang University, Hangzhou, China, published Jan. 29 in the Cambridge Journal of Regions, Economy and Society.

Warner and Xu assembled models for all counties throughout the U.S. and found labor returns (how much money people earn) are higher in states with higher unionization, and lower in states where legislation is more captured by corporate interests, she said.

"The anti-union political environment in the U.S. is longstanding," Warner said, "especially in the South, as reflected by right-to-work laws by constraining unions' ability to organize and collect dues."

Unionization rates in the U.S. have declined for decades. "Unionization is highest in the public sector, but this has been challenged by state and local austerity since the recession in 2008-09," Warner said.

Warner said that the role of the federal government is to provide funds to states and local governments to support critical public services, such as schools and roads. A good example of this is the recent $1.9 trillion COVID-19 economic stimulus package, signed into law March 11 by President Joe Biden.

"While the federal government can play a redistributive role, as with the recent COVID relief package, this is less likely in states that have more corporate influence in their legislative policymaking," said Warner. "This suggests the key to inclusive growth may rest with more balanced power between corporate and labor interests at the state level."

While the coalition between corporate interests and state legislatures is aimed at taming city-regions and reducing labor's collective bargaining power, Warner said, "In the new political economy of place, the corporate interests undermine the potential for inclusive economic growth."

Credit: 
Cornell University

With drop in LA's vehicular aerosol pollution, vegetation emerges as major source

image: In 2018, organic aerosols made up about 23% of the aerosol pollutants in Los Angeles (blue on pie chart), a large portion of which is due to chemicals emitted by plants. The right chart shows how aerosol concentrations in LA have declined over time, leveling out around 2012.

Image: 
UC Berkeley graphics by Clara Nussbaumer and Ronald Cohen

California's restrictions on vehicle emissions have been so effective that in at least one urban area, Los Angeles, the most concerning source of dangerous aerosol pollution may well be trees and other green plants, according to a new study by University of California, Berkeley, chemists.

Aerosols -- particles of hydrocarbons referred to as PM2.5 because they are smaller than 2.5 microns in diameter and easily lodge in the lungs -- are proven to cause cardiovascular and respiratory problems.

As a result of strict vehicle emissions laws, organic aerosol levels have been significantly reduced throughout the United States, but the drop has been particularly dramatic in Los Angeles, which started out at a higher level.

Based on pollution measurements over the past 20 years, the UC Berkeley scientists found that concentrations of PM2.5 in the Los Angeles basin in 2012 were half what they were in 1999. As a result, from 2016 to 2018, there were almost no PM2.5 violations in the area when temperatures were low, below 68 degrees Fahrenheit. But at warmer temperatures, aerosol concentrations rose -- over the same time period, 70% to 80% of days over 100 F exceeded the National Ambient Air Quality Standard (NAAQS) threshold.

"The positive news is that, where we did understand the source and we took action, that action has been incredibly effective," said Ronald Cohen, an atmospheric chemist and UC Berkeley professor of chemistry. "Twenty years ago, just about every day in LA was in violation of a health-based standard. And now it is only the hot days."

As vehicle organic chemicals -- compounds of carcinogens such as benzene and toluene --dropped, air quality experts focused on other potential sources of aerosols in those cities with unhealthful levels. Many researchers believe that personal care and household cleaning products -- some seemingly as benign as the citrus scent limonene -- may be the culprit. Given the temperature dependence of aerosol levels in Los Angeles, Cohen doubts that.

"There is a growing consensus that, as cars became unimportant, household chemicals are dominating the source of organics to the atmosphere and, therefore, dominating the source of aerosols," he said. "I am saying that I don't understand how aerosols from these chemicals could be temperature-dependent, and, therefore, I think it is likely something else. And trees are a good candidate."

Plants are known to release more organic chemicals as the temperature rises and in many forested areas trees are the source of organic chemicals that combine with human-produced nitrogen oxides to form aerosol. President Ronald Reagan was partially correct when he infamously stated in 1981 that, "Trees cause more pollution than automobiles do." At the time, scientists were learning about the role of forests surrounding Atlanta in causing that city's air pollution.

Cohen and former Berkeley master's degree student Clara Nussbaumer reviewed organic chemical emissions from various plants known to grow or be cultivated in the Los Angeles area and found that some, such as the city's iconic Mexican fan palms, produce lots of volatile organic compounds. Oak trees are also high emitters of organic chemicals.

They estimated that, on average, 25% of the aerosols in the Los Angeles basin come from vegetation, which includes an estimated 18 million or more trees.

Plant derived aerosols are likely made of the chemical isoprene -- the building block of rubber or plant chemicals such as terpenes, which consist of two or more isoprene building blocks combined to form a more complex molecule. Cohen says that PM2.5 aerosols can be thought of "as little tiny beads of candle wax," with plant-derived aerosols composed of many molecules of isoprene and terpenes, which are found in pine tree resins.

"I am not suggesting that we get rid of plants, but I want people who are thinking about large-scale planting to pick the right trees," he said. "They should pick low-emitting trees instead of high-emitting trees."

The research was described this month in the journal Environmental Science & Technology.

How does global warming affect pollutants?

Cohen, who has studied the temperature dependence of urban ozone levels for insight into the impact climate change will have on pollutants, decided two years ago to investigate the temperature dependence of ozone and aerosol pollution in five counties in the Los Angeles basin: Los Angeles, San Bernardino, Riverside, Orange and Ventura. He and Nussbaumer looked at data from 22 measurement sites across the basin -- eight in LA County, two in Orange County, five in Riverside County, four in San Bernardino County, and three in Ventura County -- to study aerosols, and at four sites -- three in LA, one in San Bernardino -- to study ozone.

The researchers found that at the beginning of the 21st century, the relationship between temperature and aerosol pollution was quite varied: if the temperature went up, sometimes PM2.5 concentrations would increase a lot, sometimes a little. Today, the relationship is more linear: If the temperature goes up a degree, PM2.5 concentrations predictably increase by a set amount.

Cohen and Nussbaumer focused primarily on secondary organic aerosols (SOA), which form as particles when gaseous pollutants -- primarily nitrogen oxides (NOx) and volatile organic compounds (VOCs) -- react with sunlight. The same conditions produce ozone.

Using a simple atmospheric model, they concluded that both regulated chemicals from vehicle exhaust and cooking -- primary organic aerosols such as benzene, toluene, ethylbenzene and xylene -- and isoprene from plants were precursors of the majority of the organic aerosols observed. Their model suggests that about a quarter of the SOA in the LA Basin are formed by isoprene or other very similar compounds, and that these represent most of the temperature-dependent increase. While there is evidence that some temperature-dependent VOCs have been controlled over time, such as those from evaporation of gasoline, isoprene is not one of them.

Cohen noted that as electric car use increases, the importance of organic aerosols from vegetation will become more dominant, requiring mitigation measures to keep levels within regulatory limits during heat waves.

"Cars are also contributing to ozone, and in the LA basin the ozone level is also high, at high temperatures and for the same reason: There are more organic molecules to drive the chemistry when it is hot ," Cohen said. "We want some strategy for thinking about which plants might emit fewer hydrocarbons as it gets hot or what other emissions we could control that prevent the formation of aerosols."

Cohen hopes to look at data from other urban areas, including the San Francisco Bay Area, to see if the temperature-dependent aerosols now dominate, and whether vegetation is the culprit.

The study was funded in part by a grant (NA18OAR4310117) from the National Oceanic and Atmospheric Administration (NOAA). Cohen and Allen Goldstein, a UC Berkeley professor of environmental science, policy and management and of civil and environmental engineering, have also partnered with NOAA scientists and the state and local air quality agencies on an experiment to observe emissions in Los Angeles at different temperatures. Combining these different observing strategies in the LA Basin, Cohen hopes, "will lead to better ideas for reducing high ozone and aerosol events in the basin, ones that can then be used as a guide in other major cities suffering from poor air quality."

Credit: 
University of California - Berkeley

Researchers hunt for drugs that keep HIV latent

When the human immunodeficiency virus infects cells, it can either exploit the cells to start making more copies of itself or remain dormant--a phenomenon called latency. Keeping these reservoirs latent is a challenge. A new paper, published in the Proceedings of the National Academy of Sciences, has found a way to look for chemicals that can keep the virus suppressed into its dormant state.

"The current drug treatments block healthy cells from becoming infected by the virus," said Yiyang Lu, a PhD student in the Dar lab at the University of Illinois Urbana-Champaign. "The latent reservoir poses a bigger problem because it can start producing the virus at any time. Consequently, patients have to remain on antiretroviral therapy all their lives to prevent a viral rebound."

So far, there are two types of drug treatment strategies: shock-and-kill, where reactivated cells are killed due to HIV, and a second drug cocktail prevents other cells from being infected, or block-and-lock, which forces the virus into a deep latent state so that it does not reactivate again. The problem with the first approach is that there are always some leftover reservoirs that do not get activated. The problem with the second approach, which the researchers are trying to solve, is that there aren't many drugs that have been discovered.

Since the transition from latency occurs randomly, measuring the fluctuations in gene expression can provide more coverage than the average gene expression. "Commercial drug screens usually look at mean gene expression. Instead, we used a drug screen that looks at fluctuations in gene expression. Our screen allowed us to therefore find more compounds that could have been overlooked," Lu said.

"We implemented a time-series drug screening approach that are less commonly used in other labs," said Roy Dar, an assistant professor of bioengineering at Illinois and faculty member of the Carl R. Woese Institute for Genomic Biology. The researchers used a T- cell population, which is a reservoir for HIV, that had been infected by the virus. They imaged the cells in 15-minute intervals for 48 hours and tested over 1800 compounds. They looked at noise maps to identify which drugs can modulate the gene expression.

Using the screen, they were able to find five new latency-promoting chemicals, raising the possibility that similar screens can be successfully adapted to study other systems that exhibit variability in gene expression, such as cancer. They are currently working on understanding how the five novel drugs suppress viral reactivation. "We want to test if these drugs have off-target effects in terms of how many other genes they affect in the host cells," Dar said. "We also want to test these drugs in patient samples to see whether these drugs suppress HIV in them."

Credit: 
Carl R. Woese Institute for Genomic Biology, University of Illinois at Urbana-Champaign

study: Precautions used to prevent COVID-19 decreased common respiratory illness rates

Boston - Wearing masks and physical distancing - two key infection prevention strategies implemented to stop the spread of COVID-19 - may have led to the dramatic decrease in rates of common respiratory viral infections, such as influenza. A study led by researchers at Boston Medical Center (BMC) showed an approximately 80 percent reduction in cases of influenza and other common viral respiratory infections when compared to similar time periods in previous years, before wearing masks, physical distancing, and school closures were implemented to help stop the spread of COVID-19. Published online in Open Forum Infectious Diseases, these results suggest that public health measures used to prevent COVID-19 transmission could be useful in helping prevent other respiratory viral infections.

"We know viruses that cause the common cold and pneumonia are spread through close contact, aerosols and/or droplets, which is why we decided to look into how the measures implemented to prevent the spread of COVID-19 may have impacted the incidence of other common viral respiratory illnesses," said Manish Sagar, MD, an infectious diseases physician and researcher at BMC and the study's corresponding author.

In this retrospective cohort analysis, the researchers analyzed all (inpatient and outpatient) documented respiratory viral infections at BMC for certain time periods between January 1, 2015 and November 25, 2020. These infections were diagnosed using a comprehensive respiratory panel polymerase chain-reaction test, which screens for 20 common respiratory pathogens, and positive results were recorded. Positive and negative results for SARS-CoV-2 tests were excluded from the study given the focus on other common respiratory illnesses prior to the COVID-19 pandemic.

The year 2020 was divided into two specific periods. The first, referred to as Period 1, represents the time before the implementation of mask wearing, physical distancing and school closures (Weeks 1-10 from Jan. 1 - March 10, 2020). The second, referred to as Period 2, represents the time after the implementation of these practices to stop the transmission of COVID-19 (Weeks 11-46, March 11 - Nov. 25, 2020). The researchers analyzed the number of viral infections during periods 1 and 2 for 2015 - 2019 and compared to the 2020 results.

In 2020 period 2, after the implementation of measures to stop COVID-19, newly detected respiratory viruses were approximately 80 percent lower compared to the same time period from 2015 to 2019. In contrast, in 2020 period 1, before COVID-19 prevention measures, there were more respiratory virus infections compared to 2015 to 2019. Additionally, the phased re-opening in Boston, which occurred around July 20, 2020, was associated with an increase in the detection of rhinovirus infections.

"Our study results may be particularly helpful for developing prevention strategies in settings where respiratory infections are very harmful, such as congregate settings and for the elderly and immunosuppressed," added Sagar, an associate professor of medicine and microbiology at Boston University School of Medicine.

Credit: 
Boston Medical Center

Curbing COVID-19 on campuses nationwide

While COVID-19 cases may be on the decline, the virus is still prevalent nationwide, and higher education institutions need to prepare for a successful 2021 academic year. New research from Clemson University in The Lancet Child & Adolescent Health, one of the world's premier peer-reviewed general medical journals, indicates how surveillance-based informative testing (SBIT) mitigates the spread of COVID-19 on campus, paving the way for other institutions, even those without the infrastructure or funding for mass-scale testing.

SBIT was implemented during the first two weeks of the Fall semester at Clemson. According to the study, random surveillance testing to identify outbreaks in residence halls and with targeted follow-up testing was twice as likely to detect a positive case than random testing models. In the absence of SBIT, transmission models developed by the research team show COVID cases would have increased by 154 percent.

"By focusing on residential hotspots, our SBIT strategy identified and contained outbreaks throughout campus. This strategy made efficient use of our resources, detecting positive cases at twice the rate of simple random testing," said Lior Rennert, assistant professor with the College of Behavioral, Social and Health Sciences.

The study is the first to document the implementation, results and relative effectiveness in detecting and containing COVID-19 outbreaks and mitigating university campuses' spread. For most universities, voluntary testing was implemented, leaving many cases undetected and contributing to an increase of COVID on campuses and in their surrounding communities.

The surveillance-based informative testing was spearheaded by Clemson's public health team. The authors of the manuscript are Lior Rennert, Christopher McMahan, Corey Kalbaugh, Yuan Yang, Brandon Lumsden, Delphine Dean, Lesslie Pekarek and Christopher Colenda, and are the first Clemson team to publish in a Lancet family journal.

Clemson's commitment to COVID-19 safety

The study's research team played an integral role in the University's ability to bring students back to campus in Fall 2020. Towards the tail end of the semester, there was a precipitous decline in student cases, despite significant increases in the case counts in surrounding communities. The declines were due in large part to the public health strategies implemented. At the pinnacle of these strategies was an aggressive plan to test as many students and employees as often as possible. By identifying active cases, the University was able to mitigate the virus's community spread, thus reducing risk to students, faculty, and the larger community.

What Clemson faced was not unique - and it was not the only campus with a high number of reported cases. The difference? Clemson implemented a robust and repetitive testing strategy for students and faculty regardless of symptoms or exposure. In doing so, positive cases were identified and removed from the population, thus limiting the virus's spread both on-campus and in surrounding communities through isolation and quarantine procedures.

Clemson's public health strategies were derived through data-driven means. Over the summer, the University's public health team designed and built models demonstrating pre-semester testing would reduce the spread and minimize peak cases during the semester. Acting on these findings, the University mandated all students and faculty be tested before returning to face-to-face instruction. As a part of these efforts, nearly 3,000 COVID-19 cases were discovered, preventing the return of nearly 3,000 infected students and faculty, who would have unknowingly spread the virus to others - infectivity is thought to be two to four people per confirmed case.

As in-person instruction began, the University also began randomly testing 5 percent of the student body weekly. These surveillance efforts allowed for the identification of "hotspots" in both the on- and off-campus student body. Testing was then redirected to these hotspots. This combination of testing drove the COVID prevalence to below 1 percent.

Since the initial testing efforts, Clemson's positive rate dropped weekly - as did the population using isolation and quarantine accommodations. While most of the credit should rightfully lie with students' responsible behavior, the University's public health strategies undoubtedly contributed to the decreasing case count. Beginning October 2020, the public health team implemented a weekly testing regime, keeping disease prevalence down ever since.

Credit: 
Clemson University

Food industry lobbying was intense on failed bill to limit marketing to Canadian children

image: Doctoral student Christine Mulligan

Image: 
Christine Mulligan, courtesy of Nutritional Sciences, University of Toronto

Researchers at the University of Toronto have found that food industry interactions with government heavily outnumbered non-industry interactions on Bill S-228, also known as the Child Health Protection Act, which died in the Senate of Canada in 2019.

The researchers looked at more than 3,800 interactions, which included meetings, correspondence and lobbying, in the three years before the bill failed. They found that over 80 per cent were by industry, compared to public health or not-for-profit organizations.

They also found that industry accounted for over 80 per cent of interactions with the highest-ranking government offices, including elected parliamentarians and their staff and unelected civil servants.

"Industry interacted with government much more often, more broadly, and with higher ranking offices than non-industry representatives in discussions of children's marketing and Bill S-228," said principal investigator Mary L'Abbé, a professor of nutritional sciences at U of T and a researcher in the Joannah & Brian Lawson Centre for Child Nutrition.

The journal CMAJ Open published the study today.

The researchers drew data from Health Canada's Meetings and Correspondence on Healthy Eating database, set up in 2016, which details the type and content of interactions between stakeholders and Health Canada on nutrition policies. They also used Canada's Registry of Lobbyists, which tracks the names and registrations of paid lobbyists but provides limited details on the content of the meetings.

"We're fortunate to have access to this information in Canada, as it offers insight into the story of government bills," said Christine Mulligan, a doctoral student in L'Abbé's lab and lead author on the study. "Industry stakeholders bring important viewpoints, but the volume and breadth of their lobbying on this bill was clearly disproportionate, especially compared to public health."

The food industry has a long history of effective lobbying in Canada and other countries, and a growing body of research has documented both that extensive influence and the need for policy makers to be aware of it when creating policy that promotes the health and safety of all citizens.

Health Canada met with industry 56 per cent of the time regarding the 2016 Healthy Eating Strategy, researchers at the University of Ottawa found earlier this year. And during creation of the recent Food Guide, Health Canada restricted industry lobbying -- so effectively that industry persuaded officials at Agriculture and Agri-Food Canada to lobby Health Canada on their behalf, as The Globe and Mail and other organizations reported.

Mulligan says the disparity in interactions with government among stakeholders was even greater for S-228, and that it marks a stark contrast between this bill and interactions on the Healthy Eating Strategy more broadly.

Industry lobbying has also been prominent on a stalled bill to introduce front-of-package labelling that would inform consumers about foods high in salt, sugar and saturated fat, said L'Abbé, who advised Health Canada on both bills and the Healthy Eating Strategy.

L'Abbé said that more transparency on interactions with Agriculture and Agri-Food Canada and other federal departments would help, as would more detail in the Registry of Lobbyists. All stakeholder comments related to proposed regulations are part of a public docket in the U.S., and some groups have called for a similar approach in Canada.

"We desperately need better management of the consultative process on legislative bills, for public health policy in the public good," said L'Abbé.

Credit: 
University of Toronto

USPSTF statement on screening for hearing loss in older adults

Bottom Line: The U.S. Preventive Services Task Force (USPSTF) concludes that current evidence is insufficient to make a recommendation about screening for hearing loss in asymptomatic adults 50 and older. Nearly 16% of U.S. adults 18 and over report difficulty hearing. Hearing loss has been associated with an increased risk of falls, hospitalizations, social isolation and cognitive decline. The USPSTF routinely makes recommendations about the effectiveness of preventive care services and this recommendation is similar to its 2012 statement.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jama.2021.2566)

Editor's Note: Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Note: More information about the U.S. Preventive Services Task Force, its process, and its recommendations can be found on the newsroom page of its website.

Credit: 
JAMA Network

Microchip models of human lungs enable better understanding of disease, immune response

image: Simplified schematic model of the human lung with healthy and virus-infected alveolar sacs along with a viral infection on a lung alveolus platform.

Image: 
Yildiz-Ozturk

WASHINGTON, March 23, 2021 -- According to the National Institutes of Health, respiratory viruses are the most frequent cause of disease and death in humans, a fact highlighted by the COVID-19 pandemic. Despite the potential to cause severe disease, over 70% of viral infections remain asymptomatic.

Animal models have been used widely to understand how these viruses infect the host and how the host responds to prevent infection and onset of diseases. Data based on animal models, however, does not always apply well to humans, given the variability in species and genetics.

In Biomicrofluidics, by AIP Publishing, researchers from Ege University and the University of Nottingham review a range of lung-on-chip technologies that represent the vital properties of lung tissue and are capable of recapitulating the fundamental aspects of various pathologies.

"Lung-on-chip platforms are able to reconstruct the multicellular architecture, the physiochemical microenvironment, and the tissue-tissue interface of the human lung in vitro," said author Ozlem Yesil-Celiktas.

The researchers reviewed various state-of-the-art lung-on-chips and their applications in examining, diagnosing, and treating human viruses, including the coronavirus that causes COVID-19. Different platforms focus on different parts of the lung functions, such as small airway-on-chips and lung alveolus chips.

The knowledge and expertise accumulated through the development of physiologically relevant lung-on-chip models paves the way to use these models to study the interaction of several human respiratory viruses with the airway epithelium and alveolus in an organ-relevant setting.

"The current pandemic, which spread to almost every continent in just a few months, makes us realize how much we need a practical, humanized platform to expedite the trials for potential antiviral drugs and vaccines," said Yesil-Celiktas.

Considering new research showing one of the aftereffects of COVID-19-related pneumonia is pulmonary fibrosis, the lung-on-chip systems, which focus on lung fibrosis, enable a deeper understanding of disease mechanisms and related immune and technological responses.

Despite the advantages, the lack of vascularity and integration with external immune cells means these organ-chip platforms require the integration of these components to the microchips, not only for viral infections but also for other pathologies.

Going forward, the researchers will integrate modular sensing apparatus to provide online monitoring opportunities along with diagnostic outcomes. Such biomimetic systems also enable high-resolution and real-time imaging, as well as in vitro toxicological analysis or measurements of metabolic activities of living cells.

Credit: 
American Institute of Physics

In-person, telehealth care, costs before, during COVID-19 pandemic

What The Study Did: This study of working-age people enrolled in private health plans from March 2019 through June 2020 documented patterns of care at the onset of COVID-19.

Authors: Jonathan P. Weiner, Dr.P.H., Johns Hopkins Bloomberg School of Public Health in Baltimore, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2021.2618)

Editor's Note: The article includes funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Representation of Black Americans in clinical trials of cardiovascular drugs

What The Study Did: Researchers investigated representation of Black Americans in clinical trials of cardiovascular drugs approved by the U.S. Food and Drug Administration between 2006 and 2020.

Authors: Jiarui Li, M.D., of the Chinese Academy of Medical Sciences and Peking Union Medical College in Beijing, China, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2021.2640)

Editor's Note: Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Dementia death risk is higher among the socioeconomically deprived

A large proportion of dementia deaths in England and Wales may be due to socioeconomic deprivation, according to new research led by Queen Mary University of London.

The team also found that socioeconomic deprivation was associated with younger age at death with dementia, and poorer access to accurate diagnosis.

Dementia is the leading cause of death in England and Wales, even during the COVID pandemic, and is the only disease in the top ten causes of death without effective treatment.

The research, published in the Journal of Alzheimer's Disease, examines Office for National Statistics mortality data for England and Wales, and finds that in 2017, 14,837 excess dementia deaths were attributable to deprivation, equating to 21.5 per cent of all dementia deaths that year. The team also found that the effect of this association appears to be increasing over time.

Corresponding author Dr Charles Marshall from Queen Mary University of London, whose work is funded by Barts Charity, said: "Understanding how we might prevent dementia deaths is especially important. Persistent and widening socioeconomic inequality might be having an unrecognised impact on brain health. Addressing this inequality could be an important strategy to help stem the rising tide of dementia."

Various factors have been hypothesised to mediate the relationship between dementia and socioeconomic deprivation, including education, diet, vascular risk factors, stress and access to healthcare.

It is likely that poorer quality of diagnosis in more deprived patients means that they are being disadvantaged in terms of prognosis, counselling, planning of future care, access to appropriate symptomatic treatments and opportunities to participate in research.

The researchers say that although a direct causal relationship between socioeconomic status and dementia has yet to be established, deprivation could be a major target in public health approaches aimed at reducing the population burden of dementia.

The study has limitations in that it is an observational study, meaning that a causal link between deprivation and dementia cannot be confirmed, and there is a lack of detail on specific dementia subtypes within the ONS data which is likely to lead to incomplete ascertainment of dementia cases.

Credit: 
Queen Mary University of London

Babies prefer baby talk, whether they're learning one language or two

It can be hard to resist lapsing into an exaggerated, singsong tone when you talk to a cute baby. And that's with good reason. Babies will pay more attention to baby talk than regular speech, regardless of which languages they're used to hearing, according to a study by UCLA's Language Acquisition Lab and 16 other labs around the globe.

The study found that babies who were exposed to two languages had a greater interest in infant-directed speech -- that is, an adult speaking baby talk -- than adult-directed speech. Research has already shown that monolingual babies prefer baby talk.

Some parents worry that teaching two languages could mean an infant won't learn to speak on time, but the new study shows bilingual babies are developmentally right on track. The peer-reviewed study, published today by Advances in Methods and Practices in Psychological Science, found bilingual babies became interested in baby talk at the same age as those learning one language.

"Crucially for parents, we found that development of learning and attention is similar in infants, whether they're learning one or two languages," said Megha Sundara, a UCLA linguistics professor and director of the Language Acquisition Lab. "And, of course, learning a language earlier helps you learn it better, so bilingualism is a win-win."

In the study, which took place at 17 labs on four continents, researchers observed 333 bilingual babies and 384 monolingual babies, ranging in age from 6 to 9 months and 12 to 15 months. UCLA's lab was the only one to provide data on bilingual babies who grew up hearing both English and Spanish. Sundara and Victoria Mateu, a UCLA assistant professor of Spanish and Portuguese, observed babies who were 12 to 15 months old.

Each baby would sit on a parent's lap while recordings of an English-speaking mother, using either infant-directed speech or adult-directed speech, played from speakers on the left or the right. Computer tracking measured how long each baby looked in the direction of each sound.

"The longer they looked, the stronger their preference," Mateu said. "Babies tend to pay more attention to the exaggerated sounds of infant-directed speech."

Infants' interest in English baby talk was very fine-tuned, the study noted. Bilingual parents indicated the percent of time English was spoken at home compared to Spanish. The more English the bilingual babies had been exposed to, the stronger their preference for infant-directed speech compared to adult-directed speech. However, even babies with no exposure to English preferred the English baby talk to the grown-up talk, Mateu said.

Baby talk is found across most languages and cultures, but English has one of the most exaggerated forms, Sundara said.

"Baby talk has a slower rate of speech across all languages, with more variable pitch, and it's more animated and happy," she said. "It varies mainly in how exaggerated it is."

Led by Krista Byers-Heinlein, a psychology professor at Concordia University in Montreal, the study involved labs in the United States, Canada, Europe, Australia and Singapore. The study's global reach strengthened the results, Sundara said.

"When you do language research, you want to know that the results aren't just some quirk of the language you're studying," she said.

According to the study, 6- to 9-month-old babies who had mothers with higher levels of education preferred baby talk more than babies whose mothers had less education.

"We suspect that perhaps the mothers with higher education levels spoke more to the babies and used infant-directed speech more often," Mateu said.

This study is one of the first published by the ManyBabies Consortium, a multi-lab group of researchers. Byers-Heinlein believes the unusual international, multilingual collaboration creates a model for future studies that include a similar breadth of languages and cultures.

"We can really make progress in understanding bilingualism, and especially the variability of bilingualism, thanks to our access to all these different communities," she said.

As the research continues, parents can babble to their babies in one language or two, and rest easy knowing they won't cause any confusion.

Credit: 
University of California - Los Angeles