Culture

First ever 'pioneer' factor found in plants enables cells to change their fate

image: Using an experimental technique whereby flowers can be coaxed to form from plant roots, biologists led by the University of Pennsylvania's Doris Wagner uncovered a protein that enables for the initial loosening of chromatin that can allow new proteins to be made and plants to take on different forms.

Image: 
University of Pennsylvania

Cells don't express all the genes they contain all the time. The portion of our genome that encodes eye color, for example, doesn't need to be turned on in liver cells. In plants, genes encoding the structure of a flower can be turned off in cells that will form a leaf.

These unneeded genes are kept from becoming active by being stowed in dense chromatin, a tightly packed bundle of genetic material laced with proteins.

In a new study in the journal Nature Communications, biologists from the University of Pennsylvania identify a protein that enables plant cells to reach these otherwise inaccessible genes in order to switch between different identities. Called a "pioneer transcription factor," the LEAFY protein gets a foothold in particular portions of the chromatin bundle, loosening the structure and recruiting other proteins that eventually allow genes to first be transcribed into RNA and then translated into proteins.

"The programs that are not needed in a given cell or tissue or condition are effectively shut off by various chromatin modifications that make them very inaccessible," says biologist Doris Wagner of the School of Arts & Sciences, senior author on the work. "The question has always been, How do you go from shut to open? We found that LEAFY, this protein that we already knew was important in reprogramming plant cells, is one of these pioneer transcription factors that get a foot in the door, as it were, to alter the program of cells."

Pioneer transcription factors were first characterized by Penn faculty member Kenneth Zaret of the Perelman School of Medicine, whose own work has examined these regulatory proteins in animals, such as in the context of liver development. Early in her time at Penn, Wagner heard Zaret give a talk about his work in this area and grew curious about looking for similar factors in plants, given that flexible gene expression is so critical to their survival.

Indeed, plants must switch between expressing whole sets of different genes all the time. In rich soils, they may grow more branches to get bigger, while in a drought they may express more genes associated with developing flowers, so they can set seed and reproduce before they succumb.

How plant cells determine their identity and fate has been a focus of Wagner's work since the start of her career, and so has LEAFY. During her postdoc days, Wagner showed that LEAFY could reprogram root cells to produce flowers. "That gave us a good clue that LEAFY might have this 'pioneer' activity, but we had to look more closely to prove it," she says.

To do so, Wagner and colleagues first used isolated protein and strands of genetic material to show that LEAFY, though not other transcription factors, bound to nucleosomes, subunits of chromatin where DNA spools on a cluster of proteins called histones. Specifically, the binding occurred at the gene AP1, which is known to be activated by LEAFY to prompt plants to make flowers.

To confirm that this connection was true in a living organism, the researchers took plant roots and applied a compound that causes them to flower spontaneously. When flowering, they found that not only did LEAFY bind strongly to AP1 but that the binding site was also occupied by a histone. "This tells us that the histones and LEAFY are really occupying the same portion of DNA," Wagner says.

Furthermore, they showed that chromatin structure began to open up at the AP1 region when LEAFY was activated, a key facet of what pioneer transcription factors do. This opening was limited, and full loosening of chromatin took days. What did happen quickly, the researchers found, was that LEAFY displaced a linker histone protein, creating a small local opening that also allowed other transcription factors to nose their way into the DNA.

Though pioneer transcription factors had been proposed to exist in plants, the new work provides the first concrete support backing this conception for LEAFY. And Wagner believes there are others. "If necessary, plants can alter their entire body plan or generate an entire plant from a little piece of leaf," she says. "We predict setting this in motion will require pioneer transcription factors. So plants may actually have more of these factors than animals."

In upcoming work, she and her team hope to delve more deeply into the processes that precede and follow this "pioneering" activity of LEAFY: Does anything restrict its activity and how do the other factors that it recruits fully unpack the hidden-away genes? "It would be great to find out both sides of this equation," Wagner says.

The findings have significance in agriculture and breeding, where LEAFY is already manipulated to encourage earlier flowering, for example. And as more is understood about pioneer transcription factors in plants, Wagner can envision a fine tuning of other aspects of plant growth and activity, which could be leveraged to help crops adapt to new environmental conditions, such as those being ushered in by climate change.

Credit: 
University of Pennsylvania

Fields of breeders' dreams: A team effort toward targeted crop improvements

image: Harvesting switchgrass in Texas under field rainout shelters for drought tolerance studies. This image complements a Nature paper announcing the release of a high-quality reference sequence of the complex switchgrass genome. The work was led by researchers at the University of Texas (UT) at Austin, the HudsonAlpha Institute for Biotechnology (HudsonAlpha), and the U.S. Department of Energy (DOE) Joint Genome Institute (JGI), a DOE Office of Science User Facility located at Lawrence Berkeley National Laboratory (Berkeley Lab).

Image: 
David Lowry

Gardeners and farmers around the country recognize that crop varieties grow best in certain regions. Most plant species have adapted to their local environments; for example, crop and ornamental seeds sold for the upper Midwest are often very different than those bred for Texas. Identifying and breeding varieties that have high productivity across a range of environments is becoming increasingly important for food, fuel and other applications, and breeders aren't interested in waiting decades to develop new crops.

One example is an ongoing collaborative effort to improve the emerging bioenergy crop switchgrass (Panicum virgatum), which has established 10 experimental gardens located in eight states spread across 1,100 miles. Switchgrass is a perennial grass that quickly grows in a variety of soils and water conditions, standing taller than basketball star LeBron James. In each garden, switchgrass plants clonally propagated from cuttings represent a diverse collection sourced from half of the United States.

As reported January 27, 2021 in Nature, the team led by researchers at the University of Texas (UT) at Austin, the HudsonAlpha Institute for Biotechnology (HudsonAlpha), and the U.S. Department of Energy (DOE) Joint Genome Institute (JGI), a DOE Office of Science User Facility located at Lawrence Berkeley National Laboratory (Berkeley Lab), has produced a high-quality reference sequence of the complex switchgrass genome using samples collected at these gardens. Building off this work, researchers at all four DOE Bioenergy Research Centers (BRCs)--the Great Lakes Bioenergy Research Center (GLBRC), the Center for Bioenergy Innovation, the Center for Advanced Bioenergy & Bioproducts Institute, and the Joint BioEnergy Institute--have expanded the network of common gardens and are exploring improvements to switchgrass through more targeted genome editing techniques to customize the crop for additional end products.

The genetic diversity within this set of plants, each with a fully-sequenced genome, and these gardens allow researchers to test what genes affect the plant's adaptability to various environmental conditions. "To accelerate breeding for bioenergy, we need to make connections between the plant's traits and genetic diversity," said John Lovell, an evolutionary biologist at HudsonAlpha and first author of the study. "For that, it's necessary to have the plant's genome as a reference. Additionally, having the gardens as a resource helps breeders find genetic regions of interest." The combination of field data and genetic information has allowed the research team to associate climate adaptations with switchgrass biology, information that could be useful toward the DOE's interest in harnessing the crop as a versatile candidate biomass feedstock for producing sustainable alternative fuels.

Common Gardens Are A Community Effort

The common gardens began nearly a decade ago with a proposal from UT-Austin's Tom Juenger, a longtime JGI collaborator and a senior author on this study. The use of switchgrass as a feedstock for biomass-based fuels was initially fostered by DOE's Bioenergy Research Centers, which initiated the sequencing of the switchgrass genome. DOE's Billion Ton Report, identified potential switchgrass production areas across the U.S., guiding the location of the common gardens. "Gardeners and farmers fully understand that when you move plants outside of their native habitat or cold hardiness zones, they have different levels of performance," Juenger said. "The novelty here is that we're trying to actually figure out what's causing those differences rather than just observing them. Can we quantify them? Can we tie them to the genome? We can use common garden plantings of clonally propagated plants to address these questions."

Multiple collection methods were applied to gather the diversity of switchgrass plants represented in the gardens. "Tom gave me a truck and I drove all over Texas with a shovel," recalled study co-author David Lowry, who started as a postdoctoral fellow in the Juenger lab and continues to work on the project from a lab at Michigan State University that is affiliated with the GLBRC. Additional samples came from U.S. Department of Agriculture stock centers, collaborators, and collections at other field sites. "This paper is a combination of really cutting-edge genomics and genetic analysis with large scale data collection," he added.

Jeremy Schmutz, head of the JGI Plant Program, drew parallels between these common gardens and those previously grown for the DOE candidate feedstock poplar. "You're collecting natural diversity and you're planting natural diversity in multiple locations, and then you are extracting links between the genetic variation and phenotypic performance," he said. Both switchgrass and poplar are JGI Flagship Plants.

Reaping Long-Term Investment Benefits

Switchgrass has a large polyploid genome, which means most genes are found as multiple copies across the chromosomes. "In the past, we needed model systems to test genetic hypotheses in species with large and complex genomes," said Lovell. "However, new sequencing technologies have allowed us to build the necessary genome resources to directly test for genes involved in biomass yield and climate adaptation in switchgrass, despite its physical size and genome complexity."

Work on the switchgrass genome sequence started more than a decade ago. As sequencing technologies have advanced, assembly and annotation of the genome sequence has improved in parallel. For example, the current version of the genome is assembled into sequences of 5.5 million basepair (bp) in length, while the previous version had an average of 25,000 bp pieces. That's the difference between assembling a 10,000-piece puzzle and doing the same puzzle with just 50 pieces.

The combination of new genetic tools and experimental gardens allow researchers to detect climate-gene matches, which can be exploited for accelerated crop improvement. "Because of the DOE's long-term investment and the effort that has gone into this, people are going to be able to model further research on this complex species and also at the same time, take advantage of what we can do now with genomics to really make inroads into plant biology and improvements of switchgrass as the crop species," Schmutz said.

The switchgrass genotypes that were planted into the common gardens were sequenced and assembled by the JGI, allowing the research team to conduct association mapping, linking genes to traits. One of the team's findings is that the performance of switchgrass across the garden sites depended on the origin or collection location of the individual switchgrass plants. They were able to identify many regions in the switchgrass genome that are associated with genetic differences that lead to productivity in different environments.

For example, many plants collected from native habitats in Texas and other southern locales did not survive the cold winter of 2019 at the most northern common garden in South Dakota. Conversely, upper Midwest native switchgrass plants performed poorly at the southern common gardens in Texas. This reciprocal home site advantage is direct evidence of climatic adaptation. The team's database of genes that underlie adaptation to climate provides breeders with a strong foundation to improve crop productivity under specific climates.

Sourcing plants from so many parts of the country also helped the team understand why some switchgrass plants from the Northeast have traits similar to those from the Midwest, even though their genomes were very different.

The high quality reference genome sequence of switchgrass is available on the JGI plant data portal Phytozome. This version can help breeders identify genomic regions of interest and directly introduce these features into new crop varieties. "It's going to be important to have all this information in order to facilitate breeding going forward," noted Lowry.

The team has received additional DOE funding to continue maintaining the gardens, which excites Juenger. "There will be a continuation of collecting data and information from these existing plantings, and then trying to leverage these discoveries to better understand how plants tolerate stresses and challenges in the natural environment," he said. "There aren't many efforts that have been able to study native perennial plants with these genetic and genomic resources, interweaved with this long longitudinal study perspective. Although it's been this enormous investment to set up these gardens, we have them to study for a number of years. And that's a real benefit for the research program."

Credit: 
DOE/Lawrence Berkeley National Laboratory

Focusing on field analysis

image: Integration of polymeric membrane/dielectric sphere assemblies in microfluidic chips for enhanced-contrast imaging with low-magnification systems, doi 10.1117/1.JOM.1.1.014001.

Image: 
Viri et al.

The development of cost-efficient, portable microscopy units would greatly expand their use in remote field locations and in places with fewer resources, potentially leading to easier on-site analysis of contaminants such as E. coli in water sources as well as other practical applications.

Current microscopy systems, like those used to image micro-organisms, are expensive because they are optimized for maximum resolution and minimal deformation of the images the systems produce. But some situations do not require such optimization--for instance, simply detecting the presence of pathogens in water. One potential approach to developing a low-cost portable microscopy system is to use transparent microspheres in combination with affordable low-magnification objective lenses to increase image resolution and sensitivity.

A group of researchers from Ecole Polytechnique Federale de Lausanne (EPFL) in Switzerland published a study on such an assembly composed of barium titanate spheres that are partially embedded in thin polymeric membranes. The result of their work, appearing in SPIE's new Journal of Optical Microsystems, is a proposed method to fabricate microfluidic chips using the assembly for enhanced detection of bacteria. Such customized chips with fluidic and optical components already integrated have many benefits when combined with portable low-end imagers for analyses at remote sites or in resource-limited regions.

"Cost reduction and portability are of benefit to the proliferation of analytical devices, especially in limited-resource contexts, and the integration of affordable micro-optical elements directly onto microfluidic chips can highly contribute to this," said Martin Gijs, a professor at EPFL and an author of the published work.

The assembly's ability to enhance bacteria detection paves the way for other applications friendly to use at remote sites. Additionally, the researchers revealed an opportunity to customize specific functional microfluidic elements. Such integrations could bring to fruition applications such as on-site antibiotic testing.

Given falling costs of the components and fabrication methods, the researchers' proposed fabrication protocol could be adapted easily for a wide variety of microfluidic chips with integrated optical elements. Considered along with the lower cost of low-end imaging systems, the approach could sharply increase the use of such microscopy systems in low-resource locations for on-site analyses.

Credit: 
SPIE--International Society for Optics and Photonics

Purported phosphine on Venus more likely to be ordinary sulfur dioxide, new study shows

image: An image of Venus compiled using data from the Mariner 10 spacecraft in 1974.

URL: https://solarsystem.nasa.gov/resources/2524/newly-processed-views-of-ven...

Image: 
NASA/JPL-Caltech

In September, a team led by astronomers in the United Kingdom announced that they had detected the chemical phosphine in the thick clouds of Venus. The team's reported detection, based on observations by two Earth-based radio telescopes, surprised many Venus experts. Earth's atmosphere contains small amounts of phosphine, which may be produced by life. Phosphine on Venus generated buzz that the planet, often succinctly touted as a "hellscape," could somehow harbor life within its acidic clouds.

Since that initial claim, other science teams have cast doubt on the reliability of the phosphine detection. Now, a team led by researchers at the University of Washington has used a robust model of the conditions within the atmosphere of Venus to revisit and comprehensively reinterpret the radio telescope observations underlying the initial phosphine claim. As they report in a paper accepted to the Astrophysical Journal and posted Jan. 25 to the preprint site arXiv, the U.K.-led group likely wasn't detecting phosphine at all.

"Instead of phosphine in the clouds of Venus, the data are consistent with an alternative hypothesis: They were detecting sulfur dioxide," said co-author Victoria Meadows, a UW professor of astronomy. "Sulfur dioxide is the third-most-common chemical compound in Venus' atmosphere, and it is not considered a sign of life."

The team behind the new study also includes scientists at NASA's Caltech-based Jet Propulsion Laboratory, the NASA Goddard Space Flight Center, the Georgia Institute of Technology, the NASA Ames Research Center and the University of California, Riverside.

The UW-led team shows that sulfur dioxide, at levels plausible for Venus, can not only explain the observations but is also more consistent with what astronomers know of the planet's atmosphere and its punishing chemical environment, which includes clouds of sulfuric acid. In addition, the researchers show that the initial signal originated not in the planet's cloud layer, but far above it, in an upper layer of Venus' atmosphere where phosphine molecules would be destroyed within seconds. This lends more support to the hypothesis that sulfur dioxide produced the signal.

Both the purported phosphine signal and this new interpretation of the data center on radio astronomy. Every chemical compound absorbs unique wavelengths of the electromagnetic spectrum, which includes radio waves, X-rays and visible light. Astronomers use radio waves, light and other emissions from planets to learn about their chemical composition, among other properties.

In 2017 using the James Clerk Maxwell Telescope, or JCMT, the U.K.-led team discovered a feature in the radio emissions from Venus at 266.94 gigahertz. Both phosphine and sulfur dioxide absorb radio waves near that frequency. To differentiate between the two, in 2019 the same team obtained follow-up observations of Venus using the Atacama Large Millimeter/submillimeter Array, or ALMA. Their analysis of ALMA observations at frequencies where only sulfur dioxide absorbs led the team to conclude that sulfur dioxide levels in Venus were too low to account for the signal at 266.94 gigahertz, and that it must instead be coming from phosphine.

In this new study by the UW-led group, the researchers started by modeling conditions within Venus' atmosphere, and using that as a basis to comprehensively interpret the features that were seen -- and not seen -- in the JCMT and ALMA datasets.

"This is what's known as a radiative transfer model, and it incorporates data from several decades' worth of observations of Venus from multiple sources, including observatories here on Earth and spacecraft missions like Venus Express," said lead author Andrew Lincowski, a researcher with the UW Department of Astronomy.

The team used that model to simulate signals from phosphine and sulfur dioxide for different levels of Venus' atmosphere, and how those signals would be picked up by the JCMT and ALMA in their 2017 and 2019 configurations. Based on the shape of the 266.94-gigahertz signal picked up by the JCMT, the absorption was not coming from Venus' cloud layer, the team reports. Instead, most of the observed signal originated some 50 or more miles above the surface, in Venus' mesosphere. At that altitude, harsh chemicals and ultraviolet radiation would shred phosphine molecules within seconds.

"Phosphine in the mesosphere is even more fragile than phosphine in Venus' clouds," said Meadows. "If the JCMT signal were from phosphine in the mesosphere, then to account for the strength of the signal and the compound's sub-second lifetime at that altitude, phosphine would have to be delivered to the mesosphere at about 100 times the rate that oxygen is pumped into Earth's atmosphere by photosynthesis."

The researchers also discovered that the ALMA data likely significantly underestimated the amount of sulfur dioxide in Venus' atmosphere, an observation that the U.K.-led team had used to assert that the bulk of the 266.94-gigahertz signal was from phosphine.

"The antenna configuration of ALMA at the time of the 2019 observations has an undesirable side effect: The signals from gases that can be found nearly everywhere in Venus' atmosphere -- like sulfur dioxide -- give off weaker signals than gases distributed over a smaller scale," said co-author Alex Akins, a researcher at the Jet Propulsion Laboratory.

This phenomenon, known as spectral line dilution, would not have affected the JCMT observations, leading to an underestimate of how much sulfur dioxide was being seen by JCMT.

"They inferred a low detection of sulfur dioxide because of that artificially weak signal from ALMA," said Lincowski. "But our modeling suggests that the line-diluted ALMA data would have still been consistent with typical or even large amounts of Venus sulfur dioxide, which could fully explain the observed JCMT signal."

"When this new discovery was announced, the reported low sulfur dioxide abundance was at odds with what we already know about Venus and its clouds," said Meadows. "Our new work provides a complete framework that shows how typical amounts of sulfur dioxide in the Venus mesosphere can explain both the signal detections, and non-detections, in the JCMT and ALMA data, without the need for phosphine."

With science teams around the world following up with fresh observations of Earth's cloud-shrouded neighbor, this new study provides an alternative explanation to the claim that something geologically, chemically or biologically must be generating phosphine in the clouds. But though this signal appears to have a more straightforward explanation -- with a toxic atmosphere, bone-crushing pressure and some of our solar system's hottest temperatures outside of the sun -- Venus remains a world of mysteries, with much left for us to explore.

Credit: 
University of Washington

Melatonin produced in the lungs prevents infection by novel coronavirus

image: The hormone acts as a barrier against SARS-CoV-2, blocking the expression of genes that encode proteins in cells serving as viral entry points, according to a study by researchers at the University of São Paulo

Image: 
NIAD/NIH

By Elton Alisson  |  Agência FAPESP – Melatonin synthesized in the lungs acts as a barrier against SARS-CoV-2, preventing expression of genes that encode proteins in cells such as resident macrophages in the nose and pulmonary alveoli, and epithelial cells lining the alveoli, all of which are entry points for the virus. The hormone, therefore, prevents infection of these cells by the virus and inhibits the immune response so that the virus remains in the respiratory tract for a few days, eventually leaving to find another host.

The discovery by researchers at the University of São Paulo (USP), in Brazil, helps understand why some people are not infected or do not manifest symptoms of COVID-19 even when reliably diagnosed as carriers of the virus by RT-PCR. In addition, it offers the prospect of nasal administration of melatonin, in drops or as a spray, to prevent disease from developing in pre-symptomatic patients. 

Pre-clinical and clinical trials will be needed to prove the therapeutic efficacy of melatonin against the virus, the researchers stress in an article on the study published in the journal Melatonin Research

The study was supported by FAPESP.

“We showed that melatonin produced in the lung acts as a barrier against SARS-CoV-2, preventing the virus from entering the epithelium, activating the immune system and triggering the production of antibodies,” Regina Pekelmann Markus, a professor at USP’s Institute of Biosciences (IB) and principal investigator for the project, told Agência FAPESP.

“This action mechanism by pulmonary melatonin must also involve other respiratory viruses such as influenza,” she added.

Markus began researching melatonin in the 1990s. In a study involving rodents, she showed that the hormone, produced at night by the pineal gland in the brain to tell the organism daylight has gone and it should prepare for sleep, can be produced in other organs, such as the lungs.

In a study also involving rodents, published in early 2020 in the Journal of Pineal Research, Markus and collaborators showed that resident macrophages in the pulmonary airspace absorb (phagocytize) particles of pollution. This aggressive stimulus induced the production of melatonin and other molecules by the macrophages, engulfing the particulate matter in the air breathed in by the animals and stimulating mucous formation, coughing, and expectoration to expel the particles from the respiratory tract.

When they blocked melatonin synthesis by resident macrophages, the researchers observed that the particles entered the bloodstream and spread throughout the organism, even invading the brain.

Based on the finding that melatonin produced in the lungs altered the entry points for particulate matter from air pollution, Markus and collaborators decided to investigate whether the hormone performed the same function with regard to SARS-CoV-2. “If so, the virus wouldn’t be able to bind to the ACE-2 receptor on cells, enter the epithelium and infect the organism,” Markus said.

Analysis of gene expression

To test this hypothesis, the researchers analyzed 455 genes associated in the literature with COVID-19 comorbidities, interaction between SARS-CoV-2 and human proteins, and viral entry points. The genes had been identified in studies conducted, among others, by Helder Nakaya, a professor at USP’s School of Pharmaceutical Sciences (FCF) and a co-author of the study on lung melatonin. 

From this group of genes, they selected 212 genes involved in viral cell entry, intracellular traffic, mitochondrial activity, and transcription and post-translation processes, to create a physiological signature of COVID-19.

Using RNA sequencing data downloaded from a public database, they quantified the level of expression of the 212 COVID-19 signature genes in 288 samples from healthy human lungs.

They then correlated these gene expression levels with a gene index that estimated the capacity of the lungs to synthesize melatonin (MEL-Index), based on their analysis of the lungs in healthy rodents. They found that the lower the index the higher the level of expression of genes that encode proteins for resident macrophages and epithelial cells.

The index also correlated negatively with genes that modify proteins in cell receptor CD147, a viral entry point in macrophages and other immune cells, indicating that normal lung melatonin production may be a natural protector against the virus.

The results were corroborated by three statistical techniques: the Pearson test, which measures the degree of linear correlation between two variables; a gene set enrichment analysis; and a network analysis tool that maps the connections among the most expressed genes so as to compare the same set of genes in different states. The latter was developed by Marcos Buckeridge, a professor at IB-USP and also a co-author of the study.

“We found that when MEL-Index was high the entry points for the virus in the lungs were closed, and when it was low these ‘doors’ were open. When the doors are shut, the virus wanders around for a time in the pulmonary airspace and then tries to escape in search of another host,” Markus said.

Because lung melatonin inhibits transcription of these genes that encode proteins for viral entry point cells, application of melatonin directly into the lungs in the form of drops or spray could block the virus. More research is required to prove that this is indeed the case, however, the researchers note. 

Another idea could be to use MEL-Index, the pulmonary melatonin metric, as a prognostic biomarker to detect asymptomatic carriers of SARS-CoV-2.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Study reveals precarious employment on the rise long before COVID-19

A study led by a University of Illinois Chicago researcher uses a new approach to measure precarious, or low-quality, employment in the United States. And, according to those findings, precarious employment has increased 9% between 1988 and 2016.

Precarious employment, or P.E., is defined as low-quality employment, which is often characterized by low wages, job insecurity and irregular hours, making employment risky and stressful for the worker.

In her study, "Changes in precarious employment in the United States: A longitudinal analysis," Vanessa Oddo, assistant professor in UIC's School of Applied Health Sciences, sought to create a multidimensional and continuous measure of P.E. in the U.S. She also set out to describe changes in precarious employment over time, both overall and within subgroups. The paper is published in the Scandinavian Journal of Work, Environment & Health.

A better understanding of long-term trends is a critical first step for informing future policies aimed at improving P.E. and population health in the U.S., Oddo said.

Previously, the focus for measuring P.E. was on wages, hours and union membership. For this longitudinal study, she expanded the measurement criteria to add P.E. indicators including:

Material rewards -- the wage and non-wage benefits afforded by employment.

Working-time arrangements -- the length and intensity of working hours, underemployment and schedule predictability.

Employment stability -- employment continuity, contractual temporariness and/or organizational changes (e.g., downsizing).

Workers' rights -- describes welfare state provisions associated with employment, such as access to health insurance or pensions.

Collective organization -- the possibilities (or lack thereof) for employee representation, most commonly measured through union representation.

Interpersonal relations -- employees' power relative to management (e.g., their ability to make decisions or control their schedule) and can include exposure to discrimination.

Training opportunities -- opportunities for promotion or to enhance skills.

Characterizing trends in P.E. using a multidimensional indicator is critical given that employment quality is increasingly recognized as a social determinant of health, according to Oddo.

P.E. can result in insufficient income, which compromises access to food and other necessities; greater exposure to adverse physical working conditions, such as toxic exposure, and limited control over both personal and professional lives, leading to stress.

"Importantly, poor employment quality may be contributing to widening health inequities, as women, people with lower education levels, and minorities have a higher prevalence of P.E.," Oddo said.

The research revealed P.E. score was significantly higher among people of color, women, people with lower levels of education and people with lower income. Between 1988 and 2016, the overall P.E. scores significantly increased indicating worsening employment quality over time.

However, the study showed the largest increases in P.E. among males, people with a college education, and higher-income individuals.

"These results suggest long-term decreases in employment quality are widespread in the U.S., rather than just confined to marginalized segments of the labor market," Oddo said.

According to the study, the largest change over time in employment precarity among males and college-educated and higher-income individuals could be because their P.E. score was lower at the study's beginning in 1988, leaving a greater opportunity for declines. Additionally, the large increase in P.E. among males may also be due to the declining rate of union membership in the U.S., as union membership is associated with better employment quality and, historically, was more common among males.

Oddo said P.E. has been studied more broadly after the 2008 recession when employment quality worsened and there was a notable shift toward contract work and the emergence of the gig economy. She added that there is speculation as to how the COVID-19 pandemic will affect P.E., both during the pandemic and after when work from home measures are lifted.

A holistic approach to studying P.E. is important in the future as data can inform employment policy decisions. For example, a better understanding of P.E. in the U.S. may be helpful for informing future policies around secure scheduling (i.e. advanced notice of schedules) or gig work, like California's Assembly Bill 5, which changed the rules employers must use to determine whether workers are employees or independent. The distinction is important because independent contractors are not entitled to?most of the protections and benefits?that employees get, Oddo explained.

Also, precarious employment could slow our ability to get back to work after COVID-19, as precariously employed individuals could face additional barriers to COVID-19 vaccination; for example, if they are undocumented workers or independent contractors.

Credit: 
University of Illinois Chicago

New report charts path toward superior earthquake recovery

For the last century, seismic building codes and practices have primarily focused on saving lives by reducing the likelihood of significant damage or structural collapse. Recovery of critical functions provided by buildings and infrastructure have received less attention, however. As a result, many remain vulnerable to being knocked out of service by an earthquake for months, years or for good.

A committee of experts, formed by the National Institute of Standards and Technology (NIST) and the Federal Emergency Management Agency (FEMA) under the direction of Congress, has urged officials at all levels of government to support research and policies that could help get the buildings and services society depends on up and running quickly after an earthquake. In a report delivered to Congress, the committee outlines seven recommendations that, if acted upon, may greatly improve the resilience of communities across the nation.

"As structural engineers we feel confident that the current building codes can deliver life safety design objectives. Now, it's time to go beyond that and think about recovery of function," said Siamak Sattar, a NIST structural engineer and co-author of the report.

In 2011, a magnitude 6.3 earthquake struck Christchurch, New Zealand. Over 180 lives were lost as a result, but many more were likely saved by modern building codes. However, the city's economy and quality of life were not spared.

The quake damaged the city's central business district to the point that hundreds of buildings were closed or demolished, displacing thousands of workers. Lifeline infrastructure systems -- including power, clean water and roads -- sustained heavy damage, further crippling the community's ability to bounce back. In total, the estimated costs of rebuilding the city amounted to 40 billion New Zealand dollars ($26.6 billion).

The toll taken by the Christchurch earthquake and other damaging events can in part be attributed to limitations in seismic codes and standards, as most offer little guidance on designing buildings or lifelines to recover in a timely manner in the wake of extreme events.

To prevent major earthquakes from leaving such lasting impressions in the future, Congress entrusted NIST and FEMA -- both member agencies of the National Earthquake Hazards Reduction Program (NEHRP), which NIST leads -- with the responsibility of mapping a path to greater community resilience.

Drawing expertise from both public and private sectors, NIST and FEMA assembled a committee of more than 30 engineers, architects, building owners, code officials and social scientists -- including several of their own researchers -- to devise options for addressing gaps in codes, standards and practices, which are described in their report to Congress.

The first recommendation summarizes the core of the report. The authors call for members of the government, codes and standards organizations and industry to work together in developing a national framework for setting and achieving goals based on recovery time. To produce this framework, experts must first identify what level of function provided by buildings and lifelines should be maintained after an earthquake, and then determine an acceptable time for them to be out of commission.

"There are different metrics that we can use to help guide this framework. For example, a building may need to recover within a predefined number of days, weeks or months. If it is a hospital or emergency center then you may not want it to go down at all," said Steve McCabe, director of NEHRP.

The authors also highlight the need for new recovery-based design criteria for buildings and lifelines. If developed with recovery in mind, these criteria could steer design parameters -- such as increasing a school's structural strength to limit damage or designing an electrical power supply to return to service faster -- toward improving community resilience. A critical phase of this process would be identifying the level of ground shaking that designs should be tailored to for recovery goals, which may vary by region.

Other recommendations seek to help leaders meet recovery goals aligned with the first recommendation, offering guidance on implementing new design requirements for buildings and lifelines. They also provide direction for pre-disaster planning -- a key step in preparing authorities to make timely decisions in the immediate aftermath of a disaster.

The authors seek to empower communities as well by recommending the launch of an education campaign on earthquake risk and recovery, which could reach the public through social media, streaming services or other media.

"Informed citizens are an important resource needed to develop the kind of vision required for this effort, which may well represent the largest change in building codes in 75 years," McCabe said.

In the report, the authors encourage officials to consider adopting functional recovery approaches that go beyond the current requirements. They assert that the initial investments of adopting new recovery-focused codes and upgrading older buildings and lifelines could likely be offset by the reduction of future losses. They also suggest that increased access to financial resources through mechanisms such as grant programs, incentive systems and public financing would help local governments scale the upfront costs.

"The immediate aim of the report is to spark a national conversation about developing a consensus for recovery goals and timelines. This approach may eventually be reflected in building codes, but first, a considerable amount of research must be tackled," Sattar said.

New policies could make use of the NEHRP agencies, such as NIST and FEMA, whose expertise may enable them to provide the necessary science for sound public policy.

The road toward this goal could take years to traverse, but it is critical.

In the meantime, the authors encourage early action by leaders at state and local levels, as each community may have needs that national guidelines cannot fully address. Their experiences with functional recovery planning and design could also make for valuable feedback at the national level, speeding up progress toward widespread earthquake resilience that preserves quality of life in addition to life itself.

Credit: 
National Institute of Standards and Technology (NIST)

Study: Sudden police layoffs in one US city associated with increases in crime

Amid a sharp economic downturn in 2008, police departments around the United States experienced budget shortfalls that required them to enact cutbacks. A new study examined the effects on crime of budget shortfalls in two New Jersey cities--one of which laid off more than 10 percent of its police force while the other averted layoffs. The study found that the police layoffs were associated with significant increases in overall crime, violent crime, and property crime.

The study, by researchers at John Jay College of Criminal Justice and Rutgers University, appears in Justice Evaluation Journal, a publication of the Academy of Criminal Justice Sciences.

"Our study suggests that sudden and drastic reductions in the size of a police force via layoffs of police officers can generate significant increases in crime," explains Eric Piza, associate professor at John Jay College of Criminal Justice, who led the study. "In Newark, this meant approximately 110 additional violent crimes and 100 additional property crimes per month."

The study examined New Jersey's two largest police forces: the Newark Police Department, which released 13 percent of its police force on one day in 2010, and the Jersey City Police Department, which averted layoffs by reducing the amount of a previously requested raise and increasing the copay for medical prescriptions for officers. In addition, between 2012 to 2015, Newark's police department lost officers to attrition and did not hire new personnel, while Jersey City's department added more officers.

Prior studies on the effect of decreases in the size of police forces on crime have considered incremental reductions, finding less of an impact on crime, while this study looked at a sudden cutback.

Crime in both New Jersey cities had decreased prior to 2008.

Researchers used monthly crime counts from 2006 to 2015 to measure the effects of Newark's layoffs on crime and compared them with crime in Jersey City.

The authors found significant increases in overall crime, violent crime (murder, robbery, and aggravated assault), and property crime (burglary, larceny theft, and motor vehicle theft) in Newark, with overall and violent crime becoming progressively more pronounced each year. In contrast, in Jersey City, violent crime decreased steadily over the 10 years of the study, while property crime peaked in early 2009, but declined steadily afterwards.

The termination of police officers requires the remaining officers to do more with less, the authors note. In addition, letting officers go may force police to discontinue evidence-based crime prevention practices, which may impact crime levels.

The study's authors note several limitations of their work, including that the crime data they used, which is the primary source of such information in the United States, provides an incomplete picture of crime. Also, researchers did not interview police officers or Newark officials, which the authors suggest would have added context to their findings. Lastly, while pre-layoff crime trends in Newark were not significantly different than Jersey City, it is difficult to determine whether the cities would have maintained similar crime trends if layoffs didn't occur (which is a key assumption of evaluation research).

"As police departments determine whether to phase out specialized units to meet budgetary constraints and to enact reforms that may reduce budgets and size of police forces, our findings can inform the national debate over the impact of such actions on crime," notes Vijay Chillar, a Ph.D. student at Rutgers University, who coauthored the study.

Credit: 
Crime and Justice Research Alliance

Confirmed improvement in first responders' brain health after shortened training protocol

DALLAS (January 26, 2021) - Many people believe that they can't change their brains, or that their brain health will inevitably decline as they age. But the Strategic Memory Advanced Reasoning Tactics (SMART) training protocol, created by researchers and clinicians at the Center for BrainHealth®, has been demonstrated over the past two decades to improve cognitive function and psychological well-being in laboratory participants. Recent research suggests that SMART can even make long-lasting improvements to people's brain health when given outside of the lab in short, informal training sessions.

A paper detailing these findings was recently published in Military Medicine. The research was a collaboration between Leanne R. Young, PhD, of Applied Research Associates, Inc. and researchers from the Center for BrainHealth, led by Sandra Bond Chapman, PhD, founder and chief director, and Jennifer Zientz, MS, CCC/SLP, head of clinical services.

Participants included 74 police officers and 425 total veterans, reservists, National Guard members and active-duty soldiers. First responders like them face uncertainty and stress every day at their jobs and have demanding, unpredictable schedules, so SMART - normally delivered to participants in the lab over a 12-week period - was tailored to their schedules.

The researchers wanted to test whether shorter workshops outside of the lab could have similar effects on participants' brain health. Trained clinicians delivered between six and ten hours of SMART over two to three days.

As expected, participants' overall cognitive function improved after SMART; for instance, both police officers and military personnel demonstrated an average of 20% improvement in innovative thinking skills. Surprisingly, SMART improved participants' integrated reasoning abilities by about 20% if they were military personnel, but not if they were police officers. And SMART improved participants' strategic attention by about 16% if they were police officers, but not if they were military members or veterans.

These differences in how SMART affects participants' cognitive function might be due to differences in their jobs. "Our brains are really driven by our own experiences, and how you use your brain is what makes you good at some things," said Zientz.

Still, the results suggest that various modes of SMART delivery can positively impact people's brain health. "Even in a training that doesn't take an inordinate amount of time, people can learn information, apply the information to their daily lives, and see a positive benefit from it," she continued.

SMART empowered the participants to take charge of their own brain health, including their psychological well-being. Four months after training, military personnel reported less stress and depressive symptoms, as well as more satisfaction and resilience in their lives. This suggests that SMART can produce lasting results when applied in short programs outside of the lab.

Young believes that SMART could even help save lives. "With the rising rates of suicide, both in civilian and military populations, perhaps the most exciting aspect of the study is the impact of the cognitive training on psychological health. I look forward to seeing this work continue as the military builds its arsenal of weapons against anxiety, stress and depression," said Young.

Credit: 
Center for BrainHealth

Hypnotic suggestions can make a complex task easy by helping vision fill in the blanks

Popular folklore and anecdotal evidence suggest that people in a hypnotic or suggestible state can experience sensory hallucinations, such as perceiving sounds and sights that are not actually there. Reliable scientific evidence of these experiences, however, has been notoriously challenging to obtain because of their subjective nature.

New research published in the journal Psychological Science provides compelling evidence that hypnotic suggestions can help highly susceptible people "see" imaginary objects, equipping them with the missing details needed to solve an otherwise challenging visual puzzle.

"Hypnosis holds intriguing effects on human behavior," said Amir Raz, a researcher at McGill University and coauthor on the paper. "The careful, systematic study of hypnotic phenomena can answer important questions about mind-body interactions and advance novel therapies in medicine, psychology, and dentistry."

For their research, Raz and his colleagues divided 32 participants into two groups: those who were found to be highly hypnotizable and those who were less suggestible. The participants viewed an array of disconnected lines moving around on a display screen. The lines, if lengthened and connected, would have formed various geometric shapes, such as diamonds or triangles.

Participants had to determine whether the rotation of the incomplete geometric figures was clockwise or counterclockwise. This task was inherently difficult because the disconnected lines lacked the visual cues necessary to easily assess the direction of rotation. The participants' success rate was approximately 50-50, or no better than chance.

The participants were then given the hypnotic suggestion to imagine that something was blocking out part of each shape being observed. Afterward, they repeated the same task of determining the direction of rotation.

The results revealed that participants who were highly susceptible to hypnotic suggestion successfully "hallucinated" visual occluders on top of moving objects. This added imaginary element enabled the participants to better visualize the full geometric shapes and more accurately determine their direction of rotation. On average, their success rate improved to approximately 70%, a statistically significant change.

The participants in the less hypnotizable group, however, were no more likely to complete the observational task following hypnotic suggestion. "Although these results are consistent with our hypothesis, the data surprised us by revealing the decisive and robust nature of the effect," said Raz.

Previous work on hypnosis has often highlighted its capacity to suppress or remove certain perceptual experiences. The new research shows compelling evidence that a hypnotic suggestion can also enhance or introduce perceptual experiences.

"Our findings support the idea that, at least in some people, suggestions can add perceptual information to sensory input," said Raz. "This observation adds meaningful weight to theoretical, clinical, and applied aspects of the brain and psychological sciences."

Credit: 
Association for Psychological Science

Scientists publish a blueprint to apply artificial intelligence to extend human longevity

image: Applications of AI in longevity medicine

Image: 
Deep Longevity Limited

27th of January, Wednesday, Hong Kong - Deep Longevity, a fully-owned subsidiary of Regent Pacific (HKEX: 0575), specializing in the development and the application of next-generation artificial intelligence (AI) for aging and longevity research, today announced the publication of an article in Nature Aging titled "Artificial Intelligence in Longevity Medicine"..

In the article the authors describe a new field of study converging AI, basic research, and medicine referred to as Longevity Medicine. Another definition for Longevity Medicine is the preventative and restorative medicine enabled by the deep aging clocks and artificial intelligence.

The article was authored by Alex Zhavoronkov, the founder and chief longevity officer of Deep Longevity, a computer scientist with a PhD in biophysics, Evelyne Yehudit Bischof, a practicing medical doctor trained in the top European and the US medical schools actively engaged in aging research and gerooncology at the University Hospital Basel in Switzerland, and at Shanghai University of Medicine and Health Sciences, and one of the most prolific scientists and entrepreneurs in artificial intelligence, Kai-Fu Lee.

The traditional approach to medicine is to treat diseases. However, scientists estimate (Cutler and Mattson, 2006) that complete elimination of cancer would result in only 2.3 year increase in life expectancy in the US at birth and 1.3 year gain at age 65. Complete elimination of influenza and pneumonia would yield gains of 0.5 years and 0.2 years in life expectancy in general.

These numbers are so small because there are many age-associated processes and diseases that manifest in unison in late life,¿ so the elimination of just one individual cause does not lead to the intuitively assumed gains in life expectancy. The main driver of most of these diseases and processes is aging.

Aging is a universal feature shared by all living beings. Modern artificial intelligence systems achieved superhuman accuracy in predicting the various features and learning complex patterns using many data types. When trained to predict age using large longitudinal data sets, deep neural networks (DNNs) often learn the basic biological and physiological processes that transpire slowly in time, are highly interdependent, and result in pathologies.

In the article the authors describe the basic framework for the application of deep learning to longevity research and the opportunities for longevity medicine in clinical care and the longevity industry.

"Artificial intelligence holds great potential for medicine in general; however, the ability to track and learn the minute changes that transpire in human body every second over the patient's lifetime and in large number of patients enables the development of a new field of medicine - longevity medicine", said Evelyne Yehudit Bischof, a physician at Human Longevity, Inc, and associate professor at Shanghai University of Medicine and Health Sciences.

Credit: 
Deep Longevity Ltd

Genome-editing tool TALEN outperforms CRISPR-Cas9 in tightly packed DNA

image: The research team included, from left, postdoctoral researcher Saurabh Shukla, graduate student Che Yang, chemical and biomolecular engineering professor Huimin Zhao, physics professor Paul Selvin, postdoctoral researcher Zia Fatma, graduate student Xiong Xiong, and Surbhi Jain, a former doctoral student at the U. of I. who is now a group lead in cancer biology at Northwestern University. Composite image from separate photos, in compliance with COVID-19 safety protocols.

Image: 
Composite photo by L. Brian Stauffer

CHAMPAIGN, Ill. -- Researchers used single-molecule imaging to compare the genome-editing tools CRISPR-Cas9 and TALEN. Their experiments revealed that TALEN is up to five times more efficient than CRISPR-Cas9 in parts of the genome, called heterochromatin, that are densely packed. Fragile X syndrome, sickle cell anemia, beta-thalassemia and other diseases are the result of genetic defects in the heterochromatin.

The researchers report their findings in the journal Nature Communications.

The study adds to the evidence that a broader selection of genome-editing tools is needed to target all parts of the genome, said Huimin Zhao, a professor of chemical and biomolecular engineering at the University of Illinois Urbana-Champaign who led the new research.

"CRISPR is a very powerful tool that led to a revolution in genetic engineering," Zhao said. "But it still has some limitations."

CRISPR is a bacterial molecule that detects invading viruses. It can carry one of several enzymes, such as Cas-9, that allow it to cut viral genomes at specific sites. TALEN also scans DNA to find and target specific genes. Both CRISPR and TALEN can be engineered to target specific genes to fight disease, improve crop plant characteristics or for other applications.

Zhao and his colleagues used single-molecule fluorescence microscopy to directly observe how the two genome-editing tools performed in living mammalian cells. Fluorescent-labeled tags enabled the researchers to measure how long it took CRISPR and TALEN to move along the DNA and to detect and cut target sites.

"We found that CRISPR works better in the less-tightly wound regions of the genome, but TALEN can access those genes in the heterochromatin region better than CRISPR," Zhao said. "We also saw that TALEN can have higher editing efficiency than CRISPR. It can cut the DNA and then make changes more efficiently than CRISPR."

TALEN was as much as five times more efficient than CRISPR in multiple experiments.

The findings will lead to improved approaches for targeting various parts of the genome, Zhao said.

"Either we can use TALEN for certain applications, or we could try to make CRISPR work better in the heterochromatin," he said.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Carbon-chomping soil bacteria may pose hidden climate risk

image: Soil on a chip experiments conducted by Princeton researchers mimic the interactions between soils, carbon compounds and soil bacteria, producing new evidence that large carbon molecules can potentially escape the soil much faster than previously thought. In this microscopy image, soil bacteria (red) grow around aggregates of glucose (green) that stick to pores in a transparent synthetic clay.

Image: 
Judy Q. Yang

Much of the earth's carbon is trapped in soil, and scientists have assumed that potential climate-warming compounds would safely stay there for centuries. But new research from Princeton University shows that carbon molecules can potentially escape the soil much faster than previously thought. The findings suggest a key role for some types of soil bacteria, which can produce enzymes that break down large carbon-based molecules and allow carbon dioxide to escape into the air.

More carbon is stored in soil than in all the planet's plants and atmosphere combined, and soil absorbs about 20% of human-generated carbon emissions. Yet, factors that affect carbon storage and release from soil have been challenging to study, placing limits on the relevance of soil carbon models for predicting climate change. The new results help explain growing evidence that large carbon molecules can be released from soil more quickly than is assumed in common models.

"We provided a new insight, which is the surprising role of biology and its linkage to whether carbon remains stored" in soil, said coauthor Howard Stone, the Donald R. Dixon '69 and Elizabeth W. Dixon Professor of Mechanical and Aerospace Engineering.

In a paper published Jan. 27 in Nature Communications, the researchers, led by former postdoctoral fellow Judy Q. Yang, developed "soil on a chip" experiments to mimic the interactions between soils, carbon compounds and soil bacteria. They used a synthetic, transparent clay as a stand-in for clay components of soil, which play the largest role in absorbing carbon-containing molecules.

The "chip" was a modified microscope slide, or a microfluidic device, containing silicone-walled channels half a centimeter long and several times the width of a human hair (about 400 micrometers). Inlet and outlet tubes at each end of the channels allowed the researchers to inject the synthetic clay solution, followed by suspensions containing carbon molecules, bacteria or enzymes.

After coating the channels with the see-through clay, the researchers added fluorescently labeled sugar molecules to simulate carbon-containing nutrients that leak from plants' roots, particularly during rainfall. The experiments enabled the researchers to directly observe carbon compounds' locations within the clay and their movements in response to fluid flow in real time.

Both small and large sugar-based molecules stuck to the synthetic clay as they flowed through the device. Consistent with current models, small molecules were easily dislodged, while larger ones remained trapped in the clay.

When the researchers added Pseudomonas aeruginosa, a common soil bacterium, to the soil-on-a-chip device, the bacteria could not reach the nutrients lodged within the clay's small pores. However, the enzyme dextranase, which represents enzymes released by certain soil bacteria, could break down the nutrients within the synthetic clay and make smaller sugar molecules available to fuel bacterial metabolism. In the environment, this could lead large amounts of CO2 to be released from soil into the atmosphere.

Researchers have often assumed that larger carbon compounds are protected from release once they stick to clay surfaces, resulting in long-term carbon storage. Some recent field studies have shown that these compounds can detach from clay, but the reason for this has been mysterious, said lead author Yang, who conducted the research as a postdoctoral fellow at Princeton and is now an assistant professor at the University of Minnesota.

"This is a very important phenomenon, because it's suggesting that the carbon sequestered in the soil can be released [and play a role in] future climate change," said Yang. "We are providing direct evidence of how this carbon can be released -- we found out that the enzymes produced by bacteria play an important role, but this has often been ignored by climate modeling studies" that assume clay protects carbon in soils for thousands of years.

The study sprang from conversations between Stone and coauthor Ian Bourg, an assistant professor of civil and environmental engineering and the High Meadows Environmental Institute. Stone's lab has used microfluidic devices to study the properties of synthetic fibers and bacterial biofilms, while Bourg has expertise in the surface geochemistry of clay minerals -- which are thought to contribute most to soil carbon storage due to their fine-scale structure and surface charges.

Stone, Bourg and their colleagues realized there was a need to experimentally test some of the assumptions in widely used models of carbon storage. Yang joined Stone's group to lead the research, and also collaborated with Xinning Zhang, an assistant professor of geosciences and the High Meadows Environmental Institute who investigates the metabolisms of bacteria and their interactions with the soil environment.

Jinyun Tang, a research scientist in the climate sciences department at Lawrence Berkeley National Laboratory, noted that in recent years he and others have observed the degradation of large carbon molecules in soils and hypothesized that it was mediated by biologically produced enzymes.

The Princeton team's observations "provide a very strong support to our hypothesis," said Tang, who was not involved in the study. He added that the study's technique could also be used to explore such questions as "Will the reversible interaction between small-size carbon molecules and clay particles induce carbon starvation to the microbes and contribute to carbon stabilization? And how do such interactions help maintain microbial diversity in soil? It is a very exciting start."

Future studies will test whether bacteria in the model system can release their own enzymes to degrade large carbon molecules and use them for energy, releasing CO2 in the process.

While the carbon stabilization Tang described is possible, the newly discovered phenomenon could also have the opposite effect, contributing to a positive feedback loop with the potential to exacerbate the pace of climate change, the study's authors said. Other experiments have shown a "priming" effect, in which increases in small sugar molecules in soil lead to release of soil carbon, which may in turn cause bacteria to grow more quickly and release more enzymes to further break down larger carbon molecules, leading to still further increases in bacterial activity.

Credit: 
Princeton University, Engineering School

New study identifies bird species that could spread ticks and Lyme disease

image: True thrushes, like the American robin (Turdus migratorius) pictured here, were flagged as likely competent Lyme hosts.

Image: 
Fyn Kynd.

Birds play an underrecognized role in spreading tickborne disease due to their capacity for long-distance travel and tendency to split their time in different parts of the world - patterns that are shifting due to climate change. Knowing which bird species are able to infect ticks with pathogens can help scientists predict where tickborne diseases might emerge and pose a health risk to people.

A new study published in the journal Global Ecology and Biogeography used machine learning to identify bird species with the potential to transmit the Lyme disease bacterium (Borrelia burgdorferi) to feeding ticks. The team developed a model that identified birds known to spread Lyme disease with 80% accuracy and flagged 21 new species that should be prioritized for surveillance.

Lead author Daniel Becker, a Postdoctoral Fellow at Indiana University, says, "We know birds can infect ticks with the Lyme bacterium; however, until now, no one has systematically studied the ecological and evolutionary drivers that influence which bird species are most likely to host and spread Borrelia burgdorferi on a global scale. We set out to fill this gap by identifying traits of bird species that are most likely to pass Lyme to feeding ticks."

Senior author Barbara Han, a disease ecologist at Cary Institute of Ecosystem Studies, says, "To predict and monitor species that could spread tickborne diseases to people, we first need to know which traits make certain animals good pathogen hosts. Here, we used machine learning to assess bird species traits, paired with Lyme infection data from ticks found on birds, to predict bird species that have the potential to spread Lyme."

In this study, the team searched published literature to locate studies reporting Lyme infection of ticks found feeding on birds. The global search yielded 102 studies, including data from ticks found on 183 bird species; of these, 91 carried ticks that tested positive for Borrelia burgdorferi. These bird species are considered 'competent' reservoir species because they are known to infect feeding ticks with Borrelia burgdorferi. Species flagged have a broad range, reaching across the Americas, Africa, Asia, and Oceania.

Next, machine learning was used to compare traits of competent bird species with 4691 other bird species. Data included information on life history features like diet composition, foraging location, body size, lifespan, reproductive rate, and fledgling age, as well as geographical information like migration distance, global dispersal, and maximum elevation. They also looked at baseline corticosterone - the stress hormone in birds - which can influence susceptibility to infection.

The model identified birds that were known to spread Lyme to ticks with 80% accuracy, and revealed 21 new species that should be prioritized for surveillance based on sharing traits with known competent species. High-risk species tend to have low baseline corticosterone, breed and winter at high latitudes and low elevations, are broadly distributed, and occur on either extreme of the pace-of-life continuum (species that breed early and die young, or breed late and are longer lived).

Species from the genus Turdus, commonly known as true thrushes, were found to have a significantly greater likelihood of competence compared to other taxa. This finding suggests that thrushes might be the riskiest bird species for Lyme transmission. Passerines, or perching birds, also tended to have higher competence, as did birds that primarily eat seeds and those that forage on the ground - a behavior that would put them in reach of questing ticks.

Identifying Lyme-competent bird species could have direct implications for our health. Tickborne diseases, especially Lyme disease, can be difficult to diagnose. Knowing where ticks and the diseases they carry are spreading can help medical practitioners prepare for diagnosis and treatment, improving health outcomes for patients.

Due to climate change, the breeding ranges of many birds are shifting north. As birds spread into higher latitudes, so do ticks and pathogens. Some bird species have taken up full or part-time residence in cities and suburbs. Birds that can succeed in developed environments, especially those that are overwintering in these new places in close proximity to people, increase residents' risk of contracting a tickborne disease.

Becker says, "Birds don't spread Lyme directly to people, but they can carry infected ticks to new locations with no history of Lyme occurrence. A tick could drop off a bird and into a garden or yard, where it could later bite and infect a person. If local medical practitioners are unfamiliar with Lyme symptoms, proper diagnosis could be delayed. Identifying where ticks are spreading could improve medical response to Lyme and other tickborne diseases."

Han concludes, "These findings remind us that pathogen competence varies tremendously, even among animals of the same family. Machine learning techniques allow us to analyze animal traits and help us predict risky species on a global scale - not only for Lyme, but for other tickborne and zoonotic diseases that involve multiple host species. These predictions could provide crucial information to guide early interventions, prevent disease spillover, and protect our health."

Credit: 
Cary Institute of Ecosystem Studies

Even machines need their greens

A tree grows strong from years of generating its own food. Now imagine if products could be strengthened with the same living materials that provide nutrients to strengthen trees. This is the work of USC Viterbi School of Engineering Civil and Environmental Engineering Professor Qiming Wang whose research lab is one of the first to infuse 3-D printer ink with living material. The material has potential for greater strength, to be flexible and self-heal. The work is documented in a paper published in The Proceedings of the National Academy of Sciences.

The idea for this bio-inspired ink came from trees that harness the power of photosynthesis to produce glucose that transform to cellulose and strengthen the plant's cell structure. "When trees are young," says Wang, "they are flexible, when they are mature, they are rigid."

"The research idea is also inspired by Popeye the Sailor, the animated character who can strengthen his muscles by eating spinach," says Wang, whose research is focused on bioinspired manufacturing and mechanics of unprecedented materials and structures that can potentially address engineering challenges in infrastructure, energy, robotics, healthcare and the environment.

"Now, we are using scientific innovation to realize our childhood imagination," says Wang.

The research team behind this study, which includes USC Viterbi Ph.D. students Kunhao Yu and Zhangzhengrong Feng as lead authors along with Professor Nicholas X. Fang of Massachusetts Institute of Technology and Professor Chiara Daraio of California Institute of Technology, used a centrifuge to extract chloroplasts from spinach purchased from Trader Joe's. They blended the spinach chloroplasts with a newly invented 3D-printable polymer ink. Then they used the ink to 3D-print structures. By applying light to the 3-printed structure, they created conditions to generate plant-based glucose which reacts with the polymer to make the material become stronger and stronger.

By applying two to four hours of light and mimicking the power of photosynthesis, the researchers believe this "living material" can self-strengthen to be six times its original strength. What's more, the strengthening effect induced by the living chloroplasts can be temporarily suspended by freezing the material at 0? (the chloroplasts are temporarily slowed down at freezing). Once the temperature returns to room temperature, the strengthening effect can be resumed.

"The material behaves like a snake that hibernates through the winter," Wang says.

"Such a temporary 'suspending behavior' has never been demonstrated in existing engineering materials," Wang adds.

Yu, a lead author on the paper notes, "This technology with gradient light illumination can create engineering structures with gradient stiffness, which exhibit an exceptional 'cushioning' property far beyond that of the homogeneous ones.

"Another striking finding is that the strengthening effect can be tuned by external force," said Feng, the paper's other lead author.

"When you hang a weight on a tree branch, that branch will become much stronger than other branches, a process called "mechanotransduction." The same phenomenon happens here.

The team envisions applying photosynthesis to materials to design a custom 3D-printed sneaker sole that molds to one's foot and has a customized stiffness.

Some plants exhibit a self-healing capability during grafting and wound repairing. According to the researchers, the "living material"infused with chloroplasts in a lab at USC also presents an outstanding self-repairing property. Such a property is induced by the photosynthesis-produced glucose that creates the molecular process of cross-linking (in essence equivalent to creating sutures). Such crack-repairing capability could be applied in boat propellers or even drones.

Credit: 
University of Southern California