Culture

Xenophobic and racist policies in the US may have harmful effect on birth outcomes

December 2, 2020 -- The first U.S. Executive Order of the 2017 travel ban targeting individuals from Muslim majority countries may be associated with preterm births for women from those countries residing in the U.S., according to a new study conducted at Columbia University Mailman School of Public Health. The research also showed that structurally xenophobic and racist policies in the U.S. may have a harmful effect on early life indicators of life-long health outcomes. The findings are published on line in the journal Social Science and Medicine.

This is the first national study to consider the impact of a policy that is both xenophobic and Islamophobic (anti-immigrant and anti-Muslim) on birth outcomes of women from Muslim countries impacted by the 2017 travel ban.

"Our study provides new evidence about the importance of social characteristics of host countries and structurally stigmatizing contexts and reveals the potential public health implications of the global rise in xenophobia and populism," said Goleen Samari, PhD, assistant professor of population and family health at Columbia Mailman School, and principal investigator. "Even for populations that historically experience positive birth outcomes, anti-immigrant and Islamophobic policies are associated with abrupt and detrimental shifts in health outcomes."

The researchers conducted a national-level examination of women from Middle Eastern and North African (MENA) countries included in the 2017 travel ban, many of which include a Muslim majority. These populations are typically overlooked and understudied because of how religion and race and ethnicity are defined in health survey research.

Women from countries impacted by the 2017 travel ban experienced a nearly 7% increase in the odds of delivering a preterm infant between September 2017 and August 2018--a period that began approximately eight months after the executive order travel ban was enacted. Trends in preterm birth remain unchanged for native-born non-Hispanic White women.

The data controlled for seasonality, other forms of autocorrelation, and population-level shifts in preterm birth among all women giving birth.

"As many countries continue to shut down migration systems because of the COVID-19 pandemic, it is important to understand the secondary and tertiary effects of such restrictions. This study, therefore, is an important contribution to the literature on xenophobia, structural racism, and health and perinatal demography for an often overlooked and understudied immigrant population."

Credit: 
Columbia University's Mailman School of Public Health

Plant-inspired alkaloids protect rice, kiwi and citrus from harmful bacteria

Plants get bacterial infections, just as humans do. When food crops and trees are infected, their yield and quality can suffer. Although some compounds have been developed to protect plants, few of them work on a wide variety of crops, and bacteria are developing resistance. Now, researchers reporting in ACS' Journal of Agricultural and Food Chemistry have modified natural plant alkaloids into new compounds that kill bacteria responsible for diseases in rice, kiwi and citrus.

Currently, no effective prevention or treatment exists for some plant bacterial diseases, including rice leaf blight, kiwifruit canker and citrus canker, which result in substantial agricultural losses every year. Scientists are trying to find new compounds that attack bacteria in different ways, reducing the chances that the microbes will develop resistance. Plant compounds called tetrahydro-β-carboline (THC) alkaloids are known to have antitumor, anti-inflammatory, antifungal, antioxidant and antiviral activities. So, Pei-Yi Wang, Song Yang and colleagues wondered whether derivatives of THC alkaloids could help fight plant bacterial diseases.

The researchers used a THC alkaloid called eleagnine, which is produced by Russian olive trees and some other plants, as a scaffold. To this framework, they added different chemical groups to make a series of new compounds, two of which efficiently killed three strains of plant pathogenic bacteria in liquid cultures. The team then tested the two compounds on rice, kiwi and citrus plant twigs and leaves and found that the new alkaloids could both prevent and treat bacterial infections. The researchers determined that the compounds worked by increasing levels of reactive oxygen species in the bacteria, which caused the bacterial cells to die.

Credit: 
American Chemical Society

Unmet job expectations linked to a rise in suicide, deaths of despair

Declines in blue-collar jobs may have left some working-class men frustrated by unmet job expectations and more likely to suffer an early death by suicide or drug poisoning, according to a study led by sociologists at The University of Texas at Austin.

In the study, the researchers compared life outcomes of 11,680 men to the job expectations they held as high school seniors in the early 1980s. The study showed that men who expected to work in jobs that did not require a college degree but later faced declines in the job market were nearly three times as likely to suffer early deaths by suicide and drug poisoning as men who sought work that required a bachelor's degree.

The study, published in JAMA Network Open, is the first to link the rise in suicide and drug-poisoning deaths among men without a college degree to declines in working-class jobs.

"Work plays a major role in how individuals experience their communities, derive a sense of purpose, and thus develop a sense of psychological well-being," said lead author Chandra Muller, a sociology professor and researcher at the Population Research Center at UT Austin. "It's possible that occupational expectations developed in adolescence serve as a benchmark for perceptions of adult success and, when unmet, pose a risk of self-injury."

Early death from self-injury has risen dramatically in recent decades, especially among middle-age white men whose deaths by suicide and drug poisoning increased by 9 and 31 per 100,000 (respectively) between 1980 and 2013. At the same time, the labor market also experienced a discerning trend: the decline of well-paying jobs that do not require a college degree.

Researchers from UT Austin, the University of Minnesota and the University of Wisconsin-Madison investigated the relationship between the two trends using data from the High School and Beyond cohort, a nationally representative sample of 11,680 men who were surveyed throughout high school in the early 1980s, again in 1992 (when they were 28-30 years old), and again in 2015.

Between 1992 and 2015, less than 6% of the sample had died. The researchers compared suicide and drug poisoning deaths, which are forms of self-injury, to other causes of early adult deaths, such as heart attacks and cancer. They found that the men most likely to suffer a death by suicide or drug poisoning were those who as adolescents expected to earn enough to support a family through some type of semi-skilled labor that later declined when they reached adulthood, such as manufacturing, mechanics and carpentry.

The study showed that neither educational attainment nor the actual job worked increased risk for death by self-injury. Furthermore, unmet occupational expectations were not associated with a higher risk of an early death by natural or other causes. This comparison further strengthened their conclusions about a link between a decline of working-class jobs and deaths of despair.

"Our findings suggest closed pathways to sustaining working-class jobs may contribute to men's increasing rates of suicide and drug-poisoning mortality," Muller said. "The social, psychological and cultural ideals associated with certain occupations are important considerations in labor policy, such as minimum wage policies or job retraining programs, as strategies for suicide prevention."

Credit: 
University of Texas at Austin

Ozone breaks down THC deposited on surfaces from thirdhand cannabis smoke

Second- and thirdhand tobacco smoke have received lots of attention, but much less is known about the compounds deposited on surfaces from cannabis smoke. Now, researchers reporting in ACS' Environmental Science & Technology have discovered that ozone --a component of outdoor and indoor air -- can react with tetrahydrocannabinol (THC), the psychoactive component of cannabis, on glass or cotton surfaces to produce new compounds, which they characterized for the first time.

Smoking emits reactive chemicals that remain in the air (so-called secondhand smoke) or deposit onto surfaces, including walls, windows, clothing and upholstery (thirdhand smoke). Unlike the secondhand variety, thirdhand smoke lingers long after a person stops smoking. Nicotine is semi-volatile and reacts with other chemicals on surfaces, producing new compounds that, if volatile, can also become airborne. Because cannabis smoke is chemically distinct from tobacco smoke, Aaron Wylie and Jonathan Abbatt wanted to characterize the compounds formed when THC, by itself or in cannabis smoke, on surfaces reacts with ozone in the air.

The researchers coated glass and cotton cloth, to simulate windows and clothing, with a THC solution. Then, they exposed the surfaces to concentrations of ozone that could exist in indoor air. In their analysis, they found that over time, the amount of THC on glass and cotton decreased, while the quantities of three THC oxidation products increased. In other experiments, the team used a smoking machine to deposit cannabis smoke onto cotton. Upon exposure to ozone, the same three compounds formed at roughly the same rate as observed for the THC-coated cloth. Because of the low volatility of THC and its oxidation products, the compounds are unlikely to be emitted to the air where they could be inhaled in as large amounts as nicotine, the researchers say. They say that somebody could still be exposed to THC and its derivatives, whose health effects are unknown, if they, for example, lick their fingers after touching a surface contaminated by cannabis smoke.

Credit: 
American Chemical Society

Less COVID-19 transmission seen in countries with more intense testing

Lacking vaccines, countries have relied on multiple non-pharmaceutical interventions to control COVID-19 transmission. Despite the urging of the World Health Organization (WHO) in March to "test, test, and test," policy makers disagree on on how much testing is optimal. A new study, by Ravindra Prasan Rannan-Eliya and coauthors from the Institute for Health Policy in Colombo, Sri Lanka, uses data from multiple online sources to quantify testing impact on COVID-19 transmissibility in 173 countries and territories (accounting for 99 percent of the world's cases) between March and June 2020. The authors found that among interventions, testing intensity had the greatest influence: a tenfold increase in the ratio of tests to new cases reported reduced average COVID-19 transmission by 9 percent. The authors note that this helps explain why countries such as China, Australia, and New Zealand achieved near elimination of COVID-19 and why lockdowns and other interventions failed to slow spread of the virus in others, such as India and Peru. "Even the wealthiest countries, such as the US, UK, and Qatar, cannot expand testing and tracing fast enough to achieve epidemic control," the authors conclude. "Early and continuous aggressive testing to keep incidence within capacity to test, trace and isolate may be the best implementation of flattening the curve."

Credit: 
Health Affairs

The tree of cortical cell types describes the diversity of neurons in the brain

The tree of life describes the evolution of life and seeks to define the relationships between species. Likewise, the tree of cell types aims to organize cells in the brain into groups and describe their relationships to each other.

Scientists have long pondered just what the brain's tree of cell types looks like. Now, an international collaboration led by Dr. Andreas Tolias from Baylor College of Medicine, Dr. Philipp Berens from the University of Tübingen in Germany and Dr. Rickard Sandberg from the Karolinska Institute in Stockholm, Sweden, has published an article in Nature that provides one of the most detailed and complete characterizations of the diversity of neural types in the brain so far.

Uncovering the shape of the tree of cortical cell types with Patch-seq

Neuroscientists mostly use three fundamental features to describe neurons: their anatomy, or how they look under a microscope; their physiology, or how they respond when stimulated; and, more recently, the genes they express, which are known as their transcriptome.

For this study, the research team used an experimentally challenging technique that they developed several years ago, called Patch-seq. This technique allowed them to collect a large multimodal database including genetic, anatomical and physiological information from single cells in the mouse motor cortex.

"Gathering all these three fundamental features from the same set of neurons was the key that enabled us to get a much deeper understanding of how neurons in the motor cortex are related to each other and a clearer view of how the tree of cell types looks like," said co-first author Dr. Federico Scala, postdoctoral associate in Tolias lab at Baylor.

Dr. Dmitry Kobak, also co-first author and a research scientist in Berens lab, described that while the broad genetic families of neurons had distinct anatomical and physiological properties, within each family the neurons exhibited extensive anatomical and physiological diversity. Importantly, all the three basic neuronal characteristics (anatomy, physiology and transcriptome) were correlated, which enabled the team to find interesting links between them.

"Our data supports the view that the tree of cortical cell types may look more like a banana tree with few big leaves rather than an olive tree with many small ones. This view provides a simpler model to describe the diversity of neurons we find in the brain. We believe that this simpler view will lead to a more principled understanding of why we have so many cell types in the brain to begin with and what they are used for," said Tolias, Brown Foundation Endowed Chair of Neuroscience and director of the Center for Neuroscience and Artificial Intelligence at Baylor.

In this metaphor, neurons follow a hierarchy consisting of distinct, non-overlapping branches at the level of families, the large leaves of the banana tree. Within each family, neurons show continuous changes in their genetic, anatomical and physiological features, and all three features within a family are correlated. In parallel, work published simultaneously in Cell, scientists from the Allen Institute of Brain Science in Seattle obtained very similar results from mouse visual cortex underscoring that this view of cell types may be a general building principle of brain circuits.

Credit: 
Baylor College of Medicine

Roly polies transfer environmental toxins to threatened fish populations in California

WASHINGTON--Roly poly bugs may be a source of fun for kids and adults but these little bugs that form into balls at the slightest touch are causing problems for some threatened fish.

New research finds steelhead trout in a stream on the California coast accumulate mercury in their bodies when the fish eat roly polies and similar terrestrial bugs that fall into local waterways. The new study corroborates earlier findings that mercury can make its way to the top of the food chain in coastal California.

The results show for the first time that roly polies and other bugs are transferring high levels of the toxic metal to fish in an otherwise pristine watershed where environmental contaminants are not known to be a concern, according to the researchers.

"Our research is the first step in identifying [mercury] as a potential stressor on these populations," said Dave Rundio, a research fishery biologist at NOAA's Southwest Fisheries Science Center and co-leader of the study that will be presented 8 December at AGU's Fall Meeting 2020.

It is unclear whether mercury accumulation in steelhead and other species would affect humans, but the findings suggest mercury can move between connected ecosystems and affect threatened or endangered species in unsuspecting ways, according to the researchers.

Tracing mercury's path

Mercury is a toxic metal that can cause neurological and developmental problems in humans. Some mercury occurs naturally and is not harmful to living things, but certain bacteria convert elemental mercury into an active form that can become concentrated in the tissues of living organisms and accumulate in larger and larger amounts up through the food web.

A 2019 study by Peter Weiss-Penzias, an atmospheric chemist at the University of California Santa Cruz, found mercury from coastal California fog can make its way to the top of the terrestrial food chain and reach nearly toxic levels in the bodies of pumas.

In the new study, Rundio wanted to see if mercury can accumulate in top predators in aquatic ecosystems as well as on land. Rundio, Weiss-Penzias and other colleagues looked specifically at mercury levels in steelhead trout, a kind of rainbow trout that are one of the top sport fish in North America and culturally important to some Native American tribes.

The researchers took samples of steelhead trout and their prey from a stream in the Big Sur region of central California to see if they had elevated mercury concentrations in their tissues and to determine where that mercury came from.

Steelhead are predators with a varied diet in freshwater, eating anything from small fish and crustaceans to insects and even salamanders that fall into streams. Interestingly, terrestrial bugs like roly polies make up nearly half of a steelhead's diet in streams in coastal California.

The researchers found steelhead had elevated mercury levels in their tissues, and the older, more mature fish had more mercury than juveniles. Some of the mature stream trout they sampled had mercury concentrations that met or exceeded water quality and food consumption advisory levels.

From fog to fish

The researchers also found the terrestrial bugs the fish eat - most notable roly polies, which are a non-native species from Europe - had higher mercury concentrations than their aquatic counterparts. The findings suggest mercury rolls into coastal California through fog, is consumed by roly polies eating leaf detritus and water droplets, and moves up the food chain to the fish - similar to Weiss-Penzias's findings of how mercury makes it up the terrestrial food chain to pumas.

The new findings came as a surprise to Rundio because the steelhead they sampled live in a nearly pristine environment scientists thought to be free of environmental contaminants.

It is important to know where mercury accumulates in the environment and food webs because it a difficult toxin to get rid of, said Weiss-Penzias, who will present the work. Some toxins can be diluted enough in the environment that they become essentially harmless, but mercury becomes more concentrated as it moved up the food chain.

"We have to think of mercury in that sense and be extremely concerned about it because of the continual releases of coal combustion, gold mining, and other industrial processes," Weiss-Penzias said.

Credit: 
American Geophysical Union

Amphibian die-offs worsened malaria outbreaks in Central America

WASHINGTON--The global collapse of frogs and other amphibians due to the amphibian chytrid fungus exacerbated malaria outbreaks in Costa Rica and Panama during the 1990s and 2000s, according to new research.

The findings provide the first evidence that amphibian population declines have directly affected human health and show how preserving biodiversity can benefit humans as well as local ecosystems.

"This is like a small building block showing that there could be unwanted human health consequences of amphibian collapses, and so we should really be trying to account for these impacts," said Joakim Weill, an environmental economist at the University of California Davis who will present the results Tuesday, 8 December at AGU's Fall Meeting 2020. "We really view this as an important first step leveraging this type of interdisciplinary work, trying to tease out causal relationship between environmental change and human health."

The global spread of Batrachochytrium dendrobatidis, an extremely virulent fungal pathogen known as amphibian chytrid fungus, has been responsible for massive worldwide die-offs of amphibians since the 1980s. A 2019 study found the fungal disease has played a role in the decline of over 500 amphibian species over the past five decades and presumably caused extinctions of 90 species. The authors of that study referred to the die-offs as "the greatest recorded loss of biodiversity attributable to a disease."

Chytrid fungal disease traveled across Costa Rica and Panama from the early 1980s through the 2000s. Both countries experienced large increases in malaria cases following this rolling collapse of amphibian populations.

In the new study, researchers investigated whether these malaria outbreaks were connected to the amphibian declines because amphibians eat mosquitoes that transmit the disease. They compared the timing and spatial extent of amphibian die-offs with malaria cases in Costa Rica and Panama at the county level from 1976 to 2016.

The researchers found a significant increase in malaria cases in these countries that started immediately after the amphibian die-offs began and peaked 5 to 6 years after. In 1980, there were fewer than 1,000 cases of malaria in the two countries, but cases began to rise in 1990 and peaked at about 7,000 in Costa Rica in the mid-1990s and 5,000 in Panama in the mid-2000s.

Malaria cases went back down after this peak, and the researchers suspect this is due to local public health interventions like spraying of insecticides.

The results show some of the first evidence that species extinctions and biodiversity loss can directly affect human health, according to the researchers.

Other environmental factors like deforestation also played a role in exacerbating the outbreaks, but no other factor had as much of an impact on malaria cases as the amphibian declines, according to the study.

"We are able to find what really seems to be this striking causal relationship between amphibian declines and malaria," Weill said. "It's pretty incredible that we are finding anything in the first place, because these are events that occurred 40 years ago and the right people were in the right place to make observations about amphibian populations and human disease that we can use today to arrive at new insights."

Credit: 
American Geophysical Union

Study: Telemedicine use disparity during COVID-19 among head and neck cancer patients

image: Study co-author and otolaryngologist in Henry Ford Health System's Department of Otolaryngology - Head and Neck Surgery.

Image: 
Henry Ford Health System

DETROIT (December 2, 2020) - The use of telemedicine services has shown to be exceptionally effective in meeting the health care needs of patients throughout the COVID-19 pandemic. But an analysis by Henry Ford Health System found that socioeconomic factors may affect certain patient populations on how they use the technology for accessing care.

In a Research Letter published in the Journal of the American Medical Association's Otolaryngology - Head and Neck Surgery, Henry Ford researchers report that head and neck cancer patients who were low-income, on Medicaid or uninsured were more likely to complete a virtual visit by telephone rather than by video. They also said women with a lower median household income were less likely to complete a telemedicine visit than men in the same income bracket.

Researchers said further study was needed to explain patients' reticence with completing a video visit, which provides a more comprehensive health care assessment than a phone call with their doctor. "While virtual care may provide a promising platform for expanded access to care in some patients, it must be implemented in a way that it doesn't create barriers to already disadvantaged patient populations," said Samantha Tam, M.D., a study co-author and otolaryngologist in Henry Ford's Department of Otolaryngology - Head and Neck Surgery.

The pandemic-driven need for accessing care using telemedicine services prompted researchers to evaluate whether socioeconomic factors impacted a patient's ability to receive virtual care. In their retrospective study, they analyzed census-based socioeconomic data of head and neck cancer patients who had a telemedicine visit between March 17 and April 24, 2020 and compared the results to a similar cohort from the same time frame in 2019.

Data included patients' age, sex, race, insurance status, household income, education, marital and employment status, and English-speaking households. Patient visits were categorized by virtual visits using live audio and video, visits completed by telephone only, in-person visits and no-show or canceled visits.

Data from 401 patient encounters during the 2020 study period was collected. From those numbers, 346 encounters (86.3%) were completed by 234 patients. In-person visits consisted of 87 patients (25.1%), 170 (49.1%) were virtual visits and 89 (23.6%) were telephone visits. In comparison, the 2019 study found 551 of 582 visits (94.7%) were completed by 394 patients, with no telemedicine visits completed that year.

"We know that access to smartphones and video technology is not universal but almost everyone has access to a telephone," said Vivian Wu, M.D., a study co-author and otolaryngologist. "As virtual care expands during and after this pandemic, we must keep in mind that a phone call remains an important communication method for patients to talk to their doctor."

Since the retrospective study was observation-based, the research team did not evaluate whether patients had access to mobile "smart" phones and internet connectivity.

Credit: 
Henry Ford Health

Having it both ways: a combined strategy in catalyst design for Suzuki cross-couplings

The Suzuki cross-coupling reaction is a widely used technique for combining organic compounds and synthesizing complex chemicals for industrial or pharmaceutical applications. The process requires the use of palladium (Pd) catalysts and, as of today, two main types of Pd-based materials are used in practice as heterogeneous catalysts.

The first is 'metal-loaded catalysts', which consist of Pd atoms (active sites) loaded onto inert supports made of oxides or carbon-based materials. They are easy to prepare and offer a large surface area with active sites where the Suzuki reaction can happen. However, these catalysts degrade quickly with use as the active sites aggregate/detach from the support. The second type is 'intermetallic catalysts'--molecules made of Pd and another metal. Though much more stable and effective under mild conditions, these catalysts make poor use of the high quantities of Pd required because few active sites actually end up exposed to the reaction medium. But what if both types of catalyst were combined to overcome their inherent limitations?

In a recent study published in ACS Catalysis, a team of scientists from Tokyo Tech, Japan, came up with a new idea for a heterogeneous catalyst. They chose nanoporous zirconium carbide (ZrC) as the support on which they grew ZrPd3 nanoparticles, which act as an intermetallic catalyst. Because both the support and the active compound have the same element (Zr), the chemical preparation of the catalyst is remarkably simple. The overall benefits, moreover, go far beyond that.

First, the new Pd-ZrC catalyst is highly stable because active sites (ZrPd3) get anchored on the nanoporous ZrC support. This strong interaction between ZrPd3 and ZrC helps enhance the overall catalytic stability, enabling re-use of the Pd-ZrC catalyst for more than 15 cycles. In addition, the Pd sites exposed do not clump together and disperse throughout the support, realizing a much larger effective area than intermetallic catalysts alone. The neat distribution of ZrPd3 over the surface of the support also means that a smaller amount of palladium is needed for the same number of active sites compared with other intermetallic catalysts--a measure referred to as Pd atom economy.

Perhaps most important is the fact that these benefits come with no strings attached; the actual performance, i.e. turnover frequency, of the new catalyst is higher than that of commercially available compounds. Professor Hideo Hosono, who led the study, explains: "Because the Pd-ZrC has both negatively charged Pd and a strong electron-donation ability, our catalyst achieved high catalytic performance for the Suzuki cross-coupling reaction even at room temperature."

Overall, the results of both the theoretical and experimental analyses conducted by the team of scientists confirm that their strategy is very promising for the development of future catalysts, as Prof Hosono remarks: "Our observations have proved the effectiveness of combining intermetallic catalysts with supports to improve upon multiple aspects simultaneously, showing that we can increase the degrees of freedom in the design of heterogeneous catalysts."

Improving catalysts is a practical way of reducing the economic and environmental costs associated with the synthesis of complex chemicals. Only time will tell how many novel catalyst designs are inspired by the strategy adopted in this study!

Credit: 
Tokyo Institute of Technology

When the rains stopped

Climate Change as a Catalyst in Greater Cahokia

Water and air are highly mutable resources that exist in a myriad of physical states and dimensions, and due to their affectivity, these entities participate in a multitude of interactions capable of sustaining life, transforming environments, and shaping human behavior. As air and water circulate between the atmosphere and the landscape through the process of evapotranspiration, humans interact with and form relationships--or bio-cultural associations--with these substances. Facets of human life, like breathing, cooking, bathing, agriculture, and engaging with the outdoors, become intertwined with a region's hydroclimate. Interactions with air and water, in turn, influence the ways humans construct and modify their societies.

As the climate shifts, bio-cultural associations are often altered in the process. Archaeologists analyze the impacts of climate change on human history and have frequently identified correlations between the Medieval Climate Optimum--spanning from the 9th century to the 13th century--and periods of societal change.

Taking this correlation into account, Timothy R. Pauketat, in the article "When the Rains Stopped: Evapotranspiration and Ontology at Ancient Cahokia," published in the Journal of Anthropological Research, explores how the trajectory of the Medieval Climate Optimum (MCO) aligns with the history of Greater Cahokia--an ancient indigenous city in the Mississippi River valley. By analyzing air flow and precipitation levels during the MCO, Pauketat examines how evapotranspiration shaped life in the Mississippi valley and argues it played a critical role in determining the progression of Cahokian urbanism. In particular, Pauketat focuses on the fluctuating popularity of an institutionalized form of evapotranspiration--a sacred rite referred to as "Steam Bath Ceremonialism" (SBC).

Utilizing an ontological approach that emphasizes relationships between human and non-human entities, Pauketat elaborates on how the river basin's weather extremes and strong storms were interpreted as spiritual transfers of power from the atmosphere to humanity. Steam Bath Ceremonialism presents another example of transferring powerful energy. In this transubstantiation ritual, liquid water was converted into steam, and those in attendance absorbed the steam and its healing energy. While circular Cahokian steam baths initially could only be found at a few sites, medicine bundle transfers enabled the spread of steam baths to smaller, rural regions.

Widespread acceptance of Steam Bath Ceremonialism was one of many changes in Greater Cahokia during a period of urbanization Pauketat has designated as the city's "Big Bang." Around 1050 CE, new architectural styles and elements--particularly ones associated with water or lunar cycles--were embraced as the city was planned according to a precinct grid and as older villages were replaced by mounds, plazas, cypress post arrangements, religious buildings, and borrow pits. Shrines were expanded, and causeways were constructed to establish pathways to mounds with steam baths.

While Greater Cahokia was accustomed to substantial amounts of rainfall, PDSI models reveal a hydroclimatic shift throughout the 12th century, resulting in less rainfall and progressively drier conditions. Pauketat suggests the reduction in precipitation served as a catalyst for dramatic changes, such as the migration of farmers, the construction of defensive barriers, the concealment of food supplies, and the decline of Steam Bath Ceremonialism.

Credit: 
University of Chicago Press Journals

No poaching occurring within most Channel Islands marine protected areas

image: A school of blue rockfish at one of the study sites.

Image: 
Katie Davis

NEWPORT, Ore. - Fish are thriving and poachers are staying out of marine protected areas around California's Channel Islands, a new population analysis by an Oregon State University researcher shows.

The analysis estimates fish populations and harvest rates based on the numbers of larger, older fish present; fewer larger fish is an indication of higher harvesting rates. Researchers found harvest rates essentially at zero for four species of kelp forest fish inside the marine protected areas between 2003 and 2017, but found much higher rates of harvesting at unprotected sites nearby.

"We expected we would find more large fish in the marine protected areas, and that is exactly what happened," said Will White, a marine ecologist with Oregon State's Coastal Oregon Marine Experiment Station in Newport and the paper's lead author. "We looked at the data inside and outside the marine protected areas and we found that there is no evidence of fishing inside the boundaries, but there is a lot of fishing going on just outside the boundaries."

The study is believed to be the first to directly assess whether marine protected areas have eliminated harvesting of fish inside their boundaries, said White, who is also an assistant professor in the OSU College of Agricultural Sciences' Department of Fisheries and Wildlife.

The findings were published in the journal Conservation Letters. Co-authors include Mark Yamane, an undergraduate student participating in a National Science Foundation-sponsored research experience at Hatfield Marine Science Center; Kerry Nickols of California State University, Northridge; and Jennifer Caselle of the University of California, Santa Barbara. The research was supported by the National Science Foundation and the David and Lucile Packard Foundation.

The state of California first established a network of 13 marine protected areas around the northern Channel Islands off the coast of southern California in 2003.

Marine protected areas that restrict or prohibit the harvest of wild populations of fish or other sea life are seen as a way to help replenish populations that have been overfished. They are an increasingly popular tool for fisheries management around the United States and throughout the world.

The assumption is that harvesting ceases in the protected areas, but determining if or how much illegal harvesting, or poaching, is occurring has been a challenge for researchers, which led White to develop a new method for analyzing fish populations.

The Partnership for Interdisciplinary Studies of Coastal Oceans, or PISCO, has been conducting long-term surveys of kelp forests along the West Coast for more than 20 years. PISCO researchers surveyed fish populations throughout the Channel Islands, both inside and outside the current marine protected areas, each year from 1999 through 2017.

White used that repository of data to develop the new analysis, which estimates harvest rates based on the number of larger, older fish present at a given time. The researchers were able to compare fish populations of four species of kelp forest fish year-over-year after the marine protected areas were in place.

The researchers found that overall, the harvest rates of fish in the protected areas were essentially at zero, indicating that fishing has ceased in the protected areas. Two locations showed a low harvest rate, suggesting some poaching may be occurring or that the fish in those areas may be straying outside protected boundaries, which is the more likely scenario, White said.

"In an unfished population, you'd expect to have lots of little fish, some medium fish and some really big fish. In a heavily fished area, you would have little fish and some medium fish, but not many big fish," he said. "The question is, 'Do these look like unfished populations?' and the answer is, they do."

However, in some locations outside the marine protected areas, harvest rates are much higher than expected, possibly because there is more fishing there since the marine protected areas were established. Those shifts in fishing behavior are something fisheries managers and policymakers would want to consider, as well.

"People are fishing where they are supposed to," White said, "but that may have additional repercussions."

The new analysis could be a useful tool for understanding the effectiveness of marine protected areas at a time when fisheries management officials in California and Oregon are preparing to embark on evaluations of their programs, White said. The state of California is expected to begin its review in 2022 and Oregon in 2023.

Credit: 
Oregon State University

Chaotic early solar system collisions resembled 'asteroids' arcade game

image: An elemental X-ray map of a portion of the Peekskill meteorite. Different colors correspond to different elements.

Image: 
Michael Lucas.

One Friday evening in 1992, a meteorite ended a more than 150 million-mile journey by smashing into the trunk of a red Chevrolet Malibu in Peekskill, New York. The car's owner reported that the 30-pound remnant of the earliest days of our solar system was still warm and smelled of sulfur.

Nearly 30 years later, a new analysis of that same Peekskill meteorite and 17 others by researchers at The University of Texas at Austin and the University of Tennessee, Knoxville, has led to a new hypothesis about how asteroids formed during the early years of the solar system.

The meteorites studied in the research originated from asteroids and serve as natural samples of the space rocks. They indicate that the asteroids formed though violent bombardment and subsequent reassembly, a finding that runs counter to the prevailing idea that the young solar system was a peaceful place.

The study was published in print Dec.1 in the journal Geochimica et Cosmochimica Acta.

The research began when co-author Nick Dygert was a postdoctoral fellow at UT's Jackson School of Geosciences studying terrestrial rocks using a method that could measure the cooling rates of rocks from very high temperatures, up to 1,400 degrees Celsius.

Dygert, now an assistant professor at the University of Tennessee, realized that this method -- called a rare earth element (REE)-in-two-pyroxene thermometer -- could work for space rocks, too.

"This is a really powerful new technique for using geochemistry to understand geophysical processes, and no one had used it to measure meteorites yet," Dygert said.

Since the 1970s, scientists have been measuring minerals in meteorites to figure out how they formed. The work suggested that meteorites cooled very slowly from the outside inward in layers. This "onion shell model" is consistent with a relatively peaceful young solar system where chunks of rock orbited unhindered. But those studies were only capable of measuring cooling rates from temperatures near about 500 degrees Celsius.

When Dygert and Michael Lucas, a postdoctoral scholar at the University of Tennessee who led the work, applied the REE-in-two-pyroxene method, with its much higher sensitivity to peak temperature, they found unexpected results. From around 900 degrees Celsius down to 500 degrees Celsius, cooling rates were 1,000 to 1 million times faster than at lower temperatures.

How could these two very different cooling rates be reconciled?

The scientists proposed that asteroids formed in stages. If the early solar system was, much like the old Atari game "Asteroids," rife with bombardment, large rocks would have been smashed to bits. Those smaller pieces would have cooled quickly. Afterward, when the small pieces reassembled into larger asteroids we see today, cooling rates would have slowed.

To test this rubble pile hypothesis, Jackson School Professor Marc Hesse and first-year doctoral student Jialong Ren built a computational model of a two-stage thermal history of rubble pile asteroids for the first time.

Because of the vast number of pieces in a rubble pile --1015 or a thousand trillions -- and the vast array of their sizes, Ren had to develop new techniques to account for changes in mass and temperature before and after bombardment.

"This was an intellectually significant contribution," Hesse said.

The resulting model supports the rubble pile hypothesis and provides other insights as well. One implication is that cooling slowed so much after reassembly not because the rock gave off heat in layers. Rather, it was that the rubble pile contained pores.

"The porosity reduces how fast you can conduct heat," Hesse said. "You actually cool slower than you would have if you hadn't fragmented because all of the rubble makes kind of a nice blanket. And that's sort of unintuitive."

Tim Swindle of the Lunar and Planetary Laboratory at the University of Arizona, who studies meteorites but was not involved in the research, said that this work is a major step forward.

"This seems like a more complete model, and they've added data to part of the question that people haven't been talking about, but should have been. The jury is still out, but this is a strong argument."

The biggest implication of the new rubble pile hypothesis, Dygert said, is that these collisions characterized the early days of the solar system.

"They were violent, and they started early on," he said.

Credit: 
University of Texas at Austin

New machine learning tool tracks urban traffic congestion

image: TranSEC's near real-time display of traffic state estimation in the entire Los Angeles Metro Area at 6 p.m. on a weekday. This display was computed in less than one hour. Green areas indicate traffic is flowing freely and yellow and red areas indicate congestion.

Image: 
(Image courtesy of Arun Sathanur | Pacific Northwest National Laboratory)

A new machine learning algorithm is poised to help urban transportation analysts relieve bottlenecks and chokepoints that routinely snarl city traffic.

The tool, called TranSEC, was developed at the U.S. Department of Energy's Pacific Northwest National Laboratory to help urban traffic engineers get access to actionable information about traffic patterns in their cities.

WATCH: https://www.youtube.com/watch?v=8S4bLv9CtOo
(Video by Graham Bourque | Pacific Northwest National Laboratory)

Currently, publicly available traffic information at the street level is sparse and incomplete. Traffic engineers generally have relied on isolated traffic counts, collision statistics and speed data to determine roadway conditions. The new tool uses traffic datasets collected from UBER drivers and other publicly available traffic sensor data to map street-level traffic flow over time. It creates a big picture of city traffic using machine learning tools and the computing resources available at a national laboratory.  

"What's novel here is the street level estimation over a large metropolitan area," said Arif Khan, a PNNL computer scientist who helped develop TranSEC. "And unlike other models that only work in one specific metro area, our tool is portable and can be applied to any urban area where aggregated traffic data is available."

UBER-fast traffic analysis

TranSEC (which stands for transportation state estimation capability) differentiates itself from other traffic monitoring methods by its ability to analyze sparse and incomplete information. It uses machine learning to connect segments with missing data, and that allows it to make near real-time street level estimations.

In contrast, the map features on our smart phones can help us optimize our journey through a city landscape, pointing out chokepoints and suggesting alternate routes. But smart phone tools only work for an individual driver trying to get from point A to point B. City traffic engineers are concerned with how to help all vehicles get to their destinations efficiently. Sometimes a route that seems efficient for an individual driver leads to too many vehicles trying to access a road that wasn't designed to handle that volume of traffic.

Using public data from the entire 1,500-square-mile Los Angeles metropolitan area, the team reduced the time needed to create a traffic congestion model by an order of magnitude, from hours to minutes. The speed-up, accomplished with high-performance computing resources at PNNL, makes near-real-time traffic analysis feasible. The research team recently presented that analysis at the August 2020 virtual Urban Computing Workshop as part of the Knowledge Discovery and Data Mining (SIGKDD) conference, and in September 2020 they sought the input of traffic engineers at a virtual meeting on TranSEC.

"TranSEC has the potential to initiate a paradigm shift in how traffic professionals monitor and predict system mobility performance," said Mark Franz, a meeting attendee and a research engineer at the Center for Advanced Transportation Technology, University of Maryland, College Park. "TranSEC overcomes the inherent data gaps in legacy data collection methods and has tremendous potential." 

Machine learning improves accuracy over time

The machine learning feature of TranSEC means that as more data is acquired and processed it becomes more refined and useful over time. This kind of analysis is used to understand how disturbances spread across networks. Given enough data, the machine learning element will be able to predict impacts so that traffic engineers can create corrective strategies.

"We use a graph-based model together with novel sampling methods and optimization engines, to learn both the travel times and the routes," said Arun Sathanur, a PNNL computer scientist and a lead researcher on the team. "The method has significant potential to be expanded to other modes of transportation, such as transit and freight traffic. As an analytic tool, it is capable of investigating how a traffic condition spreads."

With PNNL's data-driven approach, users can upload real-time data and update TranSEC on a regular basis in a transportation control center. Engineers can use short-term forecasts for decision support to manage traffic issues. PNNL's approach is also extensible to include weather or other data that affect conditions on the road.

Computing power for transportation planners nationwide

Just as situational awareness of conditions informs an individual driver's decisions, TranSEC's approach provides situational awareness on a system-wide basis to help reduce urban traffic congestion.

"Traffic engineers nationwide have not had a tool to give them anywhere near real-time estimation of transportation network states," said Robert Rallo, PNNL computer scientist and principal investigator on the TranSEC project. "Being able to predict conditions an hour or more ahead would be very valuable, to know where the blockages are going to be."

While running a full-scale city model still requires high-performance computing resources, TranSEC is scalable. For example, a road network with only the major highways and arterials  could be modeled on a powerful desktop computer.

"We are working toward making TranSEC available to municipalities nationwide," said Katherine Wolf, project manager for TranSEC.

Eventually, after further development, TranSEC could be used to help program autonomous vehicle routes, according to the research team.

Credit: 
DOE/Pacific Northwest National Laboratory

What social distancing does to a brain

image: Expression levels of the neuropeptide Pth2 in the zebrafish brain track the presence and density of others in the environment.

Image: 
Max Planck Institute for Brain Research / J. Kuhl

Have you recently wondered how social-distancing and self-isolation may be affecting your brain? An international research team led by Erin Schuman from the Max Planck Institute for Brain Research discovered a brain molecule that functions as a "thermometer" for the presence of others in an animal's environment. Zebrafish "feel" the presence of others via mechanosensation and water movements - which turns the brain hormone on.

Varying social conditions can cause long-lasting changes in animal behavior. Social isolation, for instance, can have devastating effects on humans and other animals, including zebrafish. The brain systems that sense the social environment, however, are not well understood. To probe whether neuronal genes respond to dramatic changes in the social environment, graduate student, Lukas Anneser, and colleagues raised zebrafish either alone or with their kin for different periods of time. The scientists used RNA sequencing to measure the expression levels of thousands of neuronal genes.

Tracking social density

"We found a consistent change in expression for a handful of genes in fish that were raised in social isolation. One of them was parathyroid hormone 2 (pth2), coding for a relatively unknown peptide in the brain. Curiously, pth2 expression tracked not just the presence of others, but also their density. Surprisingly, when zebrafish were isolated, pth2 disappeared in the brain, but its expression levels rapidly rose, like a thermometer reading, when other fish were added to the tank," explains Anneser.

Thrilled by this discovery, the scientists tested if the effects of isolation could be reversed by putting the previously isolated fish into a social setting. "After just 30 minutes swimming with their kin, there was a significant recovery of the pth2 levels. After 12 hours with kin the pth2 levels were indistinguishable from those seen in socially-raised animals," says Anneser. "This really strong and fast regulation was unexpected and indicated a very tight link between gene expression and the environment."

So which sensory modality do the animals use to detect others and drive changes in gene expression? "It turned out that the sensory modality that controls pth2 expression was not vision, taste or smell, but rather mechanosensation - they actually 'felt' the physical movements of the swimming neighboring fish," explains Schuman.

Sensing water movements

Fish perceive movement ("mechano-sense") in their immediate vicinity via a sensory organ called the lateral line. To test the role of mechanosensation in driving pth2 expression, the team ablated the mechanosensitive cells within the fish's lateral line. In previously isolated animals, the ablation of the lateral line cells prevented rescue of the neuro-hormone that was usually induced by the presence of other fish.

Just as we humans are sensitive to touch, zebrafish appear to be specifically tuned to swimming motion of other fish. The scientists saw changes in pth2 levels caused by water movements that is triggered by conspecifics in the tank. "Zebrafish larvae swim in short bouts. We mimicked this water stimulation by programming a motor to create artificial fish movements. Intriguingly, in previously isolated fish the artificial movements rescued pth2 levels just like the real neighboring fish," explains Anneser.

"Our data indicate a surprising role for a relatively unexplored neuropeptide, Pth2- it tracks and responds to the population density of an animal's social environment. It is clear that the presence of others can have dramatic consequences on an animal's access to resources and ultimate survival - it is thus likely that this neuro-hormone will regulate social brain and behavioral networks," concludes Schuman.

Credit: 
Max-Planck-Gesellschaft