Brain

COVID-19 denial depends on a population's trust in social institutions

An international team of scholars studied how the COVID-19 pandemic has impacted Europeans' stress levels and their trust in their national governments and the healthcare systems. They found that respondents were most stressed by the state of the national economy, and only after that, by the risk of catching COVID-19 and possibly being hospitalized. The results of the study were published in Royal Society Open Science.

The authors of the study represent over 50 universities. Among them is Dmitrii Dubrov, Junior Research Fellow at the HSE Center for Sociocultural Research, who developed and organized the global survey, COVIDiSTRESS. The researchers studied the psychological consequences of the current pandemic-related crisis, as reflected in stress levels. Over 150,000 respondents from over 50 countries participated in the study. The results (below) include answers from 75,570 respondents in 27 countries of the European Union (EU), who were surveyed from March 30 to April 20, 2020.

The general level of respondents' stress was measured on a 10-grade scale developed by psychologists Cohen, Kamarck, and Mermelstein (1983). This scale illustrates people's stress levels over the course of a recent week. The study participants were asked, for example, whether they experienced a lack of control over events, felt pressure due to growing difficulties, or disappointment due to unexpected change. Scores over 2.4 points were considered moderate, while those over 3.7 were considered high.

'Stress is a natural human reaction to negative change. We wanted to find out how humans would behave under stress, during the pandemic, whether they would follow recommendations by the WHO and authorities on how to protect oneself and others from COVID-19,' explains Dmitrii Dubrov, Junior Research Fellow at the HSE Centre for Sociocultural Research.

In many EU countries, levels of stress were moderate or even low. Poland and Portugal demonstrated the highest levels of stress in Europe, while the lowest rates were registered in Denmark and the Netherlands. Women worried more about the pandemic's consequences than men. The respondents were 74.18% female and 24.63% male.

The study participants also talked about the reasons of stress. The results showed that Europeans are most of all concerned about the state of the national economy, with the risk of catching COVID-19 and being hospitalized coming in second place. A total of 24 factors were indicated, including concerns about family and friends, work, or feeling isolated.

The respondents were also asked about their trust in the six key institutions, such as the healthcare system, the WHO, the police, social services, and national governments. Europeans demonstrate the highest levels of trust in their national healthcare systems and the WHO. Trust in national governments was lower than in other institutions. Finland and Denmark demonstrated the highest levels of trust in their governments. On the contrary, people in Bulgaria and Poland were much less inclined to trust their respective national governments.

The participants also evaluated the adequacy of anti-COVID measures implemented by their governments. Citizens of Slovenia and Slovakia believed the national measures to be excessive, while people in Hungary and France thought they were insufficient. Populations in countries were people trust their governments' efforts better, also better comply with social distancing guidelines.

'We have learned that COVID-19 denial depends on people's trust in social institutions, a belief that the government won't leave them on their own with their problems. Institutional trust in impacted by many factors, such as the level of corruption in the country. The results of our study can be used to prepare recommendations on how governments should communicate with people in situations of uncertainty. As we discovered here, the problem is global, which means that systematic work with citizen's demands is needed,' Dmitrii Dubrov said.

Credit: 
National Research University Higher School of Economics

The impact of geopolitical boundaries on cycad conservation efforts

image: The endangered Cycas micronesica as found in the savannas of Southern Guam.

Image: 
University of Guam

Geopolitical boundaries can have a profound effect on the protection of threatened species. A case in point is the native cycads of the United States. A recent review paper written by researchers at the Western Pacific Tropical Research Center at the University of Guam highlights extinction risks of cycad species that occur in U.S. controlled lands and the profound effect geopolitical boundaries has had on the protection of these threatened species. The paper appears in the December 2020 issue of the MDPI journal Diversity.

Cycads are the most threatened plant order worldwide. This is due to a combination of factors including habitat loss, poaching predation by invasive species, and lack of appropriate conservation measures. As a native habitat for this most endangered group of plants in the world, the United States has a responsibility to protect members of this group from extinction.

Species arise over time via several mechanisms. One of these mechanisms is called allopatric speciation, which is separation and isolation through geological time. The eventual result is a distinct species that is different from the original population. This is especially true for islands where plants and animals are isolated and evolve without defenses from plants and animals that do not occur naturally in their environment.

Since the dawn of the Anthropocene, humans have made attempts to classify life based on a hierarchy. Additionally, countries have been established and each country has unique laws pertaining to the conservation of nature. The United States is one such country with a rich history of geopolitical expansion. Over time, U.S. geopolitical changes have altered the number of cycad species that come under U.S. political purview. There are presently five cycad species that come under U.S. jurisdiction: Zamia integrifolia in the Southern United States; Zamia erosa, Zamia pumila, and Zamia potoricensis in Puerto Rico; and Cycas micronesica in Micronesia.

The only cycad species endemic to Micronesia, Cycas micronesica, is listed as "threatened" by the U.S. Fish and Wildlife service and "endangered" by the International Union for Conservation of Nature. Once the most abundant tree in Guam forests, the Cycas micronesica has drastically declined in number since the accidental introduction of several invasive insects to Guam. Additionally, extinction of this cycad would constitute the loss of the only native gymnosperm in the region. Due to the shifting geopolitical designations, this cycad is found within four different political entities.

The authors highlight that species were distinct long before geopolitical boundaries were erected. For example, saying that these are "U.S. cycads" is purely a human distinction. Furthermore, the classification of life into a hierarchy is an anthropogenic attempt to understand biology.

"As taxonomy changes in light of new morphological and genetic evidence, the need becomes clear to use the best available science to inform conservation efforts. Coordinating conservation measures across multiple governments and countries often presents unique challenges but is necessary to mitigate extinction risks," said Benjamin Deloso, a cycad specialist at UOG.

Credit: 
University of Guam

Double-duty catalyst generates hydrogen fuel while cleaning up wastewater

Hydrogen is a pollution-free energy source when it's extracted from water using sunlight instead of fossil fuels. But current strategies for "splitting" or breaking apart water molecules with catalysts and light require the introduction of chemical additives to expedite the process. Now, researchers reporting in ACS ES&T Engineering have developed a catalyst that destroys medications and other compounds already present in wastewater to generate hydrogen fuel, getting rid of a contaminant while producing something useful.

Harnessing the sun's energy to split water to make hydrogen fuel is a promising renewable resource, but it is a slow process even when catalysts are used to speed it along. In some cases, alcohols or sugars are added to boost the rate of hydrogen production, but these chemicals are destroyed as hydrogen is generated, meaning the approach is not renewable. In a separate strategy, researchers have tried using contaminants in wastewater to enhance hydrogen fuel generation. While titanium-based catalysts worked for both removing contaminants and generating hydrogen, the efficiencies were lower than expected for both steps because of their overlapping reaction sites. One way to reduce such interferences is to make catalysts by fusing together different conductive metals, thus creating separate places for reactions to occur. So, Chuanhao Li and colleagues wanted to combine cobalt oxide and titanium dioxide to create a dual-functioning catalyst that would break down common drugs in wastewater while also efficiently converting water into hydrogen for fuel.

To make the catalyst, the researchers coated nanoscale titanium dioxide crystals with a thin layer of cobalt oxide. Initial tests showed that this material didn't produce much hydrogen, so as a next step, the team spiked this dual catalyst with 1% by weight of platinum nanoparticles -- an efficient though expensive catalyst for generating hydrogen. In the presence of simulated sunlight, the platinum-impregnated catalyst degraded two antibiotics and produced substantial amounts of hydrogen. Finally, the team tested their product on real wastewater, water from a river in China and deionized water samples. Under simulated sunlight, the catalyst stimulated hydrogen production in all three samples. The greatest amount of hydrogen was obtained from the wastewater sample. The researchers say their catalyst could be a sustainable wastewater treatment option by generating hydrogen fuel at the same time.

Credit: 
American Chemical Society

Arctic was once lush and green, could be again, new research shows

image: Sarah Crump and her field partner maneuver their makeshift raft across a lake on Baffin Island.

Image: 
Zach Montes Orijin Media

Imagine not a white, but a green Arctic, with woody shrubs as far north as the Canadian coast of the Arctic Ocean. This is what the northernmost region of North America looked like about 125,000 years ago, during the last interglacial period, finds new research from the University of Colorado Boulder.

Researchers analyzed plant DNA more than 100,000 years old retrieved from lake sediment in the Arctic (the oldest DNA in lake sediment analyzed in a publication to date) and found evidence of a shrub native to northern Canadian ecosystems 250 miles (400 km) farther north than its current range.

As the Arctic warms much faster than everywhere else on the planet in response to climate change, the findings, published this week in the Proceedings of the National Academy of Sciences, may not only be a glimpse of the past but a snapshot of our potential future.

"We have this really rare view into a particular warm period in the past that was arguably the most recent time that it was warmer than present in the Arctic. That makes it a really useful analogue for what we might expect in the future," said Sarah Crump, who conducted the work as a PhD student in geological sciences and then a postdoctoral researcher with the Institute of Arctic and Alpine Research (INSTAAR).

To gain this glimpse back in time, the researchers not only analyzed DNA samples, they first had to journey to a remote region of the Arctic by ATV and snowmobile to gather them and bring them back.

Dwarf birch is a key species of the low Arctic tundra, where slightly taller shrubs (reaching a person's knees) can grow in an otherwise cold and inhospitable environment. But dwarf birch doesn't currently survive past the southern part of Baffin Island in the Canadian Arctic. Yet researchers found DNA of this plant in the ancient lake sediment showing it used to grow much farther north.

"It's a pretty significant difference from the distribution of tundra plants today," said Crump, currently a postdoctoral fellow in the Paleogenomics Lab at the University of California Santa Cruz.

While there are many potential ecological effects of the dwarf birch creeping farther north, Crump and her colleagues examined the climate feedbacks related to these shrubs covering more of the Arctic. Many climate models don't include these kinds of changes in vegetation, yet these taller shrubs can stick out above snow in the spring and fall, making the Earth's surface dark green instead of white--causing it to absorb more heat from the sun.

"It's a temperature feedback similar to sea ice loss," said Crump.

During the last interglacial period, between 116,000 and 125,000 years ago, these plants had thousands of years to adjust and move in response to warmer temperatures. With today's rapid rate of warming, the vegetation is likely not keeping pace, but that doesn't mean it won't play an important role in impacting everything from thawing permafrost to melting glaciers and sea level rise.

"As we think about how landscapes will equilibrate to current warming, it's really important that we account for how these plant ranges are going to change," said Crump.

As the Arctic could easily see an increase of 9 degrees Fahrenheit (5 degrees Celsius) above pre-industrial levels by 2100, the same temperature it was in the last interglacial period, these findings can help us better understand how our landscapes might change as the Arctic is on track to again reach these ancient temperatures by the end of the century.

Mud as a microscope

To get the ancient DNA they wanted, the researchers couldn't look to the ocean or to the land--they had to look in a lake.

Baffin Island is located on the northeastern side of Arctic Canada, kitty-corner to Greenland, in the territory of Nunavut and the lands of the Qikiqtaani Inuit. It's the largest island in Canada and the fifth-largest island in the world, with a mountain range that runs along its northeastern edge. But these scientists were interested in a small lake, past the mountains and near the coast.

Above the Arctic Circle, the area around this lake is typical of a high Arctic tundra, with average annual temperatures below 15 °F (?9.5 °C). In this inhospitable climate, soil is thin and not much of anything grows.

But DNA stored in the lake beds below tells a much different story.

To reach this valuable resource, Crump and her fellow researchers carefully balanced on cheap inflatable boats in the summer--the only vessels light enough to carry with them--and watched out for polar bears from the lake ice in winter. They pierced the thick mud up to 30 feet (10 meters) below its surface with long, cylindrical pipes, hammering them deep into the sediment.

The goal of this precarious feat? To carefully withdraw a vertical history of ancient plant material to then travel back out with and take back to the lab.

While some of the mud was analyzed at a state-of-the-art organic geochemistry lab in the Sustainability, Energy and Environment Community (SEEC) at CU Boulder, it also needed to reach a special lab dedicated to decoding ancient DNA, at Curtin University in Perth.

To share their secrets, these mud cores had to travel halfway across the world from the Arctic to Australia.

A local snapshot

Once in the lab, the scientists had to suit up like astronauts and examine the mud in an ultra-clean space to ensure that their own DNA didn't contaminate that of any of their hard-earned samples.

It was a race against the clock.

"Your best shot is getting fresh mud," said Crump. "Once it's out of the lake, the DNA is going to start to degrade."

This is why older lake bed samples in cold storage don't quite do the trick.

While other researchers have also collected and analyzed much older DNA samples from permafrost in the Arctic (which acts like a natural freezer underground), lake sediments are kept cool, but not frozen. With fresher mud and more intact DNA, scientists can get a clearer and more detailed picture of the vegetation which once grew in that immediate area.

Reconstructing historic vegetation has most commonly been done using fossil pollen records, which preserve well in sediment. But pollen is prone to only showing the big picture, as it is easily blown about by the wind and doesn't stay in one place.

The new technique used by Crump and her colleagues allowed them to extract plant DNA directly from the sediment, sequence the DNA and infer what plant species were living there at the time. Instead of a regional picture, sedimentary DNA analysis gives researchers a local snapshot of the plant species living there at the time.

Now that they have shown it's possible to extract DNA that's over 100,000 years old, future possibilities abound.

"This tool is going to be really useful on these longer timescales," said Crump.

This research has also planted the seed to study more than just plants. In the DNA samples from their lake sediment, there are signals from a whole range of organisms that lived in and around the lake.

"We're just starting to scratch the surface of what we're able to see in these past ecosystems," said Crump. "We can see the past presence of everything from microbes to mammals, and we can start to get much broader pictures of how past ecosystems looked and how they functioned."

Credit: 
University of Colorado at Boulder

Three times the gains

From climate change and carbon emissions to biodiversity and global hunger, humanity faces so many challenges that tackling them quickly is a daunting task. One solution that potentially addresses multiple issues could provide the impetus society needs to make significant progress.

An international team of 26 authors, including six at UC Santa Barbara, has just published a study in the prestigious journal Nature offering a combined solution to several of humanity's most pressing challenges. It is the most comprehensive assessment to date of where strict ocean protection can contribute to a more abundant supply of healthy seafood and provide a cheap, natural solution to address climate change, in addition to protecting embattled species and habitats.

The researchers identified specific areas of the ocean that could provide multiple benefits if protected. Safeguarding these regions would protect nearly 80% of marine species, increase fishing catches by more than 8 million metric tons and prevent the release of more than one billion tons of carbon dioxide by protecting the seafloor from bottom trawling, a widespread yet destructive fishing practice.

The study is also the first to quantify the potential release of CO2 into the ocean from trawling, and finds that trawling pumps hundreds of millions of tons of CO2 into the ocean every year.

"Ocean life has been declining worldwide because of overfishing, habitat destruction and climate change. Yet only 7% of the ocean is currently under some kind of protection," said the study's lead author Enric Sala, an explorer in residence at the National Geographic Society.

"In this study, we've pioneered a new way to identify the places that -- if protected --will boost food production and safeguard marine life, all while reducing carbon emissions," Sala said. "It's clear that humanity and the economy will benefit from a healthier ocean. And we can realize those benefits quickly if countries work together to protect at least 30% of the ocean by 2030."

To identify the priority areas, the authors -- leading marine biologists, climate experts and economists -- analyzed the world's unprotected ocean waters. They focused on the degree to which they are threatened by human activities that can be reduced by marine protected areas (for example, overfishing and habitat destruction).

They then developed an algorithm to identify where protections would deliver the greatest benefits across the three complementary goals of biodiversity protection, seafood production and climate mitigation. They mapped these locations to create a practical "blueprint" that governments can use as they implement their commitments to protect nature.

"While we consider three key benefits that marine protection is known to confer, this is really just the beginning," said co-author Darcy Bradley(link is external), co-director of the Ocean and Fisheries Program at UC Santa Barbara's Environmental Market Solutions Lab (emLab). "Our approach is a way to bring multiple stakeholders to the table, to show that their interests can be prioritized, and ultimately to demonstrate that solutions that protect large ocean areas and benefit multiple simultaneous objectives exist."

The study does not provide a single map for ocean conservation, but it offers a first-in-kind framework for countries to decide which areas to protect depending on their national priorities. However, the analysis supports the claim that 30% is the minimum amount of ocean that the world must protect in order to provide multiple benefits to humanity.

"There is no single best solution to save marine life and obtain these other benefits. The solution depends on what society -- or a given country -- cares about, and our study provides a new way to integrate these preferences and find effective conservation strategies," said coauthor Juan Mayorga(link is external), a marine scientist at emLab as well as National Geographic Society's Pristine Seas.

The study comes ahead of the 15th Conference of the Parties to the United Nations Convention on Biological Diversity, which will gather in May in Kunming, China. The meeting will bring together representatives of 190 countries to finalize an agreement to end the world's biodiversity crisis. The goal of protecting 30% of the planet's land and ocean by 2030 (the "30x30" target) is expected to be a pillar of the treaty. The report follows commitments by the United States, the United Kingdom, Canada, the European Commission and others to achieve this target on national and global scales.

"Solutions with multiple benefits are attractive to people and leaders alike," said coauthor Jane Lubchenco, a university distinguished professor at Oregon State University. "Our pioneering approach allows them to pinpoint the places that, if protected, will contribute significantly to three big problems at once: food security, climate change, and biodiversity loss. Our breakthrough in methodology can bring multiple benefits to nature and people."

The report identifies highly diverse marine areas in which species and ecosystems face the greatest threats from human activities. Establishing marine protected areas with strict regulations in those places would safeguard more than 80% of the ranges of endangered species, up from a current coverage of less than 2%. The authors found that priority locations lie throughout the ocean, with the vast majority of them contained within the 200-mile Exclusive Economic Zones (EEZs) of coastal nations.

Additional protection targets are located in the high seas -- those waters governed by international law. These include the Mid-Atlantic Ridge (a massive underwater mountain range); the Mascarene Plateau in the Indian Ocean; the Nazca Ridge off the west coast of South America; and the Southwest Indian Ridge, between Africa and Antarctica.

"Perhaps the most impressive and encouraging result is the enormous gain we can obtain for biodiversity conservation with only 21% of the ocean being protected, if we carefully chose the location of strictly protected marine areas," said coauthor David Mouillot, a professor at the Université de Montpellier in France. "One notable priority for conservation is Antarctica, which currently has little protection, but is projected to host many vulnerable species in a near future due to climate change."

Shoring up the Fishing Industry

The study finds that wisely placed marine protected areas (MPAs) that ban fishing would actually boost the production of fish at a time when supplies of wild-caught fish are dwindling and demand is rising. In doing so, the study refutes a long-held view that ocean protection harms fisheries. Instead, it opens up new opportunities to revive the industry just as it is suffering from a recession due to overfishing and the impacts of global warming.

"Some argue that closing areas to fishing hurts fishing interests. But the worst enemy of successful fisheries is overfishing, not protected areas," said lead author Sala. The study finds that protecting the right places could increase the catch of seafood by over 8 million metric tons relative to business as usual.

"It's simple: When overfishing and other damaging activities cease, marine life bounces back," said co-author Reniel Cabral(link is external), an assistant researcher at UC Santa Barbara's Marine Science Institute and in its Bren School of Environmental Science & Management. "After protections are put in place, the diversity and abundance of marine life increase over time, with measurable recovery within reserves occurring in as little as three years. Target species and large predators come back, and entire ecosystems are restored within MPAs. With time, the ocean can heal itself and again provide services to humankind."

Soaking up Carbon

The study is also the first to calculate the climate impacts of bottom trawling, a damaging fishing method used worldwide in which boats drag heavy nets across the ocean floor. The researchers found that the amount of CO2 released into the ocean from this practice is larger than most countries' annual carbon emissions, larger even than emissions from global aviation.

"The ocean floor is the world's largest carbon storehouse. If we're to succeed in stopping global warming, we must leave the carbon-rich seabed undisturbed," said coauthor Trisha Atwood of Utah State University. "Yet every day, we are trawling the seafloor, depleting its biodiversity and mobilizing millennia-old carbon and thus exacerbating climate change. Our findings about the climate impacts of bottom trawling will make the activities on the ocean's seabed hard to ignore in climate plans going forward."

The study finds that countries with large national waters and large industrial bottom trawl fisheries have the highest potential to contribute to climate change mitigation via protection of carbon stocks. The authors estimate that protecting only 4% of the ocean -- mostly within national waters -- would eliminate 90% of the present risk of carbon disturbance due to bottom trawling.

Closing a Gap

The study's range of findings helps to close a gap in our knowledge about the impacts of ocean conservation, which to date had been understudied relative to land-based conservation.

"The ocean covers 70% of the Earth; yet, until now, its importance for solving the challenges of our time has been overlooked," said coauthor Boris Worm, Killam Research Professor at Dalhousie University in Halifax, Nova Scotia. "Smart ocean protection will help to provide cheap natural climate solutions, make seafood more abundant and safeguard imperiled marine species - all at the same time.

"The benefits are clear," he continued. "If we want to solve the three most pressing challenges of our century -- biodiversity loss, climate change and food shortages -- we must protect our ocean."

Credit: 
University of California - Santa Barbara

Algae growing on dead coral could paint a falsely rosy portrait of reef health

image: Carnegie's Manoela Romanó de Orte and Ken Caldeira led a research team that deployed a cutting-edge incubator to monitor the metabolic activity of coral and algae in an area of Australia's Great Barrier Reef that had been damaged by tropical cyclones. The CISME, or Coral In Situ Metabolism and Energetics, instrument is a small chamber that can be placed directly on the coral surface and allow scientists to monitor coral growth by measuring changes in seawater chemistry.

Image: 
Image courtesy of Ken Caldeira.

Washington, DC-- Algae colonizing dead coral are upending scientists' ability to accurately assess the health of a coral reef community, according to new work from a team of marine science experts led by Carnegie's Manoela Romanó de Orte and Ken Caldeira. Their findings are published in Limnology and Oceanography.

Corals are marine invertebrates that build tiny exoskeletons, which accumulate to form giant coral reefs. Widely appreciated for their beauty, these reefs are havens for biodiversity and crucial for the economies of many coastal communities. But they are endangered by ocean warming, seawater acidification, extreme storms, pollution, and overfishing.

Coral reefs use calcium carbonate to construct their architecture, a process called calcification. For a reef to be healthy, its coral's building activities must exceed erosion, a natural phenomenon that is exacerbated by all the environmental stresses to which human activity is exposing them.

"Coral reefs are dealing with so many simultaneous threats, many of which directly inhibit their ability to grow at a sustainable rate," Caldeira explained. "If they can't maintain a slow but steady amount of growth, they could get knocked out by rising sea levels in the coming years."

However, Romanó de Orte and Caldeira's research--with former Carnegie colleagues David Koweek (now at Ocean Visions), Yuichiro Takeshita (now at the Monterey Bay Aquarium Research Institute), and Rebecca Albright (now at the California Academy of Sciences)--showed that if researchers only make measurements to assess coral health during the daytime, it could lead to false sense of security.

Why?

Because dead coral is often colonized by algal communities that can also accumulate carbonate minerals during the day. However, most of these deposits dissolve overnight, so the carbonate minerals do not accumulate over time. In contrast, living corals, , which have evolved to build massive carbonate reefs visible from space, can continue to build their skeletons, albeit slowly, even at night.

"It's long been thought that measuring calcium carbonate production could be linked directly to the health of a coral community," Romanó de Orte said. "But our findings show that as algae increasingly succeed in overgrowing dead coral, it is going to be more difficult to rely on a once tried-and-true method for assessing whether a reef community is thriving."

To gain this critical understanding, the research team--which also included Tyler Cyronak of Nova Southeastern University, Alyssa Griffin of the Scripps Institution of Oceanography, Kennedy Wolfe of the University of Queensland, and Alina Szmant and Robert Whitehead of University of North Carolina Wilmington--deployed specially designed, state-of-the-art incubator technology to closely monitor both coral and colonizing algae in an area of Australia's Great Barrier Reef that had been heavily damaged by two tropical cyclones in 2014 and 2015. They were able to monitor both calcification and dissolution of carbonate minerals, as well as the organisms' metabolic activity.

"This amazing tool allowed us to home in on the specific role that each organism has in an ecosystem's total output, which gives us new insights into how reefs are changing" Romanó de Orte explained.

Credit: 
Carnegie Institution for Science

A new, vital player in graft-versus-host disease and organ transplant rejection

A long noncoding RNA whose function was previously unknown turns out to play a vital role in mobilizing the immune response following a bone marrow transplant or solid organ transplantation.

This RNA molecule, cataloged in scientific databases simply as Linc00402, helps activate immune defenders known as T cells in response to the presence of foreign human cells, according to a new study by researchers at the University of Michigan Rogel Cancer Center and Michigan Medicine.

The investigation, which included samples from more than 50 patients who underwent a bone marrow or heart transplant, suggests inhibiting the RNA therapeutically might improve outcomes for transplant recipients. Their findings appear in Science Translational Medicine.

Study lead author Daniel Peltier, M.D., Ph.D., is a pediatric bone marrow transplant physician at U-M.

"We see a lot of graft-versus-host disease -- or GVHD -- which is a potentially fatal complication that can happen after transplant when T cells in the donor's blood see the transplant recipient's cells as invaders and attack them," he says. "Unfortunately, the medicines we use to prevent GVHD suppress the immune system and can raise the risk of a cancer relapse or infection, and they also have other side effects."

In taking a deep dive into the biology, Peltier and his colleagues hoped to find a way of targeting just the problematic components of the immune system that cause GVHD.

One reason for looking at this particular type of RNA molecule is that they tend to be expressed only by a limited number of tissues in a limited number of contexts, explains senior study author Pavan Reddy, M.D., deputy director of the Rogel Cancer Center and division chief of hematology/oncology at Michigan Medicine.

"So, unlike a lot of RNAs, which are expressed in all kinds of cells by all kinds of living things, long noncoding RNAs offer the possibility that we might be able to target them in a relatively unique and disease-specific way," he says.

Meaning: If doctors can find a way to zero-in and short circuit just the T cells' tendency to get aggressive in response to the transplant, they may not need to suppress the patient's immune system in a more general way that leaves them susceptible to infection or a regrowth of their cancer.

The researchers are hopeful the discoveries could also be used to help predict which patients are most likely to develop GVHD.

Finding Linc00402

Very little previous work has been done to examine the role of noncoding RNAs in human T cells, especially in clinically relevant contexts, Reddy says.

"Our lab studies bone marrow transplantation, which is a T-cell mediated process," he says. "Knowing more about how T cells work will help us make bone marrow transplants, or any immunotherapy, more efficacious."

The research project began with a database of blood samples from a range of patients who had undergone a bone marrow transplant at Michigan Medicine. Some patients had closely matched donors, while others had what doctors call mismatched donors.

"The whole idea was: If you take a T cell from one individual and put it into a different individual, what happens to its RNA profile?" Reddy says.

Then, using RNA sequencing, the research team looked for patterns across the bone marrow transplant patients. Their findings were validated in two other cohorts of patients using different methodologies.

"We wanted to make sure that what we were seeing wasn't just chance or an artifact of one approach," Reddy says. "And that's how we found this particular RNA, Linc00402. It's the one that remained consistent through the various cohorts we examined and the various ways we looked at things."

The bone marrow transplant researchers also collaborated with the lab of co-author Daniel Goldstein, M.D., director of the Michigan Biology of Cardiovascular Aging program, to see if the results held true for heart transplant patients. And it turns out they did.

"So that really told us that these long noncoding RNAs are unique to T cells that are responding to foreignness, both in the context of bone marrow transplantation and solid organ transplantation," Reddy says. "This is precisely the kind of cross-disciplinary collaboration that can only happen in a place like U-M."

Probing the mysteries of Linc00402

Since Linc00402 gets improperly activated in the presence of foreign cells, the researchers also conducted experiments to see if they saw the same response in the presence of another type of invader: a viral infection. But they didn't see elevated levels of the RNA in response to the virus.

"This strongly suggests that this dysregulation is a change you only see when you put a T cell from one human being into another," Reddy says.

Through a series of experiments, the researchers dug deeper into the RNA and its behavior. They used genetic tools like CRISPR and gapmers to silence the gene that makes it and lentiviruses to amplify it. And this revealed that Linc00402 plays a key role in T cells' ability to respond to a threat by proliferating.

In collaboration with the lab of co-author Arul Chinnaiyan, M.D., Ph.D., director of the Michigan Center for Translational Pathology, they found that within each cell, the RNA is primarily expressed in the cytoplasm.

"This led us to hypothesize that the RNA's normal function is help with cellular signaling following activation of the T cells," Peltier adds.

While many long noncoding RNAs are species-specific as well as tissue-specific, Linc00402 is present and regulates the same functions in mice. This surprising detail paves the way for animal model studies that could speed the progress of translating these laboratory discoveries toward the bedside, the researchers say. (The fact that it's conserved between species is another clue to its functional importance, Peltier notes.)

As these new details about Linc00402 are coming to light, the researchers are also proposing giving the RNA a more memorable name. In the study, they propose calling it ReLot, for regulatory long noncoding RNA of T cells.

"Scientists have only relatively recently started decoding the importance of some of these parts of the genome that don't code for proteins," Peltier says. "It's really only been since the robust sequencing of the transcriptome that we realized that the 80-90% that we thought was 'junk DNA' is definitely not junk."

Credit: 
Michigan Medicine - University of Michigan

Go with the flow: New model helps cities crack bottlenecks, decrease commute times

A world-first 'flow model' devised by Australian researchers could drastically slash public transport commuter times during peak periods on some of the busiest roads in major cities, new research shows.

When this flow model was implemented to improve the worst traffic bottlenecks across Melbourne, commuters saved close to 2000 hours of travel time during a single morning peak period (7am-9am) and approximately 11,000 hours of passenger travel time during a normal weekday.

Ameliorating major traffic bottlenecks also contributed to a more than 23 per cent improvement in reliability of Melbourne's public transport network, on average, during weekdays and by up to 26 per cent on weekends.

Most bottlenecks were those that cut through the Melbourne central business district, yet links connecting suburban sites such as La Trobe and Monash universities, and Chadstone shopping centre to the metro train network were among the most critical bottlenecks in Melbourne's transport network.

Research by Monash University and RMIT University, published in the prestigious international journal Nature Communications, introduced a novel flow model - built upon the so called Unaffected Demand (UD) concept - to examine the impact of road congestion on travelling passengers using the bus and tram networks.

A flow network is the mathematical model for any system with a network structure where connections are a means for carrying some kind of flow from one component to another.

Traffic and public transport systems can be perfectly modeled as complex networks where different entities (intersections or stops) are mapped to a set of components (or network nodes), and the means for transportation of flow demand between these locations are represented as connections between the nodes.

"Whether it be passengers in transport systems, or energy in power grids, the primary purpose of most critical infrastructures is to carry some kind of flow between different locations," Professor Hai Vu, research co-author and Director of Monash University's Institute of Transport Studies, said.

"The introduced concept of UD is our simple yet effective way of measuring this flow, as well as its significance, to determine the reliability of demand-serving networks and identifying the impact of congestion at any particular part of the network in the 'big picture'.

"From our analysis we've identified the impact of mitigating congestion by any measure on a road segment - not only for passengers normally passing through that corridor, but also for all trips between different places that can potentially benefit from that decongested road as a new pathway."

Research co-author and student in Monash University's Faculty of Engineering, Homayoun Hamedmoghadam, said the most effective solution to avoid bottlenecks is to allocate segregated lanes and give priority to public transport vehicles in those high impact locations.

"On the one hand, dedicating road spaces to bus and tram vehicles will significantly improve functionality of the public transport network and its quality of service which encourages its use, and on the other hand, this means less road space, which is very well known to be a discouraging factor for car use," Mr Hamedmoghadam said.

The research team extracted the mobility demand of major cities from detailed transportation data and found that applying state-of-the-art theories from network science and physics can provide a different, yet more accurate, understanding of reliability and vulnerabilities of transport networks.

The real-world case studies in the paper tackles the problem of conflicting road congestion with passenger movements by on-road (bus and tram) public transport (PT) in both Melbourne and Brisbane.

Melbourne's on-road public transport network was, at the time of collecting data in 2017, comprised of approximately 5500 nodes, 10,500 links, and a flow demand derived from part of 470,000 trips performed during a normal weekday.

Brisbane had a relatively smaller network with approximately 1400 nodes and 3400 links on average on a regular weekday.

Associate Professor Mahdi Jalili from RMIT's School of Engineering said the proposed model is capable of identifying vital bottlenecks of the network even if they are located at any corridor other than the major roads.

"The application of our framework to any urban road network leads to determination of roads where congestion, regardless of its extent, has a detrimental effect on the quality of travel over the network as a whole. Of course, major roads are pathways for more traffic and they are more prone to congestion compared to back streets," Associate Professor Jalili said.

With the ever increasing availability of data and unprecedented advancements in scientific understanding of transportation systems, Professor Lewi Stone from RMIT's School of Science says there is much to benefit from cutting-edge research in policy making, planning, and management in the area of transportation.

"By exploiting this information, our analysis becomes capable of revealing the vital bottlenecks of transport networks which evolve during a day but follow a pattern that repeats day to day," he said.

"This means that we can identify adjustments required to fine-tune the network according to how passengers move throughout the day. We can also pinpoint problems in the network that hinder passenger movements at a certain time every day, like congestion bottlenecks of morning rush-hour."

Credit: 
Monash University

Low-education voters disregard policy beliefs at the polls, research finds

image: Low-education voters are more likely to abandon their social welfare policy beliefs at the voting booth.

Image: 
Luis Sanz / UC Riverside

Many people who embrace social welfare programs vote against their own interests, according to new UC Riverside research.

The mitigating factor is education: The more education one has, the more likely one is to stick to one's policy preferences.

"It means candidates who employ tactics such as fear and attaching patriotism to certain concepts can persuade people to vote for candidates who are in opposition to their social beliefs," Diogo Ferrari, a professor of political science at UC Riverside, wrote in his recently published paper, "Education, Belief Structures, Support for Welfare Polices, and Vote," published in the journal Education & Society.

For the study, Ferrari looked at public opinion surveys collected in 2016 in more than 30 European, Asian, and North American countries. The surveys included information about peoples' education, and 18 questions gauging attitudes toward social welfare policies including social security, unemployment, education, health spending, industry financing, and income redistribution. Lastly, the survey asked which political party the respondent voted for in the last general election.

Among people with low education, social programs such as old-age pensions and giving financial aid to low-income students are met with support. Unemployment insurance, in particular, is popular among those with low education, defined as having a high school education or less. The program is about three times more popular than among those with high education, meaning at least an undergraduate degree.

"The least-educated support social protection more than the most-educated, as do the poorest groups within the same education group," Ferrari wrote.

But support for left-wing policies among people with low education doesn't translate to support for left-wing parties. It's not just left-wing beliefs and voting that are misaligned among the least educated voters: Attitudes against social welfare don't necessarily align with right-wing voting, either.

"It's only when schooling is high that ... positions are harmonized with the vote for right-wing or left-wing parties," Ferrari wrote. "Less-educated groups contradict, in behavior (vote), their attitudinal tendency to support welfare policies."

Ferrari wrote it makes them prey to the "demagogue candidate" who uses "caricatured notions of right and left to position himself politically before less attentive voters."

That could mean aligning party politics with patriotism, religion, or the promise of eliminating political corruption. And so, voters with low education can end up voting against their own interests.

"The idea is to deviate people's attention from some of the things they care about and focus on their attitudes on other areas," Ferrari said. "A candidate can emphasize anti-illegal-immigration policy, or economic nationalism, or anti-political-elite positions.

"The implication of the study is that, everything else the same, (such tactics) seem more effective among those who are less educated."

More educated voters, meanwhile, are less likely than low-educated voters to sacrifice their policy preferences and vote for parties more distant in terms of policy positions.

Ferrari's findings build on a long-held position among political scientists. In 1964's "The Nature of Belief Systems," political scientist Philip Converse argued citizens can't process large quantities of political information, which leads to a lack of structure and stability in their views. He asserted that, when people are asked to pair the terms "liberal" and "conservative" with ideology, they struggle.

Ferrari's new research qualifies that argument, asserting formal education can prevent that misalignment.

"The idea is that formal education, or schooling, makes people more likely to use broad organizing conceptual schemes such as notions of 'conservatism' and 'liberalism' to evaluate political affairs and categorize political actors," Ferrari said. "The fact is that we can clearly distinguish policy preferences between some social groups, and the match between those preferences and vote is stronger among the most educated, which indicates they are less likely to 'sacrifice' their overall policy preferences in favor of a few other 'issues of the day' when voting."

Credit: 
University of California - Riverside

Nurse work environment influences stroke outcomes

PHILADELPHIA (March 17, 2021) - Stroke remains a leading cause of death worldwide and one of the most common reasons for disability. While a wide variety of factors influence stroke outcomes, data show that avoiding readmissions and long lengths of stay among ischemic stroke patients has benefits for patients and health care systems alike. Although reduced readmission rates among various medical patients have been associated with better nurse work environments, it is unknown how the work environment might influence readmissions and length of stay for ischemic stroke patients.

In a new study from the University of Pennsylvania School of Nursing's (Penn Nursing) Center for Health Outcomes and Policy Research (CHOPR), researchers evaluated the association between the nurse work environment and readmission and length of stay for close to 200,000 hospitalized adult ischemic stroke patients in more than 500 hospitals. They found that in hospitals with better nurse work environments, ischemic stroke patients experienced lower odds of 7? and 30?day readmissions and lower lengths of stay.

Their research has been published in the journal Research in Nursing & Health. The article "Better Nurse Work Environments Associated with Fewer Readmissions and Shorter Length of Stay Among Adults with Ischemic Stroke: A Cross?Sectional Analysis of United States Hospitals" is available online.

"The work environment is a modifiable feature of hospitals that should be considered when providing comprehensive stroke care and improving post?stroke outcomes," says Heather Brom, PhD, RN, NP-C, lecturer at Penn Nursing and lead author of the article. "Our findings have important implications for quality improvement initiatives for stroke care management."

Creating good work environments for nurses is especially important so that they have adequate time to spend with stroke patients and can communicate effectively with all team members and feel supported by managers to make decisions about nursing care. "All of these aspects of the nurse work environment facilitate an effective and efficient discharge planning process, which has the potential to decrease delays in discharge and avoidable readmissions," says J. Margo Brooks Carthon, PhD, RN, FAAN, Associate Professor of Nursing and one of the co-authors of the article.

Credit: 
University of Pennsylvania School of Nursing

Researchers identify barriers to use of surface electromyography in neurorehabilitation

image: Rakesh Pilkar, Ph.D. is a Senior Research Scientist in the Center for Mobility and Rehabilitation Engineering Research at Kessler Foundation and Assistant Research Professor in the Department of Physical Medicine & Rehabilitation at Rutgers - New Jersey Medical School. Dr. Pilkar's expertise is in providing engineering solutions to biomedical research problems.

Image: 
Kessler Foundation

East Hanover, NJ. March 17, 2021. Kessler Foundation researchers have identified several practical and technical barriers to the widespread use of surface electromyography (sEMG) in clinical neurorehabilitation. Based on their holistic analysis of these factors, the researchers suggest a collaborative, interdisciplinary, and unified approach to enable rehabilitation professionals to routinely use sEMG. The article, "Use of Surface EMG in Clinical Rehabilitation of Individuals With SCI: Barriers and Future Considerations" (doi: 10.3389/fneur.2020.578559), was published December 18, 2020, in Frontiers in Neurology. It is available open access at https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7780850/

The authors are Rakesh Pilkar, PhD, Kamyar Momeni, PhD, Arvind Ramanujam, Manikandan Ravi, Erica Garbarini, and Gail F. Forrest, PhD, affiliated with the Center for Mobility and Rehabilitation Engineering Research and the Tim and Caroline Reynolds Center for Spinal Stimulation at Kessler Foundation.

sEMG is a noninvasive technology that detects, records, and interprets the electrical activity of muscles. The quantifiable information on myoelectric output recorded by sEMG is extremely useful in assessing impairment and potentially determining patient-specific and effective interventions for individuals with spinal cord injury (SCI). However, while sEMG is commonly used in neurorehabilitation research, its integration into clinical practice has been limited, according to lead author Dr. Pilkar, senior research scientist at the Center for Mobility and Rehabilitation Engineering Research.

In their analysis, the research team determined several factors that prevent widespread use of sEMG in clinical practice. "One major obstacle is integrating the time-consuming aspects of sEMG into the already demanding schedule of physical therapists, occupational therapists, and other clinicians," explained Dr. Pilkar. "Also, clinicians are often unfamiliar with technical aspects of sEMG data processing and may not have been exposed to or trained in certain aspects of this technology," he added.

The research team also identified technical challenges such as transferring the frequent research updates to the sEMG systems used in a clinical setting; lack of user-friendly interfaces; and the need for a standardized, multidisciplinary approach to the handling and interpretation of data. An additional consideration, specific to research in SCI, is that reading and interpreting EMGs for this population requires an additional skillset, as the physiological and structural state of the spinal cord affect how data are interpreted.

To overcome these obstacles, Kessler researchers propose a series of actions to facilitate the use of sEMG by rehabilitation professions. First, including hands-on sEMG experience in educational and professional training programs, and exposing trainees to non-clinical experts in complementary fields such as engineers, technicians, and data scientists. Second, developing simpler, more user-friendly technology interfaces, as well as offering open-access user tutorials to make it easier for clinicians to integrate and use sEMG. Third, codifying a means to regularly transfer research-based knowledge about sEMG and its relevance to SCI rehabilitation from researchers to clinicians will empower rehabilitation professionals to use sEMG with more confidence.

"Addressing these barriers will improve our ability to objectively assess neuromuscular outcomes," Dr. Pilkar predicted, "which is fundamental to developing interventions that improve motor function and mobility in individuals with deficits caused by SCI."

Credit: 
Kessler Foundation

Second COVID-19 wave in Europe less lethal than first wave

image: European countries experienced more than one wave, with greater case numbers observed in subsequent waves than in the first.

Image: 
Nick James, Max Menzies, and Peter Radchenko

WASHINGTON, March 16, 2021 -- As Europe experienced its enormous second wave of the COVID-19 disease, researchers noticed the mortality rate -- progression from cases to deaths -- was much lower than during the first wave.

This inspired researchers from the University of Sydney and Tsinghua University to study and quantify the mortality rate on a country-by-country basis to determine how much the mortality rate from the second wave decreased from the first.

In Chaos, by AIP Publishing, Nick James, Max Menzies, and Peter Radchenko introduce methods to study the progression of COVID-19 cases to deaths during the pandemic's different waves. Their methods involve applied mathematics, specifically nonlinear dynamics, and time series analysis.

"We take a time series, apply an algorithmic approach to chop it up into first and later waves, and do some relatively simple optimization and calculations to determine two different mortality numbers," said James, from the University of Sydney.

The mortality rate of the massive European second wave turned out to be much less severe -- at least with respect to reported cases and deaths. But how much less severe and how did it differ between countries?

"We think answering these questions is important, and to answer this for all of Europe, not just the wealthier Western countries," said Menzies, from Tsinghua University. "In Belarus, for example, the mortality rate actually increased during its second wave, while Ukraine and Moldova were still in their first wave as of the end of November."

The researchers discovered this was very different from the Netherlands, Belgium, France, and other countries that drastically reduced their mortality rates -- at least with respect to reported numbers -- between their first and second waves.

"Our work shows sharp drops in mortality with respect to reported cases and deaths," Menzies said. "The problem will always be, what is the true number of cases in the early first wave? We may never know, but we imagine future research and analysis will try to determine it."

When the researchers reran their analysis on estimated true cases and estimated deaths, Radchenko, from the University of Sydney, pointed out that those measures show serious limitations.

"Excess mortality is often negative relative to previous years, so it's unsuitable for measuring the true numbers of COVID-19 deaths," Radchenko said. "We hope others will more closely analyze the true numbers, perhaps using more specialized data such as out of particular hospitals or regions where testing was more reliable."

Broad similarity was also observed between Europe and the U.S., where Northeastern states behaved similarly to wealthy Western European countries in their sharp reductions of mortality during the second wave.

Credit: 
American Institute of Physics

How hummingbirds hum

image: Anna's hummingbird flying in the experimental setup, drinking sugar water from a fake flower.

Image: 
Photo: Lentink Lab / Stanford University.

The hummingbird is named after its pleasant humming sound when it hovers in front of flowers to feed. But only now has it become clear how the wing generates the hummingbird's namesake sound when it is beating rapidly at 40 beats per second. Researchers from Eindhoven University of Technology, Sorama, a TU/e spin-off company, and Stanford University meticulously observed hummingbirds using 12 high-speed cameras, 6 pressure plates and 2176 microphones. They discovered that the soft and complex feathered wings of hummingbirds generate sound in a fashion similar to how the simpler wings of insect do. The new insights could help make devices like fans and drones quieter.

The team of engineers succeeded in measuring the precise origin of the sound generated by the flapping wings of a flying animal for the first time. The hummingbird's hum originates from the pressure difference between the topside and underside of the wings, which changes both in magnitude and orientation as the wings flap back and forth. These pressure differences over the wing are essential, because they furnish the net aerodynamic force that enables the hummingbird bird to liftoff and hover.

Unlike other species of birds, a hummingbird wing generates a strong upward aerodynamic force during both the downward and upward wing stroke, so twice per wingbeat. Whereas both pressure differences due to the lift and drag force acting on the wing contribute, it turns out that the upward lifting pressure difference is the primary source of the hum.

The difference between whining, humming, buzzing and wooshing

Professor David Lentink of Stanford University: "This is the reason why birds and insects make different sounds. Mosquitoes whine, bees buzz, hummingbirds hum, and larger birds 'woosh'. Most birds are relatively quiet because they generate most of the lift only once during the wingbeat at the downstroke. Hummingbirds and insects are noisier because they do so twice per wingbeat."

The researchers combined all measurements in a 3D acoustic model of bird and insect wings. The model not only provides biological insight into how animals generate sound with their flapping wings, it also predicts how the aerodynamic performance of a flapping wing gives the wing sound its volume and timbre. "The distinctive sound of the hummingbird is perceived as pleasant because of the many 'overtones' created by the varying aerodynamic forces on the wing. A hummingbird wing is similar to a beautifully tuned instrument," Lentink explains with a smile.

High-tech sound camera

To arrive at their model, the scientists examined six Anna's hummingbirds, the most common species around Stanford. One by one, they had the birds drink sugar water from a fake flower in a special flight chamber. Around the chamber, not visible to the bird, cameras, microphones and pressure sensors were set up to precisely record each wingbeat while hovering in front of the flower.

You can't just go out and buy the equipment needed for this from an electronics store. CEO and researcher Rick Scholte of Sorama, a spin-off of TU Eindhoven: "To make the sound visible and be able to examine it in detail, we used sophisticated sound cameras developed by my company. The optical cameras are connected to a network of 2176 microphones for this purpose. Together they work a bit like a thermal camera that allows you to show a thermal image. We make the sound visible in a 'heat map', which enables us to see the 3D sound field in detail."

New aerodynamic force sensors

To interpret the 3D sound images, it is essential to know what motion the bird's wing is making at each sound measurement point. For that, Stanford's twelve high-speed cameras came into play, capturing the exact wing movement frame-by-frame.

Lentink: "But that's not end of story. We also needed to measure the aerodynamic forces the hummingbird's wings generates in flight. We had to develop a new instrument for that." During a follow-up experiment six highly sensitive pressure plates finally managed to record the lift and drag forces generated by the wings as they moved up and down, a first.

The terabytes of data then had to be synchronized. The researchers wanted to know exactly which wing position produced which sound and how this related to the pressure differences. Scholte: "Because light travels so much faster than sound, we had to calibrate each frame separately for both the cameras and the microphones, so that the sound recordings and the images would always correspond exactly." Because the cameras, microphones and sensors were all in different locations in the room, the researchers also had to correct for that.

Algorithm as a composite artist

Once the wing location, the corresponding sound and the pressure differences are precisely aligned for each video frame, the researchers were confronted with the complexity of interpretating high volume data. The researchers tackled this challenge harnessing artificial intelligence, the research of TU/e PhD student, and co-first author, Patrick Wijnings.

Wijnings: "We developed an algorithm for this that can interpret a 3D acoustic field from the measurements, and this enabled us to determine the most probable sound field of the hummingbird. The solution to this so-called inverse problem resembles what a police facial composite artist does: using a few clues to make the most reliable drawing of the suspect. In this way, you avoid the possibility that a small distortion in the measurements changes the outcome."

The researchers finally managed to condense all these results in a simple 3D acoustic model, borrowed from the world of airplanes and mathematically adapted to flapping wings. It predicts the sound that flapping wings radiate, not only the hum of the hummingbird, but also the woosh of other birds and bats, the buzzing and whining of insects and even the noise that robots with flapping wings generate.

Making drones quieter

Although it was not the focus of this study, the knowledge gained may also help improve aircraft and drone rotors as well as laptop and vacuum cleaner fans. The new insights and tools can help make engineered devices that generate complex forces like animals do quieter.

This is exactly what Sorama aims to do: "We make sound visible in order to make appliances quieter. Noise pollution is becoming an ever-greater problem. And a decibel meter alone is not going to solve that. You need to know where the sound comes from and how it is produced, in order to be able to eliminate it. That's what our sound cameras are for. This hummingbird wing research gives us a completely new and very accurate model as a starting point, so we can do our work even better," concludes Scholte.

Credit: 
Eindhoven University of Technology

How pregnancy turns the stress response on its head

COLUMBUS, Ohio - The link between psychological stress and physical health problems generally relates to a stress-induced immune response gone wild, with inflammation then causing damage to other systems in the body. It's a predictable cascade - except in pregnancy, research suggests.

Scientists exploring the negative effects of prenatal stress on offspring mental health set out to find the immune cells and microbes in stressed pregnant mice most likely to trigger inflammation in the fetal brain - the source for anxiety and other psychological problems identified in previous research.

Instead, the researchers found two simultaneous conditions in response to stress that made them realize just how complex the cross-talk between mom and baby is during gestation: Immune cells in the placenta and uterus were not activated, but significant inflammation was detected in the fetal brain.

They also found that prenatal stress in the mice led to reductions in gut microbial strains and functions, especially those linked to inflammation.

"I thought it was going to be a fairly straightforward tale of maternal inflammation, changes in microbes and fetal inflammation. And while the changes in microbes are there, the inflammation part is more complex than I had anticipated," said Tamar Gur, senior author of the study and assistant professor of psychiatry and behavioral health, neuroscience, and obstetrics and gynecology at The Ohio State University.

"The complex interplay between the stress response and the immune system is dysregulated by stress, which is problematic for the developing fetus. There are key changes during this critical window that can help shape the developing brain, so we want to figure out how we could potentially intervene to help regulate these systems."

The study was published recently in Scientific Reports.

Most attention paid to the negative effects of prenatal stress on offspring mental health focus on disruptive major life events or exposure to disaster, but evidence also suggests that up to 84% of pregnant women experience some sort of stress.

In a previous study, Gur's lab found that prenatal stress's contributions to life-long anxiety and cognitive problems in mouse offspring could be traced to changes in microbial communities in both mom and baby.

Gur focuses on the intrauterine environment in her search for factors that increase the risk for prenatal stress's damaging effects, and this newer study opened her eyes to how complicated that environment is.

"The dogma would be that we're going to see an influx of immune cells to the placenta. The fact that it's suppressed speaks to the powerful anti-inflammatory response of the mom. And that makes sense - a fetus is basically a foreign object, so in order to maintain pregnancy we need to have some level of immunosuppression," said Gur, also an investigator in Ohio State's Institute for Behavioral Medicine Research and a maternal-fetal psychiatrist at Ohio State Wexner Medical Center.

"We want to figure out what is at the interface between mom and baby that is mediating the immunosuppressive effect on the maternal side and the inflammation on the fetal side. If we can get at that, we'll get really important keys to understanding how best to prevent the negative impact of prenatal stress."

Prevention could come in the form of prebiotics or probiotics designed to boost the presence of beneficial microbes in the GI tract of pregnant women. Maternal microbes affect the brains and immune systems of developing offspring by producing a variety of chemicals the body uses to manage physiological processes.

"I think microbes hold really important clues and keys, making them a tantalizing target for intervention. We can do things about individuals' microbes to benefit both mom and baby," Gur said.

To mimic prenatal stress during the second and early third trimesters, pregnant mice in her lab are subjected to two hours of restraint for seven days to induce stress. Control mice are left undisturbed during gestation.

In this recent study, the researchers found stress in mice activated steroid hormones throughout the body - the sign of a suppressed immune system - and resulted in lower-than-expected populations of immune cells in reproductive tissue, suggesting that the uterus was effectively resisting the effects of the stress.

An examination of colon contents showed differences in microbial communities between stressed and non-stressed mice, with one family of microbes that influences immune function markedly decreased in stressed mice. The researchers found that stress showed few signs of gene-level changes in the colon that could let bacteria escape to the bloodstream - one way that microbes interfere with body processes.

"There are absolutely changes in microbes that might help explain key pathways that are important for health and the immune system, especially when it comes to the placenta and the mom's immune system," Gur said.

In future studies, her lab will examine immune cells in the fetal brain and monitor how gene expression changes in cells in the placenta in response to stress. She is also leading an ongoing observational study in women, tracking microbes, inflammation and stress levels during and after pregnancy.

Credit: 
Ohio State University

Photocatalytic efficiency in photocatalysis found to be site sensitive

Prof. HUANG Weixin and ZHANG Qun from University of Science and Technology of China (USTC) of the Chinese Academy of Sciences (CAS), together with domestic collaborators, probed into the photocatalytic oxidation of methanol on various anatase TiO2 nanocrystals. The results were published on Angewandte Chemie International Edition.

Semiconductor-based photocatalysis has attracted extensive attention since its discovery, owing to its environmentally friendly production of chemical fuel utilizing solar energy.

A photocatalytic reaction consists of light absorption and charge generation within photocatalysts, charge separation and migration to photocatalyst surfaces, and charge-participated reactions on photocatalyst surfaces. The last step, rate-limiting of photocatalytic reactions, involves an interfacial charge transfer process from photocatalyst surfaces to surface adsorbates and subsequent surface reactions. However, due to its complexity, there are much less studies on it.

To discover the underlying mechanism and the correlation with the photocatalytic efficiency, the researchers studied the photochemistry of CH3OH oxidation on various TiO2 nanocrystals, and the probe photocatalytic reaction for fundamental studies of complex photocatalytic reactions on oxide photocatalysts, with in situ and time-resolved characterizations and density functional theory calculations.

The results revealed that the surface site and corresponding adsorbed methanol species are demonstrated to influence the valence band bending, and these factors can thus determine the TiO2-to-CH3OH interfacial charge transfer process and subsequently the photocatalytic efficiency.

The finding was consistent with the previous study of Prof. HUANG, justifying the feasibility of oxide model catalysts from single crystals to nanocrystals.

This study unveils the site sensitiveness of interfacial charge transfer, indicating surface structure engineering of photocatalysts to be an effective approach to maximize photocatalytic efficiencies.

Credit: 
University of Science and Technology of China