Culture

Joyful screams perceived more strongly than screams of fear or anger

Screaming can save lives. Non-human primates and other mammalian species frequently use scream-like calls when embroiled in social conflicts or to signal the presence of predators and other threats. While humans also scream to signal danger or communicate aggression, they scream when experiencing strong emotions such as despair or joy as well. However, past studies on this topic have largely focused on alarming fear screams.

Humans respond to positive screams more quickly and with higher sensitivity

In a new study, a team at the University of Zurich Department of Psychology led by Sascha Frühholz investigated the meaning behind the full spectrum of human scream calls. The results revealed six emotionally distinct types of scream calls indicating pain, anger, fear, pleasure, sadness and joy. "We were surprised by the fact that listeners responded more quickly and accurately, and with a higher neural sensitivity, to non-alarming and positive scream calls than to alarming screams," says Frühholz.

Cognitive processing of joyful screams is more efficient

The research team carried out four experiments for their study. Twelve participants were asked to vocalize positive and negative screams that might be elicited by various situations. A different group of individuals rated the emotional nature of the screams and classified them into different categories. While participants listened to the screams, their brain activity underwent functional magnetic resonance imaging (fMRI) to monitor how they perceived, recognized, processed and categorized the sounds. "The frontal, auditory and limbic brain regions showed much more activity and neural connectivity when hearing non-alarm screams than when processing alarm scream calls," explains Frühholz.

More complex social environments have reshuffled neurocognitive priorities

It was previously assumed that human and primate cognitive systems were specially tailored for recognizing threat and danger signals in the form of screams. In contrast to primates and other animal species, however, human scream calls seem to have become more diversified over the course of human evolution - something that Frühholz considers to be a big evolutionary leap. "It's highly possible that only humans scream to signal positive emotions like great joy or pleasure. And unlike with alarm calls, positive screams have become increasingly important over time," he says. Researchers suggest that this may be due to the communicative demands brought about by humans' increasingly complex social environments.

Credit: 
University of Zurich

US power sector is halfway to zero carbon emissions

image: Projected versus actual outcomes for the U.S. power sector.

Image: 
Berkeley Lab

Concerns about climate change are driving a growing number of states, utilities, and corporations to set the goal of zeroing out power-sector carbon emissions. To date 17 states plus Washington, D.C. and Puerto Rico have adopted laws or executive orders to achieve 100% carbon-free electricity in the next couple of decades. Additionally, 46 U.S. utilities have pledged to go carbon free no later than 2050. Altogether, these goals cover about half of the U.S. population and economy.

These are ambitious targets, but a new look at the past 15 years in the electricity sector shows that large reductions in emissions are possible.

New research from the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) analyzes historical trends to examine how much progress the power sector has already made in reducing emissions. The study, "Halfway to Zero: Progress towards a Carbon-Free Power Sector," looks back at the 2005 Annual Energy Outlook from the Energy Information Administration (EIA), the U.S. government's official agency for data collection and analysis.

"Business-as-usual projections saw annual carbon dioxide emissions rising from 2,400 to 3,000 million metric tons (MMT) from 2005 to 2020," said Berkeley Lab scientist Ryan Wiser, lead author of the study. "But actual 2020 emissions fell to only 1,450 MMT. The U.S. cut power sector emissions by 52% below projected levels - we are now 'halfway to zero.'"

According to the study, relative to projected values, total consumer electricity costs were 18% lower; costs to human health and the climate were 92% and 52% lower, respectively; and the number of jobs in electricity generation was 29% higher.

Drivers of change

From technological advances to policy, the study identified the main drivers from the last 15 years that contributed to lower carbon emissions in the U.S. power sector. Total demand for electricity was almost exactly the same in 2020 as it was in 2005, and was 24% lower than projected fifteen years earlier. "This drop in demand was due in part to sectoral and economic changes, but also to greater energy efficiency driven by policies and technology advancement," said Wiser.

The researchers found that wind and solar power dramatically outperformed expectations, delivering 13 times more generation in 2020 than projected. This is also a result of technology development and state and federal policies, as prices plummeted for new wind and solar technologies. In addition, nuclear generation has largely held steady, tracking the past projections and helping to ensure no backsliding in carbon emission.

The study found that switching from coal to natural gas for power generation played a big role in lowering carbon emissions. Natural gas generation grew rapidly, driven by the shale gas revolution and low fuel prices.

The researchers also found that changes over the last 15 years had numerous other economic and environmental benefits. For example, total electric bills for consumers were 18% lower in 2020 than previously projected by EIA, for a total savings of $86 billion per year.

According to the study, reduced sulfur and nitrogen emissions led to lower health impacts, such as respiratory disease, with premature deaths falling from 38,000 to 3,100 per year. "Compared to the business-as-usual projection, not only did the nation significantly reduce its carbon footprint, but it did so while also reducing total energy bills and health burdens," said co-author and Berkeley Lab scientist Dev Millstein.

The study also found that while employment patterns shifted along with changes in the power sector, electricity supply is supporting 200,000 more jobs than might have been the case under the earlier projection.

Looking forward

While a look back shows that dramatic changes in emissions are possible over a 15-year span, the study points out that this does not guarantee the next 15 years will see similar progress.

Given advancements in wind, solar, and battery technologies, these resources are likely to play important near-term roles in further power-sector decarbonization. According to the study, a large share of the capacity needed to approach a zero-carbon power-sector target is already in the development pipeline: about 660 gigawatts (GW) of wind and solar have requested transmission access, more than half of what might be required to approach a zero-carbon power-sector target. Not all proposed projects will be built, but the scale indicates interest in development.

Wiser points out there are significant infrastructure requirements related to scaling up renewable energy. The power sector will have to ensure electricity delivery, reliability, and resilience; build new transmission infrastructure; change planning and grid operations; revise siting processes; and focus attention on impacted workers and communities.

Another major challenge is how to meet the last portion of demand, to ensure a reliable power supply when the wind doesn't blow and the sun doesn't shine. The study concludes that further research, development, and demonstration is needed for the numerous technologies that can fill this gap, such as longer-duration energy storage, hydrogen or synthetic fuels, bioenergy, fossil or biomass generation with carbon capture, nuclear energy, geothermal energy, and solar-thermal with storage.

"As the country maps out a plan for further decarbonization, experience from the past 15 years offers two central lessons," said Wiser. "First, policy and technology advancement are imperative to achieving significant emissions reductions. Second, our ability to predict the future is limited, and so it will be crucial to adapt as we gain policy experience and as technologies advance in unexpected ways."

Credit: 
DOE/Lawrence Berkeley National Laboratory

Life expectancy lower near superfund sites

image: Hanadi S. Rifai, John and Rebecca Moores Professor of Civil and Environmental Engineering at University of Houston, reports that living near a toxic waste or Superfund site can cut your life short.

Image: 
University of Houston

Living near a hazardous waste or Superfund site could cut your life short by about a year, reports Hanadi S. Rifai, John and Rebecca Moores Professor of Civil and Environmental Engineering at the University of Houston. The study, published in Nature Communications and based on evaluation of 65,226 census tracts from the 2018 Census, is the first nationwide review of all hazardous waste sites and not just the 1,300 sites on the national priority list managed by the federal government.

The analysis shows a decrease of more than two months in life expectancy for those living near a Superfund site. When coupled with high disadvantage of sociodemographic factors like age, sex, marital status and income, the decrease could be nearly 15 months, according to the analysis. Prior studies confirmed that those living near hazardous waste sites generally have greater sociodemographic disadvantage and, as a result, poorer health. The average life expectancy in the U.S. is 78.7 years, and millions of children have been raised within less than a one-mile radius from a federally designated Superfund site.

"We have ample evidence that contaminant releases from anthropogenic sources (e.g., petrochemicals or hazardous waste sites) could increase the mortality rate in fence-line communities," reports Rifai. "Results showed a significant difference in life expectancy among census tracts with at least one Superfund site and their neighboring tracts with no sites."

Nationally there are thousands of so-called Superfund, or contaminated, sites that pose a risk to human health and the environment. These sites include manufacturing facilities, processing plants, landfills and mining sites where hazardous waste was dumped, left out in the open or poorly managed.

The study presents a nationwide geocoded statistical modeling analysis of the presence of Superfund sites, their flood potential, and the impact on life expectancy independently and in context of other sociodemographic determinants. Life expectancy is one of the most basic indicators of public health. Studies show a 1% increase in life expectancy could lead to a 1.7% to 2% increase in population.

Analysis revealed that out of 12,717 census tracts with at least one Superfund site, the adverse effect of this presence was more severe on the ones with higher sociodemographic disadvantage. For instance, the presence of a Superfund site in a census tract with smaller than median income ($52,580) could reduce life expectancy by as much as seven months.

While many studies have broken down mortality rates associated with different diseases, only a few have paid attention to hazardous waste and Superfund sites and their potential impact on mortality rates.

Other recent national studies showed a significant correlation between the residential proximity to Superfund sites and the occurrence of Non-Hodgkin's lymphoma, especially among males. In Texas, the Texas Department of State Health Services recently examined a cancer cluster in downtown Houston around a former railroad creosote treatment facility, finding the observed number of childhood acute lymphoblastic leukemia cases was greater than expected based on cancer rates in Texas.

Rifai also examined the impact of flooding, which could cause the transport of contaminants from Superfund sites and potentially affect neighborhoods farther than the nearby fence-line communities.

"When you add in flooding, there will be ancillary or secondary impacts that can potentially be exacerbated by a changing future climate," said Rifai. "The long-term effect of the flooding and repetitive exposure has an effect that can transcend generations."

Credit: 
University of Houston

Finding resiliency in local, community news gathering

image: Community newspapers often serve as the public's main source of accurate, local news. They also can be an important way to share the impact of major national events, such as a global pandemic. As the COVID-19 pandemic began spreading throughout the United States, journalism scholars at the University of Missouri and the University of Kansas found that community newspapers across the country began to reevaluate the way they had been doing business for decades.

Image: 
University of Missouri

When the Webster-Kirkwood Times, a community newspaper in the greater St. Louis, Missouri area, had to endure layoffs and stop publishing its print edition -- due to a loss in revenue as a result of the COVID-19 pandemic -- its readers felt the loss and began supporting the newspaper in earnest.

"A lot of times people don't know what they've got until it's gone," said Jaime Mowers, editor-in-chief of the Webster-Kirkwood Times. "Now, there is such a newfound appreciation for the newspaper. It's amazing to have the community's support, knowing we are loved that much and appreciated enough to be able to bring our print edition back. We are part of the fabric of our community, and we're lucky to still be a part of that."

Community newspapers, like the Webster-Kirkwood Times and those in rural America, often serve as the public's main source of accurate, local news. They also can be an important way to share the impact of major national events, such as a global pandemic. As the COVID-19 pandemic began spreading throughout the United States, journalism scholars at the University of Missouri and the University of Kansas found that community newspapers across the country began to reevaluate the way they had been doing business for decades.

In their new study, the journalism scholars analyzed six weeks of news articles and columns in community newspapers that described COVID-19's impact on the journalism profession. They found journalists in these local newsrooms were now open to the idea that "everything is on the table" for their survival. As part of that philosophy, journalists are beginning to embrace the need for self-advocating for their profession in a way that prior generations were historically uncomfortable doing, said Ryan J. Thomas, associate professor of journalism studies in the Missouri School of Journalism.

"There has been a tendency within journalism to avoid journalism itself becoming the story, and let the quality of the work speak for itself," Thomas said. "Journalists have been holding onto this view that if they put out a quality product, the public will appreciate them. But what we've found, and which is also consistent with a few other similar studies, is that journalists' internal view of their own profession is changing. Journalists are now recognizing the need to be their own advocates, not only for the importance of their role in a democracy, but also for their own survival -- by trying to encourage people to subscribe, renew their subscription and so on."

Within two months of the COVID-19 pandemic reaching the U.S., at least 30 local newspapers across the country closed or merged due to financial situations. Meanwhile, hundreds of other community newspapers responded with layoffs, furloughs and production changes, such as shifting to an exclusive online format and reducing the number of print editions. Thomas said this shift in thinking also recognizes that reliance on an advertising model, the source of revenue for much traditional journalism, is a risky venture. He said their research revealed a debate between editors and publishers about whether or not paywalls -- a newer form of revenue for many media organizations -- should be used to access news about the coronavirus.

"On one hand, journalism is a public service and information about the coronavirus is essential for making the community aware of the health threats and safety measures that should be taken," Thomas said. "On the other hand, if that information is located behind a paywall, people cannot access it. However, a paywall is also one source of revenue and can help keep the lights on. That's what makes this a complicated situation."

Teri Finneman, an associate professor in the School of Journalism and Mass Communications at the University of Kansas, and co-author on the study, was glad to see how honest journalists were when writing about their situations.

"Over the last decade or so, I think the public keeps hearing the soundbite that journalism is dying, but I don't know if the public truly understands how journalism works," said Finneman, who received her master's degree and doctorate from MU. "So, it's important that journalists have more of these honest conversations with the public, so that the public has a greater understanding of what journalists need to be able to adequately serve them. Journalism does have business model challenges, and while some areas in journalism are thriving and great work is being done, journalism still needs communities to invest into their local newsrooms similar to the effort that journalists are making who are invested in serving the needs of their local communities."

Credit: 
University of Missouri-Columbia

Stellar feedback and an airborne observatory; scientists determine a nebula younger than believed

image: An international team led by West Virginia University researchers studied RCW 120 to analyze the effects of stellar feedback, and found that RCW 120 must be less than 150,000 years old, which is very young for such a nebula.

Image: 
West Virginia University

In the southern sky, situated about 4,300 light years from Earth, lies RCW 120, an enormous glowing cloud of gas and dust. This cloud, known as an emission nebula, is formed of ionized gases and emits light at various wavelengths. An international team led by West Virginia University researchers studied RCW 120 to analyze the effects of stellar feedback, the process by which stars inject energy back into their environment. Their observations showed that stellar winds cause the region to expand rapidly, which enabled them to constrain the age of the region. These findings indicate that RCW 120 must be less than 150,000 years old, which is very young for such a nebula.

About seven light years from the center of RCW 120 lies the boundary of the cloud, where a plethora of stars are forming. How are all of these stars being formed? To answer that question, we need to dig deep into the origin of the nebula. RCW 120 has one young, massive star in its center, which generates powerful stellar winds. The stellar winds from this star are much like those from our own Sun, in that they throw material out from their surface into space. This stellar wind shocks and compresses the surrounding gas clouds. The energy that is being input into the nebula triggers the formation of new stars in the clouds, a process known as "positive feedback" because the presence of the massive central star has a positive effect on future star formation. The team, featuring WVU postdoctoral researcher Matteo Luisi, used SOFIA (the Stratospheric Observatory for Infrared Astronomy) to study the interactions of massive stars with their environment.

SOFIA is an airborne observatory consisting of an 8.8-foot (2.7-meter) telescope carried by a modified Boeing 747SP aircraft. SOFIA observes in the infrared regime of the electromagnetic spectrum, which is just beyond what humans can see. For observers on the ground, water vapor in the atmosphere blocks much of the light from space that infrared astronomers are interested in measuring. However, its cruising altitude of seven miles (13 km), puts SOFIA above most of the water vapor, allowing researchers to study star-forming regions in a way that would not be possible from the ground. Overnight, the in-flight observatory observes celestial magnetic fields, star-forming regions (like RCW 120), comets and nebulae. Thanks to the new upGREAT receiver that was installed in 2015, the airborne telescope can make more precise maps of large areas of the sky than ever before. The observations of RCW 120 are part of the SOFIA FEEDBACK survey, an international effort led by researchers Nicola Schneider at the University of Cologne and Alexander Tielens at the University of Maryland, which makes use of upGREAT to observe a multitude of star-forming regions.

The research team opted to observe the spectroscopic [CII] line with SOFIA, which is emitted from diffuse ionized carbon in the star-forming region. "The [CII] line is probably the best tracer of feedback on small scales, and--unlike infrared images--it gives us velocity information, meaning we can measure how the gas moves. The fact that we can now observe [CII] easily across large regions in the sky with upGREAT makes SOFIA a really powerful instrument to explore stellar feedback in more detail than was possible previously," says Matteo.

Using their [CII] observations from SOFIA, the research team found that RCW 120 is expanding at 33,000 mph (15 km/s), which is incredibly fast for a nebula. From this expansion speed, the team was able to put an age limit on the cloud and found that RCW 120 is much younger than previously believed. With the age estimate, they were able to infer the time it took for the star formation at the boundary of the nebula to kick in after the central star had been formed. These findings suggest that positive feedback processes occur on very short timescales and point to the idea that these mechanisms could be responsible for the high star formation rates that occurred during the early stages of the universe.

Looking forward, the team hopes to expand this type of analysis to the study of more star forming regions. Matteo says, "The other regions we are looking at with the FEEDBACK survey are in different stages of evolution, have different morphologies, and some have many high-mass stars in them, as opposed to only one in RCW 120. We can then use this information to determine what processes primarily drive triggered star formation and how feedback processes differ between various types of star-forming regions."

Credit: 
West Virginia University

Chemical modification of RNA could play key role in polycystic kidney disease

image: Vishal Patel, M.D.

Image: 
UT Southwestern Medical Center

DALLAS - April 13, 2021 - A chemical modification of RNA that can be influenced by diet appears to play a key role in polycystic kidney disease, an inherited disorder that is the fourth leading cause of kidney failure in the U.S., UT Southwestern researchers report in a new study. The findings, published online today in Cell Metabolism, suggest new ways to treat this incurable condition.

About 600,000 Americans and 12.5 million people worldwide have autosomal dominant polycystic kidney disease (PKD), a condition caused by mutations in either of two genes, PKD1 or PKD2. These mutations cause kidney tubules - small tubes that filter blood and generate urine - to dilate, forming cysts that grossly enlarge the kidneys. In about 50 percent of patients, these cysts eventually cause kidney failure, necessitating dialysis or a kidney transplant.

Although one FDA-approved drug exists to treat PKD, it merely slows the decline in kidney function, explain study leaders Vishal Patel, M.D., associate professor of internal medicine at UTSW, and Harini Ramalingam, Ph.D., a postdoctoral fellow in Patel's lab. More treatments for this condition are urgently needed, they say, but the molecular mechanisms that cause PKD to develop and progress are still not fully known.

To better understand this condition, Patel, Ramalingam, and their colleagues investigated whether chemical modifications to the genetic molecule RNA, which translates instructions from DNA to produce proteins in the body, could play a part.

The researchers investigated whether the most common RNA chemical modification known as m6A, which occurs when a methyl group is chemically attached to an RNA component called adenosine, might be altered in PKD. Specifically, they looked at the activity of Mettl3, the enzyme that performs this methylation reaction. In three mouse models of PKD, they found that Mettl3 activity and the resulting m6A levels were significantly higher than in healthy animals without this condition. The same was true for kidney samples from PKD patients compared with healthy kidney samples.

Further investigation showed that in the mouse models, Mettl3 activity increased before the first kidney cysts made their appearance, suggesting that it might be an initiating event for the disease. When researchers genetically altered healthy mice to overproduce Mettl3, the animals developed small kidney cysts, even though they didn't carry any PKD mutations. Conversely, shutting down Mettl3 in the PKD models significantly slowed cyst growth, suggesting that the enzyme plays a key role in disease progression.

Next, the scientific team tested whether limiting methionine, the dietary nutrient that supplies the raw materials for methylation, might stem cyst formation. When researchers grew kidney tissue in petri dishes with varying concentrations of methionine, cysts increased with higher concentrations. The researchers saw an opposite phenomenon when PKD mice were fed a low-methionine diet - these animals had less severe PKD.

Ramalingam notes that the findings have two important implications. First, it may be possible to partially control PKD with a vegan or vegetarian diet, since methionine is found in meat and fish. Second, identifying chemicals that stem Mettl3 activity may lead to new drugs to treat this condition.

"Both of these possibilities represent new avenues for PKD research that didn't exist before," she says.

Credit: 
UT Southwestern Medical Center

Past Global Changes Horizons - a new paleoscience magazine for teenagers and young adults

image: Illustration from the article “From the depths of the Amundsen Sea” (p. 12) by Margot Courtillat in Past Global Changes Horizons magazine, volume 1.

Image: 
Illustration by Cirenia Arias Baldrich for Past Global Changes Horizons, vol. 1

Past Global Changes Horizons is a scientific review of why the study of Earth's history is important, and uses comics, pictures, and drawings that support short papers with strong messages about past sciences and how to prepare for a changing future. Articles cover different environments across the planet, from caves to oceans, and from Antarctica to the Rift valley in Africa.

Each of the 18 contributions addresses a scientific question and includes appealing and understandable figures or images, without sacrificing scientific rigor. Tips and suggestions for further research and discussion topics are also included, meaning Horizons is not only designed for students but also potentially for teachers.

It is currently an online magazine, but there will be the option (via the Past Global Changes website) to request free hard copies in May 2021. A second print run is planned for October 2021.

The first issue of Horizons was edited by Graciela Gil-Romera and Boris Vannière, and the editor-in-chief is PAGES' Science Officer Sarah Eggleston. The magazine is in English.

Graciela Gil-Romera said Horizons was created due to a lack of relatable outreach and science communication resources for this influential group.

"This is precisely the age group where more is needed to help shape their transition into decision-makers in adulthood," she said.

"The younger generation is being bombed with fake news, extremism, and environmental anxiety. Scientists have produced this magazine for them to show that we can all be scientists, and to create not only more environmental awareness, but also to inspire them that there is still time to build a better future, and we know this thanks to the past."

Boris Vanniere believes it is imperative for active scientists to inspire and mobilise the next generation.

"Dissemination of scientific knowledge requires more attention and direct efforts on the part of scientists to reach and address all members of society, especially the younger generation," he said.

"The objective of education is to contribute to citizen power and democratic life. Only well-informed people can judge government action. Scientific advisors are useful, but only citizens who can nurture their free will will be able to see clearly the promises and uncertainties ahead, and then be freer and stronger."

Horizons is planned as an annual publication. It is a great opportunity for more senior scientists to reach out to the next generation and get them excited about their individual research topics. All topics that echo the younger generation's questions about our relationship with the environment and the Earth system, and their history, are encouraged for future issues.

Credit: 
Past Global Changes IPO

USPSTF statement on screening for vitamin D deficiency in adults

Bottom Line: The U.S. Preventive Services Task Force (USPSTF) concludes that the current evidence is insufficient to make a recommendation about screening for vitamin D deficiency in asymptomatic adults. Vitamin D performs an important role in bone metabolism. Requirements may vary by individual and no one blood vitamin D level defines deficiency. The USPSTF routinely makes recommendations about the effectiveness of preventive care services and this recommendation updates and is consistent with its 2014 statement.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jama.2021.3069)

Editor's Note: Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Note: More information about the U.S. Preventive Services Task Force, its process, and its recommendations can be found on the newsroom page of its website.

Credit: 
JAMA Network

Study links structural brain changes to behavioral problems in children who snore

WHAT:

A large study of children has uncovered evidence that behavioral problems in children who snore may be associated with changes in the structure of their brain's frontal lobe. The findings support early evaluation of children with habitual snoring (snoring three or more nights a week). The research, published in Nature Communications, was supported by the National Institute on Drug Abuse (NIDA) and nine other Institutes, Centers, and Offices of the National Institutes of Health.

Large, population-based studies have established a clear link between snoring and behavioral problems, such as inattention or hyperactivity, but the exact nature of this relationship is not fully understood. While a few small studies have reported a correlation between sleep apnea--when pauses in breathing are prolonged--and certain brain changes, little is known about whether these changes contribute to the behaviors seen in some children with obstructive sleep-disordered breathing (oSDB), a group of conditions commonly associated with snoring that are characterized by resistance to breathing during sleep.

To address this knowledge gap, researchers led by Amal Isaiah, M.D., D.Phil., of the University of Maryland School of Medicine, capitalized on the large and diverse dataset provided by the Adolescent Brain Cognitive Development (ABCD) Study, a long-term study of child health and brain development in the United States. The team of researchers mined this wealth of data from more than 11,000 9- and 10-year-old children to examine the relationships among snoring, brain structure, and behavioral problems.

Confirming the results of previous work, their statistical analysis revealed a positive correlation between habitual snoring and behavioral problems, with the children who most frequently snored generally exhibiting worse behavior according to an assessment completed by parents. The findings further showed that snoring is linked to smaller volumes of multiple regions of the brain's frontal lobe, an area involved in cognitive functions such as problem solving, impulse control, and social interactions. The statistical analysis also suggested that the brain differences seen in children who snore may contribute to behavioral problems, but additional work on how snoring, brain structure, and behavioral problems change over time is needed to confirm a causal link.

This study's findings point to oSDB as a potential reversible cause of behavioral problems, suggesting that children routinely be screened for snoring. Children who habitually snore may then be referred for follow-up care. Such care may include assessment and treatment for conditions that contribute to oSDB, such as obesity, or evaluation for surgical removal of the adenoids and tonsils.

The ABCD Study, the largest of its kind in the United States, is tracking nearly 12,000 youth as they grow into young adults. Investigators regularly measure participants' brain structure and activity using magnetic resonance imaging (MRI) machines, and collect psychological, environmental, and cognitive information, as well as biological samples. The goal of the study is to define standards for normal brain and cognitive development and to identify factors that can enhance or disrupt a young person's life trajectory.

Credit: 
NIH/National Institute on Drug Abuse

Study provides novel platform to study how SaRS-CoV-2 affects the gut

(Boston)--How could studying gastrointestinal cells help the fight against COVD-19, which is a respiratory disease? According to a team led by Gustavo Mostoslavsky, MD, PhD, at the BU/BMC Center for Regenerative Medicine (CReM) and Elke Mühlberger, PhD, from the National Emerging Infectious Diseases Laboratories (NEIDL) at Boston University, testing how SARS-CoV-2 affects the gut can potentially serve to test novel therapeutics for COVID-19.

In order to study SARS-CoV-2, models are needed that can duplicate disease development in humans, identify potential targets and enable drug testing. BU researchers have created human induced pluripotent stem cells (iPSC)-derived intestinal organoids or 3-D models that can be infected and replicated with SARS-CoV-2.

iPSC are stem cells derived from the donated skin or blood cells that are reprogrammed back to an embryonic stem cell-like state and then can be developed into any cell type in the body.

"Human induced pluripotent stem cell derived intestinal organoids represent an inexhaustible cellular resource that could serve as a valuable tool to study SARS-CoV-2, as well as other intestinal viruses that infect the intestinal epithelium," explained corresponding author Mostoslavsky, associate professor of microbiology at Boston University School of Medicine (BUSM) and co-director of the CReM.

Using human induced pluripotent stem cells (iPSC) the researchers differentiated the iPSC cells into colonic and small intestine 3D organoids. The organoids were then passed along to the Mühlberger lab at the NEIDL where they were infected with SARS-CoV-2 to analyze the effect of infection on the cells by staining against markers, by electron microscopy and by RNA-sequencing.

"Our findings suggests that different epithelial tissues (such as the lung and the gut) react in similar manner to SARS-CoV-2 infection and therefore can help identify common mechanisms of disease that can be targeted by drugs," added Mühlberger, director of Integrated Science Services at the NEIDL and professor of microbiology at BUSM.

Credit: 
Boston University School of Medicine

Scientists identify severe asthma species, show air pollutant as likely contributor

Asthma afflicts more than 300 million people worldwide. The most severe manifestation, known as non-Th2, or non-atopic childhood asthma, represents the majority of the cases, greater than 85%, particularly in low-income countries, according to Hyunok Choi (https://health.lehigh.edu/faculty/choi-hyunok), an associate professor at the Lehigh University College of Health (https://health.lehigh.edu/). Yet, whether non-Th2 is a distinct disease (or endotype) or simply a unique set of symptoms (or phenotype) remains unknown.

"Non-Th2 asthma is associated with very poor prognosis in children and great, life-long suffering due to the absence of effective therapies," says Choi. "There is an urgent need to better understand its mechanistic origin to enable early diagnosis and to stop the progression of the disease before it becomes severe."

Studies show that nearly 50% of the children whose asthma is poorly controlled are expected to emerge as severe adult cases. Yet, a one-size-fits-all treatment approach, currently the norm for asthma, is ineffective and, says Choi, and partially responsible for asthma's growing economic burden.

"The primary reason for lack of therapeutic and preventive measures is that no etiologic, or causal, driver has ever been identified for the non-Th2 asthma," says Choi.

Now, for the first time, an epidemiological study, led by Choi, has shown that not only is non-Th2 a distinct disease, its likely inducer is early childhood exposure to airborne Benzo[a]pyrene, a byproduct of fossil fuel combustion. Choi and her colleagues are the first to demonstrate air pollution as a driver of the most challenging type of asthma, the severe subtype which is non-responsive to current therapies.

The team describes their results in an article (https://rdcu.be/cip0w) recently published online in Environmental Health Journal called "Airborne Benzo[a]Pyrene May Contribute to Divergent Pheno-Endotypes in Children." (https://rdcu.be/cip0w) Additional authors: Miroslav Dostal, Anna Pastorkova, Pavel Rossner, Jr., and Radim J. Sram from the Department of Genetic Toxicology and Nanotoxicology, Institute of Experimental Medicine, Czech Academy of Sciences, Prague, Czech Republic.

What is termed asthma is an umbrella word for multiple diseases with common symptoms. Asthma has been broadly classified as two major sets of symptoms: T helper cell high (Th2-high) and T helper cell low (non-Th2). Th2-high is associated with early-childhood allergies to common pollutants such as pet dander, tree pollens, or mold. In contrast, non-TH2 is not related to an allergic response. The non-Th2 type, marked explicitly by being non-allergy-related, is far less understood than the TH-2 type and could transform into severe or difficult to treat type.

"The identification of non-Th2 asthma as a distinct disease, with early exposure to Benzo[a]pyrene as a driver, has the potential to impact tens of millions of sufferers, since this would make it possible to intervene before the onset of irreversible respiratory injuries," says Choi.

The team tested two comparable groups of children from an industrial city, Ostrava, and the surrounding semi-rural area of Southern Bohemia, in the Czech Republic: 194 children with asthma and a control group consisting of 191 children. According to the study, Ostrava is an industrial city with a high level of coal mining activities, coal processing, and metallurgical refinement. The district-level ambient mean for Benzo[a]pyrene at the time of their investigation November 2008) was 11-times higher than the recommended outdoor and indoor air quality standard.

Not only was elevated exposure to Benzo[a]pyrene associated with correspondingly elevated odds of non-Th2 asthma, it was also associated with depressed systemic oxidant levels.

"Contrary to the current body of evidence supporting adult onset of non-atopic asthma, our data suggest for the first time that the lung function deficit and suppressed oxidative stress levels during early childhood are critical sentinel events preceding non-atopic asthma," says Choi.

Credit: 
Lehigh University

Liver transplants: Improving waitlist mortality by improved risk assessment

The top priority in the field of transplantation is to ensure that donor organs are allocated to the patients with the greatest need.

In a large-scale joint international project conducted by the Medical University of Vienna and the Mayo Clinic in Rochester (USA), researchers from the Department of General Surgery and the Division of Gastroenterology and Hepatology of MedUni Vienna's Department of Medicine III, have made a significant step forward to improve prediction of survival on the waiting list for liver transplantation by including additional laboratory parameters.

Donor livers are allocated to patients on the waiting list for a liver transplant on the basis of their individual medical need. Currently, patients are ranked by means of a score made up of 3 (+/- sodium) blood values (Model for End Stage Liver Disease or MELD score). However, over the past few years, significant limitations of the MELD-based system have been identified and waitlist mortality remains at 20%.

A particular challenge in this context is the ability to identify patients with a high risk of complications due to portal hypertension or acute-on-chronic liver failure, as the MELD score might not adequately map these conditions.

"We have now attempted to address these existing weaknesses in the MELD-based allocation system and to identify patients with a high mortality risk, despite having a comparatively low MELD score," explains Principal Investigator Patrick Starlinger from MedUni Vienna's Department of General Surgery, who is currently also working at the Mayo Clinic in Rochester. "It was particularly important to us to confirm our findings internationally and thereby document the potential improvement of organ allocation at other transplant centres and in another transplantat systems."

Simple blood test for assessing risk

von Willebrand factor (vWF) is a central component of the blood clotting system but the vWF antigen level in the blood is also an excellent marker for portal hypertension. This offers the great advantage as the vWF antigen value can readily be determined by taking a blood sample when listing patients. C-reactive protein (CRP) is likewise an easily measured routine parameter that is involved in inflammatory processes in the body of patients with liver disease even before infections develop or acute-on-chronic liver failure occurs.

Patrick Starlinger's study, which was produced in collaboration with researchers from the Medical University of Vienna and the Mayo Clinic in Rochester, found that the prediction of waitlist mortality could be significantly improved by expanding the currently used MELD score to include the vWF antigen level and the CRP value. This was also confirmed in the US patient cohort. "We also managed to show that both blood parameters reflect pathophysiological processes that drive the development of acute life-threatening complications in patients on the waiting list for a liver transplant. Having established vWF-Ag and CRP as valuable markers for portal hypertension and inflammatory processes in liver cirrhosis patients in our previous papers, this study represents another important step towards the application of this blood test in clinical care," says Mattias Mandorfer, Head of the Hepatic Hemodynamic Laboratory at the Medical University of Vienna.

This study could therefore have a significant and long lasting influence on the existing allocation system for donor livers and ultimately significantly reduce waitlist mortality among patients.

Credit: 
Medical University of Vienna

Indicators for a new audience measurement model for streaming platforms

In recent years the boom in streaming platforms and video on demand services has led to disruption in audiences, representing a difficulty when measuring the number of viewers of the content distributed by these platforms.

This new situation has not only altered the traditional television and film viewing model, but also has impacted the advertising market, which is a fundamental factor in funding and the business of audiovisual entertainment.

In this context, real and objective audience measurement (which is not influenced by the interests of the platforms) has become a key objective; it is fundamental to obtain real-time data on the reach of each production released so as to analyse its performance, know its market position, meet user demands and develop profitable services.

A recent study performed by researchers from the Universitat Oberta de Catalunya (UOC) analysed audience behaviour and measurement systems on the Netflix streaming platform and video on demand service. Their aim was to establish a more reliable audience measurement model.

"The audience has been the main financial driving force of television while advertising has been its main source of income, and therefore, for an evolving audiovisual sector, it is crucial to have accurate viewer and user numbers," explained Elena Neira, a researcher from the GAME group of the UOC Faculty of Information and Communication Sciences and the main author of the study.

New consumer habits influence audience measurement

The proliferation of streaming platforms and video on demand services has exponentially increased the quantity of content offered to users, leading them to change the way in which they watch series and films.

These new consumer habits have generated a new TV and video ecosystem which, among other factors, stems from a wider variety of devices on which people can view content, such as Smart TVs, smartphones, computers or tablets.

"At present, viewers can decide how, where and when to watch a series or a film, and therefore the traditional audience measurement models are not capable of covering the new consumer reality fully. Indeed, the idea of an audience in the sphere of streaming goes far beyond and is much more complex than the simple accumulation of viewings," said Neira, who also stressed that, at present, nobody knows the market share or average use of the platforms, or how many people have abandoned traditional television because they can watch content online.

"Our objective is to offer a starting point and to study in depth the real market share of streaming in the framework of the system's structure. ?We also want to offer some certainty and information that is of value to everyone but in particular for the television companies and the creators," the UOC researcher underlined.

To analyse this new TV and video ecosystem, the experts chose to assess the production Money Heist, since it allowed them to measure the success and popularity of the series through different channels such as traditional television and a streaming and video on demand platform, Netflix.

Since being included among the content offered by Netflix, this Spanish-made production has become a worldwide phenomenon, for which there are no specific audience data.

New parameters for reliable audience measurement

The researchers from the UOC indicate that the concept of audience has been altered in this new TV and video ecosystem. This is due to the evolution of its parameters, which now include new metrics such as audience retention or the popularity of the content, which are difficult to standardize for measurement.

It is thought that, in order to be able to carry out better audience measurements, factors should be taken into account such as audience fragmentation, the need to weight the data collected, giving importance to variables such as viewing intensity - the famous binge watching - or the volatility of the streaming platform's users. In this respect, the researcher Elena Neira stressed that "we must include new dimensions, since the new concept of audience includes aspects that are especially relevant such as the users' commitment to or involvement with the content and the depth of attention of each viewer."

The heterogeneity intrinsic in the business model of these platforms introduces elements which greatly hinder the construction of a standard and global audience concept. ?For example, unlike traditional television channels, there is not a level playing field for streaming platforms as regards household consumption and penetration, market share and availability.

All this leads to audience measurement distortions, which may be more biased on taking into account the lifecycle of the content on a streaming platform, since this will be a significant factor determining the number of viewers.

Other external factors which may influence reliable audience measurement are the impact achieved on social media, the number of downloads or the number of searches carried out on search engines like Google.

"The use of streaming platforms is a mainstream activity; they are gaining more and more hours of the population's entertainment time. This not only affects the sector, but also has legislative implications, since these business models do not have the same regulations as traditional television companies and should have certain obligations to be able to determine the size of their contribution to the state coffers," said Neira, who warned about how the platforms only tend to provide overall audience data, without figures that are specific to each territory.

This UOC research supports sustainable development goal (SDG) 8, decent work and economic growth, and industry, innovation and infrastructure.

Credit: 
Universitat Oberta de Catalunya (UOC)

"Shedding light" on the role of undesired impurities in gallium nitride semiconductors

image: Carbon impurities in gallium nitride (GaN) semiconductors affect GaN crystal growth and degrade their performance.

Image: 
Image courtesy: Masashi Kato from Nagoya Institute of Technology

The semiconductor industry and pretty much all of electronics today are dominated by silicon. In transistors, computer chips, and solar cells, silicon has been a standard component for decades. But all this may change soon, with gallium nitride (GaN) emerging as a powerful, even superior, alternative. While not very heard of, GaN semiconductors have been in the electronics market since 1990s and are often employed in power electronic devices due to their relatively larger bandgap than silicon--an aspect that makes it a better candidate for high-voltage and high-temperature applications. Moreover, current travels quicker through GaN, which ensures fewer switching losses during switching applications.

Not everything about GaN is perfect, however. While impurities are usually desirable in semiconductors, unwanted impurities can often degrade their performance. In GaN, impurities such as carbon atoms often lead to poorer switching performance due to trapping of charge carriers in "deep levels," energy levels created by the impurity defects in the GaN crystal layers and thought to originate from the presence of a carbon impurity on a nitrogen site.

A curious experimental manifestation of deep levels is the appearance of a long-lived yellow luminescence in the photoluminescence spectrum of GaN along with a long charge carrier recombination time reported by characterization techniques like time-resolved photoluminescence (TR-PL) and microwave photoconductivity decay (μ-PCD). However, the mechanism underlying this longevity is unclear.

In a recent study published in Journal of Applied Physics scientists from Japan explored the effect of deep levels on the yellow luminescence decay time and carrier recombination by observing how the TR-PL and μ-PCD signals changed with temperature. "Only after understanding the impacts of impurities in GaN power semiconductor devices can we push for the development of impurity control technologies in GaN crystal growth," says Prof. Masashi Kato from Nagoya Institute of Technology, Japan, who led the study.

The scientists prepared two samples of GaN layers grown on GaN substrates, one doped with silicon and the other with iron. The unintentional doping of carbon impurities happened during the silicon doping process. For the TR-PL measurements, the team recorded signals for temperatures up to 350°C while for μ-PCD up to 250°C due to system limitations. They used a 1 nanosecond-long UV laser pulse to excite the samples and measured the reflection of microwaves from the samples for μ-PCD.

The TR-PL signals for both samples showed a slower (decay) component with a decay time of 0.2-0.4 milliseconds. Additionally, the use of a long-pass filter with a cut-off at 461 nm confirmed that yellow light was involved. In both samples, and for both TR-PL and μ-PCD measurements, the decay time declined above 200°C, consistent with previous reports.

To explain these findings, the scientists resorted to numerical calculations, which revealed that the deep levels essentially trapped "holes" (absence of electrons) that eventually recombined with free electrons but took long to do so due to the extremely small chance of an electron being captured by the deep level. However, at high temperatures, the holes managed to escape from the trap and recombined with the electrons through a much faster recombination channel, explaining the decline in decay time.

"To reduce the effects of the slow decay component, we must either maintain a low carbon concentration or adopt device structures with suppressed hole injections," says Prof. Kato.

With these insights, it is perhaps only a matter of time before scientists figure out how to avoid these pitfalls. But with GaN's rise to power, will it be just better electronics?

Prof. Kato thinks otherwise. "GaN enables lower power losses in electronic devices and therefore saves energy. I think it can go a long way in mitigating greenhouse effects and climate change," he concludes optimistically. These findings on impurities may thus be what lead us to a cleaner, greener future!

Credit: 
Nagoya Institute of Technology

NTU Singapore study investigates link between COVID-19 and risk of blood clot formation

image: (from L-R) Assistant Professor Christine Cheung, Research Fellow Dr Wu Kanxing, and Research Assistant Florence Chioh, from NTU Lee Kong Chian School of Medicine.

Image: 
NTU

People who have recovered from COVID-19, especially those with pre-existing cardiovascular conditions, may be at risk of developing blood clots due to a lingering and overactive immune response, according to a study led by Nanyang Technological University, Singapore (NTU) scientists.

The team of researchers, led by NTU Assistant Professor Christine Cheung, investigated the possible link between COVID-19 and an increased risk of blood clot formation, shedding new light on "long-haul COVID" - the name given to the medium- and long-term health consequences of COVID-19.

The findings may help to explain why some people who have recovered from COVID-19 exhibit symptoms of blood clotting complications after their initial recovery. In some cases, they are at increased risk of heart attack, stroke or organ failure when blood clots block major arteries to vital organs.

The team, comprising researchers from NTU, Agency for Science, Technology and Research's (A*STAR) Singapore Immunology Network (SIgN), and the National Centre of Infectious Diseases, Singapore (NCID), collected and analysed blood samples from 30 COVID-19 patients a month after they had recovered from the infection and were discharged from hospital. They found that all recovered COVID-19 patients had signs of blood vessel damage, possibly from a lingering immune response, which may trigger the formation of blood clots.

Their findings were published on 23 March in the peer-reviewed scientific journal eLife.

"With more people recovering from COVID-19, we started hearing from clinicians about patients returning with blood clotting issues after they had been discharged and cleared of the virus," said Asst Prof Christine Cheung from NTU's Lee Kong Chian School of Medicine. "This makes a strong case for the close monitoring of recovered COVID-19 patients, especially those with pre-existing cardiovascular conditions like hypertension and diabetes who have weakened blood vessels."

Blood vessel damage due to post-recovery overactive immune system

The team found that recovered COVID-19 patients had twice the normal number of circulating endothelial cells (CECs) that had been shed from damaged blood vessel walls. The elevated levels of CECs indicate that blood vessel injury is still apparent after recovering from viral infection.

The researchers also found that recovered COVID-19 patients continued to produce high levels of cytokines - proteins produced by immune cells that activate the immune response against pathogens - even in the absence of the virus.

Unusually high numbers of immune cells, known as T cells, that attack and destroy viruses were also present in the blood of recovered COVID-19 patients.

The presence of both cytokines and higher levels of immune cells suggest that the immune systems of recovered COVID-19 patients remained activated even once the virus was gone.

The researchers hypothesise that these persistently activated immune responses may attack the blood vessels of recovered COVID-19 patients, causing even more damage and increasing the risk of blood clot formation further.

The study's first author Florence Chioh, a Research Assistant at NTU's Lee Kong Chian School of Medicine, said: "While COVID-19 is mainly a respiratory infection, the virus may also attack the linings of blood vessels, causing inflammation and damage. Leakage from these damaged vessels triggers the formation of blood clots that may result in the sort of complications seen in the patients during hospitalisation."

One of the co-authors of the paper, Professor Lisa Ng, Executive Director of A*STAR Infectious Diseases Labs and previously Senior Principal Investigator at SIgN, said: "We assessed the levels of immune mediators in these patients, which revealed several proinflammatory and activated T lymphocyte-associated cytokines sustained from infection to recovery phase. This correlated positively with CEC measure, implying cytokine-driven vessel damage. We found that COVID-19 patients with vascular complications have a higher frequency of T cells, which may in turn attack the blood vessels. Preventive therapy may be needed for these patients."

Emphasising post- hospitalisation care for at-risk COVID-19 patients

The study's key findings can help inform guidelines for post-hospitalisation care of COVID-19 patients who might be susceptible to 'long-haul COVID' symptoms, said the research team.

In January this year, the World Health Organisation (WHO) released a recommendation in their revised clinical management guidelines, targeted at the risk of blood clot formation. For hospitalised patients, WHO recommended the use of low dose anticoagulants for preventing the blood clots forming in blood vessels.

Asst Prof Cheung added: "Those with cardiovascular conditions need to be more cautious since their underlying conditions already weaken their vascular systems. It's a double blow with COVID-19. As we gain greater understanding of complications COVID 'long-haulers' face, there is hope to encourage vaccine take-up rate to protect oneself from both the virus and its long-term complications."

Moving forward, the team is investigating the longer-term effects of COVID-19 in patients who have recovered from the infection for at least six months or longer.

Credit: 
Nanyang Technological University