Tech

Dark energy survey releases most precise look at the universe's evolution

image: The Dark Energy Survey camera (DECam) at the SiDet clean room. The Dark Energy Camera was designed specifically for the Dark Energy Survey. It was funded by the Department of Energy (DOE) and was built and tested at DOE's Fermilab.

Image: 
DOE/FNAL/DECam/R. Hahn/CTIO/NOIRLab/NSF/AURA

In 29 new scientific papers, the Dark Energy Survey examines the largest-ever maps of galaxy distribution and shapes, extending more than 7 billion light-years across the Universe. The extraordinarily precise analysis, which includes data from the survey's first three years, contributes to the most powerful test of the current best model of the Universe, the standard cosmological model. However, hints remain from earlier DES data and other experiments that matter in the Universe today is a few percent less clumpy than predicted.

New results from the Dark Energy Survey (DES) use the largest-ever sample of galaxies observed over nearly one-eighth of the sky to produce the most precise measurements to date of the Universe's composition and growth.

DES images the night sky using the 570-megapixel Dark Energy Camera on the National Science Foundation's Víctor M. Blanco 4-meter Telescope at Cerro Tololo Inter-American Observatory (CTIO) in Chile, a Program of NSF's NOIRLab. One of the most powerful digital cameras in the world, the Dark Energy Camera was designed specifically for DES. It was funded by the Department of Energy (DOE) and was built and tested at DOE's Fermilab.

Over the course of six years, from 2013 to 2019, DES used 30% of the time on the Blanco Telescope and surveyed 5000 square degrees -- almost one-eighth of the entire sky -- in 758 nights of observation, cataloging hundreds of millions of objects. The results announced today draw on data from the first three years -- 226 million galaxies observed over 345 nights -- to create the largest and most precise maps yet of the distribution of galaxies in the Universe at relatively recent epochs. The DES data were processed at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign.

"NOIRLab is a proud host for and member of the DES collaboration," said Steve Heathcote, CTIO Associate Director. "Both during and after the survey, the Dark Energy Camera has been a popular choice for community and Chilean astronomers."

At present the Dark Energy Camera is used for programs covering a huge range of science including cosmology. The Dark Energy Camera science archive, including DES Data Release 2 on which these results are based, is curated by the Community Science and Data Center (CSDC), a Program of NSF's NOIRLab. CSDC provides software systems, user services, and development initiatives to connect and support the scientific missions of NOIRLab's telescopes, including the Blanco telescope at CTIO.

Since DES studied nearby galaxies as well as those billions of light-years away, its maps provide both a snapshot of the current large-scale structure of the Universe and a view of how that structure has evolved over the past 7 billion years.

Ordinary matter makes up only about 5% of the Universe. Dark energy, which cosmologists hypothesize drives the accelerating expansion of the Universe by counteracting the force of gravity, accounts for about 70%. The last 25% is dark matter, whose gravitational influence binds galaxies together. Both dark matter and dark energy remain invisible. DES seeks to illuminate their nature by studying how the competition between them shapes the large-scale structure of the Universe over cosmic time.

To quantify the distribution of dark matter and the effect of dark energy, DES relied mainly on two phenomena. First, on large scales galaxies are not distributed randomly throughout space but rather form a weblike structure that is due to the gravity of dark matter. DES measured how this cosmic web has evolved over the history of the Universe. The galaxy clustering that forms the cosmic web in turn revealed regions with a higher density of dark matter.

Second, DES detected the signature of dark matter through weak gravitational lensing. As light from a distant galaxy travels through space, the gravity of both ordinary and dark matter in the foreground can bend its path, as if through a lens, resulting in a distorted image of the galaxy as seen from Earth. By studying how the apparent shapes of distant galaxies are aligned with each other and with the positions of nearby galaxies along the line of sight, DES scientists were able to infer the clumpiness of the dark matter in the Universe.

To test cosmologists' current model of the Universe, DES scientists compared their results with measurements from the European Space Agency's orbiting Planck observatory. Planck used light known as the cosmic microwave background to peer back to the early Universe, just 400,000 years after the Big Bang. The Planck data give a precise view of the Universe 13 billion years ago, and the standard cosmological model predicts how the dark matter should evolve to the present.

Combined with earlier results DES provides the most powerful test of the current best model of the Universe to date, and the results are consistent with the predictions of the standard model of cosmology. However, hints remain from DES and several previous galaxy surveys that the Universe today is a few percent less clumpy than predicted [1].

Ten regions of the sky were chosen as "deep fields" that the Dark Energy Camera imaged repeatedly throughout the survey. Stacking those images together allowed the scientists to glimpse more distant galaxies. The team then used the redshift information from the deep fields to calibrate the rest of the survey region. This and other advancements in measurements and modeling, coupled with a threefold increase in data compared to the first year, enabled the team to pin down the density and clumpiness of the Universe with unprecedented precision.

DES concluded its observations of the night sky in 2019. With the experience gained from analyzing the first half of the data, the team is now prepared to handle the complete dataset. The final DES analysis is expected to paint an even more precise picture of the dark matter and dark energy in the Universe.

The DES collaboration consists of over 400 scientists from 25 institutions in seven countries.

"The collaboration is remarkably young. It's tilted strongly in the direction of postdocs and graduate students who are doing a huge amount of this work," said DES Director and spokesperson Rich Kron, who is a Fermilab and University of Chicago scientist. "That's really gratifying. A new generation of cosmologists are being trained using the Dark Energy Survey."

The methods developed by the team have paved the way for future sky surveys such as the Rubin Observatory Legacy Survey of Space and Time. "DES shows that the era of big survey data has well and truly begun," notes Chris Davis, NSF's Program Director for NOIRLab. "DES on NSF's Blanco telescope has set the scene for the remarkable discoveries to come with Rubin Observatory over the coming decade."

Credit: 
Association of Universities for Research in Astronomy (AURA)

Video platforms normalize exotic pets

Researchers at the University of Adelaide are concerned video sharing platforms such as YouTube could be contributing to the normalisation of exotic pets and encouraging the exotic pet trade.

In a study, published in PLOS ONE, researchers analysed the reactions of people to videos on YouTube involving human interactions with exotic animals and found those reactions to be overwhelmingly positive.

The researchers analysed the reactions - via text and emoji usage - in comments posted on 346 popular videos starring exotic wild cats and primates in 'free handling situations'.

These situations involved exotic animals interacting with humans or other animals, such as domestic cats and dogs. The videos examined received more than a million views and the comments posted were made between 2006 and October 2019.

First author and final year veterinary medicine student at the University of Adelaide, Georgia Moloney, said, while YouTube is not the only media platform portraying images of 'unnatural interactions' with exotic animals, it is the number one video sharing platform globally and presently the third largest social media platform overall.

"The types of interactions we observed in the videos analysed on YouTube included monkeys in nappies like children, primates as pets and pet tigers chained up and interacting with people on suburban front lawns,'' she said.

"In addition to comments along the lines of 'Isn't that cute', we found that people also indicated they wanted to be close to the animal and have a similar interaction of their own.

"This is of concern because it could indicate that people think these interactions are not only normal and okay, but desirable, and could support the exotic pet trade."

The only change in sentiment observed in the study occurred in 2015, when a negative trend was observed in reaction to videos featuring primates. The researchers say this could be partly attributed to a 2015 International Animal Rescue campaign to stop cruelty against the slow loris, titled, 'Tickling is Torture'.

"The negative trend we observed in comments on primate videos in 2015 could be connected to the 'Tickling is Torture' campaign, and demonstrates the power of social media and the role it can play in preventing animal cruelty and exploitation,'' Ms Moloney said.

The exotic pet trade is a global problem with the videos in the study uploaded from all over the world.
"We saw content uploaded from countries within all six continents. The illegal wildlife trade is a bigger problem than people realise.

"In Australia, for example, border security continues to see all sorts of native reptiles being smuggled out of our country."

Study leader, Dr Anne-Lise Chaber from the University of Adelaide School of Animal and Veterinary Sciences, who has been examining the exotic pet and illegal wildlife trade since 2008, said, while YouTube has policies outlining expectations and limitations of content published, they rely heavily on the public to report breaches and illegal content.

"Current policies rely on the public to identify what is harmful or distressing to the animal, and yet people may not have the knowledge to do that," Dr Chaber said.

"A slow loris which appears to be smiling when tickled in a video is neither violent nor graphic content and therefore it's left to the viewer to identify this sign of distress and report the video."

Dr Chaber adds: "One way YouTube could play a more active role in educating the public about what is inappropriate is by embedding an icon on videos which when clicked take people to important information about the animal.

"Education is key but this needs to go hand-in-hand with improved policies and reporting systems."

The researchers say further improvements to YouTube's policies and reporting systems could include software to automatically detect key terms such as species names within video titles or descriptions and flag them for immediate review. Also, artificial intelligence systems to accurately identify threatened exotic species depicted in content and inform the public about their conservation status before permitting viewing, similar to Instagram's Wildlife Alert System.

Credit: 
University of Adelaide

This brain circuit signals when to stop eating; could regulating it help with obesity

Like a good story, feeding has a beginning, a middle and an end. It begins with appetite prompting the search for food, continues with eating the food and it ends when satiation hits and the consumption of food is stopped.

At Baylor College of Medicine, Dr. Qi Wu, Dr. Yong Han and their colleagues have uncovered new aspects of the last part of this story that relate to the little-known neural circuits and neurotransmitters involved in ending food consumption.

The team discovered a novel circuit that connects a unique subset of dopamine-producing neurons with downstream neurons in the hindbrain (lower brainstem) and potently suppresses food intake by triggering satiation in mice. They also found that the FDA-approved drug methylphenidate (MPH) mediates its noticeable weight loss effect by activating this particular circuit, opening the possibility that regulating this circuit might help people control weight. The study appears in the journal Sciences Advances.

"Many people struggle with weight control, eating more than what the body needs, which adds extra pounds that can lead to obesity and higher risk of serious conditions such as heart disease, stroke and type 2 diabetes," said Han, a postdoctoral associate in pediatrics-nutrition in the Wu lab and the first author of this study. "Our lab is interested in improving our understanding of what goes on in the brain during feeding with the hope that our findings might one day help people better control their weight."

New insights into brain regulation of the satiation response:

"The current study is about a circuit in the brain that helps to precisely regulate the size of the food portion that is consumed," said Wu, assistant professor in pediatrics-nutrition and the corresponding author of the study. "It is not about how eating begins but about how it ends. It's about the satiation response, which is as important as appetite."

Using several advanced techniques to study neural function, including cell-specific circuitry mapping, optogenetics and real-time recordings of brain activity, the researchers discovered a novel neural circuit that connects a unique group of dopamine-producing neurons called DA-VTA with downstream target neurons known as DRD1-LPBN and regulates food consumption in mice.

The team examined the activities of the two sets of neurons while the mice were eating. They observed that the activity of these DA-VTA neurons increased immediately before the animals stopped eating. When the researchers genetically inhibited these neurons, the animals prolonged their feeding, drastically increasing the portion size. This suggests that inhibiting the circuit prevented the satiation response. They also found that enhancing the activity of the DRD1-LPBN neurons, which receive signals from the DA-VTA neurons, robustly generated the response of meal termination.

The researchers also found that the novel circuit mediated the weight loss effect that is associated with taking the drug MPH, which is approved for mitigation of attention deficit hyperactivity disorder.

"Other brain circuits have been proposed to regulate feeding, but the one we discovered is the first to be fully described to regulate portion size via dopamine signaling," Han said. "Our new study shows that a circuit connecting neurons that produce dopamine, a chemical messenger previously known for the regulation of motivation and pleasure, has a new role in the control of feeding through dynamically regulating the satiety response."

"Our finding that MPH suppresses feeding and reduces body weight in laboratory mice by strengthening the dopamine-supported novel circuit we discovered, suggests a potential off-label application of a class of MPH and derivatives in tackling obesity," Wu said. "This also has implications for the future development of circuitry-based precision medicine that can deliver weight-reducing results with higher safety and effectiveness.".

Credit: 
Baylor College of Medicine

Few public-sector employees can contribute significantly to reaching sustainability goals

image: Alexander Yuriev: "Most sustainability officers had good intentions and knew what they were doing, but they found they could not push their ideas through."

Image: 
Photo courtesy Alexander Yuriev

The province of Quebec is one of only a few jurisdictions to enshrine sustainable development into law. In 2006 the then-Liberal government of Jean Charest adopted the Sustainable Development Act, creating a framework for Quebec's public bodies to follow in order to achieve a better integration of sustainable development in its operations. This involved the creation of sustainability plans with specific targets, submission of annual management reports and, among other things, the involvement of public employees in sustainability practices.

However, a new Concordia study shows that integration of sustainability-related practices has been uneven across the dozens of public bodies -- ranging from the biggest ministries to the smallest local tribunals -- subject to the law. While some bodies enthusiastically embrace sustainability innovations coming from their employees, many others practically ignore, discourage or pay mere lip service to them, explains Alexander Yuriev, assistant professor (LTA) of management at the John Molson School of Business and lead author on the paper.

The study, published in the journal Public Management Review, is based on anonymous interviews with 33 sustainability officers from 25 different government bodies that employ a combined more than 60 per cent of government workers in Quebec. These include 11 ministries and 14 public companies.

The paper was co-written by Olivier Boiral, full professor at Université Laval and Canada Research Chair in Internalization of Sustainability Practices and Organizational Accountability, and David Talbot, associate professor at École nationale d'administration publique.

Bottom up, but no further

"Because of the law's requirements, we found that some of the larger organizations had dedicated sustainability officers, while the smaller ones did not because they lacked resources," Yuriev says.

"In some cases, they just had someone who was responsible for filling out forms and producing reports. They were meeting criteria on paper, but without substantially integrating anything new."

The success of the sustainability initiatives depends in large part on innovation stemming from individual employees. This allows each organization to develop its own individual approach to sustainable development, adaptable to its own mandates and resources. An aluminium-smelting plant would have a different approach to sustainable development than a ministry, for instance.

The researchers identified three types of factors that influenced proactive employee-driven innovation: individual, organizational and public-sector-specific. All three contained elements that determined whether an organization's implementation of sustainability innovations went beyond symbolic measures. According to Yuriev, only about six of the 25 bodies had implemented substantial employee-engaging measures in line with the Sustainable Development Act.

Often, it was not for lack of trying on employees' part. Yuriev says a combination of factors, from internal culture to hierarchies to political agendas to lack of resources, often stand in the way of meaningful staff-driven action.

"Most interviewed sustainability officers had good intentions and knew what they were doing," he adds. "But they found they could not push their ideas through. They would frequently be stopped at some level toward upper management."

He notes that substantial innovations were far more likely to be adopted in organizations where sustainability was the focus of their mandates.

Foster a growth culture

Yuriev says the law as it is written is partly to blame. It can be overly bureaucratic, requiring multiple time-consuming forms to be filled out, and has often vague criteria to measure evaluations. This allows organizations to claim compliance by adopting only superficial measures.

"Having an internal culture that looks favourably on sustainable innovation is crucial," he says. "However, it is difficult to install that kind of culture. A continuity of concrete objectives, in which one is met and then a new more challenging one is implemented, would help ensure a sense of authenticity and commitment to sustainability among employees."

Credit: 
Concordia University

Ultrasensitive blood test detects viral protein, confirms vaccine activates robust immune response

The carefully orchestrated dance between the immune system and the viral proteins that induce immunity against COVID-19 may be more complex than previously thought. A new study by investigators at Brigham and Women's Hospital used an ultrasensitive, single-molecule array (Simoa) assay to detect extremely low levels of molecules in the blood and measured how these levels change over the days and weeks following vaccination. The team found evidence of circulating protein subunits of SARS-CoV-2, followed by evidence of the body mounting its immune response and then clearing the viral protein to below the level of single-molecule detection. Results are published in Clinical Infectious Diseases.

"Because of our ultra-sensitive method, we're able to corroborate that the mRNA vaccine is operating as intended, stoking the body's immune response," said co-corresponding author David Walt, PhD, a member of the faculty in the Department of Pathology at the Brigham. Walt is also a member of the Wyss Institute and is a Howard Hughes Medical Institute Professor. "We were able to detect extremely low levels of viral protein and see that as soon as the body begins generating antibodies, those levels declined to undetectable." Walt has a financial interest in Quanterix Corporation, the company that developed the ultra-sensitive digital immunoassay platform used in this work.

To conduct their study, Walt and colleagues measured levels of SARS-CoV-2 protein subunits in plasma samples collected from 13 participants who received two doses of the Moderna (mRNA-1273) vaccine. Specifically, the team measured levels of SARS-CoV-2 antigens Spike, S1, and Nucleocapsid. The team examined plasma collected at 10-13 timepoints between 1 and 29 days after the first injection and 1-28 days after the second injection. The average age of participants was 24 and the percentage of female participants was 46.

The team found that 11-of-13 participants had low levels of SARS-CoV-2 protein (S1 subunit) as early as one day post-vaccination. S1 subunit protein level peaked on average five days after the first injection. In all participants, the level of S1 protein declined and became undetectable by day 14. Spike protein was detected in 3-of-13 participants an average of 15 days after the first injection. After the second vaccine dose, no S1 or Spike was detectable.

The team collected corresponding antibody data and showed that the immune response began to mount after the viral proteins were produced. Increased antibody levels correlated with viral protein clearance from plasma.

The researchers note that the level of translated protein detected was extremely low and disappeared once antibodies were detected. All participants in the study were healthy volunteers who were vaccinated but not infected with SARS-CoV-2.

"The vaccine is designed to introduce mRNA into the body, which is then translated into the Spike protein. It is the Spike protein that can activate the immune system, which in turn creates antibodies to prevent future infections," said co-first author Alana Ogata, PhD, a postdoctoral fellow in the Walt lab. "We observed that antibodies that target Spike and S1 proteins are generated as early as 1-2 days after circulating S1 is detected, followed by the clearance of proteins. Additionally, we see that the second dose does not result in circulating protein but does provide an additional boost in antibody levels, as expected."

Researchers note that limitations of the current study include the small sample size and potential biases that result from enrolling healthy, young adults, which may not be representative of the general population. The research team plans to continue their plasma studies in other populations, including pregnant people and children, to further understand the dynamics between viral proteins and the immune response.

Credit: 
Brigham and Women's Hospital

People prefer 'natural' strategies to reduce atmospheric carbon

ITHACA, N.Y. - Soil carbon storage, carbon capture and storage, biochar - mention these terms to most people, and a blank stare might be the response.

But frame these climate change mitigation strategies as being clean and green approaches to reversing the dangerous warming of our planet, and people might be more inclined to at least listen - and even to back these efforts.

A cross-disciplinary collaboration led by Jonathon Schuldt, associate professor of communication at Cornell University, found that a majority of the U.S. public is supportive of soil carbon storage as a climate change mitigation strategy, particularly when that and similar approaches are seen as "natural" strategies.

"To me, that psychology part - that's really interesting," Schuldt said. "What would lead people, especially if they're unfamiliar with these different strategies, to support one more than the other? Our study and others suggest that a big part of it is whether people see it as natural."

The group's paper, "Perceptions of Naturalness Predict U.S. Public Support for Soil Carbon Storage as a Climate Solution," published May 26 in the journal Climatic Change. Co-authors include Johannes Lehmann, the Liberty Hyde Bailey Professor in the School of Integrative Plant Science (SIPS), Soil and Crop Sciences Section (CALS); Dominic Woolf, senior research associate in SIPS; Shannan Sweet, postdoctoral associate in the Lehmann Lab; and Deborah Bossio of the Nature Conservancy.

Schuldt's team analyzed results from a survey of 1,222 U.S. adults who reported believing in climate change at least "somewhat," to estimate public support for soil carbon storage and how it compares to other leading carbon dioxide removal strategies.

Mitigation strategies - solar and wind power, electric vehicles and sustainable land use and biodiversity, to name a few - are already capturing much attention as the world grapples with rising temperatures, melting ice caps and increasingly violent weather events.

Survey data came from an online poll conducted Sept. 19 to Oct. 4, 2019, by NORC at the University of Chicago, a leading survey research firm. The team solicited respondents' perceptions of naturalness and policy support for five CO2 removal strategies: afforestation and reforestation; bioenergy plus carbon capture and storage; direct air capture; soil carbon storage; and soil carbon storage with biochar. Each respondent viewed a randomized group of three options and was asked to estimate the likelihood that they'd support that strategy.

They were also asked to rate their level of agreement with each of five statements related to humans' tampering with nature.

In the final analysis, perceived naturalness was a strong indicator of support for soil carbon storage as a climate change mitigation strategy. Of the five CO2 removal strategies, support was highest (73%) for afforestation and reforestation; soil carbon storage ranked second, supported by 62% of those polled.

And in this politically divided time, Schuldt said, support for soil carbon storage crossed the aisle. A total of 72% who identified as Democrats supported the strategy; among Republicans, 52% were in support.

"We expected, and found, that Democrats support all kinds of climate strategies more than Republicans do," Schuldt said. "But the error I think we sometimes make is that we categorize all Democrats as being for it, and all Republicans as being against it. That's not true."

Ultimately, Schuldt said, the goal is to allow policymakers to present the public with palatable options for addressing climate change.

"There is a whole range of solutions out there," he said. "Then the question politically becomes, where do you start? Which one has the most buy-in? I think our data help speak to that."

Credit: 
Cornell University

Keeping more ammonium in soil could decrease pollution, boost crops

image: Keeping more nitrogen in soil as ammonium may be one key way to address both challenges, according to a new paper co-authored by Princeton SPIA.

Image: 
Egan Jimenez, Princeton University

Modern-day agriculture faces two major dilemmas: how to produce enough food to feed the growing human population and how to minimize environmental damage associated with intensive agriculture. Keeping more nitrogen in soil as ammonium may be one key way to address both challenges, according to a new paper in the Proceedings of the National Academy of Sciences (PNAS).

Today's use of nitrogen fertilizers contributes heavily to greenhouse gas emissions, air pollution, and water pollution, but they are also essential for growing crops. Reducing this pollution is critical, but nitrogen use is likely to grow with increased food production. At the same time, the world's human population is increasing, and agriculture needs to efficiently produce enough food to feed everyone without resorting to clearing more forests for agriculture.

In the past, farmers have managed to increase food production by adding more nitrogen fertilizers to their farm lands, but doing so is no longer a viable or acceptable solution. Instead, farmers should consider shifting to a blend of nitrate and ammonium, the researchers argue, which can decrease pollution and increase food production. Ammonium, a form of nitrogen, binds to soil and so is less likely to leach into waterways.

"Present fertilizer systems are polluting, inefficient, and damaging to ecosystem health," said paper co-author Guntur Subbarao, a senior researcher at the Japan International Research Center for Agricultural Sciences (JIRCAS). "If farming can shift from relying entirely on nitrate in soils to a system with a blend of nitrate and ammonium, it could have far-reaching consequences in limiting nitrogen pollution while boosting crop yields."

"New tools for maintaining more existing soil nitrogen in the form of ammonium could also enable selection of crop varieties that achieve higher yields through a blend of nitrogen forms," said co-author Tim Searchinger, a senior research scholar at the Center for Policy Research on Energy and the Environment, which is based at Princeton University's School of Public and International Affairs. "The prospect exists for a double-benefit that reduces nitrogen pollution, including greenhouse gas emissions, and helps the world to save forests by producing substantially more food on the same land."

Most strategies for mitigating nitrogen pollution rely on limiting pollution at the "front end" - by attempting to more carefully apply fertilizer. However, the authors explain that no matter how carefully fertilizer is applied, there is always a leakage of nitrogen at the "back end." This leakage occurs because soil nitrogen in croplands quickly turns into nitrate, a form of nitrogen that easily leaches into groundwater and waterways and whose breakdown releases nitrous oxide, a powerful greenhouse gas.

This is where ammonium comes in, the authors say. It does not degrade into nitrous oxide unless turned into nitrate first. The article shows that while high levels of ammonium are toxic to most plants, a little appreciated line of research has shown that a mixture of nitrate and ammonium tends to increase crop yields substantially, even by 50% or more, compared to the common soil conditions today that are nearly all nitrate.

This academic finding was irrelevant until recently because microorganisms in crop fields rapidly turn nitrogen into nitrate in a process known as nitrification. However, the authors identify two emerging ways of keeping a greater balance of nitrogen forms in soils. One is to use synthetic nitrification inhibitors with coatings to limit nitrification for extended periods. The other is to take advantage of the natural ability of some plants to prevent nitrification. A plant trait that prevents microbes from converting ammonium to nitrate was first discovered in a commonly planted tropical grass, but researchers have recently begun breeding varieties of all the major grains, like wheat, to have this property.

By cultivating plants that benefit from ammonium and that aid ammonium retention in the soil by inhibiting nitrification, farmers, scientists, and policymakers could effectively increase food production while minimizing environmental degradation. The authors recommend additional research efforts, as there is currently no large-scale funding support for these efforts. They also recommend policies that shift fertilizer subsidies toward fertilizer forms or crop varieties that inhibit nitrification.

"One key benefit of this research is that once these varieties are created, all farmers around the world should be able to use them without additional cost and with the benefit of higher yields," Subbarao said.

Credit: 
Princeton School of Public and International Affairs

Electric fish -- and humans -- pause before communicating key points

image: African fish called mormyrids communicate using pulses of electricity.

Image: 
Tsunehiko Kohashi

American writer and humorist Mark Twain, a master of language and noted lecturer, once offered, "The right word may be effective, but no word was ever as effective as a rightly timed pause."

Electric fish and today's TED talk speakers take a page from Twain's playbook. They pause before sharing something particularly meaningful. Pauses also prime the sensory systems to receive new and important information, according to research from Washington University in St. Louis.

"There is an increased response in listeners to words -- or in this case, electrical pulses -- that happens right after a pause," said Bruce Carlson, professor of biology in Arts & Sciences and corresponding author of the study published May 26 in Current Biology. "Fish are basically doing the same thing we do to communicate effectively."

Beyond discovering interesting parallels between human language and electric communication in fish, the research reveals an underlying mechanism for how pauses allow neurons in the midbrain to recover from stimulation.

Carlson and collaborators, including first author Tsunehiko Kohashi, formerly a postdoctoral research associate at Washington University, conducted their study with electric fish called mormyrids. These fish use weak electric discharges, or pulses, to locate prey and to communicate with one another.

The scientists tracked the banter between fish housed under different conditions. They observed that electric fish that were alone in their tanks tend to hum along without stopping very much, producing fewer and shorter pauses in electric output than fish housed in pairs. What's more, fish tended to produce high frequency bursts of pulses right after they paused.

The scientists then tried an experiment where they inserted artificial pauses into ongoing communication between two fish. They found that the fish receiving a pause -- the listeners -- increased their own rates of electric signaling just after the artificially inserted pauses. This result indicates that pauses were meaningful to the listeners.

Other researchers have studied the behavioral significance of pauses in human speech. Human listeners tend to recognize words better after pauses, and effective speakers tend to insert pauses right before something that they want to have a significant impact.

"Human auditory systems respond more strongly to words that come right after a pause, and during normal, everyday conversations, we tend to pause just before speaking words with especially high-information content," Carlson said. "We see parallels in our fish where they respond more strongly to electrosensory stimuli that come after a pause. We also find that fish tend to pause right before they produce a high-frequency burst of electric pulses, which carries a large amount of information."

The scientists wanted to understand the underlying neural mechanism that causes these effects. They applied stimulation to electrosensory neurons in the midbrain of the electric fish and observed that continually stimulated neurons produced weaker and weaker responses. This progressive weakness is referred to as short-term synaptic depression.

Cue Mark Twain and his well-timed pauses.

The scientists inserted pauses into the continuous stimulation. They found that pauses as short as about one second allowed the synapses to recover from short-term depression and increased the response of the postsynaptic neurons to stimuli following the pause.

"Pauses inserted in electric speech reset the sensitivity of the listener's brain, which was depressed during the continuous part of the speech," Kohashi said. "Pauses seem to make the following message as clear as possible for the listener."

Similar to humans.

Synaptic depression and recovery are universal in the nervous system, the researchers noted.

"We expect the same mechanism, more or less, plays a role in pauses during communication in other animals, including humans," Carlson said.

Credit: 
Washington University in St. Louis

Reporting of race, sex, socioeconomic status in randomized clinical trials in medical journals

What The Study Did: Researchers compared reporting practices for race, sex and socioeconomic status in randomized clinical trials published in general medical journals in 2015 with those published in 2019.

Authors: Asad Siddiqui, M.D., of the Hospital for Sick Children in Toronto, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2021.11516)

Editor's Note: Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

UNH research: Journey of PFAS in wastewater facilities highlights regulation challenges

image: UNH researchers sampling emerging contaminant in the Great Bay Estuary.

Image: 
UNH

DURHAM, N.H.—Researchers at the University of New Hampshire have conducted two of the first studies in New England to collectively show that toxic man-made chemicals called PFAS (per-and polyfluoroalkyl substances), found in everything from rugs to product packaging, end up in the environment differently after being processed through wastewater treatment facilities—making it more challenging to set acceptable screening levels.

“PFAS are persistent substances that are not easily broken down and have been linked to adverse health effects,” said Paula Mouser, associate professor of civil and environmental engineering. “They are found in a wide variety of industrial, commercial and medicinal products and can end up in the body, human waste and the environment. If not managed correctly, they can be further distributed around the environment in landfills, waterways and even stabilized biosolids could be applied to agricultural fields as fertilizers.”

The researchers looked at the journey of 24 different PFAS through six New Hampshire wastewater treatment facilities, including those along the Great Bay Estuary near the N.H. Seacoast, to examine how they are distributed after being treated. PFAS come in two forms, long-chain and short-chain, which refers to the number of carbon atoms attached to fluorine in the compounds. In their first study, recently published in the journal Environmental Science: Processes and Impacts, the researchers found that short-chain PFAS ended up in the facility liquid, or effluent, while long-chain PFAS were more abundant in the sludge due to their higher affinity toward solids.

After going through a range of biological and disinfectant processes in the municipal wastewater treatment facilities, researchers found roughly 10% of the PFAS present in Great Bay could be traced back to the wastewater facilities. This suggests other dominant PFAS sources are contributing to the waterways like septic systems, agricultural land and urban runoff (which can contain biosolids), groundwater discharge from contaminated sites and surface water runoff.

Currently, the United States Environmental Protection Agency (EPA) has only issued a drinking water health advisory for two of the 4,700 known PFAS, so individual states are working to set their own standards for PFAS in drinking water, surface water and biosolids. In 2020, the New Hampshire Department of Environmental Services established maximum contaminant levels (MCLs) for four PFAS in drinking water, while in 2019, the Maine Department of Environmental Protection (DEP) established screening levels for three PFAS in biosolids.

In the UNH researchers’ second study, featured in the New England Water Environment Association Journal, the researchers used Maine’s screening levels to look at both PFAS and PPCPs, pharmaceutical and personal care products like antibiotics and flame retardants, in biosolids from wastewater treatment facilities in both New Hampshire and Vermont. Of the 39 biosolids reviewed in the sludge waste, 29 had PFAS levels that exceeded screening levels set by the Maine DEP.

“State agencies across New England are all considering regulating PFAS in wastewater biosolids, but there is still more we need to know about how the treatment of wastewater sludge influences these forever chemicals,” said Mouser.

The researchers say the challenge is finding a safe and acceptable level for waste residue that doesn’t force facilities to deposit these solids in landfills which would be enormously costly, fill up landfills faster than anticipated and possibly lead to the leaching of PFAS into landfill wastewater that may continue the cycle by returning the not easily broken-down chemicals right back to treatment facilities.

The researchers say the studies highlight the knowledge gaps around contaminants of emerging concern, like PFAS, in wastewater residuals and stress that more research is needed to look at the influence of the facility design and operation on their treatment before costly upgrades are implemented in wastewater treatment facilities.

This research was funded by New Hampshire Sea Grant and the UNH Collaborative Research Excellence (CoRE) Initiative.

The University of New Hampshire inspires innovation and transforms lives in our state, nation, and world. More than 16,000 students from all 50 states and 71 countries engage with an award-winning faculty in top-ranked programs in business, engineering, law, health and human services, liberal arts and the sciences across more than 200 programs of study. As one of the nation’s highest-performing research universities, UNH partners with NASA, NOAA, NSF and NIH, and receives more than $110 million in competitive external funding every year to further explore and define the frontiers of land, sea and space.

PHOTOS FOR DOWNLOAD

Image: https://www.unh.edu/unhtoday/sites/default/files/media/pfas_sampling_at_great_bay.jpgCredit: UNHCaption: UNH researchers sampling emerging contaminant in the Great Bay Estuary.

Image: https://www.unh.edu/unhtoday/sites/default/files/media/pfas_sampling_at_great_bay_from_boat.jpgCredit: UNHCaption: UNH researcher collecting samples of emerging contaminant in the Great Bay Estuary.

Image: https://www.unh.edu/unhtoday/sites/default/files/media/pfas_samples_3.jpgCredit: UNHCaption: Samples to be tested for traces of PFAS from one of six New Hampshire wastewater treatment facilities.

Image: https://www.unh.edu/unhtoday/sites/default/files/media/picture1_0.jpgCredit: UNHCaption: UNH master’s students Elham Tavasoli, Alexandria Hidrovo and Chris Rodriquez sampling emerging contaminants at a New Hampshire wastewater treatment facility.

Credit: 
University of New Hampshire

Research identifies climate-change refugia in dry-forest region

image: Dolina dos Macacos, a sinkhole in Parque Nacional Cavernas do Peruaçu. The study was based on analysis of tree rings in the species Amburana cearensis, as well as satellite images

Image: 
Luciano Fioroto

Several indicators point to the adverse impacts of climate change on the planet’s vegetation, but a little-known positive fact is the existence of climate-change refugia in which trees are far less affected by the gradual rise in temperatures and changing rainfall regimes. Climate-change refugia are areas that are relatively buffered from climate change, such as wetlands, land bordering water courses, rocky outcrops, and valleys with cold-air pools or inversions, for example.

A study conducted in Peruaçu Caves National Park in the state of Minas Gerais, Brazil, with FAPESP’s support, confirmed and quantified this type of occurrence. “These refugia are excellent candidates for land management initiatives, offering a high probability of success and lower expenditure in conservation areas,” said Milena Godoy-Veiga, a PhD candidate at the University of São Paulo’s Institute of Biosciences (IB-USP) and lead author of the article on the study published in Forest Ecology and Management.

The other authors include Godoy-Veiga’s thesis advisers, Gregório Ceccantini and Giuliano Locosselli.

According to Godoy-Veiga, climate-change refugia are frequently located in karstic regions. Karst is a topography formed over time from chemical dissolution of soluble rocks such as limestone, dolomite, etc., and characterized by underground drainage systems with subterranean rivers, sinkholes, and caves, as well as dramatic above-ground features such as steep cliffs and dry gullies. “This is the landscape in Peruaçu Caves National Park, where there are ground height differences of as much as 200 meters, with the high parts projecting shadows over the low parts, and the environment comprising all the other features mentioned,” she said.

The researchers reached the conclusion that climate-change refugia are to be found in a large proportion of the park by analyzing growth rings in samples of the tree species Amburana cearensis (vernacular names amburana-de-cheiro and cerejeira). “We counted over 4,500 growth rings in samples from 39 trees,” Godoy-Veiga said. “Chronological analysis is usually done with a mean value for all trees, but we were able to analyze each tree individually thanks to a partnership with two researchers at Israel’s Weizmann Institute of Science, who are also co-authors of the article: Elisabetta Boaretto, who heads a laboratory, and Lior Regev, the scientist responsible for the particle accelerator in which radiocarbon dating is done.”

They were able to date the tree rings precisely using the “bomb peak” curve, which is applicable to modern samples owing to the sharp rise in carbon-14 levels in the atmosphere and all living beings following the nuclear tests conducted during the Cold War. The levels peaked in the mid-1960s and then fell again with the signing of various international treaties banning nuclear weapons tests.

“Our analysis shows that 22 out of 39 trees were sensitive to temperature and the amount of summer rain. Six were sensitive only to rainfall, and 11 were apparently not affected by the region’s weather. Based on these results, we defined areas of the park that can be considered climate-change refugia, and confirmed this using satellite images taken during the dry and rainy seasons,” Godoy-Veiga said.

“We compared the images to construct a vegetation index, which clearly showed that the presumed climate-change refugia were the least seasonal areas of the park, where most of the trees don’t lose their leaves. These areas are associated with lower terrain and deeper soil, or are near rocky outcrops and the Peruaçu River.”

Located in Brazil’s central region in a transition zone between two important biomes, Cerrado (savanna) and Caatinga (semi-arid shrubland and thorn forest), Peruaçu Caves National Park is a monumental karst landscape with huge caves and speleothems (stalactites, stalagmites and other mineral formations) created over thousands of years by rainwater and the Peruaçu, a tributary of the São Francisco.

Besides caves, the park has almost 600 square kilometers of dry forest, where the study was conducted. “Analyzing only the park’s non-degraded portions, which correspond to about 80% of the total area, we concluded that almost a quarter, or more than 100 square kilometers, could be held to contain climate-change refugia,” Godoy-Veiga said.

The various factors mentioned have created a microenvironment that is sheltered from the region’s prevailing climate, providing more favorable conditions for land management and increasing the likelihood of its success.

However, this horizon should be considered soberly without exaggerated expectations as it is already clear that extreme weather such as the phenomena caused by El Niño in 1997 has adverse effects on tree growth even in refugia. “The study is a major advance in the identification of climate-change refugia even in dry forest areas such as those located in northern Minas Gerais, but despite protection from rising temperatures and changes in rainfall patterns in these refugia, the trees there are vulnerable to extreme weather,” Locosselli said.

Ceccantini agreed. “Large numbers of trees have died in recent years and are still standing in the park. The study helps us understand why and how we need to react in order to conserve this natural heritage,” he said.

“Understanding how climate affects trees on a microscale helps design strategies to take better care of trees, not just in conservation units such as national and state parks, but also in urban areas, where trees play a very important role in enhancing the quality of life for the inhabitants.”

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Study affirms that vaccines are safe for children and adults

A new study looking across a large body of research finds further evidence for the safety of vaccines that are Food and Drug Administration-approved and routinely recommended for children, adults and pregnant women. The study updates a vaccine safety review that was released by the federal Agency for Healthcare Research and Quality in 2014.

"This in-depth analysis found no evidence of increased risk of serious adverse events following vaccines, apart from a few - previously known - associations," said Susanne Hempel, director of the Southern California Evidence Review Center.

The meta-analysis, published in the journal Vaccine, does not address the safety of COVID-19 vaccines, but summarizes the results of 338 studies of other vaccines commonly given across the lifespan.

"These findings support decisions to vaccinate to protect ourselves and our communities from a variety of diseases," said Dr. Courtney Gidengil, the study's lead author and a senior physician policy researcher at the RAND Corporation, a nonprofit research organization. "This research is an important reminder that vaccines are safe and any risk they may pose is far outweighed by their ability to protect against diseases."

The study included reviews of vaccines for diseases such as influenza, measles, mumps, shingles, whooping cough, tetanus and human papillomavirus (HPV)-associated cancers. While vaccination rates for children remain high, rates for adults and pregnant women consistently lag.

With funding from AHRQ, researchers conducted a systematic review of relevant vaccine safety research for key adverse events, which are events of special interest that were selected with the help of vaccine experts.

Among the findings about individual vaccines, researchers found that the strength of evidence was high for no increased risk of autism among children following the measles, mumps and rubella vaccine (MMR). The strength of evidence was high that MMR is associated with an increased but still low risk of febrile seizures, an adverse event that seldom has long-term consequences.

For older children and adolescents, there was no evidence of increased risk for key adverse events for newer vaccines such as 9-valent HPV vaccine, which prevents infections that lead to cervical and other types of cancers, and serogroup B meningococcal vaccines, which prevent a type of bacterial meningitis, though there was insufficient evidence to draw firm conclusions for some key adverse events that are rare.

For adults, there was no evidence of increased risk for key adverse events for the new recombinant adjuvanted zoster vaccine which prevents shingles, the hepatitis B vaccine with novel immunostimulatory adjuvant, and newer influenza vaccines such as the adjuvanted influenza vaccine recommended for older adults.

The study found no evidence of increased risk for key adverse events among pregnant women following tetanus, diphtheria and acellular pertussis vaccine (Tdap), including stillbirth.

Researchers say that studying adverse events associated with vaccines can be difficult because the events are rare and can be caused by other factors. Therefore, it is important to continue to conduct ongoing population-based vaccine safety studies and post-marketing surveillance of vaccine safety after vaccines are licensed by the FDA to identify rare and serious adverse events.

Future vaccine safety research needs to take into account the expanding landscape of new vaccines and vaccine technologies, in particular the new COVID-19 vaccines, according to the researchers.

Credit: 
RAND Corporation

Stormwater could be a large source of microplastics and rubber fragments to waterways

In cities, heavy rains wash away the gunk collecting on sidewalks and roads, picking up all kinds of debris. However, the amount of microplastic pollution swept away by this runoff is currently unknown. Now, researchers in ACS ES&T Water report that stormwater can be a large source of microplastics and rubber fragments to water bodies and, with a proof-of-concept experiment, show that a rain garden could keep these microscopic pieces out of a storm drain.

Most cities' storm drains end up discharging directly into wetlands, creeks or rivers. Rainwater running into these drains becomes a concoction of whatever is on the ground, including dirt and grass clippings, leaked car fluids, fertilizer and garbage. Recently, researchers also found that strong rains can displace microplastics, sweeping them into stormwater, but the importance of this runoff as a source of contamination is not well understood. So, Chelsea Rochman and colleagues wanted to see whether microplastics and other tiny particles are carried into waterways by storms in urban areas, and whether a rain garden could prevent that from happening.

The researchers collected water during heavy rainstorms from 12 streams flowing into the San Francisco Bay. First, they separated floating microparticles -- which they define as less than 5 mm in size -- by color and shape and tallied them, finding higher concentrations in the streams than previous researchers had found in treated wastewater that was discharged into the bay. Microscopic fibers and black rubbery fragments were the most common microparticles, while natural debris, glass, paint and wool were only minor components. Then, the team identified a subset of plastic- or rubbery-looking fragments as being made mostly of plastic polymers or other synthetic materials, and many of the black rubbery particles originated from tires. Finally, the researchers compared the microparticles entering a rain garden to those at the garden's outflow into a storm drain. Their results showed that the rain garden captured 91 to 98% of the microparticles and 100% of the black rubbery fragments during three rain events. The researchers say that while rain gardens are known to reduce the amount of metals, nutrients and other pollutants in stormwater runoff, this study shows rain gardens could also be effective at reducing microplastic pollution.

Credit: 
American Chemical Society

Scientists find solution to measure harmful plastic particles in human sewage

image: This image is generated by siMPle, an FTIR particle-analysis software, after comparison of the acquired particles spectra with a reference database

Image: 
University of Portsmouth

Scientists have got up close and personal with human sewage to determine how best to measure hidden and potentially dangerous plastics.

As the way microplastics are measured and counted varies from place to place, there is no agreed understanding of the weight of the problem. Until scientists can agree on one way of measuring them, life on land and sea will continue to ingest who knows how much plastic, affecting health for generations.

A new study, published today in Analytical and Bioanalytical Chemistry, by the University of Portsmouth has examined one method, using a chemical solution called 'Fenton reagent' to remove organic matter from sewage. It found it has significant advantages in processing times and costs over other currently available methods of testing.

Project Lead Dr Fay Couceiro, Senior Research Fellow in Biogeochemistry at the University of Portsmouth, said: "Multiple digestion with Fenton reagent involves mixing the sewage with hydrogen peroxide and iron sulphide multiple times to breakdown the organic matter. When followed by density separation, where you float off the plastics from everything else, it provides a cleaner sample so the size and type of microplastic can be determined with much less interference."

Professor Steve Fletcher, Director of the University's Revolution Plastics initiative, said: "Having some idea of the amount of microplastics in the environment is key to understanding and stopping the potential harmful impacts that this new category of emerging pollutants could have on life on earth. The need for protocols that are robust, simple and reliable together with their standardisation are of crucial importance in the fight against plastic pollution."

The study targeted the detection of microplastics in the sub-hundred-micron size range, which often get missed because of their tiny size, yet they have potentially higher health risks associated with them. This size of particle also has limited data available from previous wastewater research.

To show the value of this method, samples of raw sewage, final effluent and sludge were mixed with two different sizes and types of microplastics. The multiple digestion with Fenton reagent method showed good recovery of the added microplastics. Considering the various stages required for the separation of microplastics, time is a limiting factor in sample processing. The multiple digestion using Fenton reagent is an inexpensive and time-efficient procedure compared to other currently available methods when analysing large numbers of samples.

Dr Couceiro says: "The Fenton reagent method used in this study has huge potential for bringing about a much needed standardistation of the measuring of microplastics. Without being able to compare and contrast concentrations of microplastics, our ability to make significant strides forward in limiting pollution will be restricted. We would welcome further research that could investigate other types of plastics and the recovering of even smaller plastic particles."

Credit: 
University of Portsmouth

No good decisions without good data: Climate, policymaking, the critical role of science

"If you can't measure it, you can't improve it". This concept is also true within the context of climate policy, where the achievement of the objectives of the United Nations Framework Convention on Climate Change (UNFCCC) is dependent on the ability of the international community to accurately measure greenhouse gas (GHG) emission trends and, consequently, to alter these trends.

Greenhouse gas (GHG) emission inventories represent the link between national and international political actions on climate change, and climate and environmental sciences. Research communities and inventory agencies have approached the problem of climate change from different angles and by using terminologies, metrics, rules and approaches that do not always match. This is particularly true dealing with "Land Use, Land-Use Change and Forestry" (LULUCF), representing about 25% of the emissions reductions pledged by countries in their National Determined Contributions (NDCs) to the Paris Agreement. This sector is one of the most challenging among the inventory sectors to deal with, mainly because of high level of complexity of its carbon dynamics and the difficulties in disaggregating the fluxes between those caused by natural and anthropogenic processes.

The study led by the CMCC Foundation Euro-Mediterranean Center on Climate Change (CMCC) and recently published in Environmental Science and Policy facilitate the understanding by research communities of the current (UNFCCC) and future (under the Paris Agreement) reporting requirements, while identifying the main issues and topics that should be considered when targeting improvement of the GHG inventory.

"Our research", explains Lucia Perugini, CMCC scientist and first author of the study, "aims to build bridges between research community and inventory agencies. Specifically, it provides an overview of the current and future GHG reporting and verification requirements under the Paris Agreement, identifying how and where the research community can provide an effective contribution (providing inputs, data, solutions, methodologies) to support GHG inventory agencies and, therefore, towards the implementation of the Paris Agreement." 

At present, a discrepancy of about of 5 gigatonnes of CO2 per year (GtCO2?/y) in global anthropogenic net land-use emissions between global models (assessed in the last IPCC assessment report AR5) and national GHG inventories (reported to UNFCCC), largely attributable to differences in defining what is the anthropogenic land flux, has been estimated. The global modelling community and national governments currently apply in fact different methods to estimate and report land-based greenhouse gas (GHG) emissions. How to reconcile conceptual differences in anthropogenic forest sink estimation between models and GHG inventories?

Each approach has its own advantages and limitations - the real problem is that they are not fully comparable. Reconciling these differences does not require that the research community abandon its own approach, but rather that solutions are found to ensure comparability.

The results of this study highlights that the research community needs to better understand terms, rules, procedures and guidelines that countries follow to estimate and report their GHG emissions under the Paris Agreement. Too often, scientific papers speak a language which is different from that used by the GHG inventory community. Moreover, to be relevant for the improvement of countries' GHG inventories, research should provide methodological guidance and research results (e.g. more innovative methodologies and tools, databases, research infrastructures and shared protocols for data gathering).

The policy process would greatly benefit from science that considers specific inventory needs; conversely, GHG inventories can represent a valid source of data that is constantly reviewed and updated and that can be particularly useful for research studies.

Promotion of national and regional networking initiatives on specific topics can help both communities in exchanging of data and methods, solving interpretative problems, and understanding each other's data and needs.

Credit: 
CMCC Foundation - Euro-Mediterranean Center on Climate Change