Culture

Drone improves odor management in water treatment plants

image: Olfactory drone with remotely controlled sampler for system calibration and validation.

Image: 
SNIFFDRONE, ATTRACT initiative (GA 777222, Horizon 2020).

The bad odors produced by the Waste Water Treatment Plants, known as WWTPs, have become a growing concern in the cities and towns that host these facilities and are considered by citizens to be the main cause of the perception of pollution, along with the dust and noise.

Now, and thanks to a collaboration between the Institute for Bioengineering of Catalonia (IBEC) and the company DAM, a new way is being opened to detect and treat these odors.

According to the researchers, "the results obtained in the SNIFFDRONE project (Odor monitoring by drones for environmental purposes) are very positive and represent a significant advance in the field of odor management in the WWTP. The new system will help to take appropriate control actions and therefore improve the management of the plant compared to current practices "

This is how the technicians from the DAM R & D & i Department express themselves, when assessing the work carried out in this European research that ended on October 31st. Specifically, a drone has been developed capable of predicting "the odor concentration from the readings of chemical sensors and providing measurements that help to locate the sources of origin", indicate from DAM.

The study, carried out together with the IBEC (Institute for Bioengineering of Catalonia) and which is part of the ATTRACT initiative (GA 777222, Horizon 2020 program of the European Union), starts from the reality that current odor evaluation methodologies use measurements infrequent olfactometric measurements that do not allow a precise characterization, so an effective monitoring of the plant cannot be carried out.

"The system provides odor concentration maps that help to take appropriate control actions and therefore improve plant management compared to current practices," explain the researchers from DAM's R + D + i department.

Investigation details

The drone is configured with an electronic nose made up of 21 chemical sensors, plus temperature, humidity and pressure sensors, in a miniature sensor chamber. It also contains a sampling system, GPS positioning, and connects to a base station for signal processing and data analysis in real time.

"The system has been calibrated and validated in real operating conditions through several measurement campaigns at the Molina de Segura WWTP (Murcia). The results obtained allow us to obtain a prototype capable of predicting the intensity of the odor of ambient air samples in real time and simultaneously collecting samples for analysis in the laboratory after the flight, which allows us to calibrate and validate the operation of the system " , stand out from the Valencian water purification company.

"The developed drone allows, thanks to a combination of chemical sensors and artificial intelligence, to quickly assess the intensity of the odor emitted by a waste management plant in large areas that are sometimes difficult to access. This information is relevant for the plant operators with the ultimate goal of minimizing the impact on neighboring communities ", declares Santi Marco, Head of the Group of Signal and Information Processing for Sensing Systems at IBEC and Professor at the University of Barcelona .

For all these reasons, the DAM Innovation department technicians consider that the results of SNIFFDRONE represent a significant advance in the field of odor management in WWTP, since until now "the odor detection robots had been tested with Single-odor chemical sources in relatively simple controlled scenarios and in most cases using ground-based robots ", they conclude.

Credit: 
Institute for Bioengineering of Catalonia (IBEC)

Study finds novel evidence that dreams reflect multiple memories, anticipate future events

DARIEN, IL - Dreams result from a process that often combines fragments of multiple life experiences and anticipates future events, according to novel evidence from a new study.

Results show that 53.5% of dreams were traced to a memory, and nearly 50% of reports with a memory source were connected to multiple past experiences. The study also found that 25.7% of dreams were related to specific impending events, and 37.4% of dreams with a future event source were additionally related to one or more specific memories of past experiences. Future-oriented dreams became proportionally more common later in the night.

"Humans have struggled to understand the meaning of dreams for millennia," said principal investigator Erin Wamsley, who has a doctorate in cognitive neuroscience and is an associate professor in the department of psychology and program in neuroscience at Furman University in Greenville, South Carolina. "We present new evidence that dreams reflect a memory-processing function. Although it has long been known that dreams incorporate fragments of past experience, our data suggest that dreams also anticipate probable future events."

The study involved 48 students who spent the night in the laboratory for overnight sleep evaluation using polysomnography. During the night, participants were awakened up to 13 times to report on their experiences during sleep onset, REM sleep, and non-REM sleep. The following morning, participants identified and described waking life sources for each dream reported the previous evening. A total of 481 reports were analyzed.

"This is a new description of how dreams draw simultaneously from multiple waking-life sources, utilizing fragments of past experience to construct novel scenarios anticipating future events," said Wamsley.

According to Wamsley, the proportional increase of future-oriented dreams later in the night may be driven by temporal proximity to the upcoming events. While these dreams rarely depict future events realistically, the activation and recombination of future-relevant memory fragments may nonetheless serve an adaptive function.

The research abstract was published recently in an online supplement of the journal Sleep and will be presented as a poster beginning June 9 during Virtual SLEEP 2021. SLEEP is the annual meeting of the Associated Professional Sleep Societies, a joint venture of the American Academy of Sleep Medicine and the Sleep Research Society.

Credit: 
American Academy of Sleep Medicine

Senolytics reduce COVID-19 symptoms in preclinical studies

ROCHESTER, Minn. -- Mayo Clinic researchers and colleagues at the University of Minnesota showed that COVID-19 exacerbates the damaging impact of senescent cells in the body. In preclinical studies, the senolytic drugs discovered at Mayo significantly reduced inflammation, illness, and mortality from COVID infection in older mice. The findings appear in the journal Science.

Senescent cells (damaged or non-functioning cells that persist in the body) contribute to many aspects of aging and illness, including inflammation and multiple chronic diseases. Based on the "Amplifier/Rheostat Hypothesis" of senescent cells developed at Mayo, the researchers sought to discover how COVID-19 causes much higher mortality in the elderly and chronically-ill. They showed that human senescent cells have an amplified response to the SARS spike protein, provoking increased production of factors causing inflammation and tissue damage by senescent cells.

The researchers also found that older mice infected with viruses, including a coronavirus related to SARS-CoV-2 using a model developed at University of Minnesota, showed an amplified reaction, with increased senescent cells, inflammation, and nearly 100 % mortality. When the researchers treated similar mice - before or after the infection - with senolytics, drugs that selectively remove senescent cells from the body, the result was the opposite. Anti-viral antibodies increased, while signs of inflammation and senescent cells significantly decreased along with mortality, so survival of the old, infected mice became more like that of younger mice.

The researchers suggest that reducing the existing burden of senescent cells in older or chronically-diseased patients may increase their resilience and decrease their risk of dying from viral infections, including SARS-CoV-2. Three such clinical trials are now underway.

"Even though vaccine use is growing, senolytics could still be helpful to those who cannot receive the vaccine, and especially to older individuals in nursing homes with comorbidities or immunity issues," says James Kirkland, M.D., Ph.D., director of the Kogod Center on Aging and, together with Tamar Tchkonia, Ph.D., senior author for Mayo Clinic on the study. The study suggests senolytics could also improve response of the elderly to vaccines and help them fight bacterial and other viral infections.

The research was supported by the National Institutes of Health; the Connor Fund; Robert J. and Theresa W. Ryan; Robert P. and Arlene R. Kogod; the Noaber Foundation; the University of Minnesota Clinical and Translational Science Institute; the Medical Discovery Team on the Biology of Aging; and The Irene Diamond Fund/American Federation on Aging Research Postdoctoral Transition 10 Award. Also, by the Fesler-Lampert Chair in Aging Studies and the AFAR Junior Faculty Award.

Credit: 
Mayo Clinic

How COVID-19 wreaks havoc on human lungs

image: New structure shows how the COVID-19 virus envelope protein (E, magenta sticks) interacts with a human cell-junction protein (PALS1, surfaces colored in blue, green, and orange). Understanding this complex structure, which was solved using a cryo-electron microscope at Brookhaven National Laboratory, could lead to the discovery of drugs that block the interaction and, potentially, the most severe effects of COVID-19.

Image: 
Brookhaven National Laboratory

UPTON, NY--Scientists at the U.S. Department of Energy's (DOE) Brookhaven National Laboratory have published the first detailed atomic-level model of the SARS-CoV-2 "envelope" protein bound to a human protein essential for maintaining the lining of the lungs. The model showing how the two proteins interact, just published in the journal Nature Communications, helps explain how the virus could cause extensive lung damage and escape the lungs to infect other organs in especially vulnerable COVID-19 patients. The findings may speed the search for drugs to block the most severe effects of the disease.

"By obtaining atomic-level details of the protein interactions we can explain why the damage occurs, and search for inhibitors that can specifically block these interactions," said study lead author Qun Liu, a structural biologist at Brookhaven Lab. "If we can find inhibitors, then the virus won't cause nearly as much damage. That may give people with compromised health a much better chance for their immune systems to fight the virus successfully."

Scientists discovered the details and developed the molecular model using one of the new cryo-electron microscopes at Brookhaven Lab's Laboratory for BioMolecular Structure (LBMS), a new research facility built with funding from New York State adjacent to Brookhaven's National Synchrotron Light Source II (NSLS-II).

"LBMS opened last summer ahead of schedule because of its importance in the battle against COVID-19," said Sean McSweeney, director of LBMS and a coauthor on the paper. "LBMS and NSLS-II offer complementary protein-imaging techniques and both are playing important roles in deciphering the details of proteins involved in COVID-19. This is the first paper published based on results from the new facility."

Liguo Wang, scientific operations director of LBMS and another coauthor on the paper, explained that "cryo-electron microscopy (cryo-EM) is particularly useful for studying membrane proteins and dynamic protein complexes, which can be difficult to crystallize for protein crystallography, another common technique for studying protein structures. With this technique we created a 3-D map from which we could see how the individual protein components fit together."

"Without cryo-EM, we couldn't have gotten a structure to capture the dynamic interactions between these proteins," Liu said.

Triggering lung disruption

The SARS-CoV-2 envelope protein (E), which is found on the virus's outer membrane alongside the now-infamous coronavirus spike protein, helps to assemble new virus particles inside infected cells. Studies published early in the COVID-19 pandemic showed that it also plays a crucial role in hijacking human proteins to facilitate virus release and transmission. Scientists hypothesize that it does this by binding to human cell-junction proteins, pulling them away from their usual job of keeping the junctions between lung cells tightly sealed.

"That interaction can be good for the virus, and very bad for humans--especially elderly COVID-19 patients and those with pre-existing medical conditions," Liu said.

When lung cell junctions are disrupted, immune cells come in to try to fix the damage, releasing small proteins called cytokines. This immune response can make matters worse by triggering massive inflammation, causing a so-called "cytokine storm" and subsequent acute respiratory distress syndrome.

Also, because the damage weakens the cell-cell connections, it might make it easier for the viruses to escape from the lungs and travel through the bloodstream to infect other organs, including the liver, kidneys, and blood vessels.

"In this scenario, most damage would occur in patients with more viruses and more E proteins being produced," Liu said. And this could become a vicious cycle: More viruses making more E proteins and more cell-junction proteins being pulled out, causing more damage, more transmission, and more viruses again. Plus, any existing damage, such as lung-cell scarring, would likely make it harder for COVID patients to recover from the damage.

"That's why we wanted to study this interaction--to understand the atomic-level details of how E interacts with one of these human proteins to learn how to interrupt the interactions and reduce or block these severe effects," Liu said.

From specks to blobs to map to model

The scientists obtained atomic-level details of the interaction between E and a human lung-cell-junction protein called PALS1 by mixing the two proteins together, freezing the sample rapidly, and then studying the frozen sample with the cryo-EM. The electron microscopes use high-energy electrons to interact with the sample in much the same way that regular light microscopes use beams of light. But electrons allow scientists to see things at a much smaller scale due to their extremely short wavelength (100,000 times shorter than that of visible light).

The first images didn't look like much more than specks. But image-processing techniques allowed the team to select specks that were actual complexes of the two proteins.

"We used two-dimensional averaging and started to see some structural features that are shared among these particles. Our images showed the complex from different orientations but at fairly low resolution," Liu said. "Then we use computational tools and computation infrastructure at Brookhaven's Computational Science Initiative to perform three-dimensional reconstructions. These give us a 3-D model--an experimental map of the structure."

With an overall resolution of 3.65 Angstroms (the size of just a few atoms), the map had enough information about the unique characteristics of the individual amino acids that make up the two proteins for the scientists to fit the known structures of those amino acids into the map.

"We can see how the chain of amino acids that makes up the PALS1 protein folds to form three structural components, or domains, and how the much smaller chain of amino acids that makes up the E protein fits in a hydrophobic pocket between two of those domains," Liu said.

The model provides both the structural details and an understanding of the intermolecular forces that allow E proteins deep within an infected cell to wrench PALS1 from its place at the cell's outer boundary.

"Now we can explain how the interactions pull PALS1 from the human lung-cell junction and contribute to the damage," Liu said.

Implications for drugs and evolution

"This structure provides the foundation for our computational science colleagues to run docking studies and molecular dynamics simulations to search for drugs or drug-like molecules that might block the interaction," said John Shanklin, chair of Brookhaven Lab's Biology Department and a coauthor on the paper. "And if they identify promising leads, we have the analytical capabilities to rapidly screen through such candidate drugs to identify ones that might be key to preventing severe consequences of COVID-19."

Understanding the dynamics of this protein interaction will also help scientists track how viruses like SARS-CoV-2 evolve.

"When the virus protein pulls PALS1 out of the cell junction, it could help the virus spread more easily. That would provide a selective advantage for the virus. Any traits that increase the survival, spread, or release of the virus are likely to be retained," Liu said.

The longer the virus continues to circulate, the more chances there are for new evolutionary advantages to arise.

"This is one more reason it is so essential for us to identify and implement promising therapeutics," Liu said. "In addition to preventing the most severe infections, drugs that effectively treat COVID-19 will keep us ahead of these mutations."

Credit: 
DOE/Brookhaven National Laboratory

UN: More harmful algal bloom impacts emerge amid rising seafood demand, coastal development

image: Algae bloom in Baltic Sea, where extreme blooms are a significant problem. Captured June 2016, by the European Space Agency Sentinel-3A satellite, which helps monitor, for example, concentrations of algae, suspended matter and chlorophyll in seawater, useful to predict harmful algal blooms, The health and vulnerability of marine ecosystems is fundamental to our knowledge of ocean productivity and, in turn, fish stocks. Download: http://www.esa.int/spaceinimages/Images/2016/07/Baltic_swirls

Image: 
European Space Agency

An unprecedented analysis of almost 10,000 Harmful Algal Bloom (HAB) events worldwide over the past 33 years was launched today by UNESCO's Intergovernmental Oceanographic Commission.

The first-ever global statistical analysis examined ~9,500 HABs events over 33 years and found that the harm caused by HABs rises in step with growth of the aquaculture industry and marine exploitation and calls for more research on linkages.

Conducted over seven years by 109 scientists in 35 countries, the study found that reported HAB events have increased in some regions and decreased or held steady in others. A widely-stated view that HABs are on the rise throughout the world, perhaps due to climate change, isn't confirmed.

However, the study, "Perceived global increase in algal blooms is attributable to intensified monitoring and emerging bloom impacts," published in the Nature journal Communications Earth & Environment, creates the world's first baseline against which to track future shifts in the location, frequency and impacts of HABs, which differ depending on which of the 250 harmful marine algae species is involved and where, requiring assessment on a species-by-species and site-by-site basis.

A public webinar on Global HAB Status Report will take place Tuesday Jun 15, 2021 at 1 PM, Paris time. To register: https://bit.ly/3z3kjCB

Databases mined

The scientists mined both the global Harmful Algae Event Database (HAEDAT), consisting of 9,503 events with one or more impacts on human society, and the Ocean Biodiversity Information System (OBIS) database, containing 7 million microalgal observation records, including 289,668 toxic algal species occurrences.

The study found that regionally-recorded HAB events, after being corrected for higher levels of monitoring effort, have

Increased:

Central America/Caribbean

South America

Mediterranean

North Asia

Decreased:

West Coast America

Australia/New Zealand

No significant change:

East Coast America

South East Asia

Europe

The 9,503 events' impacts on humans break down as follows:

48% involved seafood toxins

43% high phytoplankton counts and/or water discolorations with a socio-economic impact

7% mass animal or plant mortalities

2% caused other impacts (including foam and mucilage production)

(As well, in 11% of events, a single incident had multiple impacts, e.g. both water discoloration and mass mortality)

Of the event records linked to seafood toxins:

35% were Paralytic Shellfish Toxins (PST)

30% Diarrhetic Shellfish Toxins (DST)

9% Ciguatera Poisoning (CP)

9% marine and brackish water cyanobacterial toxins

7% Amnesic Shellfish Toxins (AST)

10% others, including Neurotoxic Shellfish Toxins (NST), Azaspiracid Shellfish Toxins (AZA), and toxic aerosols

By region, the largest number of records came from, in order:

Europe

North Asia

Mediterranean

The east and west coasts of North America

Caribbean

Pacific/Oceania

Southeast Asia

With more limited data sets for South America, and Australia/New Zealand

All geographic regions were impacted by multiple HAB types, but in varying proportions.

50% of regional HAEDAT records in the Caribbean, Benguela, Mediterranean Sea, North and South East Asia related to high phytoplankton density problems.

Seafood toxins and fish kill impacts dominated in all other regions

Among toxin-related impacts:

Paralytic Shellfish Toxins (PST) prevailed in North America, the Caribbean, South America, South East Asia, and North Asia

Diarrhetic Shellfish Toxins (DST) were the most frequently recorded in Europe and the Mediterranean (and are an emerging threat in the USA)
Neurotoxic Shellfish Toxins (NST) were confined to the US State of Florida, with a single outbreak also reported from New Zealand

Human poisonings from Ciguatera were prominent in the tropical Pacific, the Indian Ocean, Australia and the Caribbean.

For the most part, however, the impacts were confined to shellfish harvesting area closures; rarely to human poisonings. The exception: Ciguatera event records are almost exclusively based on medical reports of human poisonings.

HAB events over time

Eight of nine regions used in the study showed increases in reports logged via HAEDAT of harmful events per year, of which six were statistically significant.

The OBIS dataset, meanwhile, generally showed an increase in sampling effort in five of the nine regions.

When all the information was combined, the researchers could find no statistically significant global trend overall.

They also found, however, that aquaculture production increased 16-fold from a global total 11.35 million tonnes of seafood in 1985 to 178.5 million tonnes in 2018, with the largest increases occurring in Southeast Asia and South America/Caribbean and Central America, with North America and Europe stabilising.

The number of recorded harmful algal bloom events over time was strongly correlated with intensified aquaculture production in all regions with data suitable for the study.

However, says lead author Gustaaf M. Hallegraeff of the University of Tasmania: Intensified aquaculture clearly drives an increase in HAB monitoring efforts essential to sustaining the industry and protecting human health.

"And, just as clearly, a secondary effect of aquaculture is nutrient pollution. But a major data gap exists here. Conducting a meta-analysis of HABs vs aquaculture we had data on HAB monitoring efforts using OBIS records as a proxy but data on nutrient pollution is inadequate. The relationship between aquaculture-related nutrients and HABs therefore represents an important direction for further research."

Greater monitoring efforts

The study revealed

A 4-fold increase from 1985 to 2018 in observations of organisms mainly responsible for Diarrhetic Shellfish Poisoning (84,392 OBIS records)

A 7-fold increase in observations of organisms mainly responsible for Amnesic Shellfish Poisoning (128,282 OBIS records)

A 6-fold increase in observations of organisms mainly responsible for Paralytic Shellfish Poisoning (9,887 OBIS records)

(Note: Some observations may include non-toxic species or strains.)

In each case, the clear increase in the number of observations of problematic organisms paralleled an increase in records of associated toxic syndrome impacts.

They also found that the presence of toxic HAB species doesn't always accurately predict cases of human shellfish poisonings, which the study credits to the food safety risk management strategies in many affected countries. Some 11,000 non-fatal events related to Diarrhetic Shellfish Poisoning were reported worldwide, mostly from Europe, South America and Japan, with impacts consisting mostly of shellfish harvesting area closures.

Also, the study says, despite widespread distribution of the responsible algal species, there have been no human fatalities from Amnesic Shellfish Poisoning since the original 1987 incident in Prince Edward Island, Canada (150 illnesses, three fatalities). But ASP-associated mortalities of important marine mammals are of growing concern in Alaska and other parts of western North America, and ASP toxins have been linked to marine mammal calf mortalities in Argentina.

Of the world's 3,800 human Paralytic Shellfish Poisonings from 1985 to 2018, the largest number (2,555 from 1983 to 2013, including 165 fatalities) occurred in the Philippines, which depends strongly on aquaculture for human food protein.

DNA and other advanced detection methods have improved knowledge of the global distribution of ciguatera- causing organisms. Ciguatera poisonings, rarely fatal but annually affecting 10,000 to 50,000 people, have been decreasing in Hawaii and remained stable in French Polynesia and the Caribbean but constitute a new phenomenon in the Canary Islands.

Farmed fish killed by algal blooms: Largely a human-generated problem.

Aquacultured finfish mortalities account for much greater economic damage than HAB-contaminated seafood. Notes the study: wild marine finfish can simply swim away from blooms but those held captive in intensive aquaculture operations are vulnerable. Recorded losses include US $71 million in Japan in 1972, $70 million in Korea in 1995, $290 million in China in 2012, and $100 million in Norway in 2019.

A 2016 Chilean salmon mortality event caused a record $800 million loss, causing major social unrest.

Again, the presence of fish-killing HAB species doesn't accurately predict economic losses, the study shows. For example, Heterosigma blooms occur on the west and east coasts of Canada and the US, but fish mortalities are mostly confined to the west coast. In large part, the difference reflects the differences between sites where blooms occur and the relative location and size of aquaculture operations.

A harmful algae species that caused no problems in Australian lagoons killed 50,000 caged fish in Malaysia in 2014. It is now also known in Japan and the Philippines.

The authors note that some troublesome algal species may thrive, others decline, as ocean waters warm and acidify.

Commentary

"There has been a widely-stated contention that HABs worldwide are increasing in distribution, frequency or intensity, so a quantitative global assessment is long overdue," says lead author Prof. Hallegraeff of the Institute for Marine and Antarctic Studies, University of Tasmania.

"While some of the HAB literature over the past 30 years has handpicked selected examples to claim a global increase and expansion in HABs, this new big data approach shows a much more nuanced trend," he adds.

"Our study concludes that the health and economic damages caused by harmful microalgae -- seafood poisoning, water discolouration that blights tourism, and the death of finfish in aquaculture operations, for example -- differ between regions."

Adds co-author Adriana Zingone: "We also found that overexploitation acts as a natural multiplier of the effects of HABs, leading to an increase in impacts independent of an actual trend in HABs."

"It should be noted that over the last 40 years capacity and monitoring efforts to detect harmful species and harmful events have also increased, thus increasing the reporting of harmful events across the world's seas," she says.

"The absence of events and decreasing trends, like all negative results, are rarely published. Whether or not HABs are increasing globally, however, their impacts are a growing concern all around the globe."

Says co-author Henrik Oksfeldt Enevoldsen: "As the human population continues to increase in tandem with resource demands, HABs will predictably constitute a serious threat in terms of seafood safety and security, a hindrance to recreational uses of the sea, and a problem for the tourism industry."

"Occurrences of harmful species over time and their human impacts can be expected to change locally, regionally and globally alongside the effects that climate, hydrography and human pressure impose on the coastal environment."

"Understanding the trends and distribution patterns of harmful species and events at multiple spatial and temporal scales will help predict whether, where and when to expect HABs, their frequency and intensity. This knowledge is fundamental for effective management of HABs and to optimise the uses and values of the maritime space in coastal areas."

Johan Hanssens, Secretary-General Flanders Department of Economy, Science and Innovation, a sponsor of this report, concluded: "This status report is a very timely reminder, at the start of the UN Decade of Ocean Science for Sustainable Development, that a thorough understanding of natural and ecological processes in the ocean is crucial for the development of the blue economy, now that many coastal countries are turning to the sea for additional resources, including food provisioning. International scientific collaboration is essential and most efficient to address the associated challenges."

Credit: 
Terry Collins Assoc

Non-altered birth cord cells boost survival of critically ill COVID-19 patients

image: Ismail Hadisoebroto Dilogo, professor of medicine at Cipto Mangunkusumo Central Hospital-Universitas Indonesia, corresponding author of the study.

Image: 
AlphaMed Press

Durham, NC - Critically ill COVID-19 patients treated with non-altered stem cells from umbilical cord connective tissue were more than twice as likely to survive as those who did not have the treatment, according to a study published today in STEM CELLS Translational Medicine.

The clinical trial, carried out at four hospitals in Jakarta, Indonesia, also showed that administering the treatment to COVID-19 patients with an added chronic health condition such as diabetes, hypertension or kidney disease increased their survival more than fourfold.

All 40 patients who took part in the double-blind, controlled, randomized study were adults in intensive care who had been intubated due to COVID-19-induced pneumonia. Half were given intravenous infusions containing umbilical mesenchymal stromal cells, or stem cells derived from the connective tissue of a human birth cord, and half were given infusions without them.

The survival rate of those receiving the stem cells was 2.5 times higher and climbed even more - 4.5 times - in the COVID-19 patients who had other chronic health conditions, said Ismail Hadisoebroto Dilogo, professor of medicine at Cipto Mangunkusumo Central Hospital-Universitas Indonesia and research team member.

The stem cell infusion also was found to be safe and well-tolerated with no life-threatening complications or acute allergic reactions in seven days of post-infusion monitoring, he said.

Previous clinical trials have shown that treating COVID-19 pneumonia patients with stem cells from umbilical cord connective tissue may help them survive and recover more quickly, but the Indonesian study is the first to treat intubated, critically ill COVID-19 pneumonia patients with a naive, or non-genetically manipulated, form of the stem cells.

"Unlike other studies, our trial used stem cells obtained through explants from actual umbilical cord tissue and we did not manipulate them to exclude ACE2, a cellular protein thought to be an entry point for COVID-19," Dilogo said.

Some research suggests that one of the main causes of acute respiratory distress in COVID-19 patients is "cytokine storm," a condition in which infection prompts the body's immune system to flood the bloodstream with inflammatory proteins.

"The exact cause of cytokine storm is still unknown, but our study indicates that the presence of non-manipulated umbilical cord stromal stem cells improves patient survival by modulating the immune system toward an anti-inflammatory immune state," Dilogo said.

Since there is no cure for COVID-19, supportive care has been the only help available for patients who are critically ill with the virus.

"Although our study focused on a small number of patients, we think this experimental treatment could potentially lead to an effective adjuvant therapy for COVID-19 patients in intensive care who do not respond to conventional supportive treatment," he said.

Dilogo's research team launched the clinical trial last year after the COVID-19 occupancy rate in Jakarta's intensive care units climbed to 80 percent and the mortality rate of critically ill COVID-19 pneumonia patients in the ICUs reached 87 percent.

"This study, which assessed the potential therapeutic effect of human umbilical-cord mesenchymal stem cells on critically-ill COVID-19 patients, provides promising results that could inform a potential treatment to increase survival rates," said Anthony Atala, M.D., Editor-in-Chief of STEM CELLS Translational Medicine and Director of the Wake Forest Institute for Regenerative Medicine. "Having additional potential therapies, such as MSCs, could be highly beneficial for these patients."

Credit: 
AlphaMed Press

Identification of RNA editing profiles and their clinical relevance in lung adenocarcinoma

image: Identification of RNA editing subtypes in LUAD patients.

Image: 
©Science China Press

The incidence rate of lung adenocarcinoma (LUAD) is increasing gradually and the mortality is still high. Recent advances in the genomic profile of LUAD have identified a number of driver alterations in specific genes, enabling molecular classification and targeted therapy accordingly. However, only a fraction of LUAD patients with those driver mutations could benefit from targeted therapy, and the remaining large numbers of patients were unclassified. RNA editing events are those nucleotide changes in the RNA. Currently, the role of RNA editing events in tumorigenesis and their potential clinical utility have been reported in a series of studies. However, the profiles of the RNA editing events and their clinical relevance in LUAD remained largely unknown.

"We describe a comprehensive landscape of RNA editing events in LUAD by integrating transcriptomic and genomic data from our NJLCC project and TCGA project. We find that the global RNA editing level is significantly increased in tumor tissues and is highly heterogeneous across LUAD patients. The high RNA editing level in tumors can be attributed to both RNA and DNA alterations." said Dr. Cheng Wang, the first author for this work. The results indicated that the pattern of RNA editing events could represent the global characteristics of lung adenocarcinoma. "We then define a new molecular subtype, EC3, based on most variable RNA editing sites. The patients of this subtype show the poorest prognosis. Importantly, the subtype is independent of classic molecular subtypes based on gene expression or DNA methylation. We further propose a simplified prediction model including eight RNA editing sites to accurately distinguish EC3 subtype. " said Dr. Wang. Molecular typing based on a few RNA editing sites may have enormous potential in the clinics. "By applying the simplified model, we find that the EC3 subtype is associated with the sensitivity of specific chemotherapy drugs." said Dr. Wang.

"Our study comprehensively describes the general pattern of RNA editing in LUAD. More importantly, we propose a novel molecular subtyping strategy of LUAD based on RNA editing that could predict the prognosis of patients. A simplified model with a few editing sites makes the strategy potentially available in the clinics." said Professor Hongbing Shen, the corresponding author.

Credit: 
Science China Press

Susceptibility of COPD patients to heart rate difference associated with exposure to metals in PM2.5

image: Metal in PM2.5 exposure and difference in Heart rate in COPD and non-COPD participants.

Image: 
©Science China Press

Epidemiological and toxicological studies indicate that the adverse outcomes of PM2.5 exposure associated closely with the chemical composition in PM2.5. Metals in PM2.5 are highly concerned for their induced disruption of iron homeostasis in the lung and following oxidative stress, which is one of the key mechanisms underlying the cardiovascular autonomic dysfunction of PM2.5 exposure. However, there is no clear evidence on whether COPD patients are more susceptible to cardiovascular autonomic dysfunction associated with exposure to metals in ambient PM2.5 than individuals without COPD. Based on a panel study, the researchers directly compared metal-associated cardiovascular autonomic dysfunction between COPD patients and healthy controls.

"We observed higher levels of heart rate (HR) associated with exposure to PM2.5 and the metal elements therein. Associations between HR and metals exposure in COPD patients were stronger than those in healthy controls. Exposure to Cr had robust associations with HR, comparing to other metals." said Dr. Ke Gao, the first author for this work.

This study has many advantages. "It is a panel study design with repeated measurements of HR and exposure to metals in PM2.5. All of the recruited participants acted as their own controls, thereby lessening deviation linked to intra-individual differences. In addition, we recruited both individuals with COPD and healthy controls, allowing them to make direct comparisons of different responses from the COPD and healthy groups and to investigate the susceptibility of individuals with COPD to air pollution-linked responses." said Dr. Xi Chen, the co-first author for this work.

"Although our results require further validation, they shed light on the disease burden of COPD associated with exposure to metals in airborne PM2.5 in China." Said Prof. Tong Zhu, the corresponding author of the paper.

These encouraging results provide valuable data and improve the scientific understanding of susceptibility of COPD patients to metal-associated cardiovascular autonomic dysfunction

Credit: 
Science China Press

Discovery of a dying supermassive black hole via a 3,000-year-long light echo

image: The radio band composite image of Arp 187 obtained by VLA and ALMA telescopes (blue: VLA 4.86 GHz, green: VLA 8.44 GHz, red: ALMA 133 GHz). The image shows clear bimodal jet lobes, but the central nucleus (center of the image) is dark/non-detection.

Image: 
ALMA (ESO/NAOJ/NRAO), Ichikawa et al.

Supermassive black holes (SMBH) occupy the center of galaxies, with masses ranging from one million to 10 billion solar masses. Some SMBHs are in a bright phase called active galactic nuclei (AGN).

AGNs will eventually burn out since there is a maximum mass limit for SMBHs; scientists have long since pondered when that will be.

Tohoku University's Kohei Ichikawa and his research group may have discovered an AGN towards the end of its life span by accident after catching an AGN signal from the Arp 187 galaxy.

Through observing the radio images in the galaxy using two astronomy observatories - the Atacama Large Millimeter/submillimeter Array (ALMA) and the Very Large Array (VLA) - they found a jet lobe, a hallmark sign of AGN.

However, they noticed no signal from the nucleus, indicating the AGN activity might be silent already.

Upon further analysis of the multi-wavelength data, they found all the small scale AGN indicators to be silent, while the large-scale ones were bright. This is because the AGN has recently been quenched within the last 3,000 years.

Once an AGN dies off, smaller-scale AGN features become faint because further photon supplies also shut down. But the large scale ionized gas region is still visible since it takes about 3000 years for photons to arrive at the region's edge. Observing past AGN activity is known as light echoing.

"We used the NASA NuSTAR X-ray satellite, the best tool to observe current AGN activity," said Ichikawa. "It enables non-detection, so we were able to discover that the nucleus is completely dead."

The findings indicate AGN turn-off occurs within a 3000-year time scale, and the nucleus becomes over 1000 times fainter during the last 3000 years.

Ichikawa, who co-authored a paper for the 238 Meeting of the American Astronomical Society, says they will continue to investigate dying AGNs moving forward. "We will search for more dying AGN using a similar method as this study. We will also obtain the high spatial resolution follow-up observations to investigate the gas inflows and outflows, which might clarify how the shut down of AGN activity has occurred.

Credit: 
Tohoku University

Early endeavors on the path to reliable quantum machine learning

Anyone who collects mushrooms knows that it is better to keep the poisonous and the non-poisonous ones apart. Not to mention what would happen if someone ate the poisonous ones. In such "classification problems", which require us to distinguish certain objects from one another and to assign the objects we are looking for to certain classes by means of characteristics, computers can already provide useful support to humans.

Intelligent machine learning methods can recognise patterns or objects and automatically pick them out of data sets. For example, they could pick out those pictures from a photo database that show non-toxic mushrooms. Particularly with very large and complex data sets, machine learning can deliver valuable results that humans would not be able to find out, or only with much more time. However, for certain computational tasks, even the fastest computers available today reach their limits. This is where the great promise of quantum computers comes into play: that one day they will also perform super-fast calculations that classical computers cannot solve in a useful period of time.

The reason for this "quantum supremacy" lies in physics: quantum computers calculate and process information by exploiting certain states and interactions that occur within atoms or molecules or between elementary particles.

The fact that quantum states can superpose and entangle creates a basis that allows quantum computers the access to a fundamentally richer set of processing logic. For instance, unlike classical computers, quantum computers do not calculate with binary codes or bits, which process information only as 0 or 1, but with quantum bits or qubits, which correspond to the quantum states of particles. The crucial difference is that qubits can realise not only one state - 0 or 1 - per computational step, but also a state in which both superpose. These more general manners of information processing in turn allow for a drastic computational speed-up in certain problems.

Translating classical wisdom into the quantum realm

These speed advantages of quantum computing are also an opportunity for machine learning applications - after all, quantum computers could compute the huge amounts of data that machine learning methods need to improve the accuracy of their results much faster than classical computers.

However, to really exploit the potential of quantum computing, one has to adapt the classical machine learning methods to the peculiarities of quantum computers. For example, the algorithms, i.e. the mathematical calculation rules that describe how a classical computer solves a certain problem, must be formulated differently for quantum computers. Developing well-functioning "quantum algorithms" for machine learning is not entirely trivial, because there are still a few hurdles to overcome along the way.

On the one hand, this is due to the quantum hardware. At ETH Zurich, researchers currently have quantum computers that work with up to 17 qubits (see "ETH Zurich and PSI found Quantum Computing Hub" of 3 May 2021). However, if quantum computers are to realise their full potential one day, they might need thousands to hundreds of thousands of qubits.

Quantum noise and the inevitability of errors

One challenge that quantum computers face concerns their vulnerability to error. Today's quantum computers operate with a very high level of "noise", as errors or disturbances are known in technical jargon. For the American Physical Society, this noise is " the major obstacle to scaling up quantum computers". No comprehensive solution exists for both correcting and mitigating errors. No way has yet been found to produce error-free quantum hardware, and quantum computers with 50 to 100 qubits are too small to implement correction software or algorithms.

To a certain extent, one has to live with the fact that errors in quantum computing are in principle unavoidable, because the quantum states on which the concrete computational steps are based can only be distinguished and quantified with probabilities. What can be achieved, on the other hand, are procedures that limit the extent of noise and perturbations to such an extent that the calculations nevertheless deliver reliable results. Computer scientists refer to a reliably functioning calculation method as "robust" and in this context also speak of the necessary "error tolerance".

This is exactly what the research group led by Ce Zhang, ETH computer science professor and member of the ETH AI Center, has has recently explored, somehow "accidentally" during an endeavor to reason about the robustness of classical distributions for the purpose of building better machine learning systems and platforms. Together with Professor Nana Liu from Shanghai Jiao Tong University and with Professor Bo Li from the University of Illinois at Urbana, they have developed a new approach. This allows them to prove the robustness conditions of certain quantum-based machine learning models, for which the quantum computation is guaranteed to be reliable and the result to be correct. The researchers have published their approach, which is one of the first of its kind, in the scientific journal npj Quantum Information.

Protection against errors and hackers

"When we realised that quantum algorithms, like classical algorithms, are prone to errors and perturbations, we asked ourselves how we can estimate these sources of errors and perturbations for certain machine learning tasks, and how we can guarantee the robustness and reliability of the chosen method," says Zhikuan Zhao, a postdoc in Ce Zhang's group. "If we know this, we can trust the computational results, even if they are noisy."

The researchers investigated this question using quantum classification algorithms as an example - after all, errors in classification tasks are tricky because they can affect the real world, for example if poisonous mushrooms were classified as non-toxic. Perhaps most importantly, using the theory of quantum hypothesis testing - inspired by other researchers' recent work in applying hypothesis testing in the classical setting - which allows quantum states to be distinguished, the ETH researchers determined a threshold above which the assignments of the quantum classification algorithm are guaranteed to be correct and its predictions robust.

With their robustness method, the researchers can even verify whether the classification of an erroneous, noisy input yields the same result as a clean, noiseless input. From their findings, the researchers have also developed a protection scheme that can be used to specify the error tolerance of a computation, regardless of whether an error has a natural cause or is the result of manipulation from a hacking attack. Their robustness concept works for both hacking attacks and natural errors.

"The method can also be applied to a broader class of quantum algorithms," says Maurice Weber, a doctoral student with Ce Zhang and the first author of the publication. Since the impact of error in quantum computing increases as the system size rises, he and Zhao are now conducting research on this problem. "We are optimistic that our robustness conditions will prove useful, for example, in conjunction with quantum algorithms designed to better understand the electronic structure of molecules."

Credit: 
ETH Zurich

First glimpse of brains retrieving mistaken memories observed

image: Scientists have observed for the first time what it looks like in the key memory region of the brain when a mistake is made during a memory trial.

Image: 
Jenna Luecke/University of Texas at Austin

Scientists have observed for the first time what it looks like in the key memory region of the brain when a mistake is made during a memory trial. The findings have implications for Alzheimer's disease research and advancements in memory storage and enhancement, with a discovery that also provides a view into differences between the physiological events in the brain during a correct memory versus a faulty one.

The study was published today in the journal Nature Communications.

In both correct and incorrect recall of a spatial memory, researchers could observe patterns of cell activation in the brain that were similar, though the pace of activation differed.

"We could see the memories activating," said Laura Colgin, an associate professor of neuroscience at The University of Texas at Austin and lead author of the paper. "It's like dominoes falling. One cell activates and then the next fires."

Colgin and her team used electrophysiological recordings of rats in and out of mazes to study signals in the brain as the rats attempted to remember where a food reward was located and find it.

When rats remembered where the food reward was and located it, a specific pattern of brain cells activated with similar timing. These cells, called place cells, are associated with memories involving spatial relationships and locations and are found in the hippocampus, a section of the brain where animals including humans store most of their memories. It is also a region of the brain that experiences degeneration in patients with Alzheimer's and related memory disorders.

"If we understand what happens when a memory is not properly retrieved, it may give us insights into what is happening with memory disorders like Alzheimer's," Colgin said.

What the researchers saw in rats when they got the location wrong surprised them. They expected to see cells fire in a jumble. What they saw was the same pattern observed when rats were correct in finding the location, but the timing of the cell activation was different.

"The activation started later and it was slower, but the same pattern fired," Colgin said. "There may be less energy in the network to drive the cells, and that may be why memory was not connected with action."

The study also found that on trials when rats remembered the correct location, they were accessing the memory of the location while they were resting between tests, causing the pattern of cells to activate as they waited, the way a person might practice a speech before giving it.

On trials when rats made mistakes, they did not activate the memory of the location before they entered the maze.

One of the lab's long-term goals is to contribute to understanding memory formation and retrieval enough that one day, lost memories could be accessed even by people with memory disorders with the help of brain-computer interface technology.

"If we can understand how these large ensembles of neurons that represent memories are formed and what's happening when these memories are being properly retrieved, someday we may be able to decipher and store memories," Colgin said.

Colgin and her team plan to continue the research and hope to be able to decode memory formation and activation in real time in rats.

Credit: 
University of Texas at Austin

Snowflake morays can feed on land, swallow prey without water

image: Researchers trained seven snowflake moray eels over a period of more than five years in order to demonstrate how the eels can eat on land.

Image: 
Rita Mehta/UC Santa Cruz

Most fish rely on water to feed, using suction to capture their prey. A new study, however, shows that snowflake morays can grab and swallow prey on land without water thanks to an extra set of jaws in their throats.

After a moray eel captures prey with its first set of jaws, a second set of "pharyngeal jaws" then reaches out to grasp the struggling prey and pull it down into the moray's throat. Rita Mehta, an associate professor of ecology and evolutionary biology at UC Santa Cruz, first described this astonishing feeding mechanism in a 2007 Nature paper.

The new study, published June 7 in the Journal of Experimental Biology, shows that these pharyngeal jaws enable at least one species of moray to feed on land.

"Most fishes really need water to feed," Mehta said. "This is the first example of a fish that can feed on land without relying on water."

Reports of snowflake morays coming out of the water to grab crabs on the shore prompted her to take a closer look, she said. "These particular moray eels tend to eat hard-shelled prey like crabs, and I would see reports in the literature of them moving out of the water and lunging for crabs, but it was unclear what happened next."

Even fish well adapted to an amphibious lifestyle, such as mudskippers, need water to swallow their food. "Mudskippers come up onto mudflats and grab prey like small crabs and insects. They get around the challenge of swallowing on land by sucking up water and then using the water they have reserved in their mouth to swallow," Mehta said.

Snowflake morays can do it without water because of their unusual feeding mechanics.

"They have highly moveable pharyngeal jaws in their throat," she said. "Once the moray captures prey in its oral jaws, the pharyngeal jaws grab onto the prey again and move it further back into the esophagus. This mechanical movement does not rely on water."

Demonstrating that snowflake morays can eat on land, however, was no easy task. It took Mehta and a team of undergraduates over five years to train seven snowflake morays to slither up a ramp onto a platform, grab a piece of fish, and swallow it before returning to the water.

"They feel safer in the water, so at first they would just grab the fish and go straight back into the water with it," she said. "I relied on a team of dedicated and enthusiastic undergraduate researchers to work on training them."

Coauthor Kyle Donohoe was especially helpful, she said, because of his animal training experience from working with marine mammals as a research assistant at the Pinniped Cognition and Sensory Systems Laboratory next to Mehta's lab at UCSC's Long Marine Laboratory.

Once the eels were trained to feed on the platform, Mehta documented this unusual feeding behavior on video. She said the feeding performance of young snowflake morays is as good on land as it is in water.

"As a result, these particular morays can utilize very different environments for food resources," she said.

Credit: 
University of California - Santa Cruz

Binder-free MWW-type titanosilicate for selective and durable propylene epoxidation

image: Researchers designed and prepared a structured binder-free Ti-MWW catalyst via a combination method of shaping, recrystallization and chemical modification of active Ti sites. As a result, it had a ultra-long lifetime of 2400 h in a single HPPO run, during which the STY of PO was as high as 543 g kg-1 h-1, while the solvent consumption was only 194.3 kg kmol(H2O2)-1.

Image: 
Chinese Journal of Catalysis

Propylene oxide (PO) is one of the important propylene derivatives with high reactivity, which is used extensively as raw material for the manufacture of numerous commercial chemicals. The titanosilicate-catalyzed hydrogen peroxide propylene oxide process (HPPO) is considered to be most advantageous because it is highly economical and ecofriendly, giving only H2O as the theoretical byproduct and achieving high PO selectivity under mild reaction conditions. The industrial HPPO process is generally carried out in a fixed-bed reactor using the shaped titanosilicate catalysts. Unfortunately, the inert and non-porous binders in shaping catalysts always negatively affect the accessibility of active sites and reaction performance in HPPO process. Moreover, for the HPPO process, in terms of catalyst cost, epoxidation reactivity and PO selectivity, the second-generation Ti-MWW/H2O2/Acetonitrile system is superior to the currently commercialized first-generation TS-1/H2O2/Methanol system. Therefore, it is of great academic and industrial significance to design and synthesize an applicable Ti-MWW catalyst and realize a highly efficient HPPO process, which should be engineered delicately and comprehensively.

Recently, a research team led by Prof. Peng Wu from East China Normal University, China designed and synthesized a structured binder-free MWW-type titanosilicate catalyst with attractive HPPO performance via a combination method of shaping, recrystallization and chemical modification of Ti sites. The controlled dual- templates-assisted hydrothermal recrystallization converted the amorphous SiO2 binders in extruded SiO2/Ti-MWW catalyst to crystalline zeolite phase. Actually, such procedure could kill two birds with one stone: mass transfer problems in the shaped catalyst and chemical modification of micro-environments of the active Ti sites were conquered simultaneously. It was found recrystallization not only released the part of Ti sites within the micropores imprisoned by binders, improving the diffusion efficiency and the accessibility of Ti sites, but also constructed more active open framework TiO6 species and abundant internal silanol nests, which promoted the accumulation and activation ability of H2O2 inside the Ti-MWW monolith.

Afterwards, successive piperidine treatment and fluoridation of the binder-free Ti-MWW further enhanced the H2O2 activation and the active O transfer ability of active Ti sites, and stabilized the Ti-OOH intermediate through H-bonding formed between the end H in Ti-OOH and adjacent Si-F species, thus achieving a more efficient epoxidation process. Additionally, the side reaction of PO hydrolysis was inhibited because the modification effectively quenched numerous acidic Si-OH groups. The lifetime of the modified binder-free Ti-MWW catalyst was 2400 h with the H2O2 conversion and PO selectivity both above 99.5% as well low solvent consumption. The outstanding catalytic performance implied the great potential of this structured binder-free Ti-MWW catalyst in industrial HPPO applications. The results were published in Chinese Journal of Catalysis (10.1016/S1872-2067(20)63759-7).

Credit: 
Dalian Institute of Chemical Physics, Chinese Academy Sciences

New single-atom catalysis boots reductive amination reaction

image: Schematic illustration for preparation of Ru1/NC-T catalysts.

Image: 
DICP

The geometric isolation of metal species in single-atom catalysis (SACs) not only maximizes the atomic utilization efficiency, but also endows SACs with unique selectivity in various transformations.

The coordination environment of isolated metal atoms in SACs determines the catalytic performance. However, it remains challenging to modulate the coordinative structure while still maintain the single-atom dispersion.

Recently, a research group led by Prof. ZHANG Tao and Prof. WANG Aiqin from the Dalian Institute of Chemical Physics of the Chinese Academy of Sciences fabricated Ru1/NC SAC with good catalytic activity, selectivity and stability in reductive amination of biomass derived aldehydes/ketones to produce primary amines, and elucidated the structure-performance relationships from the atomic/molecular level.

This study was published in Nature Communications on June 2.

The researchers prepared Ru1/NC SAC towards the target reaction, and discovered that both catalytic activity and selectivity increased with the decrease of Ru-N coordination numbers. Particularly, Ru1/NC SAC with Ru1-N3 moiety offered the best catalytic performance, which was much superior to the ever reported nanoparticulate and homogeneous Ru catalysts.

Moreover, Ru1/NC presented excellent durability against poisoning by CO or sulfur-containing compounds, and the single-atom dispersion was well maintained even after reduction under extreme conditions.

Credit: 
Dalian Institute of Chemical Physics, Chinese Academy Sciences

GMRT measures the atomic hydrogen gas mass in galaxies 9 billion years ago

image: The spectrum of the detected GMRT 21cm signal (left panel) and an image (right panel).

Image: 
Aditya Chowdhury

A team of astronomers from the National Centre for Radio Astrophysics (NCRA-TIFR) in Pune, and the Raman Research Institute (RRI), in Bangalore, has used the Giant Metrewave Radio Telescope (GMRT) to measure the atomic hydrogen gas content of galaxies 9 billion years ago, in the young universe. This is the earliest epoch in the universe for which there is a measurement of the atomic hydrogen content of galaxies. The new result is a crucial confirmation of the group's earlier result, where they had measured the atomic hydrogen content of galaxies 8 billion years ago, and pushes our understanding of galaxies to even earlier in the universe. The new research has been published in the 2 June 2021 issue of The Astrophysical Journal Letters.

Galaxies consist of mostly gas and stars, with new stars forming from the existing gas during the life of a galaxy. Stars formed much more frequently when the universe was young than they do today. Astronomers have known for more than two decades that the star formation activity in galaxies was at its highest about 8-10 billion years ago, and has declined steadily thereafter. Until recently, the cause of this decline was unknown, mostly because we have had no information about the amount of atomic hydrogen gas, the main fuel for star formation, in galaxies at these early times. This changed last year when a team of astronomers from NCRA and RRI, including some of the authors from the present study, used the upgraded GMRT to obtain the first measurement of the atomic hydrogen gas content of galaxies about 8 billion years ago, when the star-formation activity of the Universe began to decline. They found that the likely cause for the decline in star formation in galaxies is that galaxies were running out of fuel.

Aditya Chowdhury, a Ph.D. student at NCRA-TIFR, and the lead author of both the new study and the 2020 one, said, "Our new results are for galaxies at even earlier times, but still towards the end of the epoch of maximum star-formation activity. We find that galaxies 9 billion years ago were rich in atomic gas, with nearly three times as much mass in atomic gas as in stars! This is very different from galaxies today like the Milky Way, where the gas mass is nearly ten times smaller than the mass in stars."

The measurement of the atomic hydrogen gas mass was done by using the GMRT to search for a spectral line in atomic hydrogen, which can only be detected with radio telescopes. Unfortunately, this "21 cm" signal is very weak, and hence nearly impossible to detect from individual galaxies at such large distances, around 30 billion light years from us, even with powerful telescopes like the GMRT. The team hence used a technique called "stacking" to improve the sensitivity. This allowed them to measure the average gas content of nearly 3,000 galaxies, by combining their 21 cm signals.

"The observations of our study were carried out around 5 years ago, before the GMRT was upgraded in 2018. We used the original receivers and electronics chain of the GMRT before its upgrade, which had a narrow bandwidth. We could hence cover only a limited number of galaxies; this is why our current study covers 3000 galaxies, compared to the 8,000 galaxies of our 2020 study with the wider bandwidth of the upgraded GMRT.", said Nissim Kanekar of NCRA-TIFR, a co-author of the study.

Barnali Das, another Ph.D. student of NCRA-TIFR, added, "Although we had fewer galaxies, we increased our sensitivity by observing for longer, with nearly 400 hours of observations. The large volume of data meant that the analysis took a long time!".

"The star formation in these early galaxies is so intense that they would consume their atomic gas in just two billion years. And, if the galaxies could not acquire more gas, their star formation activity would decline, and finally cease.", said Chowdhury. "It thus appears likely that the cause of the declining star-formation in the Universe is simply that galaxies were not able to replenish their gas reservoirs after some epoch, probably because there wasn't enough gas available in their environments."

"Reproducibility is foundational to science! Last year, we reported the first measurement of the atomic gas content in such distant galaxies. With the present result, using a completely different set of receivers and electronics, we now have two independent measurements of the atomic gas mass in these early galaxies. This would have been hard to believe, even a few years ago!", said Kanekar.

K. S. Dwarakanath of RRI, a co-author of the study, mentioned "Detecting the 21 cm signal from distant galaxies was the main original goal of the GMRT, and continues to be a key science driver for building even more powerful telescopes like the Square Kilometre Array. These results are extremely important for our understanding of galaxy evolution.".

Credit: 
Tata Institute of Fundamental Research