Earth

Optical superoscillation without side waves

image: A pair of moonlike sharp-edge apertures enables generation of diffractive focusing light spot sized within the optical diffraction limit, while eliminating side waves along the symmetric cut.

Image: 
Yanwen Hu.

Optical superoscillation refers to a wave packet that can oscillate locally in a frequency exceeding its highest Fourier component. This intriguing phenomenon enables production of extremely localized waves that can break the optical diffraction barrier. Indeed, superoscillation has proven to be an effective technique for overcoming the diffraction barrier in optical superresolution imaging. The trouble is that strong side lobes accompany the main lobes of superoscillatory waves, which limits the field of view and hinders application.

There also are tradeoffs between the main lobes and the side lobes of superoscillatory wave packets: reducing the superoscillatory feature size of the main lobe comes at the cost of enlarging the side lobes. This happens mainly because superoscillation is a local phenomenon, yet the overall width of the wave packet is wider than the optical diffraction limit.

Precise engineering of the interference of diffracted light fields emitted from complex nanostructures can produce structural masks that enable significant optical superoscillation. But structural masks require optimization and complex fabrication, and the resulting light field is still limited by high-intensity side lobes. Producing superoscillatory waves with appreciable feature size while maintaining a larger field of view has remained challenging until now.

As reported in Advanced Photonics, researchers from Jinan University, Guangzhou, China, recently developed a way to eliminate, to some extent, the tradeoffs involved in superoscillatory wave packets. They demonstrate, both experimentally and theoretically, generation of superoscillatory light spots without side lobes.

A central microdisc with cylindrical diffraction gives rise to a superoscillatory light spot of a size within the optical diffraction limit. A pair of sharp-edged apertures ensures constructive interference with the high-spatial-frequency waves. That interference effectively eliminates side lobes along a symmetric cut that can be adjusted in the transverse plane by rotating the moonlike apertures.

According to Yanwen Hu, a doctoral student working under the supervision of Zhenqiang Chen in the Department of Optoelectronic Engineering at Jinan University, "Due to its easy design, based on clear physics, the sharp-edged aperture is a promising candidate for realization of superoscillatory waves."

Hu explains further that the cylindrical diffraction of the central microdisc produces superoscillatory waves with Bessel-like forms. These forms enable the delicate structures of the superoscillatory waves propagating in free space to travel much farther than the evanescent light waves. According to Hu, this intriguing propagation effect of superoscillation holds promise for potential application in nanoparticle manipulation, as well as superresolution imaging.

Credit: 
SPIE--International Society for Optics and Photonics

Precision medicine becomes more accessible for Australians with cancer

image: Pancreatic cancer cell microscopy image.

Image: 
Garvan Institute of Medical Research

A new resource developed at the Garvan Institute of Medical Research and The Kinghorn Cancer Centre for oncologists could help make targeted cancer therapies more accessible for Australian patients.

The TOPOGRAPH (Therapy-Oriented Precision Oncology Guidelines for Recommending Anti-cancer Pharmaceuticals) database is an online tool that catalogues oncology research to streamline the process of recommending therapeutic treatments in precision cancer medicine.

Garvan Senior Research Officer Dr Frank Lin led the development of the platform reported this week in the journal npj Precision Oncology.

"TOPOGRAPH is uniquely useful in the Australian context because it combines up-to-date information on treatments approved for use in Australia in both clinical and trial settings," Dr Lin says. "This tool was designed to systematically organise the vast amount of data from clinical trials and regulatory authorities into an accessible, easy to use platform for oncologists to maximise the therapeutic benefit to patients."

Bringing the data together

While several resources exist that interpret the potential therapeutic significance of genomic variations and other biomarkers in cancer, a number of factors such as government subsidies and approvals by national regulators can limit access to treatments.

Dr Lin says that in Australia, there was a strong need for a "pragmatic, evidence-based, context-adapted tool to guide clinical management based on molecular biomarkers".

"We designed this tool because there's no good alternatives in Australia to help oncologists sieve through potential treatment options when facing a complex genomic report."

The TOPOGRAPH team, which included researchers from the Garvan Institute, The Kinghorn Cancer Centre, St Vincent's Hospital, Australian Genomic Cancer Medicine Program (Omico), UNSW Sydney, The University of Sydney and the NHMRC Clinical Trials Centre, conducted a comprehensive literature review and appraisal to develop a database comprising 211 predictive biomarkers, 117 cancer types and more than 400 therapies.

Oncologists can look up any of these parameters on the platform, as well as combinations of biomarkers, cancer types, and therapies to view information tailored for patients with advanced cancer.

Therapies are organised into different tiers according to how effective they have been shown to be for a given cancer based on key biomarkers, its approval for use in Australia, and whether the cost of the treatment can be subsidised through the pharmaceutical benefits scheme (PBS).

Dr Subo Thavaneswaran, Medical Oncologist at The Kinghorn Cancer Centre and Garvan researcher, says the database is already proving useful in a clinical setting.

"Applying TOPOGRAPH to our Molecular Tumour Board recommendations at The Kinghorn Cancer Centre gives us greater confidence in their consistency, evidence-base and understanding of accessibility to therapies in the Australian context."

Emerging research

One of the reasons TOPOGRAPH was built was to keep oncologists up to date with the latest research and treatment approvals. The researchers plan to update TOPOGRAPH with treatments and therapies as they emerge and undergo assessment by the Therapeutic Goods Administration.

Professor David Thomas, senior author of the paper and Head of the Genomic Cancer Medicine Laboratory at Garvan, Director of The Kinghorn Cancer Centre and CEO of Omico, says that TOPOGRAPH could also be expanded to other jurisdictions by adjusting the platform's tier system.

"While this paper describes the use of TOPOGRAPH in the Australian context, our approach can be applied as a framework to other jurisdictions and the guidelines of different regulatory bodies. From a global oncology perspective, comparing tiered therapies between countries may help identify differences in equity of access by highlighting the disparity in drug utilisation compared to scientific advances in cancer therapeutics," he says.

"In addition, there is a potential role for TOPOGRAPH to support translational research, by informing the design of new correlative studies to explore more precise biomarkers for targeted therapies."

Credit: 
Garvan Institute of Medical Research

Bourneville's tuberous sclerosis: everything unfolds in the brain shortly after birth

A Canadian research team has uncovered a new mechanism involved in Bourneville tuberous sclerosis (BTS), a genetic disease of childhood. The team hypothesizes that a mutation in the TSC1 gene causes neurodevelopmental disorders that develop in conjunction with the disease.

Seen in one in 6,000 children, tuberous sclerosis causes benign tumours or lesions that can affect various organs such as the brain, kidneys, eyes, heart and skin. While some patients lead healthy lives, others have significant comorbidities, such as epilepsy, autism and learning disabilities.

Although the role that the TSC1 gene plays in the disease is already known, Montreal scientists have only now identified a critical period in the postnatal development of GABAergic interneurons that are so important to the development of the brain.

The results of their study are reported today in Nature Communications.

An essential 'pathway'

All mammalian cells, and the proteins that form them, need a 'pathway' to regulate their individual growth, which scientists call a 'signaling pathway,' explained Clara A. Amegandjin, a doctorate's student in neurosciences at Université de Montréal and first author of the new study.

"The signaling pathway of mTOR (mechanistic target of rapamycin) controls several aspects of the development of brain cells - the neurons - by regulating different metabolic processes: the proliferation, growth and mobility of neurons, as well as the biosynthesis and transcription of their proteins," she said.

"The pathway is therefore pivotal in ensuring the development of neurons in an ideal environment."

When the mTOR signaling pathway is disrupted, certain diseases such as type-2 diabetes, obesity, neurodegeneration and cancer can occur.

"A mutation in the negative regulator of the TSC1 gene of the mTOR pathway is known to produce hyperactivation of the signaling pathway, resulting in abnormal cell proliferation," said UdeM neurosciences professor Graziella Di Cristo, a researcher at CHU Sainte-Justine children's hospital.

"This disruption is responsible for neurodevelopmental disorders associated with autism, intellectual disability and epilepsy in tuberous sclerosis, but the underlying mechanisms were not well understood," said Di Cristo, who oversaw the study.

A conductor that can't keep time

Di Cristo's laboratory specializes in the study of GABAergic interneurons. This type of neuron acts as a conductor in the cortex by controlling the dynamics of neural networks and circuits that regulate brain function. They are of critical importance for brain development.

"Our original hypothesis was to see if this mutation in the mTOR pathway affected the development of GABAergic cells," said Amegandjin. "In many cases of autism, these cells are deregulated. However, in tuberous sclerosis, few studies have examined their involvement in the expression of neurological comorbidities."

Using an organotypic culture that mimics brain development (growth, maturation, and stabilization) ex vivo, the research team introduced the TSC1 gene mutation into GABAergic cells of mice at specific periods during their brain development.

Using biomarkers, the researchers found early and very rapid proliferation occuring in the growth phase of the mutated cells. Synaptic connections that form too quickly become 'defective' once they mature.

"We therefore have evidence that neurodevelopmental disorders are mediated by hyperactivity of the mTOR pathway caused by the absence of the TSC1 gene," said Amegandjin.

Application in humans

Rapamycin is a drug whose mechanism of action is related to the inhibition of the mTOR protein.

"By administering this protein in preclinical models - in this case, mice - we are able to 'rescue' synaptic connections and prevent neurodevelopmental disorders," said Di Cristo. "Based on our results, this therapeutic approach would be most appropriate to prevent premature maturation of neurons."

However, she cautioned "since mTOR plays a very broad role in neuronal development, it is important to determine the exact timing of administration to avoid undesirable and possibly fatal results. We need to continue our research to confirm that these observations apply to humans."

Credit: 
University of Montreal

Long COVID symptoms likely caused by Epstein-Barr virus reactivation

image: The number of subjects reporting each of 13 clinical manifestations of long COVID.

Image: 
Jeffrey E. Gold, Ramazan A. Okyay, Warren E. Licht, and David J. Hurley

Epstein-Barr virus (EBV) reactivation resulting from the inflammatory response to coronavirus infection may be the cause of previously unexplained long COVID symptoms -- such as fatigue, brain fog, and rashes -- that occur in approximately 30% of patients after recovery from initial COVID-19 infection. The first evidence linking EBV reactivation to long COVID, as well as an analysis of long COVID prevalence, is outlined in a new long COVID study published in the journal Pathogens.

"We ran EBV antibody tests on recovered COVID-19 patients, comparing EBV reactivation rates of those with long COVID symptoms to those without long COVID symptoms," said lead study author Jeffrey E. Gold of World Organization. "The majority of those with long COVID symptoms were positive for EBV reactivation, yet only 10% of controls indicated reactivation."

The researchers began by surveying 185 randomly selected patients recovered from COVID-19 and found that 30.3% had long term symptoms consistent with long COVID after initial recovery from SARS-CoV-2 infection. This included several patients with initially asymptomatic COVID-19 cases who later went on to develop long COVID symptoms.

The researchers then found, in a subset of 68 COVID-19 patients randomly selected from those surveyed, that 66.7% of long COVID subjects versus 10% of controls were positive for EBV reactivation based on positive EBV early antigen-diffuse (EA-D) IgG or EBV viral capsid antigen (VCA) IgM titers. The difference was significant (p

"We found similar rates of EBV reactivation in those who had long COVID symptoms for months, as in those with long COVID symptoms that began just weeks after testing positive for COVID-19," said coauthor David J. Hurley, PhD, a professor and molecular microbiologist at the University of Georgia. "This indicated to us that EBV reactivation likely occurs simultaneously or soon after COVID-19 infection."

The relationship between SARS-CoV-2 and EBV reactivation described in this study opens up new possibilities for long COVID diagnosis and treatment. The researchers indicated that it may be prudent to test patients newly positive for COVID-19 for evidence of EBV reactivation indicated by positive EBV EA-D IgG, EBV VCA IgM, or serum EBV DNA tests. If patients show signs of EBV reactivation, they can be treated early to reduce the intensity and duration of EBV replication, which may help inhibit the development of long COVID.

"As evidence mounts supporting a role for EBV reactivation in the clinical manifestation of acute COVID-19, this study further implicates EBV in the development of long COVID," said Lawrence S. Young, PhD, a virologist at the University of Warwick, and Editor-in-Chief of Pathogens. "If a direct role for EBV reactivation in long COVID is supported by further studies, this would provide opportunities to improve the rational diagnosis of this condition and to consider the therapeutic value of anti-herpesvirus agents such as ganciclovir."

Credit: 
World Organization

Influence of land use on soil erosion in European Russia for the last 30 years

image: Change in the total area of cultivated land (F) in the Russian Federation in 1970-2017. RSFSR--the Russian Soviet Federative Socialist Republic as the principal part of the former USSR until December 1991; Fav--the average area of cultivated land; ?F--relative change in Fav between 1970-1991 and 2005-2017 with p (Student's t-test); R2--the coefficient of determination of a sixth-degree polynomial trend (1); 2--the years of the political and economic reform in the USSR, "Perestroika". In the sixth-degree polynomial equation: Fi is the modeled F-value for the year yi. Note. According to the five-year program for the development of the socialist planned economy of the former USSR, the total area of cultivated land during every five years (...1971-1975, 1976-1980, 1981-1985, and 1986-1990) remained slightly changed.

Image: 
Kazan Federal University

Research Associate Artyom Gusarov studied a vast array of erosion data to make a general takeaway that soil erosion and river sediment load in the aforementioned region has significantly decreased throughout the post-Soviet period.

"The decrease has been especially profound in the forest steppe, a part of which covers the Republic of Tatarstan, because of the combined influence of climate change and land cultivation," explains Gusarov. "To the north of the forest steppe, in the southern part of the boreal zone, the anthropogenic factor was the primary influence on the changes in soil erosion, at least in the east of the East European Plain. Here, the reduction of cultivated land was the biggest in the post-Soviet time. In the steppes, the primary role can be attributed to climate change, especially the warming of the near-soil air, which led to decreased frosting of soils during winters, and, therefore, decreased erosion-inducing sediment from tillage."

The research shows that there is a complex intertwining between seemingly negative socio-economic developments and environmental conditions.

"The recession of agriculture in contemporary Russia, including decreases in tillage areas, numbers of agricultural machines, livestock population, etc., led to decreased soil and ravine erosion in the region, decreased river sediment load and concomitant pollution," says Gusarov.

The results are very important for the comprehensive planning of soil preservation, hydrogeological construction, and artificial water bodies. Artyom Gusarov aims to continue this research, now moving to the northern part of the East European Plain and the rivers running into the Arctic Ocean.

Credit: 
Kazan Federal University

The origins of farming insects

image: The ambrosia beetle Crossoterasus externedentatus (Curculionidae: Platypodinae).

Image: 
Bjarte Jordal, University of Bergen (Norway)

A beetle bores a tree trunk to build a gallery in the wood in order to protect its lay. As it digs the tunnel, it spreads ambrosia fungal spores that will feed the larvae. When these bore another tree, the adult beetles will be the transmission vectors of the fungal spores in another habitat. This mutualism among insects and ambrosia fungi could be more than 100 years old --more than what was thought to date-- according to an article published in the journal Biological Reviews.

The study analyses for the first time the symbiotic associations and the coevolution between ambrosia fungi and beetles from a paleontological perspective using the Cretaceous fossil records of these biological groups. Among the authors of the study are the experts David Peris and Xavier Delclòs, from the Faculty of Earth Sciences and the Biodiversity Research Institute of the University of Barcelona (IRBio), and Bjarte Jordal, from the University of Bergen (Norway).

Beetles that grew fungi millions of years before human agriculture

Some termites, ants and beetles developed the ability to grow fungi in order to eat millions of years ago. This mutualism between insects and fungi --one of the top studied symbiosis in the natural field-- is an analogous evolutionary strategy in the farming activities of the human species since the Neolithic revolution.

Understanding the origins of the symbiosis between insects and fungi is a field of interest in several scientific disciplines. Nowadays, the mutualism between ambrosia symbiont beetles and fungi is the cause of forest and crop plagues that cause serious ecological and economic losses "it remains unclear which ecological factors facilitated the origin of fungus farming and how it transformed into a symbiotic relationship with obligate dependency", notes David Peris, first author of the study.

When did the lineage of farming insects begin?

Historically, phylogenetic studies suggest beetle fungiculture started more than 50 million years ago --before other insects-- and some studies dated it back to 86 million years ago. "The symbiotic relationship between fungus and beetles would have probably originated more than 100 million years ago, during the early Cretaceous, in groups of beetles that had gone unnoticed", reveals the expert David Peris.

As part of the study, the experts studied several specimens of worldwide distribution of the biological groups captured in amber from the Cretaceous. Therefore, the origin of ambrosia fungus is older than the main groups of beetles from the subfamilies Scolytinae and Platypodinae --Curculionidae family-- which now grow fungus in tree trunks, as stated by the authors.

"This suggests that these fungi used some other group of insects to spread millions of years ago", notes the researcher. Also, other beetle groups with a similar behaviour to ambrosia beetles --Bostrichidae and mostly Lymexylidae families-- present an older and abundant fossil record that would coincide with the emergence of ambrosia fungi, according to previous studies.

"The most interesting thing --he continues-- is that some studies note the ability to cultivate fungi in some of these current species".

Evolutionary convergence towards an obligate mutualism

The growing process of fungi starts when beetles colonize a new tree trunk or branch. During the Cretaceous, the abundance of fungi and wood-boring beetles facilitated a starting domestication of some groups of fungi. First, the fungal spores were accidentally transported from tree to tree by the wood-boring beetles "until this mutually beneficial association evolved towards a more intimate symbiosis in which fungi were inoculated into to a tree, the fungal mycelia grew and beetle larvae fed from the fungus", notes Bjarte Jordal.

This set of factors, together with the symbionts' high ability to adapt and change, eased the morphological and ecological adaptations of biological groups that converged in an obligated mutualism. That is, a symbiotic relationship between insects and fungi, beneficial for both, which still lasts.

"However, we need more studies on the knowledge of the ecology of the species from the Lymexylidae and Bostrichidae families to get more specific conclusions. Therefore, the discovery of new fossils in cretaceous amber of these groups will certainly help us to better understand the evolutionary history of this symbiotic relationship that still exists nowadays", concludes Professor Xavier Delclòs.

Credit: 
University of Barcelona

Machine learning aids earthquake risk prediction

image: Sink holes and liquefaction on roads in Christchurch, New Zealand after 2011 earthquake.

Image: 
Martin Luff, CC BY-SA 2.0, via Wikimedia Commons

Our homes and offices are only as solid as the ground beneath them. When that solid ground turns to liquid -- as sometimes happens during earthquakes -- it can topple buildings and bridges. This phenomenon is known as liquefaction, and it was a major feature of the 2011 earthquake in Christchurch, New Zealand, a magnitude 6.3 quake that killed 185 people and destroyed thousands of homes.

An upside of the Christchurch quake was that it was one of the most well-documented in history. Because New Zealand is seismically active, the city was instrumented with numerous sensors for monitoring earthquakes. Post-event reconnaissance provided a wealth of additional data on how the soil responded across the city.

"It's an enormous amount of data for our field," said post-doctoral researcher, Maria Giovanna Durante, a Marie Sklodowska Curie Fellow previously of The University of Texas at Austin (UT Austin). "We said, 'If we have thousands of data points, maybe we can find a trend.'"

Durante works with Prof. Ellen Rathje, Janet S. Cockrell Centennial Chair in Engineering at UT Austin and the principal investigator for the National Science Foundation-funded DesignSafe cyberinfrastructure, which supports research across the natural hazards community. Rathje's personal research on liquefaction led her to study the Christchurch event. She had been thinking about ways to incorporate machine learning into her research and this case seemed like a great place to start.

"For some time, I had been impressed with how machine learning was being incorporated into other fields, but it seemed we never had enough data in geotechnical engineering to utilize these methods," Rathje said. "However, when I saw the liquefaction data coming out of New Zealand, I knew we had a unique opportunity to finally apply AI techniques to our field."

The two researchers developed a machine learning model that predicted the amount of lateral movement that occurred when the Christchurch earthquake caused soil to lose its strength and shift relative to its surroundings.

The results were published online in Earthquake Spectra on April 2021.

"It's one of the first machine learning studies in our area of geotechnical engineering," Durante said.

The researchers first used a Random Forest approach with a binary classification to forecast whether lateral spreading movements occurred at a specific location. They then applied a multiclass classification approach to predict the amount of displacement, from none to more than 1 meter.

"We needed to put physics into our model and be able to recognize, understand, and visualize what the model does," Durante said. "For that reason, it was important to select specific input features that go with the phenomenon we study. We're not using the model as a black box-- we're trying to integrate our scientific knowledge as much as possible."

Durante and Rathje trained the model using data related to the peak ground shaking experienced (a trigger for liquefaction), the depth of the water table, the topographic slope, and other factors. In total, more than 7,000 data points from a small area of the city were used for training data -- a great improvement, as previous geotechnical machine learning studies had used only 200 data points.

They tested their model citywide on 2.5 million sites around the epicenter of the earthquake to determine the displacement. Their model predicted whether liquefaction occurred with 80% accuracy; it was 70% accurate at determining the amount of displacement.

The researchers used the Frontera supercomputer at the Texas Advanced Computing Center (TACC), one of the world's fastest, to train and test the model. TACC is a key partner on the DesignSafe project, providing computing resources, software, and storage to the natural hazards engineering community.

Access to Frontera provided Durante and Rathje machine learning capabilities on a scale previously unavailable to the field. Deriving the final machine learning model required testing 2,400 possible models.

"It would have taken years to do this research anywhere else," Durante said. "If you want to run a parametric study, or do a comprehensive analysis, you need to have computational power."

She hopes their machine learning liquefaction models will one day direct first-responders to the most urgent needs in the aftermath of an earthquake. "Emergency crews need guidance on what areas, and what structures, may be most at risk of collapse and focus their attention there," she said.

Sharing, Reproducibility, and Access

For Rathje, Durante, and a growing number of natural hazard engineers, a journal publication is not the only result of a research project. They also publish all of their data, models, and methods to the DesignSafe portal, a hub for research related to the impact of hurricanes, earthquakes, tsunamis, and other natural hazards on the built and natural environment.

"We did everything on the project in the DesignSafe portal," Durante said. "All the maps were made using QGIS, a mapping tool available on DesignSafe, using my computer as a way to connect to the cyberinfrastructure."

For their machine learning liquefaction model, they created a Jupyter notebook -- an interactive, web-based document that includes the dataset, code, and analyses. The notebook allows other scholars to reproduce the team's findings interactively, and test the machine learning model with their own data.

"It was important to us to make the materials available and make it reproducible," Durante said. "We want the whole community to move forward with these methods."

This new paradigm of data-sharing and collaboration is central to DesignSafe and helps the field progress more quickly, according Joy Pauschke, program director in NSF's Directorate for Engineering.

"Researchers are beginning to use AI methods with natural hazards research data, with exciting results," Pauschke said. "Adding machine learning tools to DesignSafe's data and other resources will lead to new insights and help speed advances that can improve disaster resilience."

Advances in machine learning require rich datasets, precisely like the data from the Christchurch earthquake. "All of the information about the Christchurch event was available on a website," Durante said. "That's not so common in our community, and without that, this study would not have been impossible."

Advances also require high-performance computing systems to test out new approaches and apply them to new fields.

The researchers continue to refine the machine learning model for liquefaction. Further research, they say, is needed to develop machine learning models that are generalizable to other earthquake events and geologic settings.

Durante, who returned to her native Italy this year, says one thing she hopes to take back from the U.S. is the ability for research to impact public policy.

She cited a recent project working with Scott Brandenberg and Jonathan Stewart (University of California, Los Angeles) that developed a new methodology to determine whether a retaining wall would collapse during an earthquake. Less than three years after the beginning of their research, the recommended seismic provisions for new buildings and other structures in the U.S. included their methodology.

"I want my work to have an impact on everyday life," Durante said. "In the U.S., there is more of a direct connection between research and real life, and that's something that I would like to bring back home."

Credit: 
University of Texas at Austin, Texas Advanced Computing Center

Novel lncRNA, Caren, counteracts heart failure progression

image: Caren RNA is abundant in normal cardiomyocytes. It prevents deterioration of cardiac function by enhancing energy production through increased mitochondrial number and inhibiting the activation of the DNA damage response pathways. However, Caren RNA is reduced in cardiomyocytes that are exposed to aging and pressure stress. This leads to mitochondrial dysfunction and activation of the DNA damage response resulting in the development and exacerbation of heart failure.

Image: 
Professor Yuichi Oike

A research collaboration based in Kumamoto University (Japan) has identified a novel lncRNA, Caren, that is abundantly expressed in cardiomyocytes. They showed that it enhances energy production by increasing the number of mitochondria in cardiomyocytes, and inhibits activation of the ATM protein, a key player in the DNA damage response pathway that accelerates heart failure severity. Caren RNA in cardiomyocytes is reduced by aging and high blood pressure (hypertension), which can lead to heart failure, and markedly reduced in the hearts of heart failure patients. The researchers believe that activation of Caren in cardiomyocytes could lead to the development of new heart failure therapies.

Heart failure is when reduced pumping function (contraction and dilatation) of the heart muscle is unable to pump enough blood to the body. It is still a disease with a poor prognosis and the number of heart failure patients is increasing worldwide. In developed countries, the increase in the number of heart failure patients, especially in the elderly, is a major problem. Therefore, there is a need to develop effective new treatment strategies.

Energy production from mitochondria is essential for maintaining cardiac function. Aging and hypertension, which increase the chances of heart failure development, cause mitochondrial dysfunction in cardiomyocytes that results in reduced mitochondrial energy production and increased reactive oxygen species production. Reactive oxygen species cause DNA damage, which subsequently activates the DNA damage response resulting in the exacerbation of heart failure. Therefore, mitochondrial dysfunction and the activation of the DNA damage response have both attracted attention as a cause of heart failure.

A research group led by Professor Oike at Kumamoto University has identified a novel lncRNA abundantly expressed in mouse cardiomyocytes and named it Caren (cardiomyocyte-enriched noncoding transcript). They also found that the amount of Caren RNA in mouse cardiomyocytes is reduced by stress, which can lead to heart failure. Further analysis of the function of Caren in the mouse heart revealed that it inhibits the decline of cardiac pump function. The researchers thus suggested that aging and stress reduce the amount of Caren RNA in cardiomyocytes, thereby reducing its effects and promoting mitochondrial dysfunction and activation of the DNA damage response, leading to the development and worsening of heart failure.

The researchers then genetically engineered a non-pathogenic adeno-associated virus to selectively infect cardiomyocytes and express Caren. After infecting heart failure model mice with the virus, they found that the amount of Caren RNA in cardiomyocytes increased, the number of mitochondria increased, and activation of the DNA damage response was suppressed compared to mice that were infected with a control virus, thus inhibiting the progression of heart failure in mice. The researchers also found that Caren RNA is present in human cardiomyocytes, and that its amount is inversely correlated with the expression level of heart failure marker genes in the heart tissue of heart failure patients. (The expression level of heart failure marker genes is high in heart tissues with low Caren RNA levels). Furthermore, they showed that decreasing the amount of human Caren RNA in cardiomyocytes generated from human iPS cells decreased the energy production capacity of mitochondria.

"Our research shows that increasing the amount of Caren RNA in cardiomyocytes can inhibit the onset and progression of heart failure, which we expect can be a strategy for developing new heart failure therapies," said Professor Oike. "In our in vivo mouse experiments, we found that Caren RNA supplement therapy using an adeno-associated virus was effective in counteracting heart failure progression. Now, we are going to verify whether human Caren has the same effect which could lead to the development of a new treatment for heart failure."

This research was posted online in Nature Communications on 5 May 2021.

Credit: 
Kumamoto University

University of Minnesota Medical School report details the effects of COVID-19 on adolescent sexual health

MINNEAPOLIS/ST. PAUL (06/23/2021) -- A new report from the University of Minnesota Medical School's Healthy Youth Development - Prevention Research Center (HYD-PRC) highlights that Minnesota youth continue to contract sexually transmitted infections (STIs) at alarmingly high rates, despite the COVID-19 pandemic.

The 2021 Minnesota Adolescent Sexual Health Report says that chlamydia and gonorrhea rates among Minnesota adolescents in 2020 are likely underreported, as both STI testing and case detection were scaled back during the early stages of the pandemic. However, teen pregnancy rates remain virtually unchanged from 2018, and birth rates are at historic lows for 15- to 19-year-olds.

"We must continue to highlight the importance of condoms and other barrier methods, utilize new and innovative public health educational campaigns, address STI testing shortages and expand young people's access to STI screening and treatment," said Jill Farris, MPH, director of Adolescent Sexual Health Training and Education for the HYD-PRC at the U of M Medical School.

Minnesota youth are disproportionately impacted by STIs, with the highest chlamydia and gonorrhea rates among Black and Hispanic youth. While adolescents aged 15 to 19 are only 6.5% of Minnesota's population, they accounted for 25% of all chlamydia cases and 16% of gonorrhea cases in 2020.

The report also details that disparities in sexual health outcomes -- by geography, race and ethnicity -- continue to persist. Rural areas in Minnesota continue to experience the highest teen birth rates in the state. Birth rates for American Indian, Black and Hispanic youth are higher than for white youth, and birth rates for Asian/Pacific Islander and American Indian youth are higher than national figures.

"COVID-19 may also play into the rates of STIs and pregnancy," Farris said. "While we won't know the full impact of COVID-19 on the sexual health of adolescents for a few years, we do know that providers made extraordinary efforts to reach out and connect with youth during this difficult time."

Adolescent sexual health clinicians and educators utilized telehealth/virtual learning last year due to COVID-19. The HYD-PRC surveyed 94 organizations that provided adolescent sexual health care, sex education or both during the pandemic. Organizations reported on their use of telehealth/virtual education and shared that online and hybrid options for health care and sex education need to continue after the pandemic to improve health equity among adolescents in Minnesota.

"We are excited to analyze the data over the next few years to determine if virtual platforms increased the reach of sexual health education and services, which, in turn, hopefully continues to decrease the rates of STIs and pregnancies among Minnesota youth," Farris said.

While many programs and services focus on changing individual behaviors that lead to teen pregnancy, more attention is now being focused on the social determinants that contribute to poor health outcomes through systematic lack of access to resources, power and opportunity.

"We must fully support young people's health by addressing their physical, social, emotional and cognitive development and provide them with skills and support to make healthy decisions," Farris said. "Continued focus on reducing systematic barriers and continuing to provide online and hybrid options for health care and sex education will empower Minnesota youth to make healthy choices."

Credit: 
University of Minnesota Medical School

Cat-borne parasite Toxoplasma induces fatally bold behavior in hyena cubs

image: A spotted hyena cub in Kenya's Masai Mara National Reserve.

Image: 
Zach Laubach

Best known for its presence in house cats and a tendency to infect and alter the behaviors of rodents and humans, the parasite Toxoplasma gondii (T. gondii) is also associated with bold behavior among wild hyena cubs and risk of death during interactions with lions, finds new research from the University of Colorado Boulder.

The findings, published this week in Nature Communications, reinforce previous research which has found the parasite can prompt profound behavioral changes in its hosts, and potentially in the 2 billion people worldwide estimated to be infected by it. While T. gondii has been well studied in laboratory settings with humans and wild-caught rodents, this is one of the first studies to examine how the parasite affects wild host behavior during interactions with wild cats.

The research uses a rich data set from more than three decades of continuous field research in the Maasai Mara National Reserve in Kenya. It reveals that hyena cubs, but not subadult or adults, infected by T. gondii behave more boldly in the presence of lions, and that infected cubs have a greater risk of being killed by lions.

"This project is one of a handful of long-term continuous studies on a long-lived mammal," said Zach Laubach, co-lead author on the study and postdoctoral fellow in ecology and evolutionary biology. "Our findings suggest that infection early in life leads to bolder behavior and is particularly costly for young hyenas."

Multiple strains of T. gondii are found throughout the world, infecting warm blooded animals--including humans who have house cats--during different life stages through contaminated soil, drinking water or eating meat of other infected animals. It can also be passed down from mother to baby.

The researchers found that in the hyena populations they studied, infections are widespread but more common in older animals, meaning that it's most likely that they become infected by consuming contaminated meat or water.

For infected cubs--hyenas up to one year old--they found infected animals are bolder, approaching lions from closer distances than uninfected cubs, and that infection among cubs also corresponds to a higher probability of being killed by lions. In this study, lions were responsible for all hyena cub deaths among infected animals, but only 17% of uninfected cubs died before the age of one due to lion attacks.

This is a vulnerable time in the life cycle of spotted hyenas, who despite being proficient hunters, take a long time to develop and rely on support and protection from their mothers, according to Laubach.

"Hyena moms invest a ton of both time and resources into their offspring. They nurse until they're about a year old and don't reach independence until they're about two or more years old," said Laubach. "But after they're one year old, we found no difference in how close they got to lions, regardless of infection status."

Some scientists theorize that this parasite manipulates its hosts' behavior (whether that host is a hyena or human) in order to get back to cats, where it can sexually reproduce. But the data from this study doesn't provide enough evidence to disentangle the theory supporting an adaptive mechanism for the parasite from other plausible alternative theories. It does, however, show that T. gondii has a direct and detrimental impact on hyena fitness.

Measuring the cost of confidence

Maasai Mara National Reserve is a biological hot spot in southwestern Kenya, between Lake Victoria and the bustling city of Nairobi. For more than 30 years now, the Mara Hyena Project, led by co-author Kay Holekamp, has been gathering data on the health and behavior of spotted hyenas in one of the best places in the world to study a diverse array of large, carnivorous mammals.

"It's really a one-of-a-kind system, especially for studying large carnivores in a natural setting, which is a rare opportunity," said Laubach. "I can't think of another place in the world where you can see the same numbers and diversity of species of large mammalian carnivores, it's pretty spectacular."

Laubach notes there are many misconceptions about hyenas. They're quite social and live in large groups, some with more than 120 individuals. They're also formidable in size and strength, at up to a 170 pounds when full grown (twice the size of a large dog) and they have one of the strongest jaws in the animal kingdom.

"They can eat bone, their bite can crack the femur of a giraffe," said Laubach.

Yet hyenas and lions compete with one another for territory and food, and these interactions with lions are the leading natural cause of hyena injuries and mortality.

Because previous research has shown that T. gondii can impact behavior in animals in laboratory settings, what the researchers wanted to know was: How does T. gondii affect wild hyenas' behaviors around lions, and what are the consequences?

Laubach and co-lead author Eben Gering analyzed archived data collect by numerous research assistants and graduate students. They gathered blood samples and documented the hyenas' interactions with each other and when they interacted with lions. From the safety of their vehicle, researchers Benson (Malit) Pioon and Holekamp of the Mara Hyena Project administered tranquilizers to 166 hyenas, in order to draw and test their blood for the parasite. They found 108, or 65%, had been previously exposed.

Over the years, researchers spent many hours each morning and evening out in the field, recording hyena behaviors of individuals that can be identified by their unique spot patterns on their coats. The data revealed that infection with the parasite T. gondii was related to boldness and greater risk of lion mortality among hyena cubs but not older animals. The lack of an effect in older animals could be because they've had time to learn, while cubs have less than one year of experience to compete with the influence of the parasite.

"One limitation of this work is that it was an observational study. But limitations are interesting, because it points to what one might do next," said Laubach. "We'd like to go back and tease apart how behaviors change in individuals by comparing how their behaviors differ before versus after infection."

Credit: 
University of Colorado at Boulder

NIH scientists describe 'multi-kingdom dialogue' between internal, external microbiota

image: The microbiome is comprised of microorganisms that live in and on us and contribute to human health and disease.

Image: 
NHGRI

WHAT:
National Institutes of Health scientists and their collaborators have identified an internal communication network in mammals that may regulate tissue repair and inflammation, providing new insights on how diseases such as obesity and inflammatory skin disorders develop. The new research is published in Cell.

The billions of organisms living on body surfaces such as the skin of mammals--collectively called microbiota--communicate with each other and the host immune system in a sophisticated network. According to the study, viruses integrated in the host genome, remnants of previous infections called endogenous retroviruses, can control how the host immune system and the microbiota interact, affecting tissue repair and antimicrobial defenses. Endogenous retroviruses can comprise up to 10% of all genes.

The newly discovered role of endogenous retroviruses adds to the scientific community's understanding of certain diseases and inflammatory states and opens new research avenues. "Together, our results support the idea that mammals may have co-opted their endogenous viromes as a means to communicate with their microbiota, resulting in a multi-kingdom dialogue that controls both immunity and inflammation," the authors state.

Scientists from NIH's National Institute of Allergy and Infectious Diseases led the project with collaborators from the NIH Center for Human Immunology, the National Cancer Institute, Stanford University and Scripps Research in California, University of Pennsylvania in Philadelphia, University of Oxford, and The Francis Crick Institute in England.

Building on a series of studies over the past decade showing that microbiota broadly promote immune protection, the NIAID scientists and collaborators sought to discover how this occurs. They used Staphylococcus epidermidis, a common skin bacterium with known helpful and harmful features, as a study model in laboratory and mouse experiments.

The models helped them identify the important roles of skin cells called keratinocytes and of endogenous retroviruses in communication between microbiota and the skin immune system. Keratinocytes are the primary interface between the host and its microbiota. Their study showed that S. epidermidis triggered an antiviral response in keratinocytes, and that finding led them to discover that endogenous retroviruses coordinate responses to the microbiota that stimulate the immune system.

The mouse model also showed that a high-fat diet triggers an inflammatory immune response to S. epidermidis that can be controlled by providing antiretroviral treatment, suggesting a role for endogenous retroviruses in driving inflammatory responses caused by microbes under high-fat conditions. The researchers will continue exploring how these ancient viruses control the beneficial role of the microbiota and how nutrition can change this dialogue toward pro-inflammatory responses.

ARTICLE:
D.S. Lima-Junior et al. Endogenous retroviruses promote homeostatic and inflammatory responses to the microbiota. Cell DOI: 10.1016/j.cell.2021.05.020 (2021).

WHO:
Yasmine Belkaid, Ph.D., chief of NIAID's Metaorganism Immunity Section in the Laboratory of Immune System Biology, is available to comment.

CONTACT:
To schedule interviews, please contact Ken Pekoc, (301) 402-1663, kpekoc@niaid.nih.gov.

Credit: 
NIH/National Institute of Allergy and Infectious Diseases

Melatonin in mice: there's more to this hormone than sleep

image: The new lab mice that naturally produce melatonin were able to enter a state of daily torpor.

Image: 
RIKEN

Researchers at the RIKEN Center for Brain Science and the RIKEN BioResource Research Center in Japan, along with collaborators at the State University of New York at Buffalo, have created a mouse model that allows the study of naturally occurring melatonin. Published in the Journal of Pineal Research, these first experiments using the new mice showed that natural melatonin was linked to a pre-hibernation state that allows mice to slow down their metabolism and survive when food is scarce, or temperatures are cold.

Melatonin is called "the hormone of darkness" because it's released by the brain in the dark, which usually means at night. It tells the body when it's dark outside so that the body can switch to 'night mode'. Although other hormones are easily studied in the laboratory, it has been difficult to study how the body reacts to melatonin because laboratory mice don't actually have any. To solve this problem, the researchers crossed laboratory mice with wild mice--which do produce melatonin--and bred new lab mice that can produce melatonin innately. This was a lot harder than it sounds and took over 10 mouse generations.

Once they had melatonin-producing lab mice, the researchers were able to study how the hormone affects entrainment--the alignment of the body clock with the outside world. Mice like to run on wheels regularly, and researchers can use this to measure entrainment after suddenly changing the light/dark cycle, which mimics sudden changes in times zones. Compared with regular lab mice, the mice with innate melatonin adapted their wheel running times faster to darkness starting six hours earlier, similar to 'east-bound jet lag'.

The researchers were also able to resolve a debate about whether life span is affected by melatonin, which has been hard to study because of the missing melatonin in lab mice. "Now we finally have an answer: endogenous melatonin has no life-extending effects," says Takaoka Kasahara, a senior author of the new study.

Despite many similarities, mice with innate melatonin differed from regular lab mice in several ways. The regular lab mice were heavier, had bigger reproductive organs, and were more successful at mating, producing more pups. On the other hand, melatonin-producing female mice were able to enter a state called daily torpor, a kind of low-power mode similar to hibernation that can last for a few hours a day. Daily torpor is a way for mice to deal with food scarcity and cold temperatures by conserving energy.

"There is an evolutionary advantage to producing melatonin, because it protects wild mice from losing weight when they can't find enough food. Lab mice, however, are typically given unlimited food and live in warm cages," Kasahara observes. "Our finding that mice lacking melatonin are more successful at reproducing can explain why lab mice lack melatonin. Over the years, by selecting for mice that reproduce the most pups, we might have also been inadvertently selecting for mice with lower and lower levels of melatonin."

Having shown that melatonin can affect circadian rhythms, the specially bred melatonin-proficient mice will be valuable for studying the detailed molecular and neural mechanisms of melatonin signaling on the circadian clock and sleep, as well as the effects of melatonin on immunity and bone formation. These relationships have been suggested, but have not yet been closely examined.

Further research on melatonin's relationship with daily torpor and hibernation is also important. Melatonin is necessary for seasonal reproduction in several animals, signaling the length of the night, which indicates the season. "This research could very well lead to a better understanding of seasonal affective disorder, or winter depression, in humans," says Kasahara. "Indeed, one of the newest antidepressants, agomelatin, activates melatonin receptors."

The study was authored by the following researchers:
Chongyang Zhang, Shannon J. Clough, Ekue B. Adamah-Biassi, Michele H. Sveinsson, Anthony J Hutchinson, Ikuo Miura, tamio Furuse, Shigeharu Wakana, Yui K. Matsumoto, Kazuo Okanoya , Randall L. Hudson, Tadafumi Kato, Margarita L. Dubocovich , and Takaoki Kasahara

Credit: 
RIKEN

Starchy snacks may increase CVD risk; fruits and veggies at certain meals decreases risk

DALLAS, June 23, 2021 —Can starchy snacks harm heart health? New research published today in the Journal of the American Heart Association, an open access journal of the American Heart Association, found eating starchy snacks high in white potato or other starches after any meal was associated at least a 50% increased risk of mortality and a 44-57% increased risk of CVD-related death. Conversely, eating fruits, vegetables or dairy at specific meals is associated with a reduced risk of death from cardiovascular disease, cancer or any cause.  

“People are increasingly concerned about what they eat as well as when they eat,” said Ying Li, Ph.D., lead study author and professor in the department of nutrition and food hygiene at Harbin Medical University School of Public Health in Harbin, China. “Our team sought to better understand the effects different foods have when consumed at certain meals.”

Li and colleagues analyzed the results of 21,503 participants in the National Health and Nutrition Examination Survey (NHANES) from 2003 to 2014 in the U.S. to assess dietary patterns across all meals. Among the study population, 51% of participants were women and all participants were ages 30 or older at the start of the study. To determine patient outcomes, researchers used the U.S. Centers for Disease Control and Prevention’s National Death Index to note participants who died through December 31, 2015, due to CVD, cancer or any cause.

Researchers categorized participants’ dietary patterns by analyzing what types of food they ate at different meals. For the main meals, three main dietary patterns were identified for the morning meal: Western breakfast, starchy breakfast and fruit breakfast. Western lunch, vegetable lunch and fruit lunch were identified as the main dietary patterns for the mid-day meal. Western dinner, vegetable dinner and fruit dinner were identified as the main dietary patterns for the evening meal.

For snacks, grain snack, starchy snack, fruit snack and dairy snack were identified as the main snack patterns in between meals. Additionally, participants who did not fit into specific meal patterns were analyzed as a reference group. The researchers noted that the Western dietary pattern has higher proportions of fat and protein, which is similar to many North American meals.

Participants in the Western lunch group consumed the most servings of refined grain, solid fats, cheese, added sugars and cured meat. Participants in the fruit-based lunch group consumed the most servings of whole grain, fruits, yogurt and nuts. Participants in the vegetable-based dinner group consumed the most servings of dark vegetables, red and orange vegetables, tomatoes, other vegetables and legumes. Participants who consumed starchy snacks consumed the most servings of white potatoes.

According to their findings:

Eating a Western lunch (typically containing refined grains, cheese, cured meat) was associated with a 44% increased risk of CVD death;
Eating a fruit-based lunch was associated with a 34% reduced risk of CVD death;
Eating a vegetable-based dinner was associated with a 23% and 31% reduction in CVD and all-cause mortality, respectively; and
Consuming a snack high in starch after any meal was associated with a 50-52% increased risk of all-cause mortality and a 44-57% increased risk in CVD-related mortality.

“Our results revealed that the amount and the intake time of various types of foods are equally critical for maintaining optimal health,” said Li. “Future nutrition guidelines and interventional strategies could integrate optimal consumption times for foods across the day.”

Limitations to this study include that dietary data was self-reported by participants, which may lead to recall bias. And, although the researchers controlled for potential confounders, other unmeasured confounding factors cannot be ruled out.

Credit: 
American Heart Association

Universal health care benefited colon cancer survival

PHILADELPHIA - Patients with colon cancer enrolled in the U.S. military's universal health care system experienced improved survival compared with patients in the general population, according to results published in Cancer Epidemiology, Biomarkers & Prevention, a journal of the American Association for Cancer Research.

"Colorectal cancer has the third highest death rate out of all cancers in the U.S. Therefore, it is highly important to improve survival of patients with colon cancer," said study author Craig D. Shriver, MD, FACS, FSSO, retired U.S. Army colonel and professor and director of the Murtha Cancer Center Research Program at Uniformed Services University of the Health Sciences in Bethesda, Maryland.

Previous research has shown that patients without health insurance or with Medicaid (the federal and state insurance program for low-income individuals) experience poorer survival from colon cancer than patients with private insurance. Little research has examined outcomes from the U.S. Military Health System (MHS), which provides universal health care for active-duty service members, retirees, National Guard members, and their family members.

To compare survival between patients with colon cancer in the Military Health System and those in the general population, the researchers evaluated data from the Department of Defense's Automated Central Tumor Registry (ACTUR), matching 11,907 ACTUR patients to 23,814 patients in the National Cancer Institute's Surveillance, Epidemiology, and End Results (SEER) database. All patients were diagnosed with colon cancer between January 1, 1987, and December 31, 2013.

After a median follow-up time of 56 months for ACTUR patients and 49 months for SEER patients, the researchers found that the ACTUR patients with colon cancer had an 18 percent lower risk of death compared with the SEER patients. The lower risk of death was consistent across age groups, gender, race, and year of diagnosis.

The survival benefit tended to be greater for Black patients, who typically have poorer colon cancer survival rates than white Americans. Black patients in the ACTUR database were 26 percent less likely to die of colon cancer than those in the SEER database. Among white patients, the survival benefit for ACTUR patients was the same as in the overall study population; they were 18 percent less likely than the SEER patients to die of colon cancer. While this study did not aim to compare survival between racial groups within ACTUR, previous research found that in the Military Health System, older Blacks and whites with colon cancer had similar overall survival.

Shriver explained that access to health care has been implicated as one factor in the disparities in cancer survival between Black and white Americans; therefore, the availability of universal health care through the military health system had a larger positive effect on the Black study population. "The survival benefit of Blacks in our study suggests that a universal health care system may be helpful to reduce racial disparity," Shriver said, adding that previous studies on lung cancer and glioma in the MHS showed similar success in reducing disparities.

The researchers also compared tumor stage at diagnosis, hoping to ascertain whether universal health care coverage resulted in earlier diagnosis. Colon cancer detected at early stages is more likely to be treated successfully.

The study showed that ACTUR patients were more likely than SEER patients to be diagnosed with stage 1 colon cancer (22.67 percent compared with 18.64 percent). ACTUR patients were less likely than the patients in the SEER database to be diagnosed with stage 4 colon cancer (18.74 percent compared with 21.63 percent).

"The Military Health System provides medical care with minimal or usually no financial barriers. Thus, our findings provide solid evidence of the benefits of access to universal health care," Shriver said. "What's more, when medical care is universally provided to all patients, racial disparity in colon cancer outcomes can be reduced."

Shriver noted that a limitation of the study is that the cancer registry data did not allow for a full comparison of all factors that could have affected survival among colon cancer patients. For example, the researchers could not compare treatment regimens, comorbidities, or quality of care.

Credit: 
American Association for Cancer Research

Low energy hydrogenation without hydrogen: Efficient catalysis in a stable emulsion gel

video: UJ researchers take a novel step to change hydrogenation into a safe, low energy process. They use a very stable three phase emulsion to transform a toxic waste product into valuable feedstock. The process does not need flammable, compressed hydrogen gas. It turns nitrobenzene efficiently into aniline at room temperature.
Without hydrogenation, it would not be possible to manufacture many of today's medicines. It is a backbone process for the pharmaceutical and chemical industries. But hydrogen is expensive.
However, if compressed hydrogen is not needed at all, significant savings are possible. It also means that many chemical processes can be much safer and easier to work with.
Chemists from the University of Johannesburg have demonstrated this, in research published in Colloids and Surfaces at https://www.sciencedirect.com/science/article/abs/pii/S0927775721003824

Image: 
Transmission Electron Microscope (TEM) images by Dele Peter Fapojuwo, University of Johannesburg. Video and narration by Therese van Wyk, University of Johannesburg.

UJ researchers take a novel step to change hydrogenation into a safe, low energy process. They use a very stable three phase emulsion to transform a toxic waste product into valuable feedstock. The process does not need flammable, compressed hydrogen gas.

The emulsion catalysis hydrogenates nitrobenzene efficiently at room temperature to output aniline. Aniline is widely used in the pharmaceutical industry. The bi-metallic hydrogenation catalyst is fully recovered afterwards.

Without hydrogenation, it would not be possible to manufacture many of today's medicines. It is a backbone process for the pharmaceutical and chemical industries. But hydrogen is expensive. The safety measures to prevent explosions in factories and laboratories are costly also.

However, if compressed hydrogen is not needed at all, significant savings are possible. It also means that many chemical processes can be much safer and easier to work with.

Chemists from the University of Johannesburg have demonstrated this, in research published in Colloids and Surfaces.

They converted nitrobenzene into aniline, using a catalyzed hydrogenation process in a Pickering emulsion.

The emulsion process has the potential to be a much safer industrial hydrogenation process than those currently in use.

"Pickering emulsions have been around for 150 years. But using them for catalysis only emerged in 2014," says Prof Reinout Meijboom. Meijboom is a researcher at the department of Chemical Sciences.

Like yoghurt, a Pickering emulsion is an emulsion. Such an emulsion is a mix of particles that readily dissolve in water, and particles that readily dissolve in oil. What makes yoghurt a Pickering emulsion is that it also contains enzymes, which are solid particles that do not dissolve.

Nitrobenzene is produced in huge amounts globally as waste from chemical manufacturing. It is a highly toxic, persistent organic pollutant described by the WHO, EPA and CDC, among others.

The manufacture of polyurethanes uses nitrobenzene as an intermediate. It is also used as a solvent in petroleum refining. The wastewater of dye manufacturers often contains nitrobenzene. It is an oily liquid and presents a fire hazard.

Aniline is an industrially significant commodity. It is a feedstock for a vast number of chemical products, including many medicines.

The process the researchers designed uses toluene to dissolve nitrobenzene. This forms the first, organic or toluene phase of the process. For the second aqueous phase, they dissolved sodium borohydride in water.

The catalyst is the third phase in the process. It consists of modified silica microspheres and Palladium. They also used a bimetallic catalyst, where Palladium is combined with Cobalt or Nickel.

If the three phases are added together, but not mixed into an emulsion, the combination can be stored for days or weeks, says Meijboom. A small amount of hydrogenation takes place, but the process only really gets going once a proper emulsion is formed.

The catalyst also acts as a stabilizing emulsifier.

When the three phases are mixed into an emulsion, the catalyst kick-starts the hydrogenation process. The formation of the emulsion takes a few seconds. The reaction takes about two hours at laboratory scale.

The hydrogen needed for hydrogenation is supplied by the dissolved sodium borohydride. Hydrogenation happens efficiently at room temperature, which saves energy.

There is no need for stored or piped hydrogen. This removes most of the explosion risk from the process.

The three-phase process in a Pickering emulsion has the advantage of a much bigger catalytic surface, compared to a single phase or two phase process, says Meijboom.

The catalytic efficiency can be tuned by adjusting the volume ratio of the toluene and water phases in the Pickering emulsion system.

"Each drop of toluene and nitrobenzene solution in the emulsion effectively becomes a microreactor. This is how the process can be tuned to be efficient at room temperature," he adds.

After the hydrogenation completes, the resulting emulsion is stable enough for storage for a few days, before separating out the aniline.

The study is the first efficient use of a bimetallic palladium catalyst for the hydrogenation of an aromatic compound in a water-based Pickering emulsion system, says Mr. Peter Dele Fapojuwo. He is a Postgraduate researcher at the department.

"By adding Nickel or Cobalt to the catalyst, we improved the dispersion of the Palladium on the surface of the emulsifier," he adds.

Palladium is much more expensive than Nickel or Cobalt, so the use of the bimetallic catalyst further reduces costs.

"The use of solid particles as both catalysts and emulsifier, or stabilizer, poses less threat to the environment compared to conventional surfactant. Their composition is less toxic," says Fapojuwo.

The reaction platform is significantly safer when using sodium borohydride as the reductant, rather than hydrogen, he adds.

"Hydrogenation using petroleum-derived hydrogen is neither totally environmentally-friendly nor economically viable. It requires high-pressure hydrogen, which demands expensive reactor equipment. That increases process costs," he adds.

"In theory, this process can be adapted to keeping one phase in a fixed-bed reactor and doing flow synthesis. The result would be a continuous process for catalytic reactions between two immiscible phases," says Meijboom.

"This is proof of principle phase. We're working towards generalizing the process," he says.

"We have designed a process that can be expanded to a range of industrially important reactions.

"By using emulsion chemistry, we have one system where catalyst, emulsifier, aqueous and organic phase all mix up into an extremely stable system," says Meijboom.

Credit: 
University of Johannesburg