Brain

Researchers capture first 3D super-resolution images in living mice

video: Researchers used their 3D-2PE-STED microscope to image the brain of a living mouse. Zooming in on part of a dendrite reveals the 3D structure of an individual spine.

Image: 
Joerg Bewersdorf, Yale School of Medicine

WASHINGTON -- Researchers have developed a new microscopy technique that can acquire 3D super-resolution images of subcellular structures from about 100 microns deep inside biological tissue, including the brain. By giving scientists a deeper view into the brain, the method could help reveal subtle changes that occur in neurons over time, during learning, or as result of disease.

The new approach is an extension of stimulated emission depletion (STED) microscopy, a breakthrough technique that achieves nanoscale resolution by overcoming the traditional diffraction limit of optical microscopes. Stefan Hell won the 2014 Nobel Prize in Chemistry for developing this super-resolution imaging technique.

In Optica, The Optical Society's (OSA) journal for high impact research, the researchers describe how they used their new STED microscope to image, in super-resolution, the 3D structure of dendritic spines deep inside the brain of a living mouse. Dendric spines are tiny protrusions on the dendritic branches of neurons, which receive synaptic inputs from neighboring neurons. They play a crucial role in neuronal activity.

"Our microscope is the first instrument in the world to achieve 3D STED super-resolution deep inside a living animal," said leader of the research team Joerg Bewersdorf from Yale School of Medicine. "Such advances in deep-tissue imaging technology will allow researchers to directly visualize subcellular structures and dynamics in their native tissue environment," said Bewersdorf. "The ability to study cellular behavior in this way is critical to gaining a comprehensive understanding of biological phenomena for biomedical research as well as for pharmaceutical development."

Going deeper

Conventional STED microscopy is most often used to image cultured cell specimens. Using the technique to image thick tissue or living animals is a lot more challenging, especially when the super-resolution benefits of STED are extended to the third dimension for 3D-STED. This limitation occurs because thick and optically dense tissue prevents light from penetrating deeply and from focusing properly, thus impairing the super-resolution capabilities of the STED microscope.

To overcome this challenge, the researchers combined STED microscopy with two-photon excitation (2PE) and adaptive optics. "2PE enables imaging deeper in tissue by using near-infrared wavelengths rather than visible light," said Mary Grace M. Velasco, first author of the paper. "Infrared light is less susceptible to scattering and, therefore, is better able to penetrate deep into the tissue."

The researchers also added adaptive optics to their system. "The use of adaptive optics corrects distortions to the shape of light, i.e., the optical aberrations, that arise when imaging in and through tissue," said Velasco. "During imaging, the adaptive element modifies the light wavefront in the exact opposite way that the tissue in the specimen does. The aberrations from the adaptive element, therefore, cancel out the aberrations from the tissue, creating ideal imaging conditions that allow the STED super-resolution capabilities to be recovered in all three dimensions."

Seeing changes in the brain

The researchers tested their 3D-2PE-STED technique by first imaging well-characterized structures in cultured cells on a cover slip. Compared to using 2PE alone, 3D-2PE-STED resolved volumes more than 10 times smaller. They also showed that their microscope could resolve the distribution of DNA in the nucleus of mouse skin cells much better than a conventional two-photon microscope.

After these tests, the researchers used their 3D-2PE-STED microscope to image the brain of a living mouse. They zoomed-in on part of a dendrite and resolved the 3D structure of individual spines. They then imaged the same area two days later and showed that the spine structure had indeed changed during this time. The researchers did not observe any changes in the structure of the neurons in their images or in the mice's behavior that would indicate damage from the imaging. However, they do plan to study this further.

"Dendritic spines are so small that without super-resolution it is difficult to visualize their exact 3D shape, let alone any changes to this shape over time," said Velasco. "3D-2PE-STED now provides the means to observe these changes and to do so not only in the superficial layers of the brain, but also deeper inside, where more of the interesting connections happen."

Credit: 
Optica

Researchers show how stem cell depletion leads to recurring pregnancy loss

image: Maria Diniz-da-Costa, PhD of Warwick Medical School, first author of the study.

Image: 
AlphaMed Press

Durham, NC - Depletion of a certain type of stem cell in the womb lining during pregnancy could be a significant factor behind miscarriage, according to a study released today in STEM CELLS. The study, by researchers at Warwick Medical School, University of Warwick, Coventry, England, reports on how recurrent pregnancy loss is a result of the loss of decidual precursor cells prior to conception.

"This raises the possibility that they can be harnessed to prevent pregnancy disorders," said corresponding author Jan J. Brosens, M.D., Ph.D., professor of obstetrics and gynecology at Warwick Medical School (WMS).

The womb lining - or endometrium - is a highly regenerative tissue capable of adopting different physiological states during the reproductive years. In the second half of the menstrual cycle when progesterone levels are high, the endometrium starts remodeling intensively, heralding the start of a short window during which an embryo can implant. Pregnancy depends on this transformation, a process called decidua, as it is driven by the differentiation of endometrial stromal cells into specialized decidual cells. These cells impart the plasticity needed for tissue to accommodate an embryo's rapid growth.

"While the magnitude of tissue remodeling required for pregnancy makes it likely that poised progenitor and highly proliferative decidual precursor cells are critical for the formation of a robust maternal-fetal interface, the underlying mechanisms behind this are unclear," Dr. Brosens said.

The same team had recently described the presence of a discrete population of highly proliferative mesenchymal cells (hPMC) during the window of implantation. Mesenchymal stem/stromal cells can be isolated from bone marrow, adipose and other tissue sources, and can differentiate into a variety of cell types depending on the conditions of the culture they are grown in. In this latest study, the research team set out to characterize these hPMCs.

"Our findings indicate that hPMC are derived from circulating bone marrow-derived stem cells and recruited into the lining of the womb at the time of embryo implantation. These cells appear critical in pregnancy to accommodate the rapidly growing placenta." Dr. Brosens said. "We also found that these rare but highly specialist cells are depleted in the womb lining of women with recurrent pregnancy."

Siobhan Quenby, M.D., FRCOG, professor of obstetrics and Honorary Consultant at University Hospitals Coventry and Warwickshire and the University of Warwick, was part of the research team. "These are very exciting findings," she said. "We have already demonstrated that we can increase these highly proliferative cells in the lining of womb before pregnancy. These new findings explain why these highly proliferative cells are so important for the prevention of miscarriage and possibly spontaneous preterm labor, two devasting pregnancy disorders that affect many women and couples all over the world."

Dr. Jan Nolta, Editor-in-Chief of STEM CELLS, said, "this key study begins to find answers to a very concerning problem in pregnancy disorders and gives insight into understanding factors that could contribute to pregnancy loss. We are very excited to be able to publish these important results."

The study involved recruitment of women attending the Tommy's National Miscarriage Research Centre at University Hospital Coventry and Warwickshire National Health Service Trust, England, and was supported by the Wellcome Trust.

Credit: 
AlphaMed Press

Researchers improve plant prime editing efficiency with optimized pegRNA designs

image: (a). Diagram of optimizing the PBS Tm; (b). Prime editing frequencies at different PBS Tm in rice; (c). Diagram of prime editing using the dual-pegRNA strategy; (d). Comparison of the dual-pegRNA strategy with either pegRNA alone; (e). Schematic representation of using PlantPegDesigner to design pegRNAs.

Image: 
IGDB

Precision genome editing enables the precise modification of DNA in living cells, thus enabling a breadth of opportunities for plant breeding. Prime editors, developed by Prof. David R. Liu and his colleagues, permit the installation of desired edits in a programmable target site. They are comprised of an engineered Cas9 nickase (H840A)-reverse transcriptase (RT) fusion protein and a prime editing guide RNA (pegRNA).

Prime editors were previously developed and optimized as an extremely versatile editing strategy for generating programmable point mutations, insertions and deletions in rice and wheat by Prof. GAO Caixia of the Institute of Genetics and Developmental Biology (IGDB) of the Chinese Academy of Sciences along with her research team and collaborators.

They found that the editing efficiency of the plant prime editor was strongly affected by the PBS and RT template sequence, suggesting the need for optimized pegRNA designs to yield higher product conversions.

To determine principles for efficient prime editing, Prof. GAO and Prof. LI Jiayang, also of IGDB, along with their research teams, reported optimized pegRNA design strategies that maximize plant prime editing efficiency.

Since the hybridization of the primer binding site (PBS) with the non-target strand ssDNA is the initial step in reverse transcription, the researchers hypothesized that the melting temperature (Tm) of the PBS sequence (referred to hereafter as PBS Tm) is an important parameter for prime editors. By analyzing prime editing efficiencies at 18 endogenous target sites in rice protoplasts, they found that PBS Tm strongly affects editing efficiency, with maximal prime editing occurring when PBS Tm is 30 °C in rice.

In addition to identifying optimal pegRNA designs, they also introduced advances to prime editing through the use of dual pegRNAs. This strategy relies on two pegRNAs generating respective ssDNA flaps that base pair with each other in trans while encoding the same edit on both strands of the newly synthesized DNA.

This new editing approach resulted in 3.0-fold improvements in average in prime editing efficiency compared to using individual pegRNAs alone. Furthermore, the scientists generated prime editors comprised of SpG (an engineered Cas9 with expanded PAM targeting range) to expand the targeting scope of this dual-pegRNA editing strategy. Together, optimizing PBS Tm and using a dual-pegRNA strategy boosted prime editing efficiency up to 17.4-fold in rice.

Based on these two advancements, the team developed a user-friendly web application, PlantPegDesigner, to help other researchers design prime editing tools best suited for their applications.

PlantPegDesigner offers users flexibility and control of various parameters based on their individual needs. This tool recommends spacer-PAM sequences, PBS sequences, RT template sequences and also PCR cloning primers for vector construction. Compared to other web applications, PlantPegDesigner-recommended dual pegRNAs resulted in up to a 46-fold improvement in editing activity in rice.

In summary, this work simplified the design of pegRNAs, thus providing a reliable solution for efficient prime editing in plants. The flexibility of the optimized plant prime editing system will advance both plant breeding and functional genomics research.

Credit: 
Chinese Academy of Sciences Headquarters

Arctic sponge survival in the extreme deep-sea

image: Still from the footage of the deep-sea sponge ground that was collected over a year.

Image: 
Copyright: NIOZ

For the first time, researchers from the SponGES project collected year-round video footage and hydrodynamic data from the mysterious world of a deep-sea sponge ground in the Arctic. Deep-sea sponge grounds are often compared to the rich ecosystems of coral reefs and form true oases. In a world where all light has disappeared and without obvious food sources, they provide a habitat for other invertebrates and a refuge for fish in the otherwise barren landscape. It is still puzzling how these biodiversity hotspots survive in this extreme environment as deep as 1500 metres below the water surface. With over 700 hundred hours of footage and data on food supply, temperature, oxygen concentration, and currents, NIOZ scientists Ulrike Hanz and Furu Mienis found clues that could help find some answers.

Colourful, thriving communities

'The deep sea, in most places, is barren and flat', says marine geologist Furu Mienis. 'And then, suddenly, we have these sponge grounds that form colourful, thriving communities. It is intriguing how this system sustains itself in such a place.' To better understand this unexpected success, the research team deployed a bottom lander equipped with sensors and an underwater camera specially designed for the extreme environment by NIOZ engineers and technicians. The location: an enormous seamount in the Norwegian Sea that is part of the Mid Atlantic ridge, known as the Schulz Bank. A year later they picked it up. What they saw and measured was a world where sponges survived in temperatures below zero degrees Celsius and withstood potential food deprivation, high current speeds and 200-metre-high underwater waves.

Mienis: 'We still don't get why they grow where they grow, but we are off to a good start of a better understanding. Apparently, this seamount and the hydrodynamic conditions create a beneficial system for the sponges.' A major finding located the sponge ground at the interface between two water masses where strong internal tidal waves can spread widely? and interact with the bottom landscape. The data from the sensors showed that the water flow at the summit of the seamount interacts with the seamount itself, producing turbulent conditions with temporarily high current speeds reaching up to 0.7 meter per second, which can be considered as "stormy" conditions in the deep sea.

At the same time, water movements around the seamount supply the sponge ground with food and nutrients from water layers above and below. The team measured the amount of food sinking from the surface to the sponge grounds and found that in this vertical direction, fresh food was delivered only once during a major event in the summer when the phytoplankton bloomed. Hanz: 'This isn't enough to sustain the sponge grounds, which is why we expect that additionally, bacteria and dissolved matter keep the sponges and associated fauna from starving.

Extreme environment

The long-term record shows that the sponges on Schulz Bank thrive at temperatures around zero degrees Celsius. This is at least 4 °C lower than stony cold-water corals that are also found in the deep sea. Hanz: 'It is striking that they are alive with temperatures at zero degrees or even less. This is quite extreme, even for the deep sea.' In this food-deprived environment, the cold might actually play a role in the sponge's survival by lowering its metabolism. And the cold isn't the only extremity they face. Video recordings of the highest current speed events in winter show that these 'storms' further push the sponges to their limit. Hanz: 'The speeds that we witnessed, might be close to the maximum that they can endure. At the highest speed, we saw some sponges as well as anemones being ripped from the seafloor.'

And the most remarkable thing after watching hundreds of hours of footage? Hanz: 'The image that I started with was almost the same as the image in the end. One year had gone by and everything looked almost the same. It's just so cold out there that no crazy things are going on. The reef is still very pristine.' However, Hanz and Mienis stress that it is a very vulnerable ecosystem. Hanz: 'Sponges can be up to two hundred years old, once damaged it takes ages for them to recover.' And there are potential threats. Hanz: 'Fishery seems to be the biggest one, but there is also the future possibility of deep-sea mining and changes in temperature and turbulence caused by climate change. Mienis: 'It is a fragile equilibrium that consists of many tiny components. Take one of those away and the whole system can collapse.' Their research contributes to a first baseline that might become essential in future protection. Mienis: 'We now identified the first natural ranges, and gathered a bit of the information on how these rich sponge grounds can thrive.'

Credit: 
Royal Netherlands Institute for Sea Research

Relieve your stress, relieve your allergies

image: Sneezing, a common symptom of allergies

Image: 
Sambeet D. via Pixabay

Increased allergic reactions may be tied to the corticotropin-releasing stress hormone (CRH), suggests a study published this month in the International Journal of Molecular Sciences. These findings may help clarify the mechanism by which CRH induces proliferation of mast cells (MC) - agents involved in the development of allergies in the human nasal cavity.

"In my daily practice, I meet many patients with allergies who say their symptoms worsened due to psychological stress," states lead researcher Mika Yamanaka-Takaichi, a graduate student of the Department of Dermatology, Osaka City University, "This is what led me to do this research."

Together with Professor Daisuke Tsuruta of the same department, they hypothesized that due to its role in inducing MC degranulation in human skin, "CRH may also be involved in stress-aggravated nasal allergies," says Professor Tsuruta.

When the team added CRH to a nasal polyp organ culture, they saw a significant increase in the number of mast cells, a stimulation both of MC degranulation and proliferation, and an increase of stem cell factor (SCF) expression, a growth factor of mast cells, in human nasal mucosa- the skin of the nasal cavity. In exploring possible therapeutic angles, "we saw the effect of CRH on mast cells blocked by CRHR1 gene knockdown, CRHR1 inhibitors, or an addition of SCF neutralizing antibodies," states Dr. Yamanaka-Takaichi.

In vivo, the team found an increase in the number of mast cells and degranulation in the nasal mucosa of mouse models of restraint stress, which was inhibited by the administration of CRHR1 inhibitor, antalarmin.

"In addition to understanding the effects stress has on our allergies, we have also found promising therapeutic potential in candidates like antalarmin," adds Dr. Yamanaka-Takaichi, "And this is wonderful news for my patients."

Credit: 
Osaka City University

Natural Sciences students' research published in prestigious journal

A collaborative research project by team of undergraduate students from the University of Exeter's Natural Sciences department has been published in a prestigious academic journal.

Lewis Howell, Eleanor Osborne and Alice Franklin have had their second-year research published in The Journal of Physical Chemistry B.

Their paper, Pattern Recognition of Chemical Waves: Finding the Activation Energy of the Autocatalytic Step in the Belousov-Zhabotinsky Reaction, was a result of their extended experiment work in the Stage 2 module "Frontiers in Science 2".

Their project involved the Belousov-Zhabotinsky chemical reaction - an example of a chemical oscillator that is often used to illustrate a chaotic system.

These reactions are theoretically important because they show that chemical reactions do not have to be dominated by equilibrium thermodynamic behaviour.

For the research, the team used a Raspberry Pi camera to record images of the reaction over time, and repeated the experiment under a wide range of temperature conditions.

The group were the first to apply a filter-coupled circle finding algorithm and localised pattern analysis to the Belousov-Zhabotinsky reaction in order to extract features such as velocity of the waves.

They were soon able to get exceptionally good experimental results which, coupled with the application of novel image analysis techniques, allowed them to make unprecedented progress, unveiling some peculiar and previously undocumented features of this chemical oscillator.

Lewis said: "The experimental work went really well; we planned all the experiments ourselves and conducted them over five or six weeks of lab time. It's a really interesting chemical reaction, and we had a lot of fun doing it; we found that our data was really good, which gave us a good platform to work from.

"Together we were able to do a lot of cool analysis, using image processing techniques to extract properties such as the velocity of the chemical waves you see in the reaction."

Dr Eric Hébrard, who co-supervised the module alongside Dr. David Horsell, said: "Throughout all this journey, Lewis, Eleanor and Alice have been very comfortable working alongside academic staff as well as together and have displayed engaging and collaborative team-working skills."

Alice added that "Natural Sciences at Exeter is geared towards giving undergraduate students first-hand experience of research." Eleanor also added that "without the course being so interdisciplinary, we couldn't have achieved such a high quality of results that enabled us to publish this paper; it goes to show that collaboration between scientists from different disciplines is a really effective way to approach research."

Credit: 
University of Exeter

Urban agriculture can help, but not solve, city food security problems

image: This map shows the distribution of cropland and pastureland within about 140 miles of downtown Chicago.

Image: 
Christine Costello/Penn State

While urban agriculture can play a role in supporting food supply chains for many major American cities -- contributing to food diversity, sustainability and localizing food systems -- it is unrealistic to expect rooftop gardens, community plots and the like to provide the majority of nutrition for the population of a metropolis.

That's the conclusion of a team of researchers who analyzed the nutritional needs of the population of Chicago and calculated how much food could be produced in the city by maximizing urban agriculture, and how much crop land would be needed adjacent to the city to grow the rest. The study was the first to evaluate land required to meet food demand while accounting for a range of nutritional needs instead of only calories or quantities.

"There is a tremendous enthusiasm around the country for localized food systems and urban agriculture," said lead researcher Christine Costello, assistant professor of agricultural and biological engineering, College of Agricultural Sciences, Penn State. "We wanted to determine how much nutrition urban agriculture really can contribute -- to find out what's feasible -- as well as how much land is required to meet the population's needs."

Now, with the COVID-19 pandemic exposing weaknesses in food supply chains, the focus on localizing food systems has sharpened, especially in and around big cities. Answering questions about how much food urban agriculture actually can contribute is more important than ever, Costello pointed out. For example, a recent study found that 30% of Boston's fruit and vegetable demand could be met in Boston through soil-based and rooftop urban agriculture.

With growing populations and affluence, urban food demand will increase, which presents considerable challenges to achieving economic, environmental and social sustainability, Costello noted. At the same time, more people are living in urban environments. In 2018 in the U.S., 82% of the population lived in urban areas, with an anticipated increase to 89% by 2050.

"Urban agriculture is attractive because it uses land or rooftops not currently used for food production and could increase habitat and biodiversity, enhance stormwater management, and provide fruits and vegetables, resulting in positive nutritional outcomes," Costello said. "However, fruits and vegetables do not contain sufficient calories, protein or other critical nutrients, such as vitamin B12, to support the full range of human needs."

Cultivation in soil on a rooftop typically is limited without significant restructuring of the roof, often making it infeasible, Costello explained. For this reason, hydroponic or vertical farming systems may be preferable. Hydroponic systems are best suited to produce leafy greens, such as kale and lettuce, and herbs.

In the study, researchers calculated the land required to meet the needs of Chicago and adjacent communities with and without urban agriculture food production, which they estimated two ways. One used average yields from urban and conventional farming methods; the other used optimization techniques to produce necessary nutrients using the smallest land base possible.

The team estimated the total nutrient requirements of Chicago's population using the daily food nutrient requirements recommended by the U.S. Department of Agriculture's Center for Nutrition Policy and Promotion. Twenty-eight nutrients were considered. Foods included in the study were selected based on their current prevalence in the American agricultural system and for their nutritional qualities.

The scientists estimated the amount of land required for each animal-based commodity using a formula based on USDA recommendations and prior research done by Costello. The researchers created linkages between crops and livestock in a model and used national inventory data to estimate both cropland and pastureland utilized for each kilogram -- about 2 pounds -- of animal food commodity.

The study used satellite data to define land-type availability and incorporated USDA data on yield for conventionally grown crops over a 10-year period. Soil-based urban agricultural yield data for the 2015 and 2016 growing seasons came from the Columbia Center for Urban Agriculture, located in Missouri.

The findings, recently published in Environmental Science and Technology, suggested that it is not possible -- using the predominant commodities and common urban agricultural production of today -- to meet the nutritional needs of Chicago within a radius under 400 miles, given the cropland and pastureland available, without fortifying foods with vitamin D and supplementing foods with vitamin B12.

With vitamin D fortification, a common U.S. practice, the radius required is reduced to 110-140 miles. With vitamin B12 supplementation, the radius was further reduced to 40-50 miles. The inclusion of urban agriculture reduced the radius by another 6-9 miles and increased the diversity of foods available.

"This work demonstrates the need to include a full list of nutrients when evaluating the feasibility of localizing food systems," Costello said. "Key nutrient fortification or supplementation may significantly reduce the land area required to meet the nutritional needs of a population."

Credit: 
Penn State

Electrochemical synthesis of formate from CO2 using a Sn/reduced graphene oxide catalyst

image: Scanning electron microscope image (upper left), transmission electron microscope image (upper right), reduction characteristics (bottom left) and Faradic efficiency (bottom right) of Sn/rGO catalyst.
It can be seen that Sn nanoparticles of 10-50 nm are uniformly dispersed on the reduced graphene oxide sheet (upper left and upper right). Also, the absolute value of the current density under CO2 flow is larger than that of the conventional catalyst (Sn) or Sn supported graphene oxide (Sn/GO), and the current increases from an initial electric potential of a lower absolute value. Thus, it can be seen that the overpotential is significantly reduced and that the current density is augmented. In addition, the Faradic efficiency of formate is very high using the Sn/rGO catalyst (bottom left and bottom right).

Image: 
Kanazawa University

[Background]

Decreasing the emission and efficient utilization (fixation) of carbon dioxide (CO2) are worldwide issues to prevent global warming. Promotion of the use of renewable energy is effective in reducing CO2 emissions. However, since there are large time-dependent fluctuations and large regional differences in renewable energy production, it is necessary to establish a fixation technology to allow efficient energy transportation and storage. Thus, there is increasing interest in technologies for synthesizing useful chemicals from CO2 using electricity derived from renewable energy. In particular, formic acid is attracting much attention as an energy (hydrogen) carrier because it is liquid and non-toxic at room temperature. Establishment of this technology will contribute to the efficient transportation and storage of renewable energy and to the fixation of CO2, and allow energy storage with high environmental compatibility to be achieved.

In the electrochemical reduction of CO2, it is known that formic acid can be obtained with a Faradic efficiency*1) of about 50 to 60% by using tin (Sn) as a cathode catalyst. However, in order to develop this technology for practical use, further improvement of Faradic efficiency and a reduction of overpotential*2) are necessary. There is much active interest in research to understand the design principles of catalysts to enable these goals to be achieved.

[Results]

The present research team led by Prof. Tsujiguchi and his colleagues of Kanazawa University in collaboration with scientists from University of Tsukuba and Osaka University prepared a tin/reduced graphene oxide*3) (Sn/rGO) catalyst in which Sn was supported on reduced graphite oxide by thermal reduction of tin chloride (SnCl2) and graphene oxide (GO) obtained by oxidizing graphite powder using the improved Hummers method*4). In the catalyst thus prepared, Sn is uniformly dispersed in the rGO layer, and the composite is stacked to form a 3D morphology, i.e. rGO/Sn/rGO (Fig. 1). This catalyst is characterized as a carrier with functional group containing a much larger amount of oxygen than the tin/graphite catalyst used for a comparison. When we performed the electrochemical reduction of CO2 using these catalysts with CO2 dissolved in a solution of potassium hydrogen carbonate (KHCO3), it was found that the Sn/rGO catalyst significantly decreased the overpotential and allowed a high current density to be obtained compared to the Sn catalyst. In addition, when the reduction of CO2 was performed at a constant potential, almost no products other than formic acid, such as H2 and CO, were detected and we succeeded in obtaining formic acid with a Faradic efficiency of 98% (1.8 times that with Sn catalyst alone) (Fig. 1).

The reason for the highly efficient formic acid production obtained using Sn/rGO catalyst is its high CO2 adsorption capacity. Sn/rGO can adsorb four times as much CO2 as Sn catalyst alone. Further, the rate of CO2 adsorption is eight times that of Sn catalyst alone. Computational chemistry predicted that this high CO2 adsorption capacity would be due to the oxidized functional groups of rGO and that the production of hydrogen and carbon monoxide would be suppressed since the CO2 adsorbed by the oxidized functional group of rGO is quickly and efficiently supplied to the adjacent Sn surface (Fig. 2). In order to experimentally confirm this mechanism, our team attempted electrochemical imaging of the catalytic activity with a scanning electrochemical cell microscope*5). It was revealed that a significantly higher reduction current density was observed at the interface between Sn and rGO than on the Sn or rGO surfaces (Fig. 3) suggesting that a large amount of formic acid is synthesized on Sn adjacent to rGO, in support of the above prediction by computational chemistry. This is the first experimental demonstration using a scanning electrochemical cell microscope that formic acid synthesis is actively occurring at the interface between the catalyst and the support. Thus, more efficient formic acid synthesis would be possible by combining a support with a high CO2 adsorption capacity with a catalyst for electrochemical reduction of CO2*6). This provides an important framework that could be applied to all catalysts available so far.

[Future prospects]

The results of the present study provide new insights into the development of catalysts for formic acid synthesis by the reduction of CO2, and dramatic progress is expected in the development of formic acid synthesis technology by the electrochemical reduction of CO2. In addition, we have demonstrated an improvement of selectivity due to the excellent CO2 adsorption capacity of the support as well as elucidating its reaction mechanism. This should have a great impact on electrochemical reduction technology related to CO2, including the synthesis of methanol, methane and olefins. Therefore, it has the potential to be a useful basic technology in the synthesis of chemicals from CO2. In the future, we expect that the development of electrochemical reduction cells using this catalyst will be encouraged, leading to the creation of energy storage devices with high environmental compatibility that can contribute to the fixation of CO2 and promotion of the efficient use of renewable energy.

Credit: 
Kanazawa University

Psychological forest: What trees reveal about Antarctic researchers

image: About 30 people winter over at the Japanese Antarctic research station, Syowa Station. Photographed on April 17, 2017.

Image: 
NIPR

At the bottom of the world, there's a small island about four kilometers off the coast of Antarctica. In summer, temperatures climb to freezing with uninterrupted daylight for two months. In winter, they fall to minus 40 degrees Celsius without a single sunrise for two months. It is isolated and desolate, uninhabitable to all humans -- except for the Japanese Antarctic Research Expedition (JARE). Almost every year since 1956, a JARE team winters over on the island, staying in Syowa Station, from February to January to conduct various research projects. From 2004 to 2014, however, they were also research subjects themselves.

As part of a joint project between the National Institute of Polar Research at the Research Organization of Information and Systems (ROIS) and the Antarctic Psychological Research team at Kyoto University, a series of non-invasive psychological surveys were administered six times to five different wintering-over teams.

The results were published online on Feb. 22 in the International Journal of Circumpolar Health.

"An Antarctic wintering-over station is a unique environment as a small, isolated society facing the extreme margins of survival," said first author Tomoko Kuwabara, professor emeritus in the Graduate School of Education at Kyoto University and professor in The Open University of Japan. "Although significant developments have been made in wintering-over operations, such as improved communications, it is still hard to stay an entire year in an Antarctic station, not only because of the cold, changes in daylight and exposure to UV rays, but also the isolation and the impossibility of escaping one's small social group. These conditions make an Antarctic station a unique society that can reveal much about human nature."

To better understand how team members' psychological states changed throughout their wintering-over experience, the 172 participants were asked to complete a questionnaire designed to assess mood and to draw trees. Known as the Baum test, how a participant draws trees is thought to reveal much about their mental state and personality.

Some participants depicted trees reminiscent of Japan, such as cherry trees or even a tree in a home garden, while others drew palm trees. Many sketched apple trees.

The researchers noticed the team members split themselves in two distinct groups: Those who drew the same trees every time they were asked, and those who drew different trees.

"These results suggest two types of coping among individuals: one stabilizes life by maintaining a previous lifestyle, and the other flexibly adjusts to a new way of life," Kuwabara said.

This assessment appeared to hold true across the questionnaires, but the researchers were unable to pinpoint any distinct personality traits that might make a person more or less suited to life at Antarctica.

"In general, wintering-over team members accepted their environment, and they did not act out emotionally or deny problems," Kuwabara said, noting that many seemed to draw comfort and stability from continuing to hold onto internal relationships with their family and home. "It is expected that our survey will contribute to the understanding of other isolated groups, such as crews on the space station or in other future space travel, as well as to group management in everyday societies."

The researchers plan to continue exploring how team members manage during wintering over, as well as how they readjust to typical life.

Credit: 
Research Organization of Information and Systems

Do you know the way to Berkelium, Californium?

image: Top: (left) Droplet of solution containing californium on a transmission electron microscopy grid; (right) scanning transmission electron microscopy (STEM) image of individual californium nanoparticles. Bottom: (left) STEM images of crystal structures of (left) Cf2O3 - blue schematic outlines californium columns; and (right) BkO2 - blue schematic illustrates berkelium lattice.

Image: 
Andy Minor and Rebecca Abergel/Berkeley Lab

Heavy elements known as the actinides are important materials for medicine, energy, and national defense. But even though the first actinides were discovered by scientists at Berkeley Lab more than 50 years ago, we still don't know much about their chemical properties because only small amounts of these highly radioactive elements (or isotopes) are produced every year; they're expensive; and their radioactivity makes them challenging to handle and store safely.

But those massive hurdles to actinide research may one day be a thing of the past. Scientists at the U.S. Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) and UC Berkeley have demonstrated how a world-leading electron microscope can image actinide samples as small as a single nanogram (a billionth of a gram) - a quantity that is several orders of magnitude less than required by conventional approaches.

Their findings were recently reported in Nature Communications, and are especially significant for co-senior author Rebecca Abergel (https://abergel.lbl.gov/), whose work on chelators - metal-binding molecules - has resulted in new advances in cancer therapies, medical imaging, and medical countermeasures against nuclear threats, among others. Abergel is a faculty scientist who leads the Heavy Element Chemistry program in the Chemical Sciences Division at Berkeley Lab, and assistant professor in nuclear engineering at UC Berkeley.

"There are still so many unanswered questions with regards to chemical bonding in the actinide series. With such state-of-the art instrumentation, we are finally able to probe the electronic structure of actinide compounds, and this will allow us to refine molecular design principles for various systems with applications in medicine, energy, and security," Abergel said.

"We demonstrated that you can work with less material - a nanogram - and get the same if not better data without having to invest in dedicated instruments for radioactive materials," said co-senior author Andy Minor, facility director of the National Center for Electron Microscopy at Berkeley Lab's Molecular Foundry, and professor of materials science and engineering at UC Berkeley.

Allowing researchers to work with just a nanogram of an actinide sample will significantly reduce the high costs of experiments conducted using previous methods. One gram of the actinide berkelium can cost a jaw-dropping $27 million, for example. An actinide sample that is only a nanogram also reduces radiation exposure and contamination risks, Minor added.

In one set of experiments at TEAM 0.5 (Transmission Electron Aberration-corrected Microscope), an atomic-resolution electron microscope at the Molecular Foundry, the researchers imaged single atoms of berkelium and californium to demonstrate how much less actinide material is needed with their approach.

In another set of experiments using EELS (electron energy loss spectroscopy), a technique for probing a material's electronic structure, the researchers were surprised to observe in berkelium a weak "spin-orbit coupling," a phenomenon that can influence how a metal atom binds to molecules. "This had never been reported before," said co-author Peter Ercius, a staff scientist at the Molecular Foundry who oversees the TEAM 0.5 microscope. "It's like finding a needle in a haystack. It's amazing what we could see."

Co-lead author Alexander Müller credits Berkeley Lab's interdisciplinary "team science" approach for bringing together the world's best experts in electron microscopy, heavy element chemistry, nuclear engineering, and materials science for the study.

"Because Berkeley Lab attracts amazing researchers from all fields of science, such interdisciplinary collaborative work comes naturally here," he said. "I personally found that aspect very rewarding for this project. And now that we have established this approach, we can pursue many new directions in actinide research." Müller was a postdoctoral scholar in Berkeley Lab's Molecular Foundry and UC Berkeley's Department of Materials Science and Engineering at the time of the study. He is now an associate at the Munich, Germany, office of Kearney, an international management consulting firm.

Safety protocols in place for the research involved sample preparation in dedicated laboratories and careful surveying of work areas. Since samples were prepared with miniscule amounts (1-10 nanograms) of each isotope, the contamination hazards to the equipment were also minimized, the researchers said.

The researchers hope to apply their approach to the investigation of other actinides, including actinium, einsteinium, and fermium.

"The more information we get from these minute amounts of radioactive elements, the better equipped we'll be to advance new materials for radiation cancer therapy and other useful applications," Minor said.

Credit: 
DOE/Lawrence Berkeley National Laboratory

Study shows DHA supplement may offset impact of maternal stress on unborn males

image: Senior author David Beversdorf, MD, a professor of radiology, neurology and psychology at the University of Missouri.

Image: 
Justin Kelley

Neurodevelopmental disorders like autism and schizophrenia disproportionately affect males and are directly linked to early life adversity caused by maternal stress and other factors, which might be impacted by nutrition. But the underlying reasons for these male-specific impacts are not well understood. Researchers from the University of Missouri School of Medicine and the MU Thompson Center for Autism and Neurodevelopmental Disorders have uncovered possible reasons for male vulnerability in the womb, and they've learned a specific maternal dietary supplement called docosahexanoic acid (DHA) may guard against the impact of maternal stress on unborn males during early development.

"We believe differences in metabolic requirements for male and female embryos as early as the first trimester, combined with dynamic differences in the way the male and female placenta reacts to environmental factors, contributes to the increased risk for male neurodevelopmental disorders later in life," said senior author David Beversdorf, MD, a professor of radiology, neurology and psychology at MU.

Beversdorf worked with principal investigator Eldin Jašarevic, PhD, an assistant professor of pharmacology at the University of Maryland School of Medicine and a team of researchers on the study which involved grouping 40 mice into four different cohorts. Group 1 mothers received a standard diet and were not exposed to any early prenatal stress (EPS). Group 2 got the standard diet while being exposed to (EPS), which consisted of restraint, light, noise and predator threat. Group 3 got a diet modified with supplemental DHA but was not exposed to EPS. Group 4 received DHA supplementation and EPS.

The team analyzed the embryos and placentas at 12.5 days of gestation and found exposure to prenatal distress decreased placenta and embryo weight in males but not females. In the DHA groups, they found the supplement reversed the impact of EPS on males.

"This study yielded two results regarding the interaction between maternal stress and dietary DHA enrichment in early stage embryos," Beversdorf said. "First, stress on the mother during the first week of gestation appeared to influence gene expression pattern in the placenta, and the gender of the offspring determined the magnitude of disruption. Second, a maternal diet enriched with preformed DHA during periods of high stress showed partial rescue of stress-dependent dysregulation of gene expression in the placenta."

Beversdorf said future studies will be needed to better understand the complex cellular and molecular mechanisms linking maternal diet consumption, chronic stress during pregnancy, placental gene expression and lasting health outcomes in offspring.

In addition to Beversdorf and Jašarevic, the study authors include University of Missouri colleagues Kevin Fritsche, PhD, professor of nutrition and exercise physiology; David Geary, PhD, professor of psychology; and Rocio Rivera, PhD, associate professor of animal science.

The study, "Maternal DHA supplementation influences sex-specific disruption of placental gene expression following early prenatal stress," was recently published in the journal Biology of Sex Differences. Research reported in these publications was supported by grants from the University of Missouri Research Board, F21C-Nutrition for Health Group, F21C-Reporductive Biology Group and the School of Medicine Mission Enhancement Fund. Beversdorf has consulted with Quadrant Biosciences, Impel Pharma, YAMO Pharma and Staliclca, unrelated to this work. The content is solely the responsibility of the authors and does not necessarily represent the views of the funding agencies.

Credit: 
University of Missouri-Columbia

Searching for hints of new physics in the subatomic world

image: This plot shows how the decay properties of a meson made from a heavy quark and a light quark change when the lattice spacing and heavy quark mass are varied on the calculation.

Image: 
A. Bazavov (Michigan State U.), C. Bernard (Washington U., St. Louis), N. Brown (Washington U., St. Louis), C. DeTar (Utah U.), A.X. El-Khadra (Illinois U., Urbana and Fermilab) et al.

Peer deeper into the heart of the atom than any microscope allows and scientists hypothesize that you will find a rich world of particles popping in and out of the vacuum, decaying into other particles, and adding to the weirdness of the visible world. These subatomic particles are governed by the quantum nature of the Universe and find tangible, physical form in experimental results.

Some subatomic particles were first discovered over a century ago with relatively simple experiments. More recently, however, the endeavor to understand these particles has spawned the largest, most ambitious and complex experiments in the world, including those at particle physics laboratories such as the European Organization for Nuclear Research (CERN) in Europe, Fermilab in Illinois, and the High Energy Accelerator Research Organization (KEK) in Japan.

These experiments have a mission to expand our understanding of the Universe, characterized most harmoniously in the Standard Model of particle physics; and to look beyond the Standard Model for as-yet-unknown physics.

"The Standard Model explains so much of what we observe in elementary particle and nuclear physics, but it leaves many questions unanswered," said Steven Gottlieb, distinguished professor of Physics at Indiana University. "We are trying to unravel the mystery of what lies beyond the Standard Model."

Ever since the beginning of the study of particle physics, experimental and theoretical approaches have complemented each other in the attempt to understand nature. In the past four to five decades, advanced computing has become an important part of both approaches. Great progress has been made in understanding the behavior of the zoo of subatomic particles, including bosons (especially the long sought and recently discovered Higgs boson), various flavors of quarks, gluons, muons, neutrinos and many states made from combinations of quarks or anti-quarks bound together.

Quantum field theory is the theoretical framework from which the Standard Model of particle physics is constructed. It combines classical field theory, special relativity and quantum mechanics, developed with contributions from Einstein, Dirac, Fermi, Feynman, and others. Within the Standard Model, quantum chromodynamics, or QCD, is the theory of the strong interaction between quarks and gluons, the fundamental particles that make up some of the larger composite particles such as the proton, neutron and pion.

PEERING THROUGH THE LATTICE

Carleton DeTar and Steven Gottlieb are two of the leading contemporary scholars of QCD research and practitioners of an approach known as lattice QCD. Lattice QCD represents continuous space as a discrete set of spacetime points (called the lattice). It uses supercomputers to study the interactions of quarks, and importantly, to determine more precisely several parameters of the Standard Model, thereby reducing the uncertainties in its predictions. It's a slow and resource-intensive approach, but it has proven to have wide applicability, giving insight into parts of the theory inaccessible by other means, in particular the explicit forces acting between quarks and antiquarks.

DeTar and Gottlieb are part of the MIMD Lattice Computation (MILC) Collaboration and work very closely with the Fermilab Lattice Collaboration on the vast majority of their work. They also work with the High Precision QCD (HPQCD) Collaboration for the study of the muon anomalous magnetic moment. As part of these efforts, they use the fastest supercomputers in the world.

Since 2019, they have used Frontera at the Texas Advanced Computing Center (TACC) -- the fastest academic supercomputer in the world and the 9th fastest overall -- to propel their work. They are among the largest users of that resource, which is funded by the National Science Foundation. The team also uses Summit at the Oak Ridge National Laboratory (the #2 fastest supercomputer in the world); Cori at the National Energy Research Scientific Computing Center (#20), and Stampede2 (#25) at TACC, for the lattice calculations.

The efforts of the lattice QCD community over decades have brought greater accuracy to particle predictions through a combination of faster computers and improved algorithms and methodologies.

"We can do calculations and make predictions with high precision for how strong interactions work," said DeTar, professor of Physics and Astronomy at the University of Utah. "When I started as a graduate student in the late 1960s, some of our best estimates were within 20 percent of experimental results. Now we can get answers with sub-percent accuracy."

In particle physics, physical experiment and theory travel in tandem, informing each other, but sometimes producing different results. These differences suggest areas of further exploration or improvement.

"There are some tensions in these tests," said Gottlieb, distinguished professor of Physics at Indiana University. "The tensions are not large enough to say that there is a problem here -- the usual requirement is at least five standard deviations. But it means either you make the theory and experiment more precise and find that the agreement is better; or you do it and you find out, 'Wait a minute, what was the three sigma tension is now a five standard deviation tension, and maybe we really have evidence for new physics.'"

DeTar calls these small discrepancies between theory and experiment 'tantalizing.' "They might be telling us something."

Over the last several years, DeTar, Gottlieb and their collaborators have followed the paths of quarks and antiquarks with ever-greater resolution as they move through a background cloud of gluons and virtual quark-antiquark pairs, as prescribed precisely by QCD. The results of the calculation are used to determine physically meaningful quantities such as particle masses and decays.

One of the current state-of-the-art approaches that is applied by the researchers uses the so-called highly improved staggered quark (HISQ) formalism to simulate interactions of quarks with gluons. On Frontera, DeTar and Gottlieb are currently simulating at a lattice spacing of 0.06 femtometers (10-15 meters), but they are quickly approaching their ultimate goal of 0.03 femtometers, a distance where the lattice spacing is smaller than the wavelength of the heaviest quark, consequently removing a significant source of uncertainty from these calculations.

Each doubling of resolution, however, requires about two orders of magnitude more computing power, putting a 0.03 femtometers lattice spacing firmly in the quickly-approaching 'exascale' regime.

"The costs of calculations keeps rising as you make the lattice spacing smaller," DeTar said. "For smaller lattice spacing, we're thinking of future Department of Energy machines and the Leadership Class Computing Facility [TACC's future system in planning]. But we can make do with extrapolations now."

THE ANOMALOUS MAGNETIC MOMENT OF THE MUON AND OTHER OUTSTANDING MYSTERIES

Among the phenomena that DeTar and Gottlieb are tackling is the anomalous magnetic moment of the muon (essentially a heavy electron) - which, in quantum field theory, arises from a weak cloud of elementary particles that surrounds the muon. The same sort of cloud affects particle decays. Theorists believe yet-undiscovered elementary particles could potentially be in that cloud.

A large international collaboration called the Muon g-2 Theory Initiative recently reviewed the present status of the Standard Model calculation of the muon's anomalous magnetic moment. Their review appeared in Physics Reports in December 2020. DeTar, Gottlieb and several of their Fermilab Lattice, HPQCD and MILC collaborators are among the coauthors. They find a 3.7 standard deviation difference between experiment and theory.

"... the processes that were important in the earliest instance of the Universe involve the same interactions that we're working with here. So, the mysteries we're trying to solve in the microcosm may very well provide answers to the mysteries on the cosmological scale as well."

Carleton DeTar, Professor of Physics, University of Utah
While some parts of the theoretical contributions can be calculated with extreme accuracy, the hadronic contributions (the class of subatomic particles that are composed of two or three quarks and participate in strong interactions) are the most difficult to calculate and are responsible for almost all of the theoretical uncertainty. Lattice QCD is one of two ways to calculate these contributions.

"The experimental uncertainty will soon be reduced by up to a factor of four by the new experiment currently running at Fermilab, and also by the future J-PARC experiment," they wrote. "This and the prospects to further reduce the theoretical uncertainty in the near future... make this quantity one of the most promising places to look for evidence of new physics."

Gottlieb, DeTar and collaborators have calculated the hadronic contribution to the anomalous magnetic moment with a precision of 2.2 percent. "This give us confidence that our short-term goal of achieving a precision of 1 percent on the hadronic contribution to the muon anomalous magnetic moment is now a realistic one," Gottlieb said. The hope to achieve a precision of 0.5 percent a few years later.

Other 'tantalizing' hints of new physics involve measurements of the decay of B mesons. There, various experimental methods arrive at different results. "The decay properties and mixings of the D and B mesons are critical to a more accurate determination of several of the least well-known parameters of the Standard Model," Gottlieb said. "Our work is improving the determinations of the masses of the up, down, strange, charm and bottom quarks and how they mix under weak decays." The mixing is described by the so-called CKM mixing matrix for which Kobayashi and Maskawa won the 2008 Nobel Prize in Physics.

The answers DeTar and Gottlieb seek are the most fundamental in science: What is matter made of? And where did it come from?

"The Universe is very connected in many ways," said DeTar. "We want to understand how the Universe began. The current understanding is that it began with the Big Bang. And the processes that were important in the earliest instance of the Universe involve the same interactions that we're working with here. So, the mysteries we're trying to solve in the microcosm may very well provide answers to the mysteries on the cosmological scale as well."

Credit: 
University of Texas at Austin, Texas Advanced Computing Center

Family ties protect against opioid misuse among U.S. young adults

image: Predicted Probability of Past-Year Prescription Opioid Misuse among U.S. Adults Ages 18-34, 2002-2018

Image: 
Data Source: National Survey on Drug Use and Health (NSDUH).

Syracuse, N.Y. - As opioid use disorders and overdoses continue to skyrocket in the United States, a study by researchers from Syracuse University and Pennsylvania State University shows that unmarried young adults who do not have children are mostly likely to misuse opioids.

The growing number of these "disconnected" young adults may also result in continued rises in substance use disorders and overdoses, the researchers say. The study, "Opioid misuse and family structure: Changes and continuities in the role of marriage and children over two decades," was published recently by Drug and Alcohol Dependence.

The study is also summarized in the Lerner Center for Public Health Promotion research brief "Family Ties Protect against Opioid Misuse among U.S. Young Adults."

Opioid use disorders affect over 2.1 million people in the United States, and rates of drug overdose have climbed steadily over the past three decades. In their previous studies, this group of researchers found that most people who misuse opioids, including prescription opioids, heroin and fentanyl, started their use during young adulthood.

Using a nationally representative study of U.S. adults ages 18-34 from 2002-18, the researchers examined the links between family structure (marital status and presence of children in the household) and opioid misuse. They found that married young adults have lower probabilities of prescription opioid misuse and heroin use, and that the presence of children in the household is associated with lower probabilities of prescription opioid and heroin use, especially among those who have never been married.

"In the U.S., declining marriage rates have led to increases in adults without a partner or children - a group we refer to as 'disconnected adults,' '' said researcher Shannon Monnat from Syracuse University. "These family structure changes have coincided with the dismantling of economic and social institutions that once acted as safety nets, leading to increases in opioid misuse as self-medication for psychological pain, distress, and disconnection from work, family, and social institutions."

The research team included Monnat, Lerner Chair for Public Health Promotion and an associate professor of sociology at Syracuse University, and Penn State Population Research Institute scholars Alexander Chapman (a Ph.D. student in the sociology department) and Ashton Verdery, an associate professor of sociology and demography.

Here are the key findings from their study:

Nearly 20% of U.S. adults ages 18-34 misused prescription opioids between 2002 and 2018. About 2% used heroin.

Married young adults have lower age-adjusted probability of prescription opioid misuse and heroin use.

The presence of children in the household protects against opioid misuse, especially among single adults.

Young adults who are not married and do not have children ("disconnected adults") have the highest age-adjusted probability of opioid misuse.

Increases in disconnected young adults may result in continued increases in substance use disorders and overdoses.

The researchers say these findings persisted even after accounting for several demographic and socioeconomic characteristics of respondents. Their findings suggest that increases in disconnected adults in the U.S. may result in continued increases in substance use disorders and overdoses.

"Our findings reflect the importance of social connections in preventing substance misuse," Monnat said. "Policymakers and community leaders can intervene by enacting strategies and advocating for policies that help young adults build social and community connections."

Who is at Greater Risk of Opioid Misuse?

Nearly 20% of U.S. adults ages 18-34 misused prescription opioids between 2002 and 2018. About 2% used heroin. Prescription opioid misuse increased between 2002 and 2006, leveled off in 2007-2010, and declined from 2011 to 2018. Heroin use increased from 2002 to 2014, but then declined after 2014.

The researchers found that adults with children and who are married were less likely to report prescription opioid misuse and heroin use. Adults who are both single and without children are at greatest risk of opioid misuse. These findings hold even when controlling for factors that might influence marriage, childbearing, and opioid use: educational attainment, employment status, age, race/ethnicity, and metropolitan status.

A lack of social ties, such as those provided through marriage and children, can lead to social isolation and put young adults at greater risk of opioid misuse than adults with strong family and community ties. The presence of children in the household might be particularly important for otherwise disconnected young adults, because children may provide meaning, social bonds, links to institutions, and constraints on time use that are associated with lower probability of opioid misuse.

Credit: 
Syracuse University

BMI1, a promising gene to protect against Alzheimer's disease

image: Healthy neurons (left) and Alzheimer's disease neurons (right): in green, accumulation of G4 structures in Alzheimer's neurons.

Image: 
Gilbert Bernier

Another step towards understanding Alzheimer's disease has been taken at the Maisonneuve-Rosemont Hospital Research Centre. Molecular biologist Gilbert Bernier, and professor of neurosciences at Université de Montréal, has discovered a new function for the BMI1 gene, which is known to inhibit brain aging. The results of his work have just been published in Nature Communications.

In his laboratory, Bernier was able to establish that BMI1 was required to prevent the DNA of neurons from disorganizing in a particular way called G4 structures. This phenomenon occurs in the brains of people with Alzheimer's disease, but not in healthy elderly people. Thus, BMI1 would protect against Alzheimer's by preventing, among other things, the excessive formation of G4s that disrupt the functioning of neurons.

"This discovery adds to our knowledge of the fundamental mechanisms leading to Alzheimer's," said Bernier. "There is still no cure for this disease, which now affects nearly one million Canadians. Any advance in the field brings hope to all these people and their families."

In previous articles published in the journals Cell Reports and Scientific Reports, Bernier demonstrated that the expression of the BMI1 gene is specifically reduced in the brains of people with Alzheimer's disease. He also showed that inactivation of BMI1 in cultured human neurons or in mice was sufficient to recapitulate all the pathological markers associated with Alzheimer's disease.

Credit: 
University of Montreal

Pandemic exacerbates challenges for international energy transition

The Covid-19 Crisis is deepening the divide between energy transition frontrunners and laggards. In a new publication, researchers from the Institute for Advanced Sustainability Studies (IASS) in Potsdam present an overview of the global impact of the coronavirus pandemic on the energy sector. Their findings show that low- and middle-income countries need more support in their efforts to ditch fossil fuels.

The crisis will heighten existing imbalances in an uneven energy transition landscape. Despite the crisis, frontrunners in the global energy transition will continue to expand their renewable energy capacities, while laggards will fall further behind. In Europe, the Green Deal is having an effect and has even encouraged an otherwise sluggish Poland to accelerate its incipient energy transition. In low- and middle-income countries, on the other hand, the pandemic is exacerbating financing challenges and hampering investment in renewable energy infrastructure. This is particularly evident in Latin America, where all auctioning activity in the renewable power sector has ceased. In a number of G20 countries, pandemic-related financial support for fossil fuel industries has dampened efforts to support renewable energies.

Crisis has deepened dependency on fossil fuels

In countries whose economies are heavily dependent on fossil fuels, governments have pledged significant support to these sectors, further entrenching existing dependencies. In Indonesia, for example, the government has chosen to support the country's coal industry with tax breaks and by lowering regulatory requirements, while plans to replace older, fossil-fuelled power plants with renewable energy solutions have been scaled back.

However, aid for the fossil energy industry does not always go hand in hand with a slowdown in the transition to clean energy, explains lead author Rainer Quitzow: "In some large countries, our observations suggest contradictory trends. The USA and Canada, both major oil and gas exporters, were hit hard by the collapse in demand, with the governments of both countries pledging to support the fossil sector. However, it seems unlikely that this will slow the growth of renewables in these countries for the time being, as energy transition pioneers like California continue to move away from fossil fuels." In China, provincial governments have opted to prioritize investment in coal-fired power plants and oil refineries, while the central government has continued to increase its growth targets for the clean energy sector.

In the Global South, the crisis has added to the challenges presented by an already difficult investment climate for clean energy projects. On the one hand, falling government revenues are raising concerns about the sustainability of public debt burdens, leading to a depreciation of currencies and increased borrowing costs. Since renewable energies are particularly capital-intensive, this disproportionately affects investments in renewables. On the other hand, economic hardship related to the pandemic is leading to a rise in defaults on consumer electricity bills. In several countries, governments have responded by reducing electricity prices or suspending the billing of residential electricity consumers. This puts additional pressure on the utility sectors in these countries, adding to investment risks in the power sector.

Global Green Deal could raise ambitions

Science and policymakers must double their efforts to support transitions to a clean energy future in the least developed countries, explains Quitzow: "It is high time that the plight of fossil-fuel dependent economies is addressed at the international level. The Covid-19 crisis has underlined the urgent need to develop programs similar to the European Green Deal in fossil fuel-dependent countries and regions around the world. This will require the development of new international partnerships and financing arrangements that are explicitly tailored to the challenges of fossil-fuel dependent regions." A concerted international effort is needed to confront the twin challenges of economic recovery and the global fight against climate change.

Credit: 
Research Institute for Sustainability (RIFS) – Helmholtz Centre Potsdam