Brain

Rising greenhouse gases pose continued threat to Arctic ozone layer

image: Stratospheric clouds above the Arctic, like those seen here over Kiruna, Sweden, provide ideal conditions for chemical reactions that transform chlorine to a form that depletes the Earth's protective ozone layer. New research shows that unless greenhouse gas emissions are reduced, climate patterns that favor the formation of such clouds will continue to accelerate ozone loss.

Image: 
Ross Salawitch/UMD

There is a race going on high in the atmosphere above the Arctic, and the ozone layer that protects Earth from damaging ultraviolet (UV) radiation will lose the race if greenhouse gas emissions aren't reduced quickly enough.

A new study from an international team of scientists, including University of Maryland Professor Ross Salawitch, shows that extremely low winter temperatures high in the atmosphere over the arctic are becoming more frequent and more extreme because of climate patterns associated with global warming. The study also shows that those extreme low temperatures are causing reactions among chemicals humans pumped into the air decades ago, leading to greater ozone losses.

The new findings call into question the commonly held assumption that ozone loss would grind to a halt in just a few decades following the 2010 global ban on the production of ozone depleting chemicals called chlorofluorocarbons (CFCs) and halons.

The study--which was jointly conducted by UMD, the Alfred Wegener Institute's Helmholtz Centre for Polar and Marine Research, and the Finnish Meteorological Institute--was published in the journal Nature Communications on June 23, 2021.

"We're in a kind of race between the slow and steady decline in CFCs, which take 50 to 100 years to go away, and climate change, which is causing polar vortex temperature extremes to become colder at a rapid pace," said Ross Salawitch, who is a professor in the UMD Department of Atmospheric and Oceanic Science, the Department of Chemistry and Biochemistry, and the Earth System Science Interdisciplinary Center. "The increasingly cold temperatures create conditions that promote ozone depletion by CFCs. So, even though these compounds are slowly going away, Arctic ozone depletion is on the rise as the climate changes."

New data from the study showed the lowest Arctic polar vortex temperatures and the highest ozone losses on record in 2020, beating the previous records set nine years ago in 2011.

The polar vortex is a relatively self-contained, low-pressure system that forms in the stratosphere--at an altitude of about 12 to 50 kilometers (7.5 to 31 miles)--over the Arctic every autumn and stays for varying durations throughout the winter to spring. The pattern of warm and cold winter temperatures in the polar vortex is very irregular, so not every winter is extremely cold.

But the trend toward more frequent and more extreme low temperatures in the polar vortex concerns the researchers, because those conditions promote the formation of clouds, and that promotes ozone loss in the polar stratosphere.

Most of the chlorine and a significant amount of the bromine in the stratosphere comes from the breakdown of CFCs, halons and other ozone-depleting substances. Normally within the Arctic polar vortex the chlorine is non-reactive, but clouds provide the right conditions for the chlorine to change form and react with bromine and sunlight to destroy ozone.

Despite drastic reduction of the industrial production of CFCs and halons since the Montreal Protocol in 1987 and the global ban that followed in 2010, these long-lasting compounds are still abundant in the atmosphere. According to the World Meteorological Organization, atmospheric chlorine and bromine produced by humans is not expected to fall below 50% of their highest levels until the end of this century.

To determine what this situation means for the future, the researchers projected ozone loss out to the year 2100 based on the long-term temperature trend in the polar vortex and the expected decline in chlorine and bromine compounds. They based their predictions on the output from 53 top climate models used by the Intergovernmental Panel on Climate Change.

"All but one of the climate models we looked at show that exceptionally cold winters in the polar vortex will get colder over time," Salawitch said. "And the more greenhouse gas emissions there are, the steeper the trend, which means greater ozone depletion."

Combining these projections with analyses of meteorological data from the past 56 years, the researchers confirmed that the Arctic is already experiencing a significant trend toward lower stratospheric temperatures and associated increases in ozone losses. What's more, their observations reveal that these trends are occurring at rate consistent with the fastest climate models.

"We have been saying that a train is coming for a number of years now," said Salawitch, pointing to research papers he published in 2004 and 2006 that showed extreme winters in the Arctic were becoming colder. "We've now seen the train whizzing by with record ozone loss in 2011 and now in 2020. So, this paper is really a wake-up call that something is happening in the atmosphere that's really important for ozone, and it looks like greenhouse gases are driving it."

Salawitch and his colleagues do not yet fully understand how increasing greenhouse gas emissions and the associated changes to global climate are causing the extreme cold winters in the stratospheric layer of the polar vortex. But some of the underlying mechanisms are understood. Global warming occurs in part because greenhouse gases trap heat closer to Earth's surface, which allows cooling of the upper layers in the stratosphere, where the ozone layer is located. Warming at the surface causes changes to prevailing wind patterns, and the researchers suggest that these changes also produce lower temperatures in the polar vortex.

The researchers also note that recent years have seen a rapid increase in methane, a more powerful greenhouse gas than carbon dioxide, in the lower atmosphere. As this gas travels to the stratosphere, it increases humidity, which also leads to conditions that promote ozone-destroying chemical reactions in the Arctic.

Because ozone filters much of the sun's potentially harmful UV radiation, a depleted ozone layer over the Arctic can result in more UV radiation reaching the surface of the Earth over Europe, North America and Asia when the polar vortex dips south.

But there is hope for avoiding future ozone depletion, according to the researchers. Their study shows that substantial reductions in greenhouse gas emissions over the coming decades could lead to a steady decline in conditions that favor large ozone loss in the Arctic stratosphere.

The research paper, Climate change favours large seasonal loss of Arctic ozone, Peter von der Gathen, Rigel Kivi, Ingo Wohltmann, Ross J. Salawitch, Markus Rex, was published in the journal Nature Communications on June 23, 2021.

Credit: 
University of Maryland

Some seafloor microbes can take the heat: And here's what they eat

image: Bathymetric map of Guaymas Basin annotated with sampling sites from Atlantis expedition AT37-06 in 2016.

Image: 
Teske et al, Front. Microbiol., 2021. Based on a template courtesy of C. Mortera, UNAM.

WOODS HOLE, Mass. -- It's cold in the depths of the world's oceans; most of the seafloor is at a chilly 4°C. Not so the seafloor of Guaymas Basin in the Gulf of California. Here, tectonic plates drift apart and heat from Earth's interior can rise up -- so far up that it bakes large areas of the seafloor sediments, turning buried organic matter into methane and other energy-rich compounds.

What kinds of organisms thrive in this oceanic hotspot? In two new studies, MBL Assistant Scientist Emil Ruff and collaborators show that distinct regions within the Basin harbor specially adapted microorganisms; discover new microbial inhabitants of this deep-sea community; and suggest how the community may be dramatically influencing carbon cycling in the hot seafloor sediments.

Right at the seafloor where the geothermal heat meets the cold deep ocean, the sediments often have a cozy 30-60 °C, ideal temperatures for heat-loving microbes (thermophiles). These exotic heat-lovers can use methane as an energy source and thrive in seascapes that are so different from most other ecosystems on Earth that they could well exist on another planet entirely (see photo 1). The methane munchers and other organisms that use the chemical energy of the hydrothermal fluids are the base of the food web, without which the ecosystem would not be possible. In the first study, Teske et al. show that these methane munchers and other microbes are specially adapted to distinct thermal and geochemical regimes within the Basin.

The microbial communities in these hydrothermal sediments are very diverse, yet only a few organisms can use methane as an energy source. So, what is everyone else doing?

A large part - or most - of the microbial diversity seems to consist of organisms, which - like humans - can only use reduced organic compounds for energy (such as sugars, proteins and fatty acids). These organisms, called heterotrophs, must live in some way off the biomass that rains down from the surface ocean or is produced by the methane munchers and other primary producers.

It is a long-standing question what compounds these heterotrophs use to make a living and why so many different species can live side-by-side without outcompeting each other. In the second study, Sherlynette Pérez Castro, a postdoctoral scientist in Ruff's lab at MBL, and collaborators show that certain heat lovers specialize in degrading the "debris" that is released to the environment when other cells perish: organic polymers and macromolecules. (See Pérez Castro's "Behind the Paper" blog post in Nature Microbiology.)

Every cell, be it a microbial or a human cell, mainly consists of four types of macromolecules: protein, nucleic acids (DNA, RNA), lipids (fatty acids) and polysaccharides (sugars). The researchers used each one of these four compounds successively as the sole energy and carbon source to grow and identify those deep-sea organisms that can make a living on the respective compound.

They found that all of the organisms that they could cultivate in their lab experiments belonged to previously uncultivated microbial species. The experiments also showed that each polymer was nutrition for a whole food web of organisms, which explains how a single molecule can sustain a zoo of organisms, suggesting a reason for the high diversity of coexisting heterotrophs.

To their surprise, none of the 48 different cultures produced methane, a common end product of heterotrophic organisms. This could mean that the methane that is emitted at the seafloor is completely removed from the ecosystem by the microbial communities, which has implications for the deep-sea carbon cycle that remain to be explored.

Credit: 
Marine Biological Laboratory

Marine sediments explain how part of Brazil's Northeast region became semi-arid

image: Collecting samples of marine sediment near the mouth of the Parnaíba River. Precipitation in the north of Brazil's Northeast region has systematically decreased in the last 5,000 years

Image: 
Cristiano Chiessi

Rainfall associated with the Intertropical Convergence Zone (ITCZ), the belt of converging trade winds and rising air that encircles the Earth near the Equator, affects the food and water security of approximately 1 billion people worldwide. They include about 11% of the Brazilian population, concentrated in four states of the Northeast region – Rio Grande do Norte, Ceará, Piauí, and Maranhão. Large swathes of these states have a semi-arid climate, and about half of all their annual rainfall occurs in only two months (March and April), when the tropical rain belt reaches its southernmost position, over the north of the Northeast region. During the rest of the year, the tropical rain belt shifts further north. For example, it is responsible for peak rainfall in the coastal region of Venezuela in July and August.

Projecting the future behavior of precipitation in semi-arid areas like these is fundamental for society to be able to anticipate possible shifts in rainfall patterns due to ongoing climate change. A study by Cristiano Chiessi, a professor at the University of São Paulo (USP) in Brazil, and collaborators shows that precipitation in the north of Brazil’s Northeast region has systematically decreased in the last 5,000 years, contrary to an important paradigm in paleoclimatology. This revised view of what happened in the past helps produce a more realistic scenario for what may happen in the future.

An article on the study is published in Paleoceanography and Paleoclimatology. The study was supported by FAPESP

“According to the prevailing paradigm, the tropical rain belt has migrated southward in the last 5,000 years. Our research suggests instead that its latitudinal oscillation range contracted, so that it now oscillates within a narrower band,” Chiessi told Agência FAPESP.

Valuable information regarding the responses of the climate system to different conditions is recorded in geological sediments deposited on the seabed. The study involves three independent indicators of precipitation deriving from sediments collected along the mouth of the Parnaíba on the Piauí-Maranhão border.

“We analyzed the ratio between levels of the chemical elements titanium and calcium. The titanium comes from continental rock erosion, while the calcium comes from the shells of marine organisms,” Chiessi said. “We also estimated the rate at which continental sediments accumulated on the seabed, and the composition of hydrogen isotopes in continental plant wax found in marine sediment. These three datasets, together with our analysis of numerical climate model outputs, pointed to contraction of the tropical rain belt in the last 5,000 years, rather than the suggested southward migration.”

The study also shows that surface temperature distribution in the two hemispheres is a key factor in the tropical rain belt’s position, also in contrast with the prevailing paradigm.

“According to the paradigm, southward migration of the tropical rain belt was due to a gradual increase in radiation received from the Sun by the southern hemisphere during the summer. The opposite occurred in the northern hemisphere, increasingly hindering northward migration of the tropical rain belt. However, our attention was drawn to two weaknesses in this model,” Chiessi said. “The first was the assumption that the rain belt’s position was determined by surface temperature distribution in both hemispheres, which don’t necessarily respond in a linear manner to the distribution of solar radiation. Secondly, the evidence supporting the paradigm was located almost solely in the northern hemisphere. There was no proof of the migration in the southern hemisphere.”

Although solar radiation did undergo the changes described, he went on, responses in the hemispheres were different owing to the difference in the area of the continents and oceans in each one (continents respond faster than oceans to changes in solar radiation). “It’s therefore necessary to revise the paradigm that has influenced paleoclimatology for two decades,” he said.

Numerical climate models suggest that by the end of this century the tropical rain belt’s latitudinal oscillation range will contract, further reducing rainfall in the northern portion of Brazil’s Northeast region, with potentially severe social and environmental consequences. However, if the Atlantic meridional overturning circulation (AMOC) becomes significantly weaker, reaching the tipping point predicted in another study by Chiessi, the warming of the South Atlantic will exceed that of the North Atlantic, forcing the rain belt southward. “This would have negative consequences in various parts of the world. In Brazil, however, it would prevent an even greater decrease in precipitation in the northern portion of the Northeast,” he said (more at: agencia.fapesp.br/23134/). 

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Not all dietary proteins are created equal

Dietary protein is needed to supply essential amino acids for the synthesis of the structural and functional components of living cells. Thus, food protein quantity and quality are both essential for good health. The 2020-2025 Dietary Guidelines for Americans (DGAs) published an "ounce equivalents" recommendation to help consumers meet protein requirements with a variety of protein food sources. For example, the DGAs present a variety of "ounce equivalents" in the protein food groups stating that 1 ounce of meat is equivalent to 1 cooked egg, ¼ cup of red kidney beans, 1 tablespoon of peanut butter, 2 ounces of tofu, and ½ ounce of mixed nuts. However, the DGAs do not currently address the issue of differences in protein quality associated with varied food sources. In general, animal proteins have higher protein digestibility and a better essential amino acid profile relative to dietary requirements. These measures of protein quality indicate that animal proteins can more readily provide the daily requirement of essential amino acids than plant protein.

A new manuscript recently published in The Journal of Nutrition investigated the physiological response to various ounce equivalents of protein food sources and found that the consumption of ounce equivalents of animal-based protein food sources resulted in greater gain in whole-body net protein balance above baseline than the ounce equivalents of plant-based protein food sources. (1) Robert Wolfe (University of Arkansas for Medical Sciences) and colleagues randomly assigned 56 young healthy adult participants to one of seven food intervention groups: 2 ounces of cooked beef sirloin, 2 ounces of cooked pork loin, 2 cooked eggs, ½ cup of red kidney beans, 2 tablespoons of peanut butter, 4 ounces of tofu, or 1 ounce of mixed nuts. Prior to the onset of the study, participants followed a 3-day dietary weight maintenance. Participants' net whole-body protein balance was assessed using a stable isotope tracer infusion protocol. The changes from baseline following consumption of the different protein food sources were compared with the baseline value for that individual.

Overall, investigators found that animal-based protein food sources elicited greater anabolic responses than plant-based protein food sources. Whole body protein balance increased more in the beef, pork, and eggs groups than all of the groups consuming plant-based protein food sources. Protein synthesis increased more in the beef group than in the groups consuming plant protein foods, kidney beans, peanut butter, or mixed nuts, while the egg and pork groups suppressed protein breakdown more compared with mixed nuts. The magnitude of the whole-body net balance response was correlated with the essential amino acid content of the protein food source. The researchers concluded that "ounce equivalents" of protein food sources as expressed in the DGAs are not metabolically equivalent in terms of either the anabolic response or caloric value and this should be considered as the DGAs develop approaches to establish healthy eating patterns.

"Our research illustrates that animal-based protein foods, such as beef, eggs and pork, and plant-based protein foods, such as kidney beans, peanut butter, tofu and mixed nuts, cannot be considered to be equivalent, or a substitute for each other, when developing healthy dietary patterns, given their unique physiological effects," said lead researcher Robert Wolfe, PhD, Director, Center for Translational Research in Aging and Longevity, and Professor of Geriatrics, University of Arkansas for Medical Sciences. "While it's well-established that animal proteins can more readily provide essential amino acids than plant protein foods, our study also indicates that eating animal protein foods such as beef, pork and eggs may lead to increased protein synthesis, which has been shown to have benefits such as improved satiety and lean muscle mass maintenance."

A corresponding editorial by Glenda Courtney-Martin (University of Toronto) stresses the importance and timely contribution of this study, which could guide future decisions regarding how protein foods can be better categorized by the DGAs. (2)

Credit: 
FoodMinds LLC

Metal catalysts used for environmental sustainability found to degrade and become less effective

image: Some of the complex structural arrangements of catalysts;left is known as branched structure and the right as cage structure.

Image: 
Professor Anna Klinkova of the University of Waterloo

New research is showing that some tiny catalysts being considered for industrial-scaled environmental remediation efforts may be unstable during operation.

Chemists from the University of Waterloo studied the structures of complex catalysts known as "nanoscale electrocatalysts" and found that they are not as stable as scientists once thought. When electricity flows through them during use, the atoms may rearrange. In some cases, the researchers found, electrocatalysts degrade completely.

Understanding why and how this rearrangement and degradation happens is the first step to using these nanoscale electrocatalysts in environmental remediation efforts such as removing atmospheric carbon dioxide and groundwater contaminants and transforming them into higher-value products such as fuels.

"Current electrocatalysts rely on complex nanoscale structures in order to optimize their efficiency," said Anna Klinkova, a professor in Waterloo's Department of Chemistry. "What we found, however, is that the superior performance of these complex nanomaterials often come at a cost of their gradual structural degradation, as there is a trade-off between their effectiveness and stability."

Klinkova and her team discovered that the rearrangement of atoms in the catalyst depended on the type of metal, structural shape, and the reaction conditions of the catalyst.

They identified two reasons for the rearrangements. Some small molecules can temporarily attach to the surface of the catalyst and reduce the energy needed for an atom to move across the surface. In other cases, narrow areas within the catalyst concentrate the electron's current, causing the metal atoms to displace via a process called electromigration.

Electromigration has been previously identified in microelectronics, but this is the first time it has been connected to nanoscale catalysts.

These findings establish a framework for assessing structural stability and mapping the changing geometry of nanoscale catalysts, which is an important step to designing better catalysts in the future.

"These structural effects could be used as one of the design rules in future catalyst development to maximize their stability," Klinkova said. "You could also purposefully induce reconstruction to a different structure that becomes active as the reaction starts."

The study, Interplay of electrochemical and electrical effects induces structural transformations in electrocatalysts, was recently published in the journal Nature Catalysis.

Credit: 
University of Waterloo

Surgical treatment of brain tumors should also be considered for the elderly

image: Magnetic image of meningioma.

Image: 
Ilari Rautalin

Meningiomas, which originate in the meninges surrounding the brain, are the most common type of benign brain tumours. The primary treatment for meningiomas is neurosurgery. Since the risks associated with surgical treatment increase as people get older and develop other diseases, over 80-year-old patients with brain tumours are not operated on almost anywhere in the world.

In Finland, the life expectancy and functional capacity of the elderly population have improved in recent decades, while the number of elderly brain tumour patients who are in good condition is continuously growing. This is why surgeries have increased in prevalence at the Neurosurgery Clinic of the Helsinki University Hospital in the treatment of elderly patients who have lost their functional capacity due to a brain tumour.

Since prior research evidence is scarce, the University of Helsinki and the Helsinki University Hospital surveyed surgical outcomes in their study, which investigated all meningioma patients 80 years of age or older treated through surgery at the Neurosurgery Clinic as of 2010. The total number of patients was 83. The results were recently published in the Scientific Reports journal.

Surgical treatment improves the prognosis for elderly brain tumour patients

According to the results, carefully considered surgical treatment carried out at a high-quality university hospital appears to be beneficial even for elderly brain tumour patients. The long-term prognosis of surgically treated patients did not differ from the life expectancy of the rest of the Finnish population of the same age. In addition, almost half of the patients who had ended up in institutional care due to a brain tumour were able to return home thanks to the surgical treatment.

"Our findings demonstrate that surgically removing a tumour can improve life quality and even save lives in even very old brain tumour patients, especially when taking into consideration the poor prognosis, without surgical treatment, for over 80-year-old brain tumour patients who have lost their functional capacity," says Ilari Rautalin, the principal investigator of the study.

Docent Miikka Korja, head of the Department of Neurosurgery at the Helsinki University Hospital and supervisor of the study, emphasises the importance of the findings from the perspective of neurosurgeons. Korja is one of the neurosurgeons who carry out meningioma surgeries on elderly patients.

"These are demanding operations with a high risk of complications. This is why we have had to consider, on a case-by-case basis, whether these relatively fragile elderly persons are able to tolerate such a surgery, which is stressful and demanding for the body, or whether we are ultimately causing more harm than good. Our preliminary investigations offer relief and encouragement - our surgical treatments make it possible for some elderly patients to return home from inpatient care, improving their functional capacity for the rest of their lives in the process," Korja says.

Credit: 
University of Helsinki

Nrf2: The custodian regulating oxidative stress and immunity against acrylamide toxicity

image: Genetic ablation of Nrf2 exacerbates neurotoxic effects of acrylamide in mice.

Image: 
Tokyo University of Science

Acrylamide is a toxic chemical compound that affects the nervous system. Not only is it widely used in industries such as paper production, plastics, and wastewater management, but it is also a byproduct of commonly used food processing methods, which makes human exposure to acrylamide inevitable. Therefore, many studies have focused on understanding the toxic effects of acrylamide and our body's response to them. Generally, in response to toxicity, the body's cells release protective factors and antioxidants to remedy the damage. This response is activated by various cellular machinery. One such activator is a protein called "nuclear factor erythroid 2-related factor 2" (Nrf2), which is a master regulator of the response to oxidative stress and the immune system.

In a recent study, a team of scientists, led by Prof. Gaku Ichihara from Tokyo University of Science, reported the role of Nrf2 in acrylamide-induced neurotoxicity. Prof. Ichihara states, "Our study showed that Nrf2 has a protective role against neurologic damage and suggests it is through activation of antioxidant stress genes and suppression of proinflammatory cytokine genes."

In their study published in the journal Toxicology, Prof. Gaku Ichihara, along with his colleagues Prof. Masayuki Yamamoto from Tohoku University, Prof. Ken Itoh from Hirosaki University, Associate Prof. Seiichiroh Ohsako from The University of Tokyo, and Prof. Sahoko Ichihara from Jichi Medical University, used mice models to study the role of Nrf2 in acrylamide-induced neurotoxicity. They tested their hypothesis that when Nrf2 gene is removed, the neurotoxic effects of acrylamide will be amplified. For this, they developed "knockout" mice that could not produce Nrf2, and gave the Nrf2-knockout mice and a set of counterpart "wild-type" mice that could produce Nrf2 different concentrations of acrylamide for 4 weeks. Then, they compared the neurotoxicity between both groups of mice using various sensory and motor tests, immunohistochemistry, and protein and gene expression analyses.

The scientists found that the Nrf2-knockout mice had severe neurotoxic effects such as sensory and motor system dysfunction and axonal damage. While these mice produced fewer antioxidants and protective factors in response to acrylamide, they also showed enhanced release of pro-inflammatory chemicals, called "cytokines," in the brain, which can potentially cause additional damage. Additionally, as different doses were given to the mice, the scientists also determined that the neurotoxicity was dose-dependent.

Previous studies have established the role of Nrf2 as a master regulator of protective genes but this study explained the specific mechanisms of immune response to acrylamide-induced toxicity, with Nrf2 at the center of it all. As Prof. Ichihara states, "The results document the first known morphological and neuro-functional evidence of the regulatory role of Nrf2 in acrylamide-induced neurotoxic effects in mice."

The findings of this study are also immensely valuable in the field of disease biology, as recent studies have shown a link between air pollution and Alzheimer's disease. Since the air contains other acrylamide-like chemical pollutants with similar neurotoxic effects, the study's findings could prove useful in the prevention of Alzheimer's disease.

Prof. Ichihara and his team's study is certainly a timely one, as reports of acrylamide intoxication are on the rise and further research is required to better understand the specific mechanisms by which the body protects itself from harm.

Credit: 
Tokyo University of Science

New high-speed method for spectroscopic measurements

image: Conceptual image of the method of using spectrally varying polarization states for high-speed spectroscopic measurements.

Image: 
Frederic Bouchard / National Research Council of Canada

Researchers at Tampere University and their collaborators have shown how spectroscopic measurements can be made much faster. By correlating polarization to the colour of a pulsed laser, the team can track changes in the spectrum of the light by simple and extremely fast polarization measurements. The method opens new possibilities to measure spectral changes on a nanosecond time scale over the entire colour spectrum of light.

In spectroscopy, often the changes of the wavelength, i.e. colour, of a probe light are measured after interaction with a sample. Studying these changes is one of the key methods to gain a deeper understanding of the properties of materials down to the atomic level. Its applications range from astronomical observations and material studies, to fundamental investigations of atoms and molecules.

The research team has demonstrated a novel spectroscopic method which has the potential to speed up measurements to read-out rates that are impossible with conventional schemes. The results have now been published in the prestigious journal Optica.

Spectroscopic measurements usually rely on separating the different colour components to different positions, where the spectrum can then be read-out by a detector array similar to a camera chip. While this approach enables a direct inspection of the spectrum, it is rather slow due to the limited speed of the large read-out array. The new method that the researchers implemented circumvents this restriction by generating a more complex state of laser light and thereby allowing a faster measurements scheme.

"Our work shows a simple way to have different polarizations for all colour components of the laser. By using this light as a probe, we can simply measure the polarization to gain information about changes in the colour spectrum," explains Doctoral Researcher Lea Kopf, lead author of the study.

The trick the researchers use is to perform a modulation in the temporal domain by coherently splitting a femto-second pulse of a laser into two parts - each having a different polarization slightly delayed in time with respect to each other.

"Such a modulation can easily be done using a birefringence crystal, where differently polarized light travels at different speeds. This leads to the spectrally-changing polarization required for our method," describes Associate Professor Robert Fickler, who leads the Experimental Quantum Optics group in which the experiment was performed.

High-speed spectroscopic measurements

The researchers have not only demonstrated how such complex states of light can be generated in the lab; they also tested their application in reconstructing spectral changes using only polarization analysis. As the latter only requires up to four simultaneous intensity measurements, a few very fast photodiodes can be used.

Using this approach, the researchers can determine the effect of narrowband modulations of the spectrum at a precision that is comparable to standard spectrometers but at high speed. "However, we couldn't push our measurement scheme to its limits in terms of possible read-out rates, as we are limited by the speed of our modulation scheme to a few million samples per second," continues Lea Kopf.

Building on these promising initial results, future tasks will include to apply the idea to more broadband light, such as super continuum light sources, and to apply the scheme in spectroscopic measurements of naturally fast varying samples to use its full potential.

"We are happy that our fundamental interest in structuring light in different ways has now found a new direction, which seems to be helpful for spectroscopy tasks which are usually not our focus. As a quantum optics group, we have already started discussing how to apply and benefit from these ideas in our quantum photonics experiments," adds Robert Fickler.

Credit: 
Tampere University

New report from VA-BU-Concussion Legacy Foundation Brain Bank marks 1,000 brain donations milestone with inside look at CTE research

(BOSTON) - Research collaborators from the VA, Boston University, and the Concussion Legacy Foundation (CLF) published an inspiring new report today, "1,000 Reasons for Hope," which exclusively details the first 1,000 brain donors studied at the VA-BU-CLF Brain Bank since 2008 and how they have advanced research on concussions and CTE. The report also explains how the next 1,000 brain donors will answer critical questions that take us closer to preventing, diagnosing, and treating CTE, as well as the long-term consequences of concussion and traumatic brain injury.

"Our understanding of CTE is far behind that of other neurodegenerative disease like Alzheimer's Disease and ALS," said Dr. Ann McKee, chief of neuropathology for the VA Boston Healthcare System and director of the VA-BU-CLF Brain Bank. "Each case we have the honor to study accelerates the science of CTE. Thanks to our Legacy Donors, incredible team and growing national and international collaborations, we are now on the cusp of major breakthroughs."

"Thanks to our Legacy Donors and their families, as well as our incredible research team, we now have a roadmap for how to diagnose and treat CTE during life," said Chris Nowinski, PhD, CLF co-founder and CEO. "Our discoveries have already inspired changes to sports that will prevent many future cases of CTE in the next generation of athletes."

Newly released data found only in "1,000 Reasons for Hope" shows the VA-BU-CLF Brain Bank has received tissue from donors who died as young as age 14 and as old as age 98, with more than 30 primary exposures to brain trauma.

Football players are by far the most represented group, with 706 men having football as their primary exposure to head impacts, including 305 former NFL players. Military veterans are the next largest group (66) followed by ice hockey (45), boxing (30), soccer (24), and rugby (18).

The report also reveals the race and sex of the first thousand Legacy Donors, with 77% white, 18% Black, and 5% other races. To ensure our discoveries benefit all, CLF is actively recruiting athletes and veterans from diverse racial backgrounds and exposures to head impacts. Recruiting females is a priority, as only 2.8% of the first thousand brains are female. Everyone is invited to pledge to donate their brain or join CLF's clinical research registry at PledgeMyBrain.org.

The research highlighted in the report would not be possible without the generous contributions made by the families who donated the brain of their loved one. In "1,000 Reasons for Hope," some of those family members talked publicly about the donation process for the first time.

"I'm very grateful for the insight the Concussion Legacy Foundation has been able to provide me and my family," said Dwayne Johnson, the son of former professional wrestler Rocky Johnson. "Losing my dad without warning was a tough kick in the gut, but one of the saving graces of his passing was coming to understand just how healthy his brain was. As a professional wrestler his entire life, his brain endured a lot. I know he'd be proud knowing the donation of it has impacted brain research and hopefully can shed some light and understanding, not only in science, but also to other families around the world."

"It provided my family and I with final closure," said Gail Evans, widow of former NFL player James Evans. "It gave us light into a world that had darkened around us and provided us with understanding and peace."

With the next 1,000 Legacy Donors studied, VA-BU-CLF Brain Bank researchers and collaborators expect to make monumental progress in the fight against CTE and traumatic brain injuries. The report features insights from 17 of the world's top CTE researchers about what they expect to learn from the next thousand donors.

You can read and download the full "1,000 Reasons for Hope" report here.

Credit: 
Boston University School of Medicine

Blaming the pandemic for stress leaves couples happier

When the COVID-19 pandemic hit during the winter of 2020, locking down entire countries and leaving people isolated in their homes without outside contact for weeks at a time, many relationship experts wondered what that kind of stress would do to romantic couples. What they found was that when couples blamed the pandemic for their stress, they were happier in their relationships.

The findings are outlined in a paper out today in the journal Social Psychological and Personality Science.

Previous research has shown that romantic partners tend to be more critical toward each other when experiencing common stress -- what researchers call stress spillover -- but major events such as natural disasters are not always associated with poor relationship functioning. Because these significant stressors are more noticeable than routine situations, people may be more aware that stress is affecting them and spilling over into the relationship.

"Because of this awareness, when major stressors occur, romantic partners may be less likely to blame each other for their problems and more likely to blame the stressor, which may reduce the harmful effects of stress on the relationship," said Lisa Neff, an associate professor of human development and family sciences at The University of Texas at Austin and one of the study's co-authors.

The COVID-19 pandemic brought a unique opportunity to study this phenomenon, with many couples suddenly working from home, spending more time together, trying to homeschool children, dealing with job losses and dealing with the fear and anxiety of a quickly spreading deadly virus. Researchers analyzed data collected from 191 participants during the early weeks of the pandemic and again seven months later. They found that although people were generally less happy in their relationship when they were experiencing more stress, the harmful effects of stress were weaker among those individuals who blamed the pandemic for their stress.

"Some people come together and they say, 'This is a stressful situation and we're going to tackle this together, and we're not going to blame each other for things that are hard or difficult,'" said Marci Gleason, associate professor of human development and family sciences at UT Austin.

Researchers initially thought that the protective effects of blaming the pandemic might fade over time, but that was not the case.

"Even though people have been under a lot of stress for a long time, the pandemic has continued to be a major headline in the news, which may keep it in people's awareness -- making it easier to keep blaming the pandemic and to reduce stress spillover by blaming the pandemic," Neff said. "Stress is often harmful, but the more we recognize it and where it's coming from, the more we can protect ourselves from it. Talking openly about that stress can weaken some its negative effects."

Credit: 
University of Texas at Austin

Computers help researchers find materials to turn solar power into hydrogen

UNIVERSITY PARK, Pa. -- Using solar energy to inexpensively harvest hydrogen from water could help replace carbon-based fuel sources and shrink the world's carbon footprint. However, finding materials that could boost hydrogen production so that it could compete economically with carbon-based fuels has been, as yet, an insurmountable challenge.

In a study, a Penn State-led team of researchers reports it has taken a step toward overcoming the challenge of inexpensive hydrogen production by using supercomputers to find materials that could help accelerate hydrogen separation when water is exposed to light, a process called photocatalysis.

Both electricity and solar energy can be used to separate hydrogen from water, which is made up of two hydrogen atoms and an oxygen atom, according to Ismaila Dabo, associate professor of materials science and engineering. Institute for Computational and Data Sciences (ICDS) affiliate and co-funded faculty member of the Institutes of Energy and the Environment. Using sunlight to generate electricity to create hydrogen -- or electrolysis --, which, in turn, would likely be converted back into electricity may not be technically advantageous or economically effective. While using solar energy directly to produce hydrogen from water -- or photocatalysis -- avoids that extra step, researchers have yet to be able to use direct solar hydrogen conversion in a way that would compete with carbon-based fuels, such as gasoline.

The researchers, who report their findings in Energy and Environmental Science, used a type of computational approach called high-throughput materials screening to narrow a list of more than 70,000 different compounds down to six promising candidates for those photocatalysts, which, when added to water, can enable the solar hydrogen production process, said Dabo.

They examined the compounds listed in the Materials Project database, an online open-access repository of known and predicted materials. The team developed an algorithm to identify materials with properties that would make them suitable photocatalysts for the hydrogen production process. For example, the researchers investigated the ideal energy range -- or the band gap -- for the materials to absorb sunlight. Working closely with Héctor Abruña, professor of chemistry at Cornell; Venkatraman Gopalan, professor of materials science and engineering at Penn State; and Raymond Schaak, professor of chemistry at Penn State; they also looked at materials that could effectively dissociate water, as well as materials that offered good chemical stability.

"We believe the integrated computational-experimental workflow that we have developed can considerably accelerate the discovery of efficient photocatalysts," said Yihuang Xiong, graduate research assistant and co-first author of the paper. "We hope that, by doing so, we will be able to reduce the cost of hydrogen production."

Dabo added the team focused on oxides -- chemical compounds made up of at least one oxygen atom -- because they can be synthesized in a reasonable amount of time using standard processes. The work required collaborations from across disciplines, which served as a learning experience for the research team.

"I found it very rewarding to have worked on such a collaborative project," said Nicole Kirchner-Hall, doctoral student and co-author of the paper. "As a graduate student specializing in computational material science, I was able to predict possible photocatalytic materials using calculations and work with experimental collaborators here at Penn State and other institutions to co-validate our computational predictions."

Other researchers have previously conducted an economic analysis on several options of using solar energy to produce electricity and determined that solar could drop the price of hydrogen production to compete with gasoline, said Dabo.

"Their essential conclusion was that if you were able to develop this technology, you could produce hydrogen at the cost of $1.60 to $3.20 per equivalent gallon of gasoline," said Dabo. "So, compare that to gasoline, which is around $3 a gallon, if this works, you could pay as low as $1.60 for about the same amount of energy as a gallon of gas in the ideal case scenario."

He added that if a catalyst can help boost solar hydrogen production, this could lead to a hydrogen price that is competitive with gasoline.

The team relied on Penn State ICDS's Roar supercomputer for the computations. According to Dabo, computers represent an important tool in speeding up the process to find the right materials to be used in specific processes. This computationally driven, data-intensive method could represent a revolution in efficiency over the painstaking trial-and-error approach.

"When Thomas Edison wanted to find materials for the light bulb, he looked at just about every material under the sun until he found the right material for the light bulb," said Dabo. "Here we're trying to do the same thing, but in a way to use computers to accelerate that process."

He added that computers will not replace experimentation.

"Computers can make the recommendations as to what materials will be the most promising and then you still need to do the experimental study," said Dabo.

Dabo said he expects the power of computers will streamline the process of finding the best candidates and dramatically cut the time it takes to design materials in the lab to bring them to market to address needs.

The researchers evaluated machine learning algorithms to make suggestions for chemicals that could be synthesized and used as catalysts in solar hydrogen production. Based on this preliminary investigation, they suggest future work may focus on developing machine learning models to improve the chemical screening process.

Dabo also added that they may look at chemical compounds outside of oxides to determine if they might serve as catalysts for solar hydrogen production.

"So far, we did one cycle of this process on oxides -- essentially rusted metals -- but there are a lot of compounds that could be made that aren't based on oxygen," said Dabo. "For example, there are compounds based on nitrogen or sulfur, that we could explore."

Credit: 
Penn State

Landmark field trials show potential of gene-editing

image: Field trials of gene-edited crops show immense potential

Image: 
John Innes Centre

Field trials investigating healthy compounds in agronomically important brassica crops have underlined the "immense potential" of gene editing technology, say researchers.

The trials are the first field application of the technology in the UK since the reclassification of gene-edited crops as genetically modified organisms by the Court of Justice of the European Union (CJEU) in 2018.

The results come as the UK Government is determining whether to allow gene-editing approaches for the purpose of food production, following a DEFRA-led public consultation.

"Our results demonstrate the immense potential for gene-editing to facilitate crop improvement by translating discoveries in fundamental biological processes," said Professor Lars Østergaard, a group leader at the John Innes Centre and one of the authors of the study.

"Modern technologies such as gene-editing by CRISPR provide opportunities to nutritionally fortify foods and safely adapt crops to new environments, addressing the serious challenge that the climate crisis is posing to global food production," he added.

The study focused on glucosinolates which are known to give the distinctive, often pungent, flavour to cruciferous vegetables such as broccoli, cabbage, and kale, and are associated with beneficial effects on human health.

These sulfur-containing organic compounds are exclusively produced by plants of this group and are believed to have health promoting effects, including being anti-carcinogenic, promoting improved blood glucose control and reducing the risk of cardiovascular disease. For this reason, increasing their levels has been an important target for breeders of vegetable brassicas.

Previous work using model plants under optimal laboratory conditions has shown that glucosinolate biosynthesis in the Brassicaceae family is regulated by the gene MYB28. But the effects of this master regulator have not been verified by translating them into crop plants grown in the field.

In this proof-of-concept study, scientists successfully used CRISPR-Cas9 gene editing technology to "knock out" the MYB28 gene in Brassica oleracea (a species that includes many common cultivars such as broccoli). Single gene knockouts in Brassica genus are complicated by multiple copies of numerous genes, including those in the glucosinolate biosynthesis pathway.

The gene-edited plants were grown in field trial conditions in compliance with the 2001/18 GMO directive, in accordance with the ECJ ruling in 2018. Genetic and metabolomic analysis showed that the gene-knockout resulted in a down regulation of glucosinolate biosynthesis genes and a reduction in accumulation of glucosinolates in the leaves and florets of field-grown myb28 mutant broccoli plants.

These results revealed for the first time that MYB28 in B. oleracea regulates glucosinolate levels in a field-based environment, in agreement with previous findings obtained with model plants and in the glasshouse.

Reducing gene activity via a knockout - is one application of the gene-editing toolkit.

First author Dr Mikhaela Neequaye said: "By showing that the master regulator of methionine-derived glucosinolate biosynthesis genes, MYB28, functions in the field, as we know it does in glasshouse-grown plants, the MYB28 gene represents a reliable target for manipulating glucosinolate levels in vegetable brassicas. This study highlights the potential of gene editing in the ongoing characterization and modification of these processes in the field, in often complex crop systems"

Credit: 
John Innes Centre

Breathing new life into existing tech: FT-IR spectrometer shows molecular orientation

image: 3D printable optical setup with built-in sample chamber for a FT-IR spectrometer. Sample is put on the Si ATR crystals for measurement.

Image: 
©M. Takahashi & K. Okada, Osaka Prefecture University

"Any problem can be solved with a little ingenuity". While they may not be the originators of this quote, recent work from researchers at Osaka Prefecture University into understanding the molecular orientation of hybrid thin-film material is a concrete example of its central message. "We wanted everyone to have access to this knowledge," states research lead Professor Masahide Takahashi of the OPU Graduate School of Engineering. Using laboratory-grade equipment with 3D printable optical setups, his research group has established an easy, versatile, yet highly sensitive approach to identify the orientation of molecules and chemical bonds in crystalline organic-inorganic hybrid thin film deposited on a substrate as small as 10 nm, "even film with 3 molecular layers", continues the Professor. Their work was published on June 18th in Chemical Science.

The equipment they used was a spectrometer that employs a technique called Fourier transform infrared spectroscopy (FT-IR) and polarized infrared light with an originally designed 3D-printed attenuated total reflectance (ATR) unit. FT-IR spectrometers are found in most laboratories in part because they show what molecules are found in a sample - but they have not been able to reveal the three-dimensional orientation of these molecules relative to the substrates. This is important to the manufacturing of thin-film devices that can be nanometers in size, as an unpredicted shift in molecular orientation at that level can cause the entire structure of the device to break down.

Conventionally, in FT-IR spectroscopy in transmission configuration, infrared light penetrates from the top of the sample like a skewer. This narrow point of entry and exit does not allow the sample enough interaction with the light to excite the molecules in their chemically bonded states. "We realized that by re-orienting the sample, we could introduce polarized light directly into the substrate of the thin film, generating an evanescent wave that heats up the sample, exciting certain molecules and betraying their orientation," states Bettina Baumgartner, a visiting researcher on the team, "we just needed a new kind of sample interface", adds Associate Professor Kenji Okada. This is where the team designed a brand-new ATR optical setup that bounces polarized infrared light through the entirety of the sample substrate allowing the team to observe the vibration of all molecules aligned with the electric field component of the infrared light, revealing their orientation. Any lab with a 3-D printer can make these ATR optical setups.

This method, which the team used to obtain the structural information of metal-organic framework thin film with a degree of crystal orientation comparable to X-ray structural analysis, is expected to be a useful method in many situations in materials science, such as where orientation control is linked to controlling physical properties, the functional improvement of porous materials used for CO2 capture, and the development of new heterogeneous catalysts.

Credit: 
Osaka Prefecture University

Personalized medicine, not X-rays, should guide forearm fracture treatment in older adults

A decade-long study of the most common forearm fracture in older adults revealed that personalized medicine catering to a patient's individual needs and environment, not age or X-rays, should guide treatment options.

Led by a Michigan Medicine physician, the research team examined treatment outcomes over two years for patients who fractured their distal radius, the larger of two bones in the forearm. They found no one-size-fits all method for treating the fracture, which more than 85,000 Medicare beneficiaries sustain annually.

"Traditionally, surgeons look at these broken bones on X-rays, and they have to assess various ways of fixing it based off fracture anatomy and patient age," said Kevin Chung, M.D., study lead and Charles B. G. De Nancrede Professor of Surgery at Michigan Medicine. "However, in older patients, we determined that the patient-centered care in tailoring particular treatments to their needs, social environment and risk tolerance for surgery are all considerations in prescribing treatment."

The new study, published in JAMA Network Open and funded by the National Institutes of Health, covered more than 180 participants studied at more than 20 global medical centers over a 10-year period.

"For hand surgery, this is the most intense, collaborative effort to try and answer a 200-year puzzle about distal radius fractures in older adults," Chung said. "It is one of the most common fractures in the world for this patient population - you have parents and grandparents that will get this fracture. For the good of public health, we needed to answer this question."

Participants in the trial were randomized to receive one of three treatment strategies, including volar locking plates, external fixation, and pinning. Those who chose not to have surgery were treated with casting. Although participants treated with volar plating reported better ability to perform daily tasks early on in the follow-up period of the trial, that gap between plating and other methods disappeared at the six-month mark. All participants were satisfied with the outcomes at the study's conclusion.

"Surgeons need to know how to perform all the techniques available to treat distal radius fracture, rather than being so ingrained to using just the plating system because, now, most trainees are taught just that system," Chung said. "But there are so many fracture patterns that require one to have all the tools and skills necessary to make sure patients receive tailored treatment for their injuries."

The results show that the interaction between the surgeon, the patient and the patient's family is key because satisfaction of the patients demands a much more personal approach than the singular interest of fixing a broken bone, Chung said.

"We know that chronological age doesn't determine a patient's physiological age," he said. "When someone is 70 years old, they may be physiologically 40-50. Those patients have a need to get back to physical activities and independent living, so we should treat them more aggressively."

Recruitment for a trial lasting 10 years proved difficult for the research team. The older people participating in this historic study, some struggling with transportation, wanted to help others, Chung said.

"It is difficult to participate in this trial because of the time and effort invested by the patients," he said. "These are special people. They contributed their lives for the science of helping other older patients who will suffer from this fracture. Their commitment is inspirational to us, which kept our research team going, despite overwhelming challenges. We are grateful to our study participants, the National Institutes of Health and the support of the American people."

Credit: 
Michigan Medicine - University of Michigan

A key player in cell death moonlights as a mediator of inflammation

image: Caspase-1, a protease of the caspase family, is activated in response to microbial infections, cellular stresses, and so on, through inflammasome activation, resulting in maturation of the key pro-inflammatory cytokines IL-1α and IL-1β. Although caspase-1 directly processes pro-IL-1β into the mature form, pro-IL-1α is not a substrate for caspase-1. It has long been unclear why activation of caspase-1 causes maturation of IL-1α. This study revealed that gasdermin D (GSDMD), an executor of cell death called pyroptosis, mediates IL-1α maturation downstream of caspase-1. Caspase-1 cleaves GSDMD, and its N-terminal fragment forms pores in the cell membrane. This allows the influx of calcium ions, leading to activation of calpains and processing of pro-IL-1α into its mature form by calpains. Maturation of IL-1 not only increases its activity but also facilitates its release from the cell through GSDMD pores. Therefore, GSDMD plays a critical role in caspase-1-dependent inflammatory events, which include the maturation and release of IL-1α. As GSDMD and IL-1α have been implicated in the pathogenesis of several inflammatory diseases, this signaling pathway may be a therapeutic target for such disorders.

Image: 
Kanazawa University

Kanazawa, Japan - Interluekin-1α (IL-1α) is an important part of the immune response, but until now it has been unclear how this molecule is processed from its precursor, pro-IL-1α, and exits the cell during inflammasome activation. Now, researchers from Japan have found that gasdermin D, a protein that was already known to mediate pyroptosis, a form of regulated cell death, plays a crucial role in the maturation and release of IL-1α.

In a study published in March in Cell Reports, researchers from Kanazawa University report that, when the inflammasome (a part of the innate immune system) is activated, gasdermin D forms pores in the cell membrane that allow factors from outside of the cell to flow in, leading to the activation and release of mature IL-1α.

Previous work has shown that inflammasome activation leads to gasdermin D being cut in two by an enzyme, and that one half of the cut protein forms membrane pores. This leads to pyroptosis whereby the pores let water into the cells, causing them to swell and burst.

"We knew that caspase-1 cleaves gasdermin D, and that this enzyme is also important for IL-1α activation," says lead author of the study Kohsuke Tsuchiya. "We therefore suspected that gasdermin D might be involved in the pathway leading to IL-1α release."

To test this possibility, the researchers deleted the gene encoding gasdermin D, and found that this almost completely eliminated IL-1α exit from the cells. Unexpectedly, though, they also saw that virtually no mature IL-1α was present inside these cells.

"This made us think that gasdermin D is important not only for IL-1α release, but also IL-1α maturation," explains Takashi Suda, lead author of the study. "When we looked into this in more detail, we found that gasdermin D did not need to cause cell lysis for these effects to take place. Instead, only its ability to form pores in the membrane was required for both IL-1α maturation and release."

The researchers went on to show that pore formation by gasdermin D allows calcium influx and activation of calpains (a type of proteases), which promotes processing of the pro-IL-1α precursor into mature IL-1α maturation.

"Our findings provide the missing link between caspase-1 activity and IL-1α maturation," says Tsuchiya.

Given that inflammasome activation is a key element of both inflammatory diseases and the response to infection, this study provides important insight into essential functions of the human immune system. Identifying gasdermin D as a regulator of immune function as well as cell death increases our understanding of how inflammasome activation ultimately leads to IL-1α release from immune cells.

Credit: 
Kanazawa University