Tech

Few realistic scenarios left to limit global warming to 1.5°C

Of the over 400 climate scenarios assessed in the 1.5°C report by the Intergovernmental Panel on Climate Change (IPCC), only around 50 scenarios avoid significantly overshooting 1.5°C. Of those only around 20 make realistic assumptions on mitigation options, for instance the rate and scale of carbon removal from the atmosphere or extent of tree planting, a new study shows. All 20 scenarios need to pull at least one mitigation lever at "challenging" rather than "reasonable" levels, according to the analysis. Hence the world faces a high degree of risk of overstepping the 1.5°C limit. The realistic window for meeting the 1.5°C target is very rapidly closing.

If all climate mitigation levers are pulled, it may still be possible to limit global warming to 1.5°C in line with the Paris Agreement. The findings could help inform the heated climate policy debate. "The emission scenarios differ in their reliance on each of the five mitigation levers we looked at. Yet all scenarios that we find to be realistic pull at least several levers at challenging levels," says lead author Lila Warszawski from the Potsdam Institute for Climate Impact Research (PIK). "None of the realistic scenarios relies on a single silver bullet."

All realistic scenarios pull all five levers

"The energy sector is key to the 1.5°C target of course, with on the one hand reduction of energy demand and on the other decarbonisation of energy use and production," says Warszawski. "Yet we can't do away with the other strategies. Removing carbon dioxide from the atmosphere and for instance storing it underground also proves to be almost indispensable. Land use must become a net carbon sink, for instance by re-wetting peatlands or afforestation. Finally, emissions of the powerful greenhouse gas methane must be cut from animal production, but also from leaks in oil and gas extraction. This is quite a list."

The researchers drew from existing research to define bounds that delineate between the 'reasonable', 'challenging', and 'speculative' use of each of the levers by mid-century. The bounds quantify the range of emissions reduction potentials of each of the aggregate levers, which result from technological, economic, social and resource considerations. They can then be translated into contributions to keep warming at 1.5°C with no or low temperature overshoot.

A triple challenge for humanity

"This calls for an immediate acceleration of worldwide action to reduce greenhouse gas emissions by all available means," says co-author Tim Lenton from Exeter University. "We need a sustainability revolution to rival the industrial revolution. Otherwise those most vulnerable to climate change are going to bear the brunt of missing the 1.5°C target. This is a system-wide challenge - piecemeal actions and rhetorical commitments are not going to be enough."

"Humanity is facing a triple challenge to stabilize global warming without significantly overshooting the 1.5°C commitment" says co-author Nebojsa Nakicenovic from the International Institute for Applied Systems Analysis, IIASA. "First is to half global emissions every decade requiring a herculean effort and a decarbonization revolution by phasing-out fossil energy, a quantum leap in efficiency and sufficiency, and climate-friendly behaviors and diets; second to pursue nature-friendly carbon removal through afforestation and land-use change; and third to assure safe operating of Earth systems that now remove half of global emissions from the atmosphere."

Unrealistically optimistic scenarios over-estimate e.g. carbon capture and storage potentials

Those scenarios classified by the analysis as unrealistically optimistic most frequently tend to over-estimate carbon capture and storage potentials, while others over-estimate energy consumption or reduction of non-CO2 greenhouse gases like methane. Still others make all too bold assumptions about dietary changes towards more plant-based food or about limited population growth.

The authors also took a closer look at the scenarios provided by the International Energy Agency (IEA) in 2018 and the one called 'Sky' produced by the Shell oil and gas company. Both scenarios foresee net emissions falling to zero globally as late as 2070. The researchers found that they do not lie within the corridor of carbon dioxide emissions over the next century that seems to offer a realistic chance of meeting the 1.5°C target. The Shell Sky scenario shows emissions levels in 2030 well above other scenarios considered in this study.

"The Shell Sky scenario has been called a pie in the sky, and that's indeed what it is," says co-author Gail Whiteman from the University of Exeter's Business School. "From a science perspective, this is quite clear. In the business community some still like it because it seems to offer, in comparison to other scenarios, a relatively easy way out of the climate crisis. Our analysis shows, however, that there are no easy ways out."

Irrespective of the specific climate target rapid emission reductions are key

"The necessary emissions reductions are hard to achieve, technically but also politically. They require unprecedented innovation of lifestyles and international cooperation," concludes co-author Johan Rockström from PIK. "I understand anyone who thinks we might fail the 1.5°C target. Also, it is clear that irrespective of the specific climate target rapidly implementing strong emission reductions is key now. Yet I think limiting warming at 1.5°C is worth just every effort because this would limit the risk of giving some tipping elements in the Earth system an additional push, such as ice sheets or ecosystems like the Amazon rainforest. As technical as this all might sound, it really is about assuring a safe climate future for all."

Credit: 
Potsdam Institute for Climate Impact Research (PIK)

Climate change threatens one-third of global food production

image: High emissions scenario: areas within and outside Safe Climatic Space for food production 2081-2100 (see comparison image for legend)

Image: 
Matti Kummu/Aalto University

Climate change is known to negatively affect agriculture and livestock, but there has been little scientific knowledge on which regions of the planet would be touched or what the biggest risks may be. New research led by Aalto University assesses just how global food production will be affected if greenhouse gas emissions are left uncut. The study is published in the prestigious journal One Earth on Friday 14 May.

'Our research shows that rapid, out-of-control growth of greenhouse gas emissions may, by the end of the century, lead to more than a third of current global food production falling into conditions in which no food is produced today - that is, out of safe climatic space,' explains Matti Kummu, professor of global water and food issues at Aalto University.

According to the study, this scenario is likely to occur if carbon dioxide emissions continue growing at current rates. In the study, the researchers define the concept of safe climatic space as those areas where 95% of crop production currently takes place, thanks to a combination of three climate factors, rainfall, temperature and aridity.

'The good news is that only a fraction of food production would face as-of-yet unseen conditions if we collectively reduce emissions, so that warming would be limited to 1.5 to 2 degrees Celsius,' says Kummu.

Changes in rainfall and aridity as well as the warming climate are especially threatening to food production in South and Southeast Asia as well as the Sahel region of Africa. These are also areas that lack the capacity to adapt to changing conditions.

'Food production as we know it developed under a fairly stable climate, during a period of slow warming that followed the last ice age. The continuous growth of greenhouse gas emissions may create new conditions, and food crop and livestock production just won't have enough time to adapt,' says Doctoral Candidate Matias Heino, the other main author of the publication.

Two future scenarios for climate change were used in the study: one in which carbon dioxide emissions are cut radically, limiting global warming to 1.5-2 degrees Celsius, and another in which emissions continue growing unhalted.

The researchers assessed how climate change would affect 27 of the most important food crops and seven different livestock, accounting for societies' varying capacities to adapt to changes. The results show that threats affect countries and continents in different ways; in 52 of the 177 countries studied, the entire food production would remain in the safe climatic space in the future. These include Finland and most other European countries.

Already vulnerable countries such as Benin, Cambodia, Ghana, Guinea-Bissau, Guyana and Suriname will be hit hard if no changes are made; up to 95 percent of current food production would fall outside of safe climatic space. Alarmingly, these nations also have significantly less capacity to adapt to changes brought on by climate change when compared to rich Western countries. In all, 20% of the world's crop production and 18% of livestock production under threat are located in countries with low resilience to adapt to changes.

If carbon dioxide emissions are brought under control, the researchers estimate that the world's largest climatic zone of today - the boreal forest, which stretches across northern North America, Russia and Europe - would shrink from its current 18.0 to 14.8 million square kilometres by 2100. Should we not be able to cut emissions, only roughly 8 million square kilometres of the vast forest would remain. The change would be even more dramatic in North America: in 2000, the zone covered approximately 6.7 million square kilometres - by 2090 it may shrink to one-third.

Arctic tundra would be even worse off: it is estimated to disappear completely if climate change is not reined in. At the same time, the tropical dry forest and tropical desert zones are estimated to grow.

'If we let emissions grow, the increase in desert areas is especially troubling because in these conditions barely anything can grow without irrigation. By the end of this century, we could see more than 4 million square kilometres of new desert around the globe,' Kummu says.

While the study is the first to take a holistic look at the climatic conditions where food is grown today and how climate change will affect these areas in coming decades, its take-home message is by no means unique: the world needs urgent action.

'We need to mitigate climate change and, at the same time, boost the resilience of our food systems and societies - we cannot leave the vulnerable behind. Food production must be sustainable,' says Heino.

Credit: 
Aalto University

The mechanism of action of genes with high mutation frequency in cancer

After the p53 tumour suppressor gene, the genes most frequently found mutated in cancer are those encoding two proteins of the SWI/SNF chromatin remodelling complex. This complex's function is to "accommodate" the histones that cover the DNA of the chromosomes so that the processes of transcription, DNA repair and replication or chromosome segregation can occur, as appropriate. A group from the University of Seville has demonstrated at CABIMER that the inactivation of BRG1, the factor responsible for the enzymatic activity of the SWI/SNF complexes, leads to high genetic instability, a characteristic common to the vast majority of tumours.

This study's most important contribution is that it deciphers the mechanism by which this occurs. The SWI/SNF complex is necessary for cells to resolve the conflicts that occur in chromosomes when the transcription and replication machineries collide at the same site and hinder each other. If any part of the SWI/SNF complex is mutated, DNA replication is defective and chromosomal breaks occur, largely promoted by the accumulation of DNA-RNA hybrids at the sites where conflicts occur.

This work solves a very significant problem in molecular and cell biology by providing the SWI/SNF complex with a new function, and opens the possibility that this chromatin remodeler may be a tumour suppressor and that DNA-RNA hybrids constitute an important source of tumourigenicity. The study is part of the ERC Advanced project by Andrés Aguilera, professor of genetics, and its main author is Dr. Aleix Bayona-Feliu, working for the Juan de la Cierva programme, and is published in the prestigious journal Nature Genetics.

Credit: 
University of Seville

Male hormones regulate stomach inflammation in mice

image: Glucocorticoids and androgens promote a healthy stomach pit by inhibiting inflammation, left, while their absence promotes inflammation and SPEM seen in a diseased pit, right. SPEM glands are also much larger than healthy stomach glands.

Image: 
Jonathan Busada, Ph.D./NIEHS

Scientists at the National Institutes of Health determined that stomach inflammation is regulated differently in male and female mice after finding that androgens, or male sex hormones, play a critical role in preventing inflammation in the stomach. The finding suggests that physicians could consider treating male patients with stomach inflammation differently than female patients with the same condition. The study was published in Gastroenterology.

Researchers at NIH's National Institute of Environmental Health Sciences (NIEHS) made the discovery after removing adrenal glands from mice of both sexes. Adrenal glands produce glucocorticoids, hormones that have several functions, one of them being suppressing inflammation. With no glucocorticoids, the female mice soon developed stomach inflammation. The males did not. However, after removing androgens from the males, they exhibited the same stomach inflammation seen in the females.

"The fact that androgens are regulating inflammation is a novel idea," said co-corresponding author John Cidlowski, Ph.D., deputy chief of the NIEHS Laboratory of Signal Transduction and head of the Molecular Endocrinology Group. "Along with glucocorticoids, androgens offer a new way to control immune function in humans."

While this study provides insight into how inflammation is being regulated in males, Cidlowski said additional research is underway to understand the process in females. The scientist handling this phase of research is co-corresponding author Jonathan Busada, Ph.D., assistant professor at West Virginia University School of Medicine in Morgantown. When Busada started the project several years ago, he was a postdoctoral fellow working in Cidlowski's group.

Whether inflammation is inside the stomach or elsewhere in the body, Busada said rates of chronic inflammatory and autoimmune diseases vary depending on sex. He said eight out of 10 individuals with autoimmune disease are women, and his long-term goal is to figure out how glucocorticoids and androgens affect stomach cancer, which is induced by chronic inflammation.

The current research focused on stomach glands called pits, which are embedded in the lining of the stomach.

Busada said the study showed that glucocorticoids and androgens act like brake pedals on the immune system and are essential for regulating stomach inflammation. In his analogy, glucocorticoids are the primary brakes and androgens are the emergency brakes.

"Females only have one layer of protection, so if you remove glucocorticoids, they develop stomach inflammation and a pre-cancerous condition in the stomach called spasmolytic polypeptide-expressing metaplasia (SPEM)," Busada said. "Males have redundancy built in, so if something cuts the glucocorticoid brake line, it is okay, because the androgens can pick up the slack."

The research also offered a possible mechanism -- or biological process -- behind this phenomenon. In healthy stomach glands, the presence of glucocorticoids and androgens inhibit special immune cells called type 2 innate lymphoid cells (ILC2s). But in diseased stomach glands, the hormones are missing. As a result, ILC2s may act like a fire alarm, directing other immune cells called macrophages to promote inflammation and damage gastric glands leading to SPEM and ultimately cancer.

"ILC2s are the only immune cells that contain androgen receptors and could be a potential therapeutic target," Cidlowski said.

Credit: 
NIH/National Institute of Environmental Health Sciences

Ion transporters in chloroplasts affect the efficacy of photosynthesis

A study led by LMU plant biologist Hans-Henning Kunz uncovers a new role for ion transporters: they participate in gene regulation in chloroplasts.

In plants, photosynthesis takes place in intracellular 'factories' called chloroplasts. Plant chloroplasts evolved from photosynthetic cyanobacteria, which were engulfed by a non-photosynthetic cell in the course of evolution. As a result of this evolutionary event, chloroplasts possess two envelope membranes, and have retained functional remnants of their original cyanobacterial genomes. Henning Kunz (LMU Biocenter), his group, and their collaborators have now demonstrated that proteins involved in ion transport in the inner chloroplast membrane participate in the regulation of gene expression in the organelle, therefore playing an important role in the control of photosynthesis.

Photosynthesis actually occurs in a highly specialized 'thylakoid' membrane system in the chloroplast. This system is encapsulated by the inner and outer envelope. Among the proteins embedded in the inner membrane are ion transporters that regulate the ion balance in the medium, known as the stroma, which bathes the thylakoid membranes in the center of the organelle. It also contains the organelle's DNA genome, and the ribosomes that synthesize the proteins which it encodes. Notably, photosynthesis is crucially dependent on the coordinated expression of the chloroplast genome and genes that now reside in the cell nucleus. "Using the model plant Arabidopsis thaliana, we have now shown that the ion balance in the stroma plays a role in the exchange of regulatory information between chloroplasts and the nucleus," says Kunz.

He had previously observed that chloroplast development was delayed and plant growth retarded when two genes that code for specific ion-transport proteins were deleted. "Our experiments show that, in the absence of these ion transporters, certain RNA binding proteins that are encoded by nuclear genes are unable to bind to their target RNAs in the stroma of the chloroplasts," he explains. As a consequence, the processing and maturation of these RNAs (which are necessary for the transfer of genetic information from the chloroplast genome into stromal proteins) is inhibited. "This defect has a particularly significant effect on RNAs that are required for ribosome assembly in the stroma. The resulting reduction in the numbers of functional ribosomes severely impairs chloroplast protein synthesis in these mutants," Kunz explains.

The results reported in the new study could contribute to efforts to enhance the efficiency of photosynthesis under otherwise unfavorable environmental conditions, which would help crop plants to adapt to climate change. "Ion transporters could serve as important tools in this context," says Kunz. "Photosynthetic efficiency is highly dependent on the biochemistry of the stroma. These transport proteins have an important influence on this fine-tuned balance. Only if we start to understand their structures and complex functions in detail it will be possible to manipulate and benefit from the transporters' action."

Credit: 
Ludwig-Maximilians-Universität München

Researchers suggest pathway for improving stability of next-generation solar cells

Scientists have uncovered the exact mechanism that causes new solar cells to break down, and suggest a potential solution.

Solar cells harness energy from the Sun and provide an alternative to non-renewable energy sources like fossil fuels. However, they face challenges from costly manufacturing processes and poor efficiency - the amount of sunlight converted to useable energy.

Perovskites are materials developed for next-generation solar cells. Although perovskites are more flexible cheaper to make than traditional silicon-based solar panels and deliver similar efficiency, perovskites contain toxic lead substances. Versions of perovskites using alternatives to lead are therefore being investigated.

Versions using tin instead of lead show promise but degrade quickly. Now, researchers at Imperial and the University of Bath have shown how these perovskites degrade to tin iodide, which, when exposed to moisture and oxygen, forms iodine. This iodine then helps form more tin iodide, causing cyclic degradation.

The team also show how the selection of a crucial layer within the perovskite can mitigate against degradation under ambient conditions and increase stability. They hope this will help researchers design more stable high-performance tin perovskites that show potential as new solar cells.

Lead researcher Professor Saif Haque, from the Department of Chemistry at Imperial, said: "Knowing the mechanism will help us overcome a major stumbling block for this exciting new technology. Our results will also enable the design of tin perovskite materials with improved stability, paving the way for cheaper, more flexible solar harvesting devices."

Credit: 
Imperial College London

New research optimizes body's own immune system to fight cancer

video: This high magnification video shows T cells migrating on nanopatterns engineered to mimic tumor architectures. University of Minnesota researchers are studying the mechanical properties of the cells to better understand how the immune cells and cancer cells interact.

Image: 
Provenzano Group, University of Minnesota

A groundbreaking study led by engineering and medical researchers at the University of Minnesota Twin Cities shows how engineered immune cells used in new cancer therapies can overcome physical barriers to allow a patient's own immune system to fight tumors. The research could improve cancer therapies in the future for millions of people worldwide.

The research is published in Nature Communications, a peer-reviewed, open access, scientific journal published by Nature Research.

Instead of using chemicals or radiation, immunotherapy is a type of cancer treatment that helps the patient's immune system fight cancer. T cells are a type of white blood cell that are of key importance to the immune system. Cytotoxic T cells are like soldiers who search out and destroy the targeted invader cells.

While there has been success in using immunotherapy for some types of cancer in the blood or blood-producing organs, a T cell's job is much more difficult in solid tumors.

"The tumor is sort of like an obstacle course, and the T cell has to run the gauntlet to reach the cancer cells," said Paolo Provenzano, the senior author of the study and a biomedical engineering associate professor in the University of Minnesota College of Science and Engineering. "These T cells get into tumors, but they just can't move around well, and they can't go where they need to go before they run out of gas and are exhausted."

In this first-of-its-kind study, the researchers are working to engineer the T cells and develop engineering design criteria to mechanically optimize the cells or make them more "fit" to overcome the barriers. If these immune cells can recognize and get to the cancer cells, then they can destroy the tumor.

In a fibrous mass of a tumor, the stiffness of the tumor causes immune cells to slow down about two-fold--almost like they are running in quicksand.

"This study is our first publication where we have identified some structural and signaling elements where we can tune these T cells to make them more effective cancer fighters," said Provenzano, a researcher in the University of Minnesota Masonic Cancer Center. "Every 'obstacle course' within a tumor is slightly different, but there are some similarities. After engineering these immune cells, we found that they moved through the tumor almost twice as fast no matter what obstacles were in their way."

To engineer cytotoxic T cells, the authors used advanced gene editing technologies (also called genome editing) to change the DNA of the T cells so they are better able to overcome the tumor's barriers. The ultimate goal is to slow down the cancer cells and speed up the engineered immune cells. The researchers are working to create cells that are good at overcoming different kinds of barriers. When these cells are mixed together, the goal is for groups of immune cells to overcome all the different types of barriers to reach the cancer cells.

Provenzano said the next steps are to continue studying the mechanical properties of the cells to better understand how the immune cells and cancer cells interact. The researchers are currently studying engineered immune cells in rodents and in the future are planning clinical trials in humans.

While initial research has been focused on pancreatic cancer, Provenzano said the techniques they are developing could be used on many types of cancers.

"Using a cell engineering approach to fight cancer is a relatively new field," Provenzano said. "It allows for a very personalized approach with applications for a wide array of cancers. We feel we are expanding a new line of research to look at how our own bodies can fight cancer. This could have a big impact in the future."

Credit: 
University of Minnesota

You're so vein: Scientists discover faster way to manufacture vascular materials

video: See polymerization and vascularization happen in tandem

Image: 
Mayank Garg, Nancy Sottos, Jeff Moore, and Philippe Guebelle

Developing self-healing materials is nothing new for Nancy Sottos, lead of the Autonomous Materials Systems Group at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign.

Drawing inspiration from biological circulatory systems -- such as blood vessels or the leaves on a tree -- University of Illinois Urbana-Champaign researchers have worked on developing vascularized structural composites for more than a decade, creating materials that are lightweight and able to self-heal and self-cool.

But now, a team of Beckman researchers led by Sottos and Mayank Garg, postdoctoral research associate and lead author of the newly published Nature Communications paper, "Rapid Synchronized Fabrication of Vascularized Thermosets and Composites," have shortened a two-day manufacturing process to approximately two minutes by harnessing frontal polymerization of readily available resins.

"For the past several years we've been looking for ways to make vascular networks in high-performance materials," said Sottos, who is also Swanlund Endowed Chair and head of the Department of Materials Science and Engineering at Illinois. "This is a real breakthrough for making vascular networks in structural materials in a way that saves a lot of time and saves a lot of energy."

Garg said the simplest way to understand their work is to picture the composition of a leaf with its internal channels and structural networks. Now, imagine that the leaf is made from a tough structural material; inside, fluid flows through different spouts and channels of its interconnected vasculature. In the case of the researchers' composites, the liquid is capable of a variety of functions, such as cooling or heating in response to extreme environments.

"We want to create these life-like structures, but we also want them to maintain their performance over substantially longer times compared to existing infrastructure by adopting an approach biology uses ubiquitously," Garg said. "Trees have networks for transporting nutrients and water from the ground against gravity and transporting synthesized food from the leaf to the rest of the tree. The fluids flow in both directions to regulate temperature, grow new material, and repair existing material over the entire lifecycle of the tree. We try to replicate these dynamic functions in a non-biological system."

However, creating these complex materials has historically been a long, daunting process for the Autonomous Materials Systems Group. In previous research on self-healing materials, researchers needed a hot oven, vacuum, and at least a day to create the composites. The lengthy manufacturing cycle involved curing the host material and subsequently burning or vaporizing a sacrificial template to leave behind hollow, vascular networks. Sottos said the latter process can take 24 hours. The more complicated the vascular network, the more difficult and time-consuming it is to remove.

To create the host materials, scientists opt for frontal polymerization, a reaction-thermal diffusion system that uses the generation and diffusion of heat to promote two different chemical reactions concurrently. The heat is created internally during solidification of the host and surplus heat deconstructs an embedded template in tandem to manufacture the vascular material. This means the researchers are able to shorten the process by combining two steps into one, creating the vascular networks as well as the polymerized host material without an oven. Additionally, the new process enables researchers to have more control in the creation of the networks, meaning the materials could have increased complexity and function in the future.

"With this research, we've figured out how to put in vascular networks by using frontal polymerization to drive the vascularization," Sottos said. "It gets done in minutes now instead of days -- and we don't have to put it in an oven."

Two processes in one: Tandem polymerization and vascularization allow scientists to create self-healing structural materials in a matter of minutes.

Self-healing materials can be beneficial wherever strong materials are essential to maintain function under sustained damage -- such as the construction of a skyscraper. But in the case of the researchers, the most likely applications are for planes, spaceships, and even the International Space Station. Sottos explained materials produced in this manner could be commercially manufactured in five to 10 years, though the researchers note that all required materials and processing equipment are currently commercially available.

Beckman Institute Director Jeff Moore, a Stanley O. Ikenberry Endowed Chair of chemistry, as well as Philippe Geubelle, the Bliss Professor of aerospace engineering and executive associate dean of The Grainger College of Engineering, were also involved in the project.

From a computational standpoint, Geubelle explained that he was able to capture the frontal polymerization and endothermic phase change taking place in the sacrificial templates.

"We performed adaptive, transient, nonlinear finite element analyses to study this competition and determine the conditions under which this simultaneous frontal polymerization and vascularization of the gel can be achieved," he said. "This technology will lead to a more energy efficient and substantially faster way to create composites with complex microvascular networks."

Thanks to the team's interdisciplinary discovery, dynamic multifunctional materials are now easier to manufacture than ever before.

"This research is a combination of experimental work as well as computational work," Garg said. "It requires synchronized communication among team members from various disciplines -- chemistry, engineering, and materials science -- to overhaul traditional non-sustainable manufacturing strategies."

"There's nothing better than to see ideas bubble up from students and postdocs in the AMS group resulting from interactions and joint group meetings," Moore added. "The Moore Group has studied chain unzipping depolymerization reactions for years. I was delighted when I learned that the AMS team recognized how the thermal energy produced in a heat-evolving polymerization reaction could be synced to chain unzipping depolymerization in another material for the purpose of fabricating channels. The first time I saw Mayank's results, I thought to myself, 'I wish I'd have thought of that idea.'"

Credit: 
Beckman Institute for Advanced Science and Technology

U-M researchers trace path of light in photosynthesis

Three billion years ago, light first zipped through chlorophyll within tiny reaction centers, the first step plants and photosynthetic bacteria take to convert light into food.

Heliobacteria, a type of bacteria that uses photosynthesis to generate energy, has reaction centers thought to be similar to those of the common ancestors for all photosynthetic organisms. Now, a University of Michigan team has determined the first steps in converting light into energy for this bacterium.

"Our study highlights the different ways in which nature has made use of the basic reaction center architecture that emerged over 3 billion years ago," said lead author and U-M physicist Jennifer Ogilvie. "We want to ultimately understand how energy moves through the system and ends up creating what we call the 'charge-separated state.' This state is the battery that drives the engine of photosynthesis."

laser 2.jpegPhotosynthetic organisms contain "antenna" proteins that are packed with pigment molecules to harvest photons. The collected energy is then directed to "reaction centers" that power the initial steps that convert light energy into food for the organism. These initial steps happen on incredibly fast timescales--femtoseconds, or one millionth of one billionth of a second. During the blink of an eye, this conversion happens many quadrillions of times.

Researchers are interested in understanding how this transformation takes place. It gives us a better understanding of how plants and photosynthetic organisms convert light into nourishing energy. It also gives researchers a better understanding of how photovoltaics work--and the basis for understanding how to build them better.

When light hits a photosynthetic organism, pigments within the antenna gather photons and direct the energy toward the reaction center. In the reaction center, the energy bumps an electron to a higher energy level, from which it moves to a new location, leaving behind a positive charge. This is called a charge separation. This process happens differently based on the structure of the reaction center in which it occurs.

In the reaction centers of plants and most photosynthetic organisms, the pigments that orchestrate charge separation absorb similar colors of light, making it difficult to visualize charge separation. Using the heliobacteria, the researchers identified which pigments initially donate the electron after they're excited by a photon, and which pigments accept the electron.

Heliobacteria is a good model to examine, Ogilvie said, because their reaction centers have a mixture of chlorophyll and bacteriochlorophyll, which means that these different pigments absorb different colors of lights. For example, she said, imagine trying to follow a person in a crowd--but everyone is wearing blue jackets, you're watching from a distance and you can only take snapshots of the person moving through the crowd.

"But if the person you were watching was wearing a red jacket, you could follow them much more easily. This system is kind of like that: It has distinct markers," said Ogilvie, professor of physics, biophysics, and macromolecular science and engineering

Previously, heliobacteria were difficult to understand because its reaction center structure was unknown. The structure of membrane proteins like reaction centers are notoriously difficult to determine, but Ogilvie's co-author, Arizona State University biochemist Kevin Redding, developed a way to resolve the crystal structure of these reaction centers.

To probe reaction centers in heliobacteria, Ogilvie's team uses a type of ultrafast spectroscopy called multidimensional electronic spectroscopy, implemented in Ogilvie's lab by lead author and postdoctoral fellow Yin Song. The team aims a sequence of carefully timed, very short laser pulses at a sample of bacteria. The shorter the laser pulse, the broader light spectrum it can excite.

Each time the laser pulse hits the sample, the light excites the reaction centers within. The researchers vary the time delay between the pulses, and then record how each of those pulses interacts with the sample. When pulses hit the sample, its electrons are excited to a higher energy level. The pigments in the sample absorb specific wavelengths of light from the laser--specific colors--and the colors that are absorbed give the researchers information about the energy level structure of the system and how energy flows through it.

"That's an important role of spectroscopy: When we just look at the structure of something, it's not always obvious how it works. Spectroscopy allows us to follow a structure as it's functioning, as the energy is being absorbed and making its way through those first energy conversion steps," Ogilvie said. "Because the energies are quite distinct in this type of reaction center, we can really get an unambiguous look at where the energy is going."

Getting a clearer picture of this energy transport and charge separation allows the researchers to develop more accurate theories about how the process works in other reaction centers.

"In plants and bacteria, it's thought that the charge separation mechanism is different," Ogilvie said. "The dream is to be able to take a structure and, if our theories are good enough, we should be able to predict how it works and what will happen in other structures--and rule out mechanisms that are incorrect."

Credit: 
University of Michigan

Harvesting light like nature does

image: POSS-peptoid molecules self-assemble into rhomboid-shaped nanocrystals.

Image: 
(Illustration by Stephanie King | Pacific Northwest National Laboratory)

Inspired by nature, researchers at Pacific Northwest National Laboratory (PNNL), along with collaborators from Washington State University, created a novel material capable of capturing light energy. This material provides a highly efficient artificial light-harvesting system with potential applications in photovoltaics and bioimaging.

The research provides a foundation for overcoming the difficult challenges involved in the creation of hierarchical functional organic-inorganic hybrid materials. Nature provides beautiful examples of hierarchically structured hybrid materials such as bones and teeth. These materials typically showcase a precise atomic arrangement that allows them to achieve many exceptional properties, such as increased strength and toughness.

PNNL materials scientist Chun-Long Chen, corresponding author of this study, and his collaborators created a new material that reflects the structural and functional complexity of natural hybrid materials. This material combines the programmability of a protein-like synthetic molecule with the complexity of a silicate-based nanocluster to create a new class of highly robust nanocrystals. They then programmed this 2D hybrid material to create a highly efficient artificial light-harvesting system.

"The sun is the most important energy source we have," said Chen. "We wanted to see if we could program our hybrid nanocrystals to harvest light energy--much like natural plants and photosynthetic bacteria can--while achieving a high robustness and processibility seen in synthetic systems." The results of this study were published May 14, 2021, in Science Advances. 

Big dreams, tiny crystals

Though these types of hierarchically structured materials are exceptionally difficult to create, Chen's multidisciplinary team of scientists combined their expert knowledge to synthesize a sequence-defined molecule capable of forming such an arrangement. The researchers created an altered protein-like structure, called a peptoid, and attached a precise silicate-based cage-like structure (abbreviated POSS) to one end of it. They then found that, under the right conditions, they could induce these molecules to self-assemble into perfectly shaped crystals of 2D nanosheets. This created another layer of cell-membrane-like complexity similar to that seen in natural hierarchical structures while retaining the high stability and enhanced mechanical properties of the individual molecules.

"As a materials scientist, nature provides me with a lot of inspiration" said Chen. "Whenever I want to design a molecule to do something specific, such as act as a drug delivery vehicle, I can almost always find a natural example to model my designs after."

Designing bio-inspired materials

Once the team successfully created these POSS-peptoid nanocrystals and demonstrated their unique properties including high programmability, they then set out to exploit these properties. They  programmed the material to include special functional groups at specific locations and intermolecular distances. Because these nanocrystals combine the strength and stability of POSS with the variability of the peptoid building block, the programming possibilities were endless.

Once again looking to nature for inspiration, the scientists created a system that could capture light energy much in the way pigments found in plants do. They added pairs of special "donor" molecules and cage-like structures that could bind an "acceptor" molecule at precise locations within the nanocrystal. The donor molecules absorb light at a specific wavelength and transfer the light energy to the acceptor molecules. The acceptor molecules then emit light at a different wavelength. This newly created system displayed an energy transfer efficiency of over 96%, making it one of the most efficient aqueous light-harvesting systems of its kind reported thus far.

Demonstrating the uses of POSS-peptoids for light harvesting

To showcase the use of this system, the researchers then inserted the nanocrystals into live human cells as a biocompatible probe for live cell imaging. When light of a certain color shines on the cells and the acceptor molecules are present, the cells emit a light of a different color. When the acceptor molecules are absent, the color change is not observed. Though the team only demonstrated the usefulness of this system for live cell imaging so far, the enhanced properties and high programmability of this 2D hybrid material leads them to believe this is one of many applications.

"Though this research is still in its early stages, the unique structural features and high energy transfer of POSS-peptoid 2D nanocrystals have the potential to be applied to many different systems, from photovoltaics to photocatalysis," said Chen. He and his colleagues will continue to explore avenues for application of this new hybrid material.

Credit: 
DOE/Pacific Northwest National Laboratory

Etching process enhances the extraction of hydrogen during water electrolysis

image: Schematic diagram of the etching process used on metal phosphide electrocatalysts

Image: 
The authors

Extracting hydrogen from water through electrolysis offers a promising route for increasing the production of hydrogen, a clean and environmentally friendly fuel. But one major challenge of water electrolysis is the sluggish reaction of oxygen at the anode, known as the oxygen evolution reaction (OER).

A collaboration between researchers at Hunan University and Shenzhen University in China, has led to a discovery that promises to improve the OER process. In their recent paper, published in the KeAi journal Green Energy & Environment, they report that etching - or, in other words, chemically removing - the oxide overlayers that form on the surface of the metal phosphide electrocatalysts regularly used in electrolysis, can increase OER efficiency.

Professor Shuangyin Wang of the State Key Laboratory of Chem/Bio-sensing and Chemometrics at Hunan University led the study. He explains: "While metal phosphides are often used as catalysts due to their unique physicochemical properties such as high conductivity, earth-abundance reserves and excellent performance, a common, but often neglected fact is that they are quick to suffer atmospheric oxidation when they are exposed to air. This causes them to form oxide overlayers on their surface, which can change the surface reconstruction process and confuse the structure-performance relationship."

To solve this problem, Professor Wang and his colleagues decided to etch away those oxide overlayers using a dielectric barrier discharge plasma technique. And they discovered that the etching process not only accelerated the surface reconstruction process, but greatly enhanced the formation of metal hydroxides and OER activity.

According to Prof. Wang: "These findings are helpful for understanding the structure-performance relationship of metal phosphides in electrooxidation reaction. And we suspect that the same etching process has the potential to be used on other oxygen-susceptible metal compounds such as chalcogenides, nitrides and carbides.

"Our hope is that our study guides the rational design and engineering of more efficient electrocatalysts for water electrolysis."

Credit: 
KeAi Communications Co., Ltd.

Finding control in hard-to-predict systems

Input one, output one; input two, output two; input three; output purple --what kind of system is this? Computer algorithms can exist as non-deterministic systems, in which there are multiple possible outcomes for each input. Even if one output is more likely than another, it doesn't necessarily eliminate the possibility of putting in three and getting purple instead of three. Now, a research team from Iowa State University has developed a way to control such systems with more predictability. The results were published in IEEE/CAA Journal of Automatica Sinica.

"The supervisory control problem for discrete event systems under control involves identifying the supervisor, if one exists," said paper author Ratnesh Kumar, Harpole Professor in the Department of Electrical and Computer Engineering, Iowa State University, USA. "If there is a supervisor, if it's synchronously composed with the system, it results in a system that conforms to the control specification."

A discrete event system behaves based on its current state. If the state changes, the value changes. In the example system above, something about the system's state changed to make it take four and produce purple. Kumar's approach examines the system as it currently exists and finding the least fixed-point operator, or the piece that is most easily changed. The act of identifying such a component can result in a new model that acts as the supervisor of the system.

The researchers used quotienting to determine the possible outcomes and build parameters to identify possible controllers. In simple terms, a quotient is the known number of possibilities: Divide 10 by three. The quotient is three, with fractional possibilities. In Kumar's system, each event is referred to as a "plant" and the entire system is understood as a "warehouse." The quotient is the plant divided by a specification determined by the warehouse, resulting in multiple possible answers, depending on what the system looks like in the moment.

"Given a plant and the specification of the controlled plant, the quotienting operation generates a new specification describing the obligation on the supervisor such that the plant, when controlled by a supervisor, satisfies the specification," Kumar said.

Say the plant is the input of three, and the controller is purple. The calculus involved in describing the operation produces a new parameter of purple. The controller, or supervisor, is obligated to take the input of three and output purple. If the supervisor does not exist, the quotienting process still results in a supervisory control operation.

"The central tenant of our technique is to develop a quotienting-based technique to decide the existence of supervisor and generate the same if one exists," Kumar said.

The researchers conducted simulations to verify their approach and next plan to investigate their method in systems where only some of the actions are observable.

Credit: 
Chinese Association of Automation

Using micro-sized cut metal wires, Japanese team forges path to new uses for terahertz waves

image: Researchers from Tokyo University of Agriculture and Technology successfully tested reflectionless, highly refractive index metasurfaces that may eventually be used in practical applications to send, receive, and manipulate light and radio waves in the terahertz waveband (THz).

Image: 
Takehito Suzuki, Tokyo University of Agriculture and Technology

Japanese researchers successfully tested reflectionless, highly refractive index metasurface that may eventually be used in practical applications to send, receive, and manipulate light and radio waves in the terahertz waveband (THz). THz is measured in millionths of a meter, known as micrometers. The metasurface, an artificial two-dimensional flat material, was made of micro-sized cut metal wires of silver paste ink placed on both the front and back of a polyimide film. The team, led by Takehito Suzuki, Associate Professor at the Tokyo University of Agriculture and Technology (TUAT) Institute of Engineering, published their findings on April 29, 2021 in Optics Express.

Such flat metasurfaces represent a leap forward in the study of THz optics, because they may be flexible, adaptable to a much wider array of potential uses, and far smaller than the present generation of THz optics which rely upon naturally occurring materials that have fixed indices of refraction in the THz waveband, such as cyclo-olefin polymer, magnesium oxide, and silicon. An index of refraction of a material shows that how slow electromagnetic waves travel in the material compared to a vacuum.

A greater ability to receive, transmit, control, and manipulate electromagnetic waves above 1.0 THz is necessary to unlock their potential, which remains largely untapped, according to Suzuki. "The reflectionless metasurface with a high refractive index above 1.0 THz can offer an accessible platform for terahertz flat optics such as 6G wireless communications and other possible commercial applications," Suzuki said. "In addition to vastly faster wireless data transfer speeds, a better ability to manipulate THz waves using metasurfaces may greatly advance technology in the areas of wavefront shaping, beam forming, polarization control, and optical vortices - subjects of great interest to the scientific and communication communities."

Suzuki's research team set out to support the greater scientific community's goal of replacing conventional three-dimensional bulky optical components with two-dimensional flat ones, a feat that would free up space and allow the development of smaller, more adaptable scientific and communication instruments, as well as more advanced security cameras.

The team, Harumi Asada, Kota Endo, and Takehito Suzuki, created their experimental metasurface using silver paste ink and a very thin polyimide film. Cut metal wires with a silver paste ink laid onto the film by a super-fine ink-jet printer (SIJ Technology, Inc.) capable of drawing lines in the order of 10 micrometers in width, yielded the result they had hoped for: The metasurface, which was made of 80,036 pairs of cut metal wires with silver paste ink on both the front and back of 6x6 square millimeters (roughly an infant's thumbnail) plot of a polyimide film, has a high refractive index and low reflection at 3.0 THz.

Suzuki and his collaborating scientists plan to further investigate the potential of flat optics for use in the THz waveband, with the hope of finding scalable, commercially viable materials suitable for a wide array of future uses.

Credit: 
Tokyo University of Agriculture and Technology

Kaiser Permanente cancer survival rate higher among insured

PASADENA, Calif. -- Among cancer patients with health coverage in Southern California, those who were diagnosed and treated at Kaiser Permanente, an integrated health care organization, had better survival rates, especially Black and Latino patients, according to Kaiser Permanente research published in The American Journal of Managed Care.

"Kaiser Permanente is committed to finding and addressing health care inequities," said the study's senior author, Reina Haque, PhD, a cancer epidemiologist in the Kaiser Permanente Southern California Department of Research & Evaluation. "We investigated survival among insured patients with cancer to help pinpoint factors associated with mortality. We found that although Kaiser Permanente Southern California had a higher proportion of minority patients and those from lower socioeconomic status groups, the overall mortality rate among Kaiser Permanente members was still lower than in the group with other health coverage."

She added that the researchers also found markedly lower mortality across all age groups and diagnosis stages among Kaiser Permanente members in Southern California.

Researchers conducted the retrospective cohort analysis of all insured adults diagnosed with 8 common cancers (breast, prostate, lung, colon, melanoma, uterine, kidney, and bladder) from 2009 to 2014 from the California Cancer Registry and followed them through December 2017. Nearly 165,000 adults were included in the study. Patients were covered through various health plans, such as HMOs, PPOs, and other private insurance plans. About one-quarter were covered through Medicare.

The study found that in comparison to patients diagnosed with cancer at Kaiser Permanente in Southern California, and after accounting for socioeconomic status, age, stage at diagnosis, gender, cancer site and primary cancer treatments, African American patients diagnosed in non-Kaiser Permanente hospitals had a 14% higher risk of death and Latino patients had a 23% increased risk of death.

The study highlights the value of integrated health care delivery when treating patients with complex conditions such as cancer, said the lead author of the study, Robert Cooper, MD, a pediatric hematologist and oncologist at the Kaiser Permanente Los Angeles Medical Center.

"We suspect that the integrated nature of the Kaiser Permanente care system, where all care occurs in the same system and all caregivers are connected through the same electronic health record, may optimize care for patients with complex diseases," Dr. Cooper said. "The long history of connectedness has allowed for the development of systematic practices such as sharing of expertise and safety nets focused on caring for all of a patient's needs. These practices may help mitigate the poor outcomes for patients experiencing health disparities."

A related study that used the same California Cancer Registry data set was published in February in Cancer Causes & Control by the same authors. That study also examined mortality in insured cancer patients throughout Southern California and found that lower socioeconomic status was a stronger driver of mortality risk than race or ethnicity in the covered population of patients. In that study, the highest mortality rates were observed in the lowest social-economic status groups compared to patients in the highest socioeconomic status group, and that pattern persisted across all racial and ethnic groups. In fact, the study showed a 70% increased risk of mortality when comparing those in the lowest socioeconomic group versus the upper socioeconomic group.

Both studies were supported with funds from Kaiser Permanente Community Health, administered through the Kaiser Permanente Regional Research Committee. Authors, in addition to Drs. Cooper and Haque, are Joanie Chung, MPH, and Tiffany Hogan, MD.

Credit: 
Kaiser Permanente

COVID-19 vaccines in patients with cancer

What The Viewpoint Says: Questions regarding the safety of COVID-19 vaccination for patients with cancer are explored in this article.

Authors: Dimitrios P. Kontoyiannis, M.D., Sc.D.,Ph.D., of the University of Texas MD Anderson Cancer Center in Houston, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamaoncol.2021.1218)

Editor's Note: The article includes conflicts of interest disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network