Culture

Oldest enzyme in cellular respiration isolated

image: Ph.D. student Dragan Trifunovic with a big bottle and a small test tube containing cultured Thermotoga maritima bacteria

Image: 
Uwe Dettmar for Goethe University Frankfurt, Germany

In the first billion years, there was no oxygen on Earth. Life developed in an anoxic environment. Early bacteria probably obtained their energy by breaking down various substances by means of fermentation. However, there also seems to have been a kind of "oxygen-free respiration". This was suggested by studies on primordial microbes that are still found in anoxic habitats today.

"We already saw ten years ago that there are genes in these microbes that perhaps encode for a primordial respiration enzyme. Since then, we - as well as other groups worldwide - have attempted to prove the existence of this respiratory enzyme and to isolate it. For a long time unsuccessfully because the complex was too fragile and fell apart at each attempt to isolate it from the membrane. We found the fragments, but were unable to piece them together again," explains Professor Volker Müller from the Department of Molecular Microbiology and Bioenergetics at Goethe University.

Through hard work and perseverance, his doctoral researchers Martin Kuhns and Dragan Trifunovic then achieved a breakthrough in two successive doctoral theses. "In our desperation, we at some point took a heat-loving bacterium, Thermotoga maritima, which grows at temperatures between 60 and 90°C," explains Dragan Trifunovic, who will shortly complete his doctorate. "Thermotoga also contains Rnf genes, and we hoped that the Rnf enzyme in this bacterium would be a bit more stable. Over the years, we then managed to develop a method for isolating the entire Rnf enzyme from the membrane of these bacteria."

As the researchers report in their current paper, the enzyme complex functions a bit like a pumped-storage power plant that pumps water into a lake higher up and produces electricity via a turbine from the water flowing back down again.

Only in the bacterial cell the Rnf enzyme (biochemical name = ferredoxin:NAD-oxidoreductase) transports sodium ions out of the cell's interior via the cell membrane to the outside and in so doing produces an electric field. This electric field is used to drive a cellular "turbine" (ATP synthase): It allows the sodium ions to flow back along the electric field into the cell's interior and in so doing it obtains energy in the form of the cellular energy currency ATP.

The biochemical proof and the bioenergetic characterization of this primordial Rnf enzyme explains how first forms of life produced the central energy currency ATP. The Rnf enzyme evidently functions so well that it is still contained in many bacteria and some archaea today, in some pathogenic bacteria as well where the role of the Rnf enzyme is still entirely unclear.

"Our studies thus radiate far beyond the organism Thermotoga maritima under investigation and are extremely important for bacterial physiology in general," explains Müller, adding that it is important now to understand exactly how the Rnf enzyme works and what role the individual parts play. "I'm happy to say that we're well on the way here, since we're meanwhile able to produce the Rnf enzyme ourselves using genetic engineering methods," he continues.

Credit: 
Goethe University Frankfurt

Novel approach reduces SCA1 symptoms in animal model

Research has shown that a mutation in the ATAXIN-1 gene leads to accumulation of Ataxin-1 (ATXN1) protein in brain cells and is the root cause of a rare genetic neurodegenerative disease known as spinocerebellar ataxia type 1 (SCA1). How healthy cells maintain a precise level of ATXN1 has remained a mystery, but now a study led by researchers at Baylor College of Medicine and the Jan and Dan Duncan Neurological Research Institute at Texas Children's Hospital reveals a novel mechanism that regulates ATXN1 levels.

Manipulating this mechanism in animal models of SCA1 reduced ATXN1 levels and improved some of the symptoms of the condition. The findings, published in the journal Genes & Development, offer the possibility of developing treatments that could improve the condition, for which there is no cure.

"SCA1 is characterized by progressive problems with movement, including loss of coordination, and balance (ataxia) and muscle weakness. People with SCA1 typically survive 15 to 20 years after symptoms first appear," said first author Larissa Nitschke, doctoral candidate in the lab of Dr. Huda Zoghbi at Baylor and Texas Children's.

"SCA1 is one of the adult-onset neurodegenerative diseases for which we know the genetic cause, in this case the gene ATXN1," said Zoghbi, corresponding author of the work and professor of molecular and human genetics, pediatrics and neuroscience, and Ralph D. Feigin, M.D. Endowed Chair at Baylor. "When we identified the gene, we learned that mutations can cause the ATXN1 protein to remain in cells longer than normally. This is bad news for neurons as too much ATXN1 leads to their death."

The findings suggested that lowering the levels of ATXN1 might result in improved symptoms, so Nitschke and her colleagues looked for mechanisms that cells use to control the levels of ATXN1.

How cells regulate ATXN1 levels

As with other genes, part of the ATXN1 gene codes for the protein itself and the rest is involved in regulating the expression of the RNA and protein encoded by the gene.

"We looked at a regulatory region known as 5-prime untranslated region (5' UTR), which is unusually long for the ATXN1 gene, and found that it keeps the protein in check so it does not accumulate to reach toxic levels," Nitschke said.

The researchers studied this region in great detail, piece by piece, looking to identify individual sequences or elements that might control the amount of ATXN1 that cells produce. They found several elements that fulfilled that function.

Nitschke and her colleagues focused on one regulatory element that seemed important because it is conserved in many species. They discovered that this short piece could regulate ATXN1 levels.

"We also found that we could reduce the amount of ATXN1 produced with a microRNA called miR760 that binds specifically to the conserved small piece in the 5'UTR region. MicroRNAs are tiny RNA molecules that cells use to regulate the production of specific proteins by interacting with regulatory regions," Nitschke said. "This finding encouraged us to test whether miR760 could reduce the amount of ATXN1 in animal models of SCA1."

Reducing ATXN1 in the cerebellum improves SCA1 symptoms in animal models

Testing the effect of miR760 on animal models of SCA1 had to be planned carefully.

"The role of ATXN1 in the brain is complex," said Zoghbi, director of the Jan and Dan Duncan Neurological Research Institute and member of the Howard Hughes Medical Institute. "Having too much ATXN1 in the back of the brain, the region called the cerebellum, which is involved in balance and coordination, results in balance problems. Having too little ATXN1 in the part of the brain for learning and memory increases the risk of Alzheimer's disease."

The researchers designed their experiments to reduce the levels of ATXN1 only in the cerebellum using gene therapy directed just at this brain region. The results were encouraging. Providing miR760 lowered the levels of ATXN1 and, importantly, improved motor and coordination deficits in the animal models of SCA1.

"The most exciting part of our findings was that we could reduce some of the symptoms of SCA1 in the animal models," Nitschke said. "Although we only lowered the levels of ATXN1 by about 25 percent, the mice significantly improved their movements. This result strongly supports further studies to explore the effectiveness of this approach to treat the human condition."

The findings not only highlight the importance of ATXN1 gene regulatory regions in SCA1, but also bring up the possibility that mutations in these DNA elements could lead to increased levels of ATXN1 and in turn increase the risk for balance problems. Identifying and analyzing the sequences of such elements in people with balance problems might have a potential to help provide a diagnosis.

Credit: 
Baylor College of Medicine

Measuring electron emission from irradiated biomolecules

When fast-moving ions cross paths with large biomolecules, the resulting collisions produce many low-energy electrons which can go on to ionise the molecules even further. To fully understand how biological structures are affected by this radiation, it is important for physicists to measure how electrons are scattered during collisions. So far, however, researchers' understanding of the process has remained limited. In new research published in EPJ D, researchers in India and Argentina, led by Lokesh Tribedi at the Tata Institute of Fundamental Research, have successfully determined the characteristics of electron emission when high-velocity ions collide with adenine - one of the four key nucleobases of DNA.

Since high-energy ions can break strands of DNA as they collide with them, the team's findings could improve our understanding of how radiation damage increases the risk of cancer developing within cells. In their experiment, they considered the 'double differential cross section' (DDCS) of adenine ionisation. This value defines the probability that electrons with specific energies and scattering angles will be produced when ions and molecules collide head-on, and is critical for understanding the extent to which biomolecules will be ionised by the electrons they emit.

To measure the value, Tribedi and colleagues carefully prepared a jet of adenine molecule vapour, which they crossed with a beam of high-energy carbon ions. They then measured the resulting ionisation through the technique of electron spectroscopy, which allowed them to determine the adenine's electron emissions over a wide range of energies and scattering angles. Subsequently, the team could characterise the DDCS of adenine-ion collision; producing a result which largely agreed with predictions made by computer models based on previous theories. Their findings could now lead to important advances in our knowledge of how biomolecules are affected by high-velocity ion radiation; potentially leading to a better understanding of how cancer in cells can arise following radiation damage.

Credit: 
Springer

Skoltech supercomputer helps scientists reveal most influential parameters for crop

image: A heatmap of the impact of key soil parameters on yield

Image: 
Pavel Odinev / Skoltech

Nowadays, agriculture is going to become AI-native: Skoltech researchers have used the Zhores supercomputer to perform a very precise sensitivity analysis to reveal crucial parameters for different crop yields in the chernozem region. Their paper was published in the proceedings of the International Conference on Computational Science 2020.

Farmers all over the world use digital crop models to predict crop yields; these models describe soil processes, climate, and crop properties and require environmental and agricultural management input data to calibrate them and improve the forecasts. In some countries, however, agrochemical data is not freely available for users of these models, and this calibration can become expensive and time-consuming.

A Skoltech team led by full professor Ivan Oseledets and assistant professor Maria Pukalchik used one of the popular open-source process-based model called MONICA and figured out a way to reveal only the most important parameters for crop yield based on historical data and process-modeling. Moreover, they sped up computational efficiency from one simulation per day to half a million model simulations per hour using Zhores, the flagship Skoltech supercomputer.

This stunning amount of simulations is necessary to perform high-quality sensitivity analysis that helps determine how the changes in certain input factors (such as soil parameters or fertilizer) influenced the output crop yield prediction.

The research team used field data from an experiment in the Russian chernozem region, with seasonal crop-rotation of sugar beet (Beta vulgaris), spring barley (Hordeum vulgare), and soybean (Glycine max) observed from 2011 to 2017. They picked six main soil parameters for sensitivity analysis and performed what's called Sobol sensitivity analysis (named after Ilya Sobol, a Russian mathematician who proposed it in 2001).

"Soil is a very complicated issue in this country. Unfortunately, the data about soil properties and crop yield are not published. We have found an opportunity to overcome this barrier and set up the Zhores supercomputer to solve this issue. Now we can simulate all possible variants and reveal the most crucial parameters without time-consuming and costly work. We hope that our achievements will help farmers digitalize their crop growth," said Maria Pukalchik.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

Materials science researchers develop first electrically injected laser

image: Fisher Yu, University of Arkansas

Image: 
University of Arkansas

Materials science researchers, led by electrical engineering professor Shui-Qing "Fisher" Yu, have demonstrated the first electrically injected laser made with germanium tin.

Used as a semiconducting material for circuits on electronic devices, the diode laser could improve micro-processing speed and efficiency at much lower costs.

In tests, the laser operated in pulsed conditions up to 100 kelvins, or 279 degrees below zero Fahrenheit.

"Our results are a major advance for group-IV-based lasers," Yu said. "They could serve as the promising route for laser integration on silicon and a major step toward significantly improving circuits for electronics devices."

The research is sponsored by the Air Force Office of Scientific Research, and the findings have been published in Optica, the journal of The Optical Society. Yiyin Zhou, a U of A doctoral student in the microelectronics-photonics program authored the article. Zhou and Yu worked with colleagues at several institutions, including Arizona State University, the University of Massachusetts Boston, Dartmouth College in New Hampshire and Wilkes University in Pennsylvania. The researchers also collaborated with Arktonics, an Arkansas semiconductor equipment manufacturer.

The alloy germanium tin is a promising semiconducting material that can be easily integrated into electronic circuits, such as those found in computer chips and sensors. The material could lead to the development of low-cost, lightweight, compact and low power-consuming electronic components that use light for information transmission and sensing.

Yu has worked with germanium tin for many years. Researchers in his laboratory have demonstrated the material's efficacy as a powerful semiconducting alloy. After reporting the fabrication of a first-generation, "optically pumped" laser, meaning the material was injected with light, Yu and researchers in his laboratory continue to refine the material.

Credit: 
University of Arkansas

Sugar-based signature identifies T cells where HIV hides despite antiretroviral therapy

image: Dr. Mohamed Abdel-Mohsen

Image: 
The Wistar Institute

PHILADELPHIA -- (August 4, 2020) -- Scientists at The Wistar Institute may have discovered a new way of identifying and targeting hidden HIV viral reservoirs during treatment with antiretroviral therapy (ART). These findings were published today in Cell Reports and may have translational implications for improving the long-term care of HIV positive people.

ART has dramatically increased the health and life expectancy of HIV-infected individuals, suppressing virus replication in the host immune cells and stopping disease progression; however, low yet persistent amounts of virus remain in the blood and tissues despite therapy. Virus persistency limits immune recovery and is associated with chronic levels of inflammation so that treated HIV-infected individuals have higher risk of developing a number of diseases.

This persistent infection stems from the ability of HIV to hide in a rare population of CD4 T cells. Finding new markers to identify the virus reservoir is of paramount importance to achieve HIV eradication.

"With recent advances that we are making in the fields of glycobiology and glycoimmunology, it has become clear that the sugar molecules present on the surface of immune cells play a critical role in regulating their functions and fate," said corresponding author Mohamed Abdel-Mohsen, Ph.D., assistant professor in The Wistar Institute Vaccine & Immunotherapy Center. "However, the relevance of host cell-surface glycosylation in HIV persistence remained largely unexplored, making it a 'dark matter' in our understanding of HIV latency. For the first time, we described a cell-surface glycomic signature that can impact HIV persistence."

Persistently infected cells can be divided into two groups: cells where the virus is completely silent and does not produce any RNA (i.e., silent HIV reservoir); and cells where the virus produces low levels of RNA (i.e., active HIV reservoir). Targeting and eliminating both types of reservoirs is the focus of the quest for an HIV cure. A main challenge in this quest is that we do not have a clear understanding of how these two types of infected cells are different from each other and from HIV-uninfected cells. Therefore, identifying markers that can distinguish these cells from each other is critical.

For their studies, Abdel-Mohsen and colleagues used a primary cell model of HIV latency to characterize the cell-surface glycomes of HIV-infected cells. They confirmed their results in CD4 cells directly isolated from HIV-infected individuals on ART.

They identified a process called fucosylation as a feature of persistently infected T cells in which the viral genome is actively being transcribed. Fucosylation is the attachment of a sugar molecule called fucose to proteins present on the cell surface and is critical for T-cell activation.

Researchers also found that the expression of a specific fucosylated antigen called Sialyl-LewisX (SLeX) identifies persistent HIV transcription in vivo and that primary CD4 T cells with high levels of SLeX have higher levels of T-cell pathways and proteins known to drive HIV transcription during ART. Such glycosylation patterns were not found on HIV-infected cells in which the virus is transcriptionally inactive, providing a distinguishing feature between these two cell compartments. Interestingly, researchers also found that HIV itself promotes these cell-surface glycomic changes.

Importantly, having a high level of SLeX is a feature of some cancer cells that allow them to metastasize (spread to other sites in the body). Indeed, researchers found that HIV-infected cells with high levels of SLeX are enriched with molecular pathways involved in trafficking between blood and tissues. These differential levels of trafficking might play an important role in the persistence of HIV in tissues, which are the main sites where HIV hides during ART.

Based on these findings, the role of fucosylation in HIV persistence warrants further studies to identify how it contributes to HIV persistence and how it could be used to target HIV reservoirs in blood and tissues.

Credit: 
The Wistar Institute

Russian developers created a platform for self-testing of AI medical services

image: The first working prototype of the platform is hosted on the popular GitHub service, and developers from all over the world can take part in its improvement by adding verification criteria depending on the purpose of the services.

Image: 
Center for Diagnostics and Telemedicine

Experts from the Center for Diagnostics and Telemedicine have developed a platform for self-testing services which is based on artificial intelligence and designed for medical tasks, such as for analyzing diagnostic images. The first working prototype of the platform is hosted on the popular GitHub service, and developers from all over the world can take part in its improvement by adding verification criteria depending on the purpose of the services. Sergey Morozov, CEO of the Center for Diagnostics and Telemedicine, spoke about this at the thematic week dedicated to artificial intelligence which was part of the program of the European Congress of Radiology (ECR 2020).

Before implementing a service based on artificial intelligence (AI) into routine clinical practice, it is necessary to test it for technical readiness, as well as to verify whether it meets the stated characteristics. It is called analytical validation of the algorithm. The services that have passed it are allowed to be integrated into medical systems, including city healthcare.

Integration is a complex and expensive process, so it becomes a barrier for many teams that cannot guarantee the required accuracy and speed of the algorithm processing data of the system into which they are integrated. Currently analytical validation is performed manually. Manual validation allows accidental or deliberate deviations from the approved test program, as well as manipulation of datasets, and also can potentially put different test participants in unequal conditions.

To solve these problems and automate the verification process, ensuring trust of users, specialists of the Center for Diagnostic and Telemedicine have developed a platform that allows developers of AI-based services to independently conduct preliminary tests (analytical validation) of their algorithms. A prototype of the platform has been hosted on the GitHub, and the first version of the service for exchanging datasets and data analysis results has already been uploaded.

The platform provides an opportunity for the unlimited number of accesses to single samples of data instances from the test set in order to fine-tune algorithms. It has uniform rules of use, and it is possible to test several services simultaneously. At the same time, the platform records the time that the software spends on data processing (time-study), and the developers receive an automatic report on the results of testing, - explains Sergey Morozov, CEO of the Center for Diagnostic and Telemedicine.

By automating the entire process on the self-testing platform, the human factor is minimized, which makes data manipulation (to improve results) impossible. In addition, the comparison of the service's verification results with the reference data is absolutely transparent - the developer can see what metrics were used, and how the final result reflected in the report was calculated.

Anyone can take part in improving the platform and add necessary metrics to it, which will be used to evaluate the algorithm's performance for certain medical purposes (for example, for analyzing radiographs or mammograms). However, the addition of the platform will be monitored - the only metrics that have scientific justification will be included in the platform operating on the basis of the Center, - notes Nikolai Pavlov, the developer of the platform, Head of Dataset Labeling Conveyor of the Medical Informatics, Radiomics and Radiogenomics Sector, Center for Diagnostics and Telemedicine.

The creators of the platform invite developers of AI algorithms, programmers and researchers to take part in updating and improving the platform in order to develop a uniform, universal, and user-friendly tool for self-testing of artificial intelligence algorithms intended for medical purposes in the international community. At the moment, there is no such tool aimed specifically at the clinical implementation of services based on AI technologies.

Credit: 
Center of Diagnostics and Telemedicine

Integration of gene regulatory networks in understanding animal behavior

image: Saurabh Sinha, Director of Computational Genomics and computer science professor at University of Illinois

Image: 
L. Brian Stauffer

For years, scientists have attributed animal behavior to the coordinated activities of neuronal cells and its circuits of neurons, known as the neuronal network (NN). However, researchers are pushing the boundaries in understanding animal behavior through the integration of gene regulation.

Fueled by a long-time collaboration with Carl R. Woese Institute for Genomic Biology (IGB) Director and entomology professor Gene Robinson at the University of Illinois Urbana-Champaign, incoming IGB Director of Computational Genomics and computer science professor Saurabh Sinha helped organize a workshop on "Cis-Regulatory Evolution in Development and Behavior" in 2018 to push a new line of thinking.

"One of the remarkable findings from a study led by Gene and his collaborators was that more eusocial insects seemed to have something different about their regulatory genome," said Sinha. "It seemed that there was some sort of evolutionary signature of complex social behavior that we hadn't really expected and was one of those findings that really made you re-think the implications."

The two-day workshop brought together people from a diverse set of skillsets where ideas were exchanged and challenged during discussions on various topics. Two years later, the results of those discussions culminated in a perspective article published in the Proceedings of the National Academy of Science.

"The starting point for this perspective is that the NN is the de facto standard for understanding what goes on in the brain as pertinent to behavior," said Sinha. "Our goal was to highlight another level of dynamics that accompany behavior and not just the dynamics of the NN."

The authors of the perspective synthesized current evidence on the role of the gene regulatory networks (GRNs) - a collection of regulatory interactions between genes - in the context of animal behavior along with the NN. Behavior-associated GRNs (bGRNs) impact gene expression changes associated with a certain animal behavior while developmental GRNs (dGRNs) influence development of new cells and connections in the brain. The integration of NNs, bGRNs and dGRNs across multiple scales holds potential in understanding how these networks work in concert to regulate animal behavior.

"Our first goal was to simply emphasize the significance of the GRN in the behavioral context, before speculating on how the GRN might interact with the NN since current research is lacking," said Sinha. "One example of an interaction between the NN and GRN could be the modulation of neuronal transmission activity through control of protein or peptide expression by the GRN."

Through experimental mapping of these networks, the changes in gene expression can be corresponded with behaviors in different cell types. Emerging technologies will play a key role in these efforts. "Measuring gene expression in the brain has been fraught with the heterogeneity of the brain where you have so many different cell types," said Sinha. "The fact that we have single-cell technology really taking off means that we can have a proper resolution of GRNs in the brain and therefore, examine how cell type-specific GRNs interact with signal transmission through the NN."

The perspective also touches on how environmental factors and social behavior affect GRNs, which then go on to modulate NN function and behavior. "The environment can induce epigenetic and longer-lasting changes that then lead to the GRN becoming different," said Sinha. "Looking at brain function not only through the lens of the NN but also through GRNs allows us to bring in the environment in a credible way. In regard to social behavior, there is probably a difference in the GRN of more eusocial bees and that is a starting point for the intriguing possibility that social behavior has some unique characteristics in its GRNs."

With the emergence of technologies, future analyses of bGRNs and the interchange between bGRNs, dGRNs and NNs in various behavioral contexts will provide a deeper understanding of animal behavior.

Credit: 
Carl R. Woese Institute for Genomic Biology, University of Illinois at Urbana-Champaign

Inexpensive, accessible device provides visual proof that masks block droplets

DURHAM, N.C. - Duke physician Eric Westman was one of the first champions of masking as a means to curtail the spread of coronavirus, working with a local non-profit to provide free masks to at-risk and under-served populations in the greater Durham community.

But he needed to know whether the virus-blocking claims mask suppliers made were true, to assure he wasn't providing ineffective masks that spread viruses along with false security. So he turned to colleagues in the Duke Department of Physics: Could someone test various masks for him?

Martin Fischer, Ph.D., a chemist and physicist, stepped up. As director of the Advanced Light Imaging and Spectroscopy facility, he normally focuses on exploring new optical contrast mechanisms for molecular imaging, but for this task, he MacGyvered a relatively inexpensive apparatus from common lab materials that can easily be purchased online. The setup consisted of a box, a laser, a lens, and a cell phone camera.

In a proof-of-concept study appearing online Aug. 7 in the journal Science Advances, Fischer, Westman and colleagues report that the simple, low-cost technique provided visual proof that face masks are effective in reducing droplet emissions during normal wear.

"We confirmed that when people speak, small droplets get expelled, so disease can be spread by talking, without coughing or sneezing," Fischer said. "We could also see that some face coverings performed much better than others in blocking expelled particles."

Notably, the researchers report, the best face coverings were N95 masks without valves - the hospital-grade coverings that are used by front-line health care workers. Surgical or polypropylene masks also performed well.

But hand-made cotton face coverings provided good coverage, eliminating a substantial amount of the spray from normal speech.

On the other hand, bandanas and neck fleeces such as balaclavas didn't block the droplets much at all.

"This was just a demonstration - more work is required to investigate variations in masks, speakers, and how people wear them - but it demonstrates that this sort of test could easily be conducted by businesses and others that are providing masks to their employees or patrons," Fischer said.

"Wearing a mask is a simple and easy way to reduce the spread of COVID-19," Westman said. "About half of infections are from people who don't show symptoms, and often don't know they're infected. They can unknowingly spread the virus when the cough, sneeze and just talk.

"If everyone wore a mask, we could stop up to 99% of these droplets before they reach someone else," Westman said. "In the absence of a vaccine or antiviral medicine, it's the one proven way to protect others as well as yourself."

Westman and Fischer said it's important that businesses supplying masks to the public and employees have good information about the products they're providing to assure the best protection possible.

"We wanted to develop a simple, low-cost method that we could share with others in the community to encourage the testing of materials, masks prototypes and fittings," Fischer said. "The parts for the test apparatus are accessible and easy to assemble, and we've shown that they can provide helpful information about the effectiveness of masking."

Westman said he put the information immediately to use: "We were trying to make a decision on what type of face covering to purchase in volume, and little information was available on these new materials that were being used."

The masks that he was about to purchase for the "Cover Durham" initiative?

"They were no good," Westman said. "The notion that 'anything is better than nothing' didn't hold true."

Credit: 
Duke University Medical Center

Advance in programmable synthetic materials

image: Rods of multivariate MOFs (left) can be programmed with different metal atoms (colored balls) to do a series of chemical tasks, such as controlled drug release, or to encode information like the ones and zeros in a digital computer.

Image: 
UC Berkeley image by Omar Yaghi and Zhe Ji

Artificial molecules could one day form the information unit of a new type of computer or be the basis for programmable substances. The information would be encoded in the spatial arrangement of the individual atoms - similar to how the sequence of base pairs determines the information content of DNA, or sequences of zeros and ones form the memory of computers.

Researchers at the University of California, Berkeley, and Ruhr-Universität Bochum (RUB) have taken a step towards this vision. They showed that atom probe tomography can be used to read a complex spatial arrangement of metal ions in multivariate metal-organic frameworks.

Metal-organic frameworks (MOFs) are crystalline porous networks of multi-metal nodes linked together by organic units to form a well-defined structure. To encode information using a sequence of metals, it is essential to be first able to read the metal arrangement. However, reading the arrangement was extremely challenging. Recently, the interest in characterizing metal sequences is growing because of the extensive information such multivariate structures would be able to offer.

Fundamentally, there was no method to read the metal sequence in MOFs. In the current study, the research team has successfully done so by using atom probe tomography (APT), in which the Bochum-based materials scientist Tong Li is an expert. The researchers chose MOF-74, made by the Yaghi group in 2005, as an object of interest. They designed the MOFs with mixed combinations of cobalt, cadmium, lead, and manganese, and then decrypted their spatial structure using APT.

Li, professor and head of the Atomic-Scale Characterisation research group at the Institute for Materials at RUB, describes the method together with Dr. Zhe Ji and Professor Omar Yaghi from UC Berkeley in the journal Science, published online on August 7, 2020.

Just as sophisticated as biology

In the future, MOFs could form the basis of programmable chemical molecules: for instance, an MOF could be programmed to introduce an active pharmaceutical ingredient into the body to target infected cells and then break down the active ingredient into harmless substances once it is no longer needed. Or MOFs could be programmed to release different drugs at different times.

"This is very powerful, because you are basically coding the behavior of molecules leaving the pores," Yaghi said.

They could also be used to capture CO2 and, at the same time, convert the CO2 into a useful raw material for the chemical industry.

"In the long term, such structures with programmed atomic sequences can completely change our way of thinking about material synthesis," write the authors. "The synthetic world could reach a whole new level of precision and sophistication that has previously been reserved for biology."

Credit: 
University of California - Berkeley

Decline in plant breeding programs could impact food security

image: A team of scientists led by Kate Evans, a Washington State University horticulture professor who leads WSU's pome fruit (apples and pears) breeding program, found that public plant breeding programs are seeing decreases in funding and personnel.

Image: 
WSU

Public plant breeding programs are declining across the United States.

A team of scientists led by Kate Evans, a Washington State University horticulture professor who leads WSU's pome fruit (apples and pears) breeding program, found that public plant breeding programs are seeing decreases in funding and personnel.

The study was published in the journal Crop Science.

Evans and her colleagues conducted a survey of 278 plant breeding programs around the country. Public programs are chiefly federal programs, like those run by the U.S. Department of Agriculture, or based at public research universities.

In the surveys, respondents estimated a 21.4% decline in full time employee (FTE) time for program leaders over the past five years and an estimated 17.7% decline in FTE time for technical support personnel.

The researchers also found that retirement looms for a significant number of plant breeding program leaders. Over a third of the responding programs reported having leaders over the age of 60 and 62% are led by people over 50.

This decline is concerning because plant breeding has a direct impact on food security, Evans said.

"Plant breeding plays a fundamental part of the long-term food security of this country," Evans said. "The tremendous increases in food production over the past century are largely due to plant breeding, and the world's population is only increasing."

The focus on food security has received more attention in the last few months, as the COVID-19 pandemic has moved around the world, she said.

"Plant breeding is a long-term, sustainable way to address concerns over having enough food and keeping our food sources secure," said Evans, who is based at WSU's Tree Fruit Research & Extension Center in Wenatchee.

Plant breeding takes on many forms, from breeding disease tolerance, increasing production, introducing new delicious varieties, or improving drought tolerance.

"It could be a disease, a pest, climate change, any number of things," Evans said. "We do not live in a stable environment, and there are many different ways to deal with that."

Plant pathogens, like bacteria, and pests are always adapting, so varieties of crops that were bred to naturally fight off a disease start to lose their defenses. Plant breeding programs help growers stay ahead of those potentially harmful adaptations.

Another impact of declining breeding programs is losing those with a local focus.

"In Washington, for example, our cereal breeding programs are very focused on local production," Evans said. "They breed wheat that grows very well for eastern Washington."

Another example is the citrus industry. Citrus greening disease has been devastating to growers, particularly in Florida, when trees produce bitter, green, and misshapen fruit. Plant breeding programs are working hard to develop varieties that naturally repel the pest that causes the problems.

One reason that plant breeding programs are declining is expense. It takes many years to develop a new variety of a crop, Evans said. And funding a program for that long requires significant investment.

"We can't rely on grants because those are often only for a few years," she said. "You can't do anything in plant breeding in three years, it requires long-term sustained funding to get a program going."

Credit: 
Washington State University

COVID recovery choices shape future climate

A post-lockdown economic recovery plan that incorporates and emphasises climate-friendly choices could help significantly in the battle against global warming, according to a new study.

This is despite the sudden reduction of greenhouse gas emissions and air pollutants during lockdown having a negligible impact on holding down global temperature change.

The researchers warn that even with some lockdown measures staying in place to the end of 2021, without more structural interventions global temperatures will only be roughly 0.01°C lower than expected by 2030.

However, the international study, led by the University of Leeds, estimates that including climate policy measures as part of an economic recovery plan with strong green stimulus could prevent more than half of additional warming expected by 2050 under current policies.

This would provide a good chance of global temperatures staying below the Paris Agreement's aspirational 1.5?C global warming limit and avoiding the risks and severe impacts that higher temperatures will bring.

Piers Forster began working with his daughter, Harriet, after her A levels were cancelled. They analysed the newly accessible global mobility data from Google and Apple. They calculated how 10 different greenhouse gases and air pollutants changed between February and June 2020 in 123 countries. They then brought in a wider team to help with the detailed analysis.

The team's findings, published today in Nature Climate Change, detail how despite carbon dioxide (CO2), nitrogen oxides (NOx) and other emissions falling by between 10-30% globally, through the massive behavioural shifts seen during lockdown, there will be only a tiny impact on the climate, mainly because the decrease in emissions from confinement measures is temporary.

The researchers also modelled options for post-lockdown recovery, showing that the current situation provides a unique opportunity to implement a structural economic change that could help us move towards a more resilient, net-zero emissions future.

Study lead author Professor Piers Forster, director of the Priestley International Centre for Climate at Leeds and Principal Investigator of the CONSTRAIN consortium, said: "The choices made now could give us a strong chance of avoiding 0.3?C of additional warming by mid-century, halving the expected warming under current policies. This could mean the difference between success and failure when it comes to avoiding dangerous climate change.

"The study also highlights the opportunities in lowering traffic pollution by encouraging low emissions vehicles, public transport and cycle lanes. The better air quality will immediately have important health effects - and it will immediately start cooling the climate."

Study co-author Harriet Forster, who has just completed her studies at Queen Margaret's School, said: "Our paper shows that the actual effect of lockdown on the climate is small. The important thing to recognise is that we've been given a massive opportunity to boost the economy by investing in green industries - and this can make a huge difference to our future climate.

"I'm going to London next month to study art but I also did chemistry at A-level so was glad to use what I learned in my chemistry classes to do something useful."

Study co-author Corinne Le Quéré from the University of East Anglia said: "The fall in emissions we experienced during COVID-19 is temporary and therefore it will do nothing to slow down climate change, but the Government responses could be a turning point if they focus on a green recovery, helping to avoid severe impacts from climate change."

Study co-author Joeri Rogelj from the Grantham Institute - Climate Change and the Environment at Imperial College London said: "Both sobering and hopeful, the flash crash in global emissions due to lockdown measures will have no measurable impact on global temperatures by 2030; but the decisions we make this year about how to recover from this crisis can put us on a solid track to meet the Paris Agreement. Out of this tragedy comes an opportunity, but unless it is seized a more polluting next decade is not excluded."

Study co-author Matthew Gidden from Climate Analytics, Berlin said: "The lasting effect of COVID-19 on climate will not depend on what happens during the crisis, but what comes after. "Stimulus focused on green recovery and low-carbon investment can provide the economic kick start needed while putting the world on track to meet climate pledges."

Study co-author Professor Mathew Evans. From Wolfson Atmospheric Chemistry Laboratories, University of York and the National Centre for Atmospheric Science said: "The analysis of air quality observations from around the world showed us that the emissions reductions captured by Google and Apple's mobility data were pretty close to those actually being experienced."

Study co-author Christoph Keller from Goddard Earth Sciences, Technology and Research (GESTAR) based in the Global Modeling and Assimilation Office (GMAO) at NASA GSFC said: "The decrease in human activity in the wake of the COVID-19 pandemic has created a unique opportunity to better quantify the human impact on atmospheric air pollution.

"Near real-time analysis of observations, mobility data, and NASA model simulations offers quantitative insights into the impact of COVID-19 containment measures on air pollution. This study demonstrates how such information can help to advance our understanding of the complicated interactions between air quality and climate."

Further information:

Link to media resources: https://constrain-eu.org/media-resources-forster-et-al-2020/

(Includes animation of fraction of usual NOx and SO2 emissions due to COVID-19)

Please credit all use of resources to CONSTRAIN

Page access password: CONSTRAIN

The paper Current and future global climate impacts resulting from COVID-19 is published in Nature Climate Change on 07 August 2020. (DOI: 10.1038/s41558-020-0883-0)

Once published the paper will be available at: https://www.nature.com/articles/s41558-020-0883-0

Christoph Keller is based in the Global Modeling and Assimilation Office (GMAO) at NASA GSFC, in Greenbelt, MD, just outside Washington, DC. He is employed by the Universities Space Research Association (USRA) in the institute "Goddard Earth Sciences, Technology and Research (GESTAR)" funded by GSFC. https://gmao.gsfc.nasa.gov/

For additional information contact University of Leeds press officer at a.harrison@leeds.ac.uk

Q&A Current and future global climate impacts resulting from COVID-19

What did the study do?

* The team used newly available mobility data from Apple and Google to estimate how emissions of 10 different greenhouse gases and air pollutants changed between February and June 2020, a time of unprecedented restrictions on work and travel due to COVID-19 lockdowns.

* The data, which covered a total of 123 countries responsible for 99% of global fossil fuel CO2 emissions, provided a unique opportunity to rapidly compare emissions trends consistently across countries and sectors.

* For each country, the team used the mobility data to establish changes in activity levels for six economic sectors (surface transport, residential, power, industry, public/commercial, and domestic aviation).

** For countries where access to Google data was not possible, such as China, Russia and Iran (all large emitters who imposed strict lockdowns), the methodology developed by Le Quéré et al.(2020) was used.

** The team also used Le Quéré et al. to provide estimates for international aviation and shipping.

* The changes in activity/mobility over time were used to estimate how emissions had changed during lockdown, compared to recent baseline emissions:

** For CO2 the baseline was taken from Le Quéré et al. (2019 levels).

** For all other emissions we used the EDGAR database (2015 levels).

* For CO2 alone, results are consistent with the study of Le Quéré et al. based on the analysis of confinement measures and activity data. The method used here makes use of mobility trends at the country level which is more direct than using confinement measures, but could be overestimating changes by around 20%.

* Observed concentration of nitrogen dioxide (NO2) from surface air-quality monitoring sites in 32 countries around the world were coupled to NASA's global air pollution model to predict what the concentration of NO2 would have been without the COVID-19 restrictions. Comparing the actual observed concentration during the restrictions to that predicted by the model allowed another way to estimate the change in the emissions of oxides of nitrogen.

* The team then developed a simple set of assumptions to estimate how lockdown emissions changes translated into temperature change - the direct effect of global lockdown on climate. In doing so, the team assumed some restrictions on activity due to COVID-19 (66% of the restrictions level seen in June 2020) will remain in place until the end of 2021, representing a "a "two-year blip".

* Using a simple climate model, the team also considered how choices made around economic recovery from the COVID-19 crisis will affect future emissions pathways and therefore global temperatures, from now until 2050.

* These choices included economic recoveries driven by green stimulus packages or increasing reliance on fossil fuels, which were compared to a baseline reflecting a direct return, post-lockdown, to pre-COVID-19 policies and associated emissions levels. In each case, the team included the "two-year blip" at the start.

** Our baseline represents emissions levels reflecting Nationally Determined Contributions (NDC) until 2030, with no significant strengthening of climate action thereafter.

** The fossil-fuelled recovery assumes strong support for fossil-fuels (an additional 1% of GDP invested). Emissions are 10% higher in 2030 compared to the baseline and continue to rise thereafter.

** The moderate green stimulus assumes that recovery packages target low-carbon energy supply and energy efficiency (an additional 0.8% of GDP invested), do not support bailouts for fossil firms, and begin to structurally change the carbon intensity of economic activity. Greenhouse gas emissions decrease by about 35% by 2030 relative to the baseline and reach global net-zero CO2 by 2060.

** The strong green stimulus invests an additional 1.2% of GDP in low carbon technologies and reduces investment in fossil fuels, leading to a 50% decrease in greenhouse gas emissions by 2030 and global net-zero CO2 by 2050.

What did the study find? (Only present central values. Full uncertainties ranges are reported in the paper.):

* The team's analysis shows that emissions reductions likely peaked in mid-April 2020, with carbon dioxide (CO2), nitrogen oxides (NOx) and other emissions falling by between 10-30% globally.

** Changes in surface transport were the biggest driver for most types of emission.

** Changes also occurred worldwide, with most countries contributing to the fall in emissions (mobility fell by 10% or more during April 2020 in all but one country, and by 80% in five or more countries).

** These findings are also reflected in satellite data and local ground-based observations, which show similar declines in air pollution.

** The reductions calculated by the mobility data were very similar to the reductions calculated from the air quality monitoring data.

* However, the direct temperature impact of the pandemic will be negligible: even with some lockdown measures staying in place to the end of 2021, global temperatures will only be around 0.01°C lower than expected by 2030 (compared to the current baseline).

** It will be difficult to see any effect of the pandemic on climate before 2030 because of temporary nature of the lockdown emission changes and also the short-term cancellation effects on climate from changes in NOx and SO2 described below.

** Falls in NOx emissions would normally lead to further cooling in the short-term, but this is offset by warming from a 20% reduction in SO2 emissions, which will balance out by 2030 (SO2 emissions lead to aerosol formation, which reflect sunlight back to space and cool the planet, so reducing SO2 reduces its cooling effect).

* Although it will be difficult to see the effects of lockdown on climate in the next decade, after 2030, differences begin to emerge depending on the choices made:

** If, after a two-year blip, economic recovery goes back to current investment levels, or we choose a recovery that strongly invests in fossil fuels, we are likely (>80% probability) to see warming of more than 1.5 °C above preindustrial levels by 2050.

** But if we choose a pathway with a strong green stimulus, investing around 1.2% of global GDP in low carbon technologies, and including climate policy measures, we could prevent around 0.3?C of additional warming by 2050.

** This would give us a good chance (~55%) of staying below the Paris Agreement's 1.5?C aspirational temperature goal.

What are the implications?

* As above, the direct effect of the COVID-19 pandemic on the climate will be negligible - a difference of only around 0.01°C by 2030.

* Lockdown's massive but temporary shifts in behaviour have therefore only had a tiny impact on the climate, and pollution levels across the world are already returning to near normal. This means we need structural change in the long-term in order to avoid dangerous climate change.

* The investment choices we make about economic recovery will strongly affect our climate trajectory to mid-century:

** A green recovery that invests in low carbon technologies, avoids fossil fuel lock-in, and cuts global emissions to net-zero by 2050, would mean we avoid around 0.3°C of warming by 2050, this is half of the expected 0.6C warming under current policies.

** This would also set the world on track for meeting the Paris Agreement's long-term temperature goal.

** This 0.3 degrees C could therefore represent the difference between us facing or avoiding dangerous climate change.

* In the short-term, policies that cut road transport emissions (NOx) will help to offset any temporary warming from cleaning up SO2 emissions from the power and industry.

** This will be especially important at a regional level where changes in aerosol concentration can lead to risks from extreme weather, such as heatwaves or rainfall, adding to the economic and health burden caused by the pandemic.

* Finally, rapid and easy access to big data can clearly contribute, in new and unexpected ways, to the evidence base scientific studies relating to COVID-19. We encourage Google, Apple and others to make their data freely available, and to promote its application.

Credit: 
University of Leeds

Transgender and gender-diverse individuals more likely to be autistic

Transgender and gender-diverse adults are three to six times more likely as cisgender adults (individuals whose gender identity corresponds to their sex assigned at birth) to be diagnosed as autistic, according to a new study by scientists at the University of Cambridge's Autism Research Centre.

This research, conducted using data from over 600,000 adult individuals, confirms previous smaller scale studies from clinics. The results are published today in Nature Communications.

A better understanding of gender diversity in autistic individuals will help provide better access to health care and post-diagnostic support for autistic transgender and gender-diverse individuals.

The team used five different datasets, including a dataset of over 500,000 individuals collected as a part of the Channel 4 documentary "Are you autistic?". In these datasets, participants had provided information about their gender identity, and if they received a diagnosis of autism or other psychiatric conditions such as depression or schizophrenia. Participants also completed a measure of autistic traits.

Strikingly, across all five datasets, the team found that transgender and gender-diverse adult individuals were between three and six times more likely to indicate that they were diagnosed as autistic compared to cisgender individuals. While the study used data from adults who indicated that they had received an autism diagnosis, it is likely that many individuals on the autistic spectrum may be undiagnosed. As around 1.1% of the UK population is estimated to be on the autistic spectrum, this result would suggest that somewhere between 3.5.-6.5% of transgender and gender-diverse adults is on the autistic spectrum.

Dr Meng-Chuan Lai, a collaborator on the study at the University of Toronto, said: "We are beginning to learn more about how the presentation of autism differs in cisgender men and women. Understanding how autism manifests in transgender and gender-diverse people will enrich our knowledge about autism in relation to gender and sex. This enables clinicians to better recognize autism and provide personalised support and health care."

Transgender and gender-diverse individuals were also more likely to indicate that they had received diagnoses of mental health conditions, particularly depression, which they were more than twice as likely as their cisgender counterparts to have experienced. Transgender and gender-diverse individuals also, on average, scored higher on measures of autistic traits compared to cisgender individuals, regardless of whether they had an autism diagnosis.

Dr Varun Warrier, who led the study, said: "This finding, using large datasets, confirms that the co-occurrence between being autistic and being transgender and gender-diverse is robust. We now need to understand the significance of this co-occurrence, and identify and address the factors that contribute to well-being of this group of people."

The study investigates the co-occurrence between gender identity and autism. The team did not investigate if one causes the other.

Professor Simon Baron-Cohen, Director of the Autism Research Centre at Cambridge, and a member of the team, said: "Both autistic individuals and transgender and gender-diverse individuals are marginalized and experience multiple vulnerabilities. It is important that we safe-guard the rights of these individuals to be themselves, receive the requisite support, and enjoy equality and celebration of their differences, free of societal stigma or discrimination."

Credit: 
University of Cambridge

New Zealand's Southern Alps glacier melt has doubled

image: Rob Roy glacier in the Matukituki valley in December 2018; this glacier on steep hillslopes is now disconnected from a (out of sight) valley floor part.

Image: 
Jonathan Carrivick, University of Leeds

Glaciers in the Southern Alps of New Zealand have lost more ice mass since pre-industrial times than remains today, according to a new study.

Research led by the University of Leeds, in collaboration with the National Institute of Water and Atmospheric Research (NIWA) in New Zealand, mapped Southern Alps ice loss from the end of the Little Ice Age -- roughly 400 years ago -- to 2019.

The study found that the rate of ice loss has doubled since glaciers were at their Little Ice Age peak extent. Relative to recent decades, the Southern Alps lost up to 77% of their total Little Ice Age glacier volume.

Climate change has had a significant impact on ice loss around the world. Not only do local communities depend on glaciers as sources of fresh water, hydropower and irrigation, but mountain glacier and ice cap melt presently accounts for 25% of global sea-level rise.

Rapid changes observed today for mountain glaciers need to be put into a longer-term context to understand global sea-level contributions, regional climate-glacier systems and local landscape evolution.

The study, published in the journal Scientific Reports, determined volume changes for 400 mountain glaciers across New Zealand's Southern Alps for three time periods; the pre-industrial Little Ice Age to 1978, 1978 to 2009 and 2009 to 2019.

The team reconstructed glacier volumes using historical records of glacier outlines, as well as examinations of moraines and trimlines, which are accumulations of glacial debris and clear lines on the side of a valley formed by a glacier, respectively. Moraines and trimlines can indicate former ice margin extent and ice thickness changes through time.

By comparing changes in the glacier surface reconstructed during the Little Ice Age peak and the glacier surface in more recent digital elevation models, the study found that ice loss has increased two-fold since the Little Ice Age with a rapid increase in ice volume loss in the last 40 years.

Up to 17% of the volume that was present at the Little Ice Age was lost between 1978 and 2019 alone. In 2019, only 12% of ice mass remained in what was formerly the low altitude part of the Little Ice Age glacier region - also called the ablation zone - and much of the what used to be ice-covered in the Little Ice Age ablation zone is now completely ice free.

Study lead author Dr Jonathan Carrivick, from the School of Geography, said: "These findings quantify a trend in New Zealand's ice loss. The acceleration in the rate of ice mass loss may only get worse as not only climate but also other local effects become more pronounced, such as more debris accumulating on glaciers surfaces and lakes at the bottom of glaciers swell, exacerbating melt.

"Our results suggest that the Southern Alps has probably already passed the time of 'peak water' or the tipping point of glacier melt supply. Looking forwards, planning must be made for mitigating the decreased runoff to glacier-fed rivers because that affects local water availability, landscape stability and aquatic ecosystems."

Co-author Dr Andrew Lorrey is a Principal Scientist based at NIWA who was involved with the study. He says "The long-term ice volume decline, rising snowlines, and rapid disintegration of glaciers across the Southern Alps we have observed is alarming. Photographic evidence that has been regularly collected since the late 1970s show the situation has dramatically worsened since 2010.

"Our findings provide a conservative baseline for rates of Southern Alps ice volume change since pre-industrial times. They agree with palaeoclimate reconstructions, early historic evidence and instrumental records that show our ice is shrinking from a warming climate."

Credit: 
University of Leeds

The costs and benefits of addressing customer complaints

Researchers from Michigan State University, University of South Florida, St. John's University, and American Customer Satisfaction Index (ACSI) published a new paper that analyzes relationships between customer complaints, complaint handling by companies, and customer loyalty to understand how customer complaint management affects companies' performance and to inform companies how to manage customer complaints much better and more consistently.

The study, forthcoming in the Journal of Marketing, is titled "Turning Complaining Customers into Loyal Customers: Moderators of the Complaint Handling - Customer Loyalty Relationship" and is authored by Forrest Morgeson, Tomas Hult, Sunil Mithas, Tim Keiningham, and Claes Fornell.

The angry restaurant patron. The irritated airline passenger. The retail customer screaming about a return or refund. Every company worries about complaining customers. They can be loud, disruptive, and damage a company's brand reputation, sales, employee morale, and market value. But are customer complaints as damaging as they seem?

As it turns out, customers who lodge complaints are not a lost cause. They can still be satisfied and remain loyal if their complaints are handled well. Regrettably, companies rarely handle complaints consistently, partly because they don't know how.

The research team carried out the largest study ever on customer complaints to inform companies how to manage customer complaints much better and more consistently. We studied data from the world-renowned American Customer Satisfaction Index (ACSI) regarding behaviors of 35,597 complaining customers over a 10-year period across 41 industries.

The study finds that the relationship between a company's complaint recovery and customer loyalty is stronger during periods of faster economic growth, in more competitive industries, for customers of luxury products, and for customers with higher overall satisfaction and higher expectations of customization. On the other hand, the recovery-loyalty relationship is weaker when customers' expectations of product/service reliability are higher, for manufactured goods, and for males compared to females.

Hult explains that "We draw two key conclusions from the results. First, companies need to recognize not only that industries vary widely in the percentage of customers who complain (on average, about 11.1 percent), but also that economic, industry, customer-firm, product/service, and customer segment factors dictate the importance of complaint recovery to customers and their future loyalty. Companies should develop complaint management strategies accordingly."

He continues, "Secondly, the financial benefits of complaint management efforts differ significantly across companies. Since complaint management's effect on customer loyalty varies across industries and companies offering different kinds of goods, the economic benefit from seeking to reaffirm customer loyalty via complaint recovery varies as well. Through this study, these performance factors can be identified and considered when designing a company's complaint management system."

Without context, these conclusions suggest that a profit-maximizing strategy simply requires that managers understand the impact of complaint recovery on customer loyalty in their industry. Added to this complexity, however, is the reality that profitability is not evenly distributed throughout the customer base. Fornell says that "Companies need to implement complaint management systems that make it easier for front-line employees to respond to complaining customers in ways that optimize customer satisfaction, customer loyalty, and the economic contribution of customers."

Without a deeper understanding of the boundaries of the complaint handling-customer loyalty relationship and the effects of economic, industry, customer-firm, product/service, and customer segment factors, companies will likely allocate cost estimates to complaint management that are too low for the required recovery actions or customer loyalty estimates that are too high, or both, instead of achieving an optimal point of recovery-loyalty yield.

Fornell advises that "Achieving an optimal recovery-loyalty yield is more advantageous than adopting the mantra that the customer is always right. It is a folly to believe that the customer is always right. Economically speaking, the customer is only "right" if there is an economic gain for the company to keep that customer. In reality, some complaining customers are very costly and not worth keeping."

Credit: 
American Marketing Association