Earth

Reconstruction shows increased global warming trends since 1850s

image: Rebuilt Meteorological Pavilion at Hong Kong Ancient Observatory

Image: 
Qingxiang Li

Earth is warming rapidly, but there is too little observational data in some regions such as the Arctic or high-altitude areas like the Qinghai-Tibetan plateau to adequately and consistently assess temperature variations across the globe. To better understand how temperatures have increased, an international team led by researchers at Sun Yat-Sen University in China has released a newly merged global surface temperature dataset, including reconstructed land and marine measurements from the 1850s to 2018. The study provides evidence that there was a consistent increased warming trend compared with previous estimations, which closely matches the available observational data and updated simulations covering the past two decades.

The approach and results, including dataset description, were published on Jan 28 in Advances in Atmospheric Sciences.

"The global surface temperature is one of the most important and accurate essential climate variables in the Earth system, yet there are still a number of discrepancies among the evaluation of magnitude of global warming," said paper author Qingxiang Li, professor in the School of Atmospheric Sciences and Key Laboratory of Tropical Atmosphere-Ocean System, Sun Yat-Sen University, and in the Southern Laboratory of Ocean Science and Engineering in China.

"The importance of global complete coverage dataset is emphasized in recent studies of the 'hiatus' period, when global warming appeared to slow from 1998 to 2012, especially for observations of high-latitude regions such as the Arctic."

Li also noted that observed surface temperatures recorded over the decades appeared to increase slower compared to simulated models, but that could be explained by inaccurate tools or varied measuring practices.

To better understand how the globe has warmed, the team previously combined the global land surface temperature dataset with the extended reconstructed sea surface temperature dataset from the National Oceanic and Atmospheric Administration (NOAA) into the China Merged Surface Temperature (CMST) dataset. In this paper, they updated the CMST with a reconstructed global land surface temperature dataset, using ensembles from both datasets of land and sea surface temperatures.

"The resulting dataset is the CMST-Interim, a global monthly surface temperature dataset spanning 1854 to 2018," Li said, noting the new dataset includes improved coverage of the Earth's surface, with 90% of the globe include from the 1950s onwards. "The CMST-Interim shows a significantly increased warming rate of the global surface temperature compared to the original dataset."

The researchers previously found that the existing datasets overestimated surface temperature anomalies prior to the 19th century, yet underestimated surface temperature anomalies in the 21st century -- explaining the appearance of the hiatus period.

Next, the team plans to continue improving and testing CMST-Interim, with a specific focus on improving the assessment of sea ice surface temperatures, and eventually upgrade CMST-Interim to a new version of CMST.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

NUS scientists discover a new pathway essential for blood formation

Blood is vital to life, and a healthy body replenishes worn-out blood cells with new ones throughout one's lifetime. If something goes wrong with this process, serious illness will result.

Researchers from the National University of Singapore (NUS) have now discovered a mechanism controlling the replenishment of blood cells, which could have relevance for new treatments for blood cancers and other blood-related diseases.

The international research team, helmed by Dr Akihiko Numata while he was a Postdoctoral Fellow in the laboratory of Professor Daniel Tenen of the Cancer Science Institute of Singapore and Yong Loo Lin School of Medicine at NUS, focused their investigations on a protein called Tip60, which catalyzes important biological processes in many living organisms. In particular, Tip60 controls hematopoietic stem cells, the source of new blood cells.

In a 10-year-long study, the scientists developed sophisticated molecular tools and experiments to understand the role Tip60 plays in hematopoietic stem cells. They knocked out the protein by modifying its genetic code, thereby deleting certain parts of the protein and preventing it from binding to other biological molecules. The scientists then compared the malfunctioning Tip60 with the normal version.

"We discovered that Tip60 plays a crucial role, activating genes that are in turn responsible for maintaining the hematopoietic stem cells and their DNA. In fact, when completely deprived of Tip60, many of the cells suffered 'catastrophic' DNA damage and died. On the other hand, some of the genes that Tip60 affects can lead to leukemia, and understanding this pathway may lead to novel therapeutic approaches," explained Prof Tenen.

Credit: 
National University of Singapore

Fetal and neonatal therapies improve prognosis of congenital cytomegalovirus infection

image: Comparison of the neurological prognoses for the FT group and the NT only group.

Image: 
Tanimura et al. J Reprod Immunol. 2021.

A cross-institutional research group has revealed for the first time in the world that infants with symptomatic congenital cytomegalovirus (CMV) infection who were treated with a combination of immunoglobulin fetal therapy and neonatal therapy with antiviral drugs were less likely to experience the severe aftereffects associated with the infection than those who only received the neonatal therapy.

It is hoped that the number of children suffering severe aftereffects resulting from congenital CMV infection will decrease in the future.

The research group included the following members:

Doctor YAMADA Hideto (Director of the Center for Recurrent Pregnancy Loss and Genome Medical Center, Teine Keijinkai Hospital, visiting professor at Osaka University and former professor of the Department of Obstetrics and Gynecology, Kobe University Graduate School of Medicine)

Associate Professor TANIMURA Kenji (Department of Obstetrics and Gynecology, Kobe University Graduate School of Medicine),

Associate Professor FUJIOKA Kazumichi (Department of Pediatrics, Kobe University Graduate School of Medicine)

Professor MORIOKA Ichiro (Department of Pediatrics and Child Health, Nihon University School of Medicine)

The pre-proof paper on research results was published online in the Journal of Reproductive Immunology on December 16 2020, with the final version being made available on January 8 2021.

Main Points

When CMV infects fetuses, it can cause severe handicaps such as hearing difficulties, as well as mental and physical development disorders. In particular, 90% of babies who are born with the symptoms of congenital CMV infection experience serious long-term aftereffects.

It is known that hearing difficulties and delays in mental development can be ameliorated in babies born with symptoms of CMV infection, if they are given prompt treatment with antiviral drugs after birth.

Although there have been some attempts to treat fetuses diagnosed with congenital CMV infection during the gestational period (fetal therapies), there is currently no established method of treatment for babies before they are born.

Until now, there has been no research carried out into the effectiveness of combining fetal treatment with neonatal treatment.

This research team has shown for the first time in the world, that a combination of fetal and neonatal treatments is more likely to lessen the aftereffects of congenital CMV infection in infants than neonatal treatment alone.

Research Background

Cytomegalovirus (CMV) can infect babies while they are in the uterus. Congenital CMV infection can cause severe aftereffects in these children such as hearing difficulties, and mental and physical developmental disorders. This is a big issue worldwide, for example, it is estimated that around 1000 babies are born with congenital CMV infection each year in Japan. Fetal growth restriction (*1), microcephaly (*2), ventriculomegaly (*3), hepatomegaly (*4), pleural effusion and peritoneal effusion (*5) are among the characteristic symptoms of congenital CMV infection. Approximately 90% of infants who experience these clinical manifestations are left with the aforementioned severe aftereffects.

In recent years, it has been discovered that treating newborns with these clinical manifestations of congenital CMV infection with the antiviral drug, Valganciclovir, can improve not only hearing issues but also reduce delays in mental and physical development. In Japan, a clinical trial is ongoing to approve the neonatal therapies as treatments covered by public health insurance.

On the other hand, there are some cases where clear clinical manifestations of CMV infection can be diagnosed in fetuses via ultrasound while they are still in the uterus. It is supposed that infants who exhibit these symptoms in the uterus are more likely to experience more severe aftereffects than infants who are diagnosed with congenital CMV infection after being born.

Up until now, there have been some reports on the effectiveness of administering immunoglobulin blood products (*6) during the gestational period via the mother's bloodstream, the amniotic fluid, the umbilical cord or into the fetus's abdomen, to treat fetuses diagnosed with symptomatic congenital CMV infection. A previous study has also looked at administering valacyclovir, which is an antiviral drug used against herpesvirus, to pregnant women carrying fetuses with CMV.

However, these studies included fetuses with no clear clinical manifestations of CMV infection (asymptomatic cases) or those with only mild symptoms. Therefore no clear answer has been obtained as to whether these treatments are effective for fetuses with serious forms of congenital CMV infection. Unfortunately, as a consequence of this, there is currently no established fetal treatment method.

In 1996, Doctor Yamada et al. became the first in the world to try treating symptomatic congenital CMV infection with immunoglobulin fetal therapy (J Perinatol 1998). After that, the results of a multicenter study showing the effectiveness of immunoglobulin fetal treatment in 12 cases were published (J Perinatol 2012). However, up until now research has focused on fetal treatment and neonatal treatment separately. There has been no research published on the effectiveness of a combined fetal and neonatal treatment approach.

Therefore this research group sought to investigate for the first time in the world, whether or not an integrated fetal and neonatal treatment approach (immunoglobulin therapy while in the uterus and antiviral drug therapy after birth) could lessen the severity of aftereffects experienced by infants with congenital CMV infection.

Research Findings

The researchers carried out fetal therapy (FT) and neonatal therapy (NT) at Kobe University Hospital. The pregnant women and their husbands were given a thorough explanation of the treatments and their consent was obtained prior to the study's commencement.

The presence of congenital CMV infection during the gestational period was determined either through ultrasound examinations in which characteristic symptoms of congenital CMV infection were observed, or through CMV-DNA PCR analysis of amniotic fluid sampled from the pregnant women. In CMV positive cases, the pregnant women and their partners were asked whether or not they wished to receive FT. This FT was administered to those who agreed; immunoglobulin was administered either in the form of an injection into the abdomen of the fetus or via intravenous injection to the mothers. If the FT was effective, the treatment was continued. If it was found to be ineffective, the researchers considered inducing birth once the fetus was over 32 weeks with a weight of over 1200g, in order to start neonatal therapy as soon as possible.

All babies who received FT were given comprehensive examinations after being born, including ultrasound examinations, CT scans, Auditory Brainstem Response tests (*7), and ophthalmoscopes (eye tests). From these results, babies were given NT with the antiviral drug Valganciclovir (which has been shown to be effective against CMV), if they exhibited clinical manifestations of congenital CMV infection and PCR analysis of their urine was positive for CMV. Babies who did not show any symptoms of congenital CMV infection were not given NT, even if the PCR test result was positive.

On the other hand, some cases could only receive NT. This included cases where babies had been delivered at another hospital, been diagnosed with congenital CMV infection and then been transferred to the Department of Pediatrics at Kobe University, as well as cases where the parents refused FT for their diagnosed baby, and where the baby was born too early to receive FT. In addition, to make sure no cases of CMV infection were overlooked, PCR tests on the urine of all newborns delivered at Kobe University Hospital and several related hospitals were also carried out. Those newborns who were found to be CMV positive after the PCR test and also demonstrated clinical manifestations of the infection after thorough testing only received NT.

The research team conducted a long-term follow-up on the two groups of children: Group 1 (who received both FT and NT or who only received FT) and Group 2 (who only received NT). Between the ages 1.5 and 3 years old, periodical assessments were carried out to investigate whether or not each child had hearing loss, and if so, whether it was in one ear or both, as well as examinations to determine if there were any mental abnormalities. Based on these examinations and a Developmental Quotient (*8), neurological prognosis was divided into 3 categories: normal, mild impairments, and severe impairments. The researchers investigated whether there was a difference in these prognoses between Group 1 and Group 2 infants.

In the ten year period between 2009 and 2019, 15 cases received FT. In 4 of the 6 cases with fetal growth restriction, fetal growth increased after receiving FT. In 1 case both ventriculomegaly and hepatomegaly disappeared after FT, and in another case peritoneal effusion temporarily disappeared. Furthermore, the amount of CMV in the amniotic fluid either decreased or disappeared in 7 of the cases, and in 1 case CMV in the peritoneal fluid disappeared. Sadly, in 2 cases the babies died soon after birth due to respiratory failure caused by fluid build-up and insufficiently developed lungs. On the other hand, 1 of the cases did not require neonatal treatment, because, even though the post-birth PCR test was positive for CMV, the ventriculomegaly and hepatomegaly that they experienced during the gestational period had disappeared and they had no other symptoms of CMV infection. The remaining 12 cases received both FT and NT. Of the 19 cases receiving only NT in this ten year period, one was born prematurely at 24 weeks and sadly died one month later, so NT could not be completed.

Regarding the comparison of infants' neurological prognoses, some cases were omitted from the final analysis of the longitudinal data. These included cases where the periodical assessments at 1.5 years old could not be carried out for reasons such as the infant's death, not reaching the age for the assessments and the parents' refusal (4 cases in the FT group, and 3 in the NT only group). In the NT only group, two cases had chorioretinitis (*9) as the only manifestation of CMV infection, and as this could not be diagnosed during the gestational period, the infants did not receive fetal treatment. Therefore these two cases were also omitted.

The percentage of infants who demonstrated normal development at ages 1.5 and 3 was 45.5% for the FT group (5 out of 11) and 21.4% for the NT only group (3 out of 14). Although the percentage for the FT group was higher, no statistically significant difference was found. However, in terms of the percentage of children with severe impairments, there was a clear statistical difference. In the NT only group 64.3% (9 out 14) had severe impairments, however this percentage was significantly lower for the FT group at only 18.2% (2 out of 11) (see Figure).

Further Research

It is supposed that infants who are diagnosed with congenital CMV infection during the gestational period have more severe symptoms than those who are diagnosed with it after being born. In a world first, the results of this research have shown the possibility that the long term neurological outcomes of fetuses diagnosed with congenital CMV infection can be improved via a combination of immunoglobulin fetal therapy and neonatal therapy with antiviral drugs.

The next issue is that a double-blind trial (*10) needs to be carried out in order to prove the effectiveness of treatment methods using drugs. However, it would be extremely unethical to use a placebo in cases where congenital CMV infection is detected during the gestational period and it is clear that without treatment the prognosis will worsen. After a new law related to clinical trials was passed, the fetal therapy method in this study became available but only as a treatment not covered by public health insurance. Therefore, this presents an issue from a financial point of view.

It is hoped that the results of this study will lead to further research into the use of immunoglobulin and other drugs in fetal therapy for congenital CMV infection, and that this will enable such treatments to be offered under public health insurance. This would hopefully lead to a decrease in the number of children affected by the aftereffects of this disease.

Credit: 
Kobe University

Ecologists conducted a novel study on vegetation transpiration from a global network of 251 sites

image: An ecologist from RUDN University together with colleagues from 14 countries compared three methods for estimating ecosystem transpiration in a study. In the first ever research with such a comprehensive data-set, the team used land-atmosphere water vapor flux data of collected at 251 locations all over the planet, from Australia to Greenland. The outcome of the research help to understand the role of plants in the global water and carbon cycles in the current predicament of global warming.

Image: 
RUDN University

An ecologist from RUDN University together with colleagues from 14 countries compared three methods for estimating ecosystem transpiration in a study. In the first ever research with such a comprehensive data-set, the team used land-atmosphere water vapor flux data of collected at 251 locations all over the planet, from Australia to Greenland. The outcome of the research help to understand the role of plants in the global water and carbon cycles in the current predicament of global warming. The results of the study were published in the December 2020 issue of the journal Global Change Biology.

Plants roots absorb water from the soil and transport through the stems up to their leaves thanks to a gradient of water vapor pressure. Once it reaches the leaves, water evaporates through leaf pores called stomata and gets into the atmosphere. The physical process by which water is released to the atmosphere by plants is called transpiration. Transpiration is a 'meeting point' of carbon, water, and energy cycles in terrestrial ecosystems, since plants need water for fixing atmospheric CO2 by photosynthesis and convert a large fraction of the solar energy input into this process, therefore by improving the modelling of transpiration scientists can analyze the role of vegetation in climate change scenarios. An international group of scientists led by Dr. Jacob Nelson from the Max Planck Institute for Biogeochemistry (Germany) and including an ecologist from RUDN University, compared three methods for estimating ecosystem transpiration based on micrometeorological data from FLUXNET--a global network of stations.

The team used the data collected at 251 FLUXNET sites. Among many environmental physical and chemical parameters, these stations provide continuous flux measurements of water vapor and carbon dioxide between the monitored ecosystems and the atmosphere. To do so, the eddy covariance method is applied, that relies on the three-dimensional monitoring at high frequency of turbulent flows of trace gases. The team chose three methodological approaches to retrieve transpiration from the eddy covariance data and used independent tree sap flow measurements from six test sites to compare the transpiration estimates.

"All three methods are based on the ratio between evapotranspiration and fluxes of carbon uptaken by photosynthesis from the atmosphere, that is termed water use efficiency, and differ by initial assumptions and parameterization. At daily scale, transpiration estimates yielded by the three methods were highly correlated, between 89 and 94%. However, the ratio of transpiration to evapotranspiration differed across models ranging from 45% to 77%." said Dr. Luca Belelli Marchesini researcher at the Agrarian and Technological Institute of RUDN University (Russia) and at the Fondazione Edmund Mach (Italy).

Having further analyzed the results in search of driving factors, the team concluded that the geographic variation in the transpiration to evapotranspiration ratio (T/ET) was mainly controlled by vegetation and soil characteristics rather than by climatic variables such as temperature and precipitation.

To explain the relative stability of T/ET among sites, the team suggested two hypotheses. The first consists in a trade-off between the amount of precipitation intercepted by vegetation canopies and soil evaporation: ecosystems with a dense leaf cover, not limited by water availability, would thus intercept more rain and soil evaporation would be reduced. In contrast, water limited ecosystems, characterized by a smaller vegetation cover, would have a larger fraction of water evaporated from the soil.

According to the second hypothesis, ecosystems tend to adapt to the available water resources, therefore, for instance, vegetation in dry climates would improve the utilization of the limited precipitation, thus increasing the T/ET ratio.

'The combination of these two hypotheses likely explains the relative stability of the T/ET ratio in different ecosystems. This study represents the first extensive estimate of ecosystem transpiration based on in-situ data and allows shedding new light on the role of plants' water use in the context of the global water and carbon cycles" added Dr. Luca Belelli Marchesini.

Credit: 
RUDN University

An efficient tool to link X-ray experiments and ab initio theory

image: The electronic structure of complex molecules can be assessed by the method of resonant inelastic X-ray scattering (RIXS) at BESSY II.

Image: 
Martin Künsting /HZB

Molecules consisting of many atoms are complex structures. The outer electrons are distributed among the different orbitals, and their shape and occupation determine the chemical behaviour and reactivity of the molecule. The configuration of these orbitals can be analysed experimentally. Synchrotron sources such as BESSY II provide a method for this purpose: Resonant inelastic X-ray scattering (RIXS). However, to obtain information about the orbitals from experimental data, quantum chemical simulations are necessary. Typical computing times for larger molecules take weeks, even on high-performance computers.

Speeding up the evaluation

"Up to now, these calculations have mostly been carried out subsequent to the measurements", explains theoretical chemist Dr. Vinicius Vaz da Cruz, postdoc in Prof. Dr Alexander Föhlisch's team. Together with the RIXS expert Dr. Sebastian Eckert, also a postdoc in Föhlisch's team, they have developed a sophisticated new procedure that speeds up the evaluation many times over.

"With our method, it takes a few minutes and we don't need a super-computer for this, it works on desktop machines," says Eckert. The HZB scientists have tested the method on the molecule 2-thiopyridone, a model system for proton transfer, which are essential processes in living cells and organisms. Despite the short computing time, the results are precise enough to be very useful.

"This is a huge step forward," emphasises Föhlisch. "We can run through many options in advance and get to know the molecule, so to speak. In addition, this method also makes it possible to simulate far more complex molecules and to interpret the experimentally obtained data in a meaningful way". Experimental physicist Eckert adds: "We can now also run the simulations during the measurement and see immediately where it might be particularly exciting to take a closer look".

The procedure is an extension of the well established and highly efficient time-dependent density functional theory, which is much faster than the traditional concepts to simulate the RIXS process. "The simplicity of the method allows for a large degree of automatization," says Vaz da Cruz: "It can be used like a black box."

Credit: 
Helmholtz-Zentrum Berlin für Materialien und Energie

Pharmaceutical research: when active substance and target protein 'embrace' each other

image: Upper part: Long residence time. An inhibitor (left: stick model) binds to the signal molecule FAK (right: part oft the FAK protein depicted as calotte model with spheres). The structural change of FAK causes hydrophobic contacts (yellow, so-called DFG motif) and a long-lasting engagement.
Lower part: Short residence time. PYK2 signal protein does not change its structure upon inhibitor binding, thus resulting in a fast inhibitor dissociation. Graphics: Knapp Laboratory, Goethe University Frankfurt

Image: 
Knapp Laboratory, Goethe University Frankfurt

FRANKFURT. Many anti-cancer drugs block signals in cancer cells that help degenerated cells to multiply uncontrollably and detach from tissue. For example, blocking the signalling protein FAK, a so-called kinase, causes breast cancer cells to become less mobile and thus less likely to metastasise. The problem is that when FAK is blocked by an inhibitor, the closely related signalling protein PYK2 becomes much more active and thus takes over some of FAK's tasks. The ideal would therefore be an inhibitor that inhibits both FAK and PYK2 in the same way for as long as possible.

An international team led by the pharmaceutical chemist Prof. Stefan Knapp from Goethe University has investigated a series of specially synthesised FAK inhibitors. All inhibitors bound to the FAK protein at about the same rate. However, they differed in the duration of binding: The most effective inhibitor remained bound to the FAK signalling protein the longest.

Using structural and molecular biological analyses as well as computer simulations, the research team discovered that binding of inhibitors that remain in the FAK binding pocket for a long time induce a structural change. Thus, through binding of these inhibitors, FAK changes its shape and forms a specific, water-repellent structure at contact sites with the inhibitor, comparable to an intimate embrace.

The closely related protein PYK2, on the other hand, remained comparatively rigid, and although the most effective FAK inhibitor also blocked PYK2, its effect was significantly weaker due to quickly dissociating inhibitors from the binding site. Interestingly, computer simulations were able to predict the kinetics of binding very well, providing a method for accurate simulation of drug dissociation rates for future optimisation of drug candidates.

Prof. Stefan Knapp explains: "Because we now have a better understanding of the molecular mechanisms of the interaction of potent inhibitors of these two kinases, we hope to be able to use computer simulations to better predict drug residence times of inhibitors and drugs candidates in the future. So far, little attention has been paid to the kinetic properties of drug binding. However, this property has now emerged as an important parameter for the development of more effective drugs that are designed to inhibit their target proteins - as in the case of FAK and PYK2 - not only potently but also for a long time."

Credit: 
Goethe University Frankfurt

Research illuminates lobsters' genetic response to changing climate

image: Maura Niemisto, a research associate at Bigelow Laboratory in East Boothbay Maine, prepares a lobster sample in the lab. Niemisto is lead author on a recent paper showing the effects of ocean warming and ocean acidification on postlarval lobsters.

Image: 
Bigelow Laboratory for Ocean Sciences

The American lobster, which supports the most valuable fishery in North America, may be more susceptible to the effects of climate change than previously thought, according to a new study published in Ecology and Evolution. This finding could help fishery managers anticipate the long-term effects of climate change for one the nation's most precious natural resources.

The American lobster's range extends from Atlantic Canada to the mid-Atlantic waters of the United States, but increased carbon dioxide emissions by humans are warming and acidifying their ocean habitat.

To date, studies of the early life stages of lobsters have concluded that ocean acidification, compared to warming, had relatively limited impact on growth and metabolism. However, according to the new publication, their genes tell a different story.

"Our study indicated that acidification is affecting these lobsters on a molecular scale," said Maura Niemisto, lead author and research associate at Bigelow Laboratory for Ocean Sciences. "Because of environmental changes, they have genes firing at an even higher rate."

Niemisto published the research with Bigelow Laboratory Senior Research Scientist David Fields and University of Maine Research Professor Richard Wahle, who is based at the Darling Marine Center. Co-authors Spencer Greenwood of the University of Prince Edward Island, and Fraser Clark of Dalhousie University contributed expertise in genomics. The work built on previous experiments by co-author Jesica Waller of the Maine Department of Marine Resources.

Through their genes, all living organisms have the capacity to regulate a wide range of biological processes. Changes in their environment can influence this. For example, many organisms have a heat shock protein gene. When they get overheated, the environmental stress triggers cells to produce a protein that helps the organism adapt.

This study focused on the postlarval stage of lobsters because they live in the upper water column, where ocean acidification and temperature are changing quickly. In addition, they are planktonic, which means they have limited control of their movement to avoid unfavorable environments.

The scientists conducted laboratory experiments to expose postlarval lobsters to the temperature and acidity levels projected for the end of the century. Under these conditions, they found lobsters' cells adjusted gene regulation to support shell structure and immune functions.

This response was stronger in relation to increased acidity than increased temperature. The study also revealed that, when their environment was both warm and acidic, the lobsters showed significantly more genetic response than when exposed to either stressor alone.

"Stressors on an organism have the ability to compound into something that makes it really hard to grow through all the developmental stages to get to a full-grown lobster," said Niemisto, who conducted the research as part of her master's thesis in marine biology at UMaine.

The implications of these responses remain unclear for lobsters in the Gulf of Maine, which is acidifying and warming faster than the majority of the world's oceans.

While the results suggest that lobsters have the genetic capacity to adapt to their changing environment, it may come at a cost. The genetic processes take a significant amount of energy, which comes from a limited budget.

"So, if they're spending a lot of energy on building proteins to respond to stressors, something else has to give," Niemisto said.

Researchers are now looking to see exactly where that extra energy is coming from and what is being given up. For example, if lobsters have to expend energy on maintaining their shells, they may not be able to invest in their immune system.

The authors say the next step is to better understand these genetic responses in relation to larger-scale changes. Pairing this study's findings with further research on the ecology and biology of lobsters can help reveal their likely response as a population.

"This study has cracked open the door to our understanding of the basic biology of lobster larvae and their response to the changing environment," said co-author Fields. "However, there is still a lot to learn. As the use of these genetic tools grows, so will our understanding of the larval lobster biology and the impact of climate change on this iconic fishery."

Credit: 
Bigelow Laboratory for Ocean Sciences

Nanoparticle drug delivery technique shows promise for treating pancreatic cancer

image: Study researchers Drs. Snigdha Banerjee, Suman Kambhampati, Sushanta Banerjee, and a colleague examine a pancreatic cancer image.

Image: 
Jeff Gates

Researchers with the Kansas City Veterans Affairs Medical Center and North Dakota State University have designed a new way to deliver pancreatic cancer drugs that could make fighting the disease much easier. Encapsulating cancer drugs in nanoparticles shows potential to target tumors more effectively and avoid danger to other parts of the body.

The study results appeared in the Jan. 4, 2021, issue of the journal Molecular Pharmaceutics.

Study author Dr. Sushanta Banerjee, a researcher with the Kansas City VA and University of Kansas medical centers, explains that this technology has the potential to drastically improve Veterans' cancer care. "Veteran health care will benefit immensely from such therapeutic models, as they are effective in delivering the drug to the tumor site without any toxic side effects [and with] minimal dosing. Once ready for patient use, this technique will reduce the number of doses required by a patient as well as effectively hinder the progression of the tumor."

Pancreatic ductal adenocarcinoma is the most aggressive form of pancreatic cancer. It is one of the leading causes of death from cancer worldwide. Patients with this form of cancer have a five-year survival rate of about 8%. Around 7% of all cancer deaths in the United States come from pancreatic ductal adenocarcinoma.

The medication gemcitabine is the current standard of care for treating this cancer. However, gemcitabine offers only a modest improvement to patients' chances of survival. Gemcitabine degrades quickly within the body, limiting its effectiveness. Pancreatic cancer tumors also often develop resistance to the drug.

A more effective treatment for this type of cancer, called an extracellular receptor kinase inhibitor (ERKi), has been developed. Genetic research has shown that a specific gene mutation is one of the main drivers of pancreatic tumor growth. The enzyme ERK interacts with this mutation, so inhibiting the enzyme can slow the cancer. Research also suggests that developed resistance to gemcitabine involves this enzyme.

Unfortunately, several problems make treating patients with ERKi difficult. The drug is toxic and can cause damage in other parts of the body. ERKi does not dissolved in water, making it difficult to prepare an effective formulation. It is also prone to breaking down in the body, limiting its effectiveness.

To combat these problems, the researchers created a new way to deliver medications to pancreatic tumors. They designed a nanoparticle delivery system to get both gemcitabine and ERKi to the pancreas where they will be most effective.

The two drugs are encased in nanoparticles made of polymers. The nanoparticles stop the drugs from breaking down and protect other areas of the body from the toxic effects.

The pH inside tumor cells is lower than the pH of the rest of the body. The nanoparticles are designed to release the drugs when they come in contact with a lower pH environment. In this way, the researchers can target the drugs specifically to cancer cells and not other areas of the body.

Using a nanoparticle vehicle to deliver the medications also allows for a higher dose to be given without needing multiple separate doses, says Banerjee.

In the study, the researchers tested their new technique on cancer cells cultured in the lab. The nanoparticle encapsulation effectively delivered the two drugs to the targeted cells. The testing showed that this drug combination can suppress cancer cell growth. Results also showed that this delivery method was "markedly" more effective than administering the drugs without the nanoparticles.

Additionally, the researchers found that adding ERKi to gemcitabine increased the body's sensitivity to gemcitabine. The two drugs work together synergistically to fight the cancer, according to the researchers.

While more research is needed, the study shows that the drug delivery method is a promising new way to fight pancreatic cancer.

According to Banerjee, this technique could also be used to treat other types of cancer, such as breast, prostate, and ovarian cancers. The nanoparticle polymers developed by the research team can be combined with different chemotherapy drugs to target tumors in different parts of the body, he explains.

The research team is currently working on different drug combinations to treat various cancers, and is also creating new polymers to improve cancer treatment.

Credit: 
Veterans Affairs Research Communications

Marine heatwaves becoming more intense, more frequent

image: Bleached corals from warm ocean water temperatures.

Image: 
NOAA

When thick, the surface layer of the ocean acts as a buffer to extreme marine heating--but a new study from the University of Colorado Boulder shows this "mixed layer" is becoming shallower each year. The thinner it becomes, the easier it is to warm. The new work could explain recent extreme marine heatwaves, and point at a future of more frequent and destructive ocean warming events as global temperatures continue to climb.

"Marine heatwaves will be more intense and happen more often in the future," said Dillon Amaya, a CIRES Visiting Fellow and lead author on the study out this week in the Bulletin of the American Meteorological Society's Explaining Extreme Events. "And we are now understanding the mechanics of why. When the mixed layer is thin, it takes less heat to warm the ocean more."

The mixed layer--the water in which temperature remains consistent--blankets the top 20-200 meters of the ocean. Its thickness is responsible for heat events: the thicker it is, the more the layer can act as a buffer to shield the waters below from incoming hot air. But as this armor thins, the mixed layer becomes more susceptible to rapid swings in temperature.

"Think of the mixed layer as boiling a pot of water," said Amaya. "It will take no time at all for an inch of water to come to a boil, but much longer for a pot filled to the brim to heat through."

Amaya and his team used a combination of ocean observations and models to estimate the depth of the mixed layer back to 1980, and also project out into the future. They determined that over the last 40 years, the layer has thinned by nearly 3 meters (9 feet) in some regions of the North Pacific. And by 2100, the mixed layer will be 4 meters (12 feet) thinner--30 percent less than what it is today. This thin mixed layer combined with warmer global temperatures will set the stage for drastic swings in ocean temperatures, leading to much more frequent and extreme heating events, the researchers say.

And it's already happening. Take the 2019 heatwave in the Northeast Pacific. Weakened winds and higher air temperatures came together to warm Pacific Ocean waters by about 3 degrees C (5.5 F). A thinning mixed layer most likely contributed to this surge of warm waters, the authors found. And it will get worse.

"If you take the same wind and ocean conditions that occurred in 2019 and you apply them to the estimated mixed layer in 2100, you get a marine heatwave that is 6.5 degrees C (12 F) warmer than what we say in 2019," said Amaya. "An event like that would absolutely devastate sensitive marine ecosystems along the U.S. west coast."

Amaya also points out that, as climate continues to warm and the mixed layer continues to thin, scientists might start to lose the ability to predict year-to-year ocean surface temperatures. Without the ability to accurately forecast ocean temperatures, fisheries and other coastal operations could be in danger.

Other studies also suggest marine heatwaves will become more common in the future, but not many have explored the root cause: ocean dynamics and physics. "In order to simulate these events in models and help predict them, we must understand the physics of why that's happening," said Amaya.

Credit: 
University of Colorado at Boulder

Counties with more cannabis dispensaries show reduced opioid deaths

Counties with a greater number of cannabis dispensary storefronts experience reduced numbers of opioid-related deaths relative to other locales, a recent University of California, Davis, study has found. This is the first study to examine the association between active cannabis dispensary operations -- both medical and recreational -- and opioid-related mortality rates at the county level, suggesting that providing alternative pain management could improve public health outcomes, researchers said.

"While the associations documented cannot be assumed to be causal, they suggest a potential relationship between increased prevalence of medical and recreational cannabis dispensaries and reduced opioid-related mortality rates," said Greta Hsu, professor of management at the Graduate School of Management at UC Davis and lead author of the study.

The study was published Wednesday, Jan. 27, in the British Medical Journal. The article was co-authored by Balazs Kovacs of Yale University.

"Given the alarming rise in the U.S.'s fentanyl-based market and in deaths involving fentanyl and its analogs in recent years, the question of how legal cannabis availability relates to opioid-related deaths can be regarded as a particularly pressing one," researchers said.

According to the U.S. Centers for Disease Control and Prevention -- the source for researchers' opioid data -- opioids were involved in 46,802 overdose deaths in 2018, accounting for nearly 70 percent of all drug overdose deaths.

"Overall, greater understanding the public health outcomes of cannabis legalization on opioid misuse is needed for policymakers to properly weigh the potential benefits versus harms of promoting cannabis legalization," Hsu said.

Significant death rate from synthetic opioids
"As the spread of COVID-19 has overtaken global health resources and attention, another health crisis appears to be silently raging in the background: increasing opioid-related overdose deaths," Hsu said. Reports from public health tracking systems such as the University of Michigan's System for Opioid Overdose Surveillance suggest, she said, that opioid-related mortality rates in 2020 increased substantially over previous years.

"But there is an alternative approach to thinking about how to address widespread misuse of opioids: altering the supply of available drugs with potential medical usages in pain management," Hsu said. "In particular, public health researchers have wondered whether increasing the availability of cannabis, which is generally thought to be a less addictive substance relative to opioids, could be associated with a decrease in opioid-related deaths."

In this study, researchers looked at the 812 counties (out of more than 3,000 total) in the United States that legally allowed cannabis dispensaries by the end of 2017. They documented opioid mortality rates and counted dispensaries selling cannabis between 2014 and 2018, aggregating all opioid-related deaths but also separating out deaths due to prescription opioids, heroin and synthetic opioids.

Finding a relationship between county-level medical and recreational dispensary numbers and reduced opioid-related mortality rates appears particularly strong with regards to deaths associated with synthetic (non-methadone) opioids, researchers found. This class of opioids includes fentanyl and its analogs, which have sharply overtaken other types of opioids in number of deaths in the U.S. in recent years. (Opioids refer to a class of chemically related drugs that include the illicit drug heroin, prescription pain relievers such as oxycodone and synthetic opioids such as fentanyl.)

By the end of 2018 -- the latest date county-level mortality rates available from the CDC -- 33 states had legalized medical cannabis, which requires purchasers to a have a medical identification card or doctor recommendation. Ten states, by then, and the District of Columbia had legalized recreational cannabis, which requires buyers to be 21.

There is no federal database for dispensaries. Under federal law, cannabis is still illegal in the United States. To gather data on cannabis dispensaries, researchers examined at county-level data from Weedmaps.com to count actual storefront businesses, with street addresses, that are in operation. Researchers noted that counties oversee criminal justice, social service, and health and emergency service programs -- all vital dimensions of the public infrastructure related to drug use and markets.

Other studies of the association between cannabis laws and opioid-related deaths often focus at the state level. However, the researchers noted that a focus on state legislation can be less reliable because state laws sometimes take months or years to go into effect. Further, many local counties within cannabis-legal states do not allow storefronts or other sales opportunities, limiting access to cannabis among individuals within those counties.

"As business school researchers, we tracked evolving cannabis markets across the U.S. from 2014 onwards in an effort to understand how this new category of organizations emerged," Hsu said.

"We realized, however, that our county-level database could also be used to examine whether the availability of legal cannabis in an increasing number of geographic areas has any implications for opioid misuse. Allowing for legal sale of cannabis is a key step in increasing its availability, since it shifts the cost structure of supplying cannabis, making cannabis more easily and widely accessible to customers."

Researchers said more study is needed to determine if there is true cause and effect, and that other issues still must be considered. "At the same time, cannabis' potential harms for adolescent's cognitive development, medical conditions such as schizophrenia, and public safety risks should not be ignored," Hsu added.

Credit: 
University of California - Davis

People's acceptance of inequality affects response to company wrongdoings

UNIVERSITY PARK, Pa. -- People who do not accept inequality are more likely to negatively evaluate companies that have committed wrongdoings than people who do accept inequality, and this response varies by culture, according to researchers at Penn State. The team also found that companies can improve their standing with consumers when they offer sincere apologies and remedies for the harm they caused to victims.

"Some prominent examples of company moral transgressions include Nike's and Apple's questionable labor practices in developing countries, BP's oil spill in the Gulf of Mexico and Volkswagen's emissions scandal," said Felix Xu, graduate student in marketing at Penn State.

In their paper, which published on Jan. 22 in the Journal of Consumer Research, the team examined people's power distance beliefs (PDBs), or the extent to which they accept social inequality.

"Our key finding is that individuals with lower PDB -- those with more egalitarian values -- tend to react more harshly toward morally transgressing companies because they spontaneously feel more empathy for potential victims, even though they are not personally impacted by the transgression and the victims are complete strangers," said Xu.

The researchers conducted a set of 13 studies to examine consumer responses following company moral transgressions. In one study, they looked at Google search patterns regarding the Volkswagen emissions scandal, which occurred in 2015. They also investigated how these patterns varied by country.

"We found that in countries with lower PDBs there was greater searching on the scandal, which suggests they were more concerned with the Volkswagen moral transgression," said Karen Winterich, Gerald I. Susman Professor in Sustainability.

Specifically, countries such as the United States and Europe -- which tend to have low PDBs -- were more likely to search for information regarding the Volkswagen emissions scandal, which is consistent with a less favorable response to the moral transgression.

"Individuals with high levels of PDB tend to accept power disparity, often perceiving it as functional because they view social hierarchy and structure as the basis for societal order and security. In other words, they respect authority and believe everyone should have a defined place within the social hierarchy," said Winterich. "For example, research shows that people in high power distance cultures such as Japan and China tend to constantly monitor their behaviors to ensure they act consistently with their status in the social hierarchy, suggesting that social structure is highly respected in such cultures."

India is also a high-PDB culture, and one that the researchers included in another of their studies. For that study, the team recruited one-third of its participants from the United States, one-third from India and the final one-third from other parts of the world. The participants were given a short description about a fictitious appliance company, accompanied by a short news release from Consumer Reports that depicted a moral transgression involving unfair employee treatment. Next, they were asked to indicate their attitude and intentions toward the appliance company. Finally, their PDB was assessed using an established scale.

The researchers again found that people with low PDB more harshly evaluated the transgressing company and that this occurred more frequently in low-PDB countries.

In a third study, the team investigated the relationship between PDB and empathy toward victims of morally transgressing companies. Participants were given a news story about the appliance company to read and asked to imagine how the workers felt about what happened and how it has affected their lives. Afterwards, participants were asked, "What thoughts and feelings ran through your mind as you read the news post?

The researchers found that high-PDB consumers did not react as negatively toward morally transgressing companies due to a lack of spontaneously felt empathy toward transgression victims. However, when the researchers induced empathy in participants, the high-PDB consumers became more empathetic toward victims.

"Inequality/equality is an important moral principle, and our research shows that the more people are accepting of inequality, the less empathy they feel for victims and the less negatively they react to company moral transgressions," said Lisa Bolton, professor of marketing.

In yet another study, the researchers provided participants with news stories about two morally transgressing companies -- one in which the victims were more salient, or noticeable, and one in which the victims were less salient. The participants were allowed to choose a gift card for one of the companies. The team found that lower-PDB participants were less likely to choose a gift card for the morally transgressing company with high victim salience.

"People tend to assume that a company that 'does wrong' is punished; yet, there are findings that stock prices and purchase behavior don't change much," said Winterich. "These findings offer some insight into why this may be, as only those with lower PDB are likely to respond negatively to moral transgressions and only when the victims of the transgression are salient."

Finally, the team examined how company response strategies can help mitigate these negative consumer reactions.

"We've probably all seen examples of this -- a company does wrong, then tries to apologize or make restitution to victims," said Bolton. We find that the most effective response strategy is a combination of a sincere apology and a remedy, which signals company empathy toward victims and helps attenuate the negative reactions of consumers who are themselves empathetic toward victims."

Xu noted that the best strategy is for companies to avoid any moral transgressions.

"Unfortunately, some will inevitably occur, and in today's digital age, such a problem can quickly spread around the world and become a global issue," he said. "Our findings suggest that firms operating globally should be prepared for different levels of repercussions if they are accused of ethical misconduct, especially so in low-PDB countries like the United States."

Credit: 
Penn State

Social media study reveals diabetics' fear of disrupted insulin supplies because of Brexit

Diabetics living in the UK worry about disruption to insulin supplies as a result of Brexit, new research shows.

Insulin is the hormone that helps control the body's blood sugar level and is critical to the survival of many people living with Type 1 diabetes. Currently most insulin used in the UK is imported.

The research - by the University of York - analysed 4,000 social media posts from the UK and the States in order to explore the experiences of living as an insulin-dependent person. Around 25 per cent of the posts relating to health were made by diabetics and about nine percent of all the posts were about availability of insulin.

The study looked at tweets posted in 2019 before the Brexit negotiations were concluded but researchers said that people were concerned regardless of outcome.

Dr Su Golder from the Department of Health Sciences said: "People talked a lot about stockpiling and being scared of not being able to get insulin whatever the Brexit outcome. Many of the tweets on this topic discussed the fact that insulin was a lifesaver for them as it is for so many other Type 1 diabetics."

The study also showed that the consequences to an individual's health because of the cost of insulin was the most discussed topic in the States. Some talked about how they manage having to pay for their own medication with many facing the choice of paying for their rent or paying for insulin.

The research also identified issues patients may conceal from healthcare professionals, such as purchasing medications from unofficial sources.

Dr Golder added: "This research gives an insight into the real-life issues individuals face when taking anti diabetic drugs. It shows there is a fear of not having access to insulin, whether due to cost or physical availability and also highlights the impact of the sacrifices made to access insulin."

Type 1 diabetes is a chronic condition in which the pancreas produces little or no insulin by itself. The most common is type 2 diabetes, usually in adults, which occurs when the body becomes resistant to insulin or doesn't make enough insulin. The World Health Organisation estimates about 422 million people worldwide have diabetes.

In the UK, more than a million people with diabetes in the UK rely on insulin, according to the charity, Diabetes UK.

Credit: 
University of York

Growth of northern Tibet proved the key to East Asian biodiversity

image: Snowy mountain peaks of Northern Tibet

Image: 
ZUO Linren

Pioneering work led by a joint China-UK consortium has revealed the origin of one of the world's most important ecosystems, the East Asian biodiversity "hotspot," thus solving a longstanding riddle as to what prompted its formation and evolution.

In a recent study published in Science Advances, a joint research team led by scientists from Xishuangbanna Tropical Botanical Garden (XTBG) of the Chinese Academy of Sciences, the University of Bristol (UK) and the Open University (UK) has revealed the first direct mechanism explaining how the growth of mountains in Northern Tibet drastically altered climate, vegetation and plant diversity in East Asia.

The researchers used an innovative climate model that simulates vegetation and plant diversity, coupled with spectacular new fossil finds, to demonstrate Northern Tibet's importance in transforming what was previously near desert into lush forests.

"We approached this question by integrating modeling results and fossil data," said Dr. LI Shufeng from XTBG, lead author of the study.

The researchers conducted 18 sensitivity experiments using different Tibetan topographies representing various late Paleogene to early Neogene conditions, which tested almost all possible Tibetan orographic evolution scenarios.

They found that from the late Paleogene to the early Neogene (40-23 million years ago), the growth of the north and northeastern portion of Tibet was the most important factor in the development of biodiversity, because it increased rainfall, especially winter rainfall, over East Asia where previously dry winter conditions had existed. This allowed the development of a stable, wet and warm climate that was vital for a variety of unique plants and animal species to evolve in vast numbers, creating what we know today as a biodiversity hotspot. Today, it is one of the world's natural medicinal cabinets and a source of important new pharmaceutical drugs.

According to Prof. Paul J. Valdes of the University of Bristol, most previous studies attempting to identify the source of this hotspot focused on southern Tibet and the Himalaya. But Valdes said the rise of northern Tibet was the key.

"The topography of northern Tibet decreases the East Asian winter monsoon winds in the southern part of China, causing wetter winters in eastern Asia and this allows the expansion of vegetation and biodiversity," said Valdes.

So enigmatic was the change that even in Chinese folklore this area is known as the "land of fish and rice," due to its agricultural wealth, fertile soils, pleasant climate and variety of unique species.

"If there was no northern Tibetan growth, there would be no 'land of fish and rice' in eastern Asia," said Prof. ZHOU Zhekun from XTBG.

Credit: 
Chinese Academy of Sciences Headquarters

Study shows racial disparities in elementary school disciplinary actions

Even after accounting for differences in income, education, caregiver support, special education services and parental reports of misbehavior and family conflict, elementary school-age Black children are 3.5 times more likely to be suspended or placed in detention than their white peers, a new study finds.

The results were unsettling even to the researchers themselves, who were familiar with previous research into racial disparities in school discipline. Previous studies primarily used school records, but this study was able to use a nationwide self-reported dataset, with data collected as part of a long-term investigation into how the brain develops through the preteen and teen years into early adulthood.

Because they had so much data on the participants, the researchers could do what previous studies could not and control for factors that are thought to account for discipline problems, like socioeconomic status and levels of family conflict.

And, in fact, those factors did even out the discipline disparities between white and Hispanic children; however, they couldn't account for the discipline disparities between white and Black children. Among study participants, before factoring in those controls, 3% of white children and 15.2% of Black children received a detention or suspension in the past year.

"We were alarmed about how strong the findings were," said co-first author Matthew Fadus, M.D. "Even when we controlled for many of these predictors of school discipline such as family income and education, the disparities remained."

"With all of those factors controlled for, there has to be something else accounting for the differences in discipline rates," Fadus said. He said that racism or unconscious bias is likely at the root of the higher discipline rates for Black children. In their paper, the researchers note that the problem goes beyond individuals' actions.

"We believe that the findings of this study as a whole are not reflective of individual behaviors and responsibility from youth, but instead are the result of a long history of societal inequities and systemic racism," they said.

Fadus, now with the Department of Child and Adolescent Psychiatry at Massachusetts General Hospital, and co-first author Emilio Valadez, Ph.D., now with the Department of Human Development and Quantitative Methodology at the University of Maryland, College Park, were both working as trainees at the Medical University of South Carolina with co-senior author Lindsay Squeglia, Ph.D., of the Department of Psychiatry and Behavioral Sciences, when they began developing the report.

Squeglia is a principal site investigator on the Adolescent Brain Cognitive Development, or ABCD, study. Conducted at 21 sites across the United States, the ABCD study recruited more than 11,000 children, ages 9 and 10, with the expectation of conducting brain scans, in-depth interviews, psychological tests and cognitive tasks over the course of 10 years to chart typical brain development. The anonymized data is then available to researchers around the world to use to explore social, psychological, neurological and biological questions. Only a few years into the study, the first batches of data have already yielded insights into the effects of prenatal alcohol exposure, parental depression and neighborhood poverty.

Prior studies that considered racial disparities in school discipline examined school records, and those studies found that Black children were more likely to receive detentions and suspensions. Those previous studies' detractors, however, said the disparities were probably the result of differences in student behavior.

The ABCD study contains a wealth of data unavailable in school records, allowing the researchers to control for factors that often are associated with school discipline, like family income and education, as well as special education services and caregiver reports of behavior and conflict at home. As part of the ABCD study, parents also complete the Child Behavior Checklist, a widely used questionnaire about children's behavior.

The checklist includes questions about things like whether the child disobeys at home or school or gets in fights - indicators that allow researchers to benchmark a child's "externalizing behaviors." A certain amount of externalizing behavior is expected for all adolescents, Valadez said. Further, the rates of these behaviors in the ABCD study participants matched the general population, Squeglia said, meaning the ABCD study participants are not an unusually disruptive bunch.

"Children often communicate with behaviors more so than words," Fadus said, "and for this reason, the authors of the study hope that educators will pause for a moment before imposing detention or suspension and instead work to understand more the purposes and meaning of student behaviors." The authors advocated using restorative practices, which they noted emphasize "belonging, social engagement and accountability rather than control and punishment."

"Suspensions and detentions don't work. This is not a practice that is helping kids," Squeglia added. "These are kids. These are 9- and 10-year-olds. We're using practices that we know don't work, but we keep doing this, with these huge disparities in who's getting affected."

This is also a pivotal age, when children can develop negative attitudes about themselves or about school, Fadus said.

Not only are suspensions ineffective, but they especially burden single-parent households when a parent must take time off work to stay home with a suspended child, Squeglia said.

Because the ABCD study follows participants until they are 19 or 20 years old, future research could compare suspension and detention rates once the children enter middle and high school. Only 5% of children overall in the study received a detention or suspension, but that figure would be expected to grow as the children enter their teen years, Squeglia said, which would then allow researchers to make more detailed comparisons.

For example, because of the relatively small number of suspensions and detentions in this first study, the researchers couldn't categorize the rates of discipline for subjective versus objective reasons - subjective reasons being things like talking back and objective reasons being quantifiable violations like bringing drugs or a weapon to school.

And because they're looking at the same individuals over time, researchers could also see whether receiving such discipline at age 9 or 10 alters a child's likelihood of future behavioral or legal problems or leads to worse mental health outcomes, Valadez said.

In their paper, the researchers call for more research into the psychological, social and economic consequences of this type of discipline.

"Unless clinicians, educators, and legislators broadly address racist policies in areas such as education, public benefits, housing opportunities, and the justice system, the findings in this study will likely persist," they said.

Credit: 
Medical University of South Carolina

More than just CO2: It's time to tackle short-lived climate-forcing pollutants

Climate change mitigation is about more than just CO2. So-called "short-lived climate-forcing pollutants" such as soot, methane, and tropospheric ozone all have harmful effects. Climate policy should be guided by a clearer understanding of their differentiated impacts.

It is common practice in climate policy to bundle the climate warming pollutants together and express their total effects in terms of "CO2 equivalence". This 'equivalence' is based on a comparison of climate effects on a 100-year timescale. This approach is problematic, as IASS scientist Kathleen Mar explains in a new research paper: "The fact is that climate forcers simply aren't 'equivalent' - their effects on climate and ecosystems are distinct. Short-lived climate forcers have the largest impact on near-term climate whereas CO2 has the largest impact on long-term climate." The use of a 100-year time horizon as the primary basis for evaluating climate effects understates the impacts of short-lived climate-forcing pollutants (SLCPs) and thus undervalues the positive near-term effects that can be achieved by reducing SLCP emissions.

Reducing SLCP emissions benefits climate, human health, and food security

One of the more pernicious qualities of CO2 is that it accumulates in the atmosphere - once it is emitted, it takes a long time for it to be removed. SLCPs, on the other hand, remain in the atmosphere for significantly shorter periods. The atmosphere and climate system react much more quickly to reductions in the emission of these pollutants. This should be a boon for policymakers, Mar argues: "Reducing SLCP emissions would slow near-term climate warming, reduce air pollution, and improve crop yields - all positive benefits that citizens could experience today and in the near future." Studies indicate that a rapid reduction in SLCP emissions could slow the rate of climate change, reducing the risk of triggering dangerous and potentially irreversible climate tipping points and allowing more time for climate adaptation.

Measures to reduce SLCP emissions could be implemented using existing technologies and practices, such as the collection of landfill gas to generate energy. Changes in other sectors will be needed to achieve further reductions. Methane and soot emissions from the agriculture and waste management sectors, for example, have important climate as well as health impacts. Similarly, hydrofluorocarbons (HFCs), which are significantly more potent climate forcers than CO2, are still widely used as coolants.

Designing effective climate change mitigation: don't overlook SLCPs

Mar argues that clear communication on the different time horizons relevant for CO2 and SLCP mitigation would help guide climate policy discussions towards more effective outcomes: "To mitigate the most harmful consequences of climate change as a whole, we need to minimize both near-term and long-term climate impacts - and that means reducing SLCPs in parallel to CO2. However, the positive benefits of SLCP mitigation are simply not captured when using the 100-year time horizon as the single benchmark for assessing climate impacts."

Recognizing the broader benefits of SLCP reductions, several countries have stepped up and made SLCP mitigation a central element of their national climate strategies. Chile, Mexico, and Nigeria all include SLCPs in their national commitments under the Paris Agreement. If this type of holistic approach can be expanded and translated into on-the-ground emissions reductions at a global scale, then it will certainly be a win, not only for climate, but for air quality, health, and sustainable development.

Credit: 
Research Institute for Sustainability (RIFS) – Helmholtz Centre Potsdam