Earth

Surprising news: drylands are not getting drier

image: A dryland ecosystem in Northern California shows decreasing soil moisture but little changes in surface water availability.

Image: 
Columbia Engineering

New Columbia Engineering study--first to investigate the long-term effect of soil moisture-atmosphere feedbacks in drylands--finds that soil moisture exerts a negative feedback on surface water availability in drylands, offsetting some of the expected decline

New York, NY--January 4, 2021--Scientists have thought that global warming will increase the availability of surface water--freshwater resources generated by precipitation minus evapotranspiration--in wet regions, and decrease water availability in dry regions. This expectation is based primarily on atmospheric thermodynamic processes. As air temperatures rise, more water evaporates into the air from the ocean and land. Because warmer air can hold more water vapor than dry air, a more humid atmosphere is expected to amplify the existing pattern of water availability, causing the "dry-get-drier, and wet-get-wetter" atmospheric responses to global warming.

A Columbia Engineering team led by Pierre Gentine, Maurice Ewing and J. Lamar Worzel professor of earth and environmental engineering and affiliated with the Earth Institute, wondered why coupled climate model predictions do not project significant "dry-get-drier" responses over drylands, tropical and temperate areas with an aridity index of less than 0.65, even when researchers use the high emissions global warming scenario. Sha Zhou, a postdoctoral fellow at Lamont-Doherty Earth Observatory and the Earth Institute who studies land-atmosphere interactions and the global water cycle, thought that soil moisture-atmosphere feedbacks might play an important part in future predictions of water availability in drylands.

The new study, published today by Nature Climate Change, is the first to show the importance of long-term soil moisture changes and associated soil moisture-atmosphere feedbacks in these predictions. The researchers identified a long-term soil moisture regulation of atmospheric circulation and moisture transport that largely ameliorates the potential decline of future water availability in drylands, beyond that expected in the absence of soil moisture feedbacks.

"These feedbacks play a more significant role than realized in long-term surface water changes," says Zhou. "As soil moisture variations negatively impact water availability, this negative feedback could also partially reduce warming-driven increases in the magnitudes and frequencies of extreme high and extreme low hydroclimatic events, such as droughts and floods. Without the negative feedback, we may experience more frequent and more extreme droughts and floods."

The team combined a unique, idealized multi-model land-atmosphere coupling experiment with a novel statistical approach they developed for the study. They then applied the algorithm on observations to examine the critical role of soil moisture-atmosphere feedbacks in future water availability changes over drylands, and to investigate the thermodynamic and dynamic mechanisms underpinning future water availability changes due to these feedbacks.

They found, in response to global warming, strong declines in surface water availability (precipitation minus evaporation, P-E) in dry regions over oceans, but only slight P-E declines over drylands. Zhou suspected that this phenomenon is associated with land-atmosphere processes. "Over drylands, soil moisture is projected to decline substantially under climate change," she explains. "Changes in soil moisture would further impact atmospheric processes and the water cycle."

Global warming is expected to reduce water availability and hence soil moisture in drylands. But this new study found that the drying of soil moisture actually negatively feeds back onto water availability--declining soil moisture reduces evapotranspiration and evaporative cooling, and enhances surface warming in drylands relative to wet regions and the ocean. The land-ocean warming contrast strengthens the air pressure differences between ocean and land, driving greater wind blowing and water vapor transport from the ocean to land.

"Our work finds that soil moisture predictions and associated atmosphere feedbacks are highly variable and model dependent," says Gentine. "This study underscores the urgent need to improve future soil moisture predictions and accurately represent soil moisture-atmosphere feedbacks in models, which are critical to providing reliable predictions of dryland water availability for better water resources management."

Credit: 
Columbia University School of Engineering and Applied Science

Scientists reach limit of multi-parameter quantum measurement with zero trade-off

image: There are three modules in the experiment: state preparation, evolution, and measurement.

Image: 
HOU Zhibo et al.

Real-life applications like magnetometry or quantum gyroscope typically involve precise measurement on multiple parameters. How to achieve the ultimate precision limits simultaneously is a long sought-after grail in the field.

It is widely believed that the ultimate precision limits for all parameters cannot be achieved simultaneously, since generators of different parameters are generally non-commuting, which induces the trade-offs among the precisions.

Yet such trade-offs are escaped from by the group of Prof. LI Chuanfeng and Prof. XIANG Guoyong from Key Laboratory of Quantum Information at University of Science and Technology of China of the Chinese Academy of Sciences and their collaborator Prof. YUAN Haidong from Chinese University of Hongkong.

They counteracted the trade-offs and achieved the precision limit for the estimation of all three parameters in SU(2) operators simultaneously with 13.27 dB improvement over the shot-noise limit, which has been published in journal Science Advances.

XIANG and researchers extended the control-enhanced sequential measurement scheme from single-parameter estimation to multi-parameter estimation.

They related the simultaneous multi-parameter quantum estimation directly to the Heisenberg uncertainty relations and showed that to achieve the precision limit for multiple parameters simultaneously requires the simultaneous saturation of the minimum uncertainty in multiple Heisenberg uncertainty relations.

As the first experimental demonstration of multi-parameter quantum estimation with zero trade-off, the work reveals the deep connection between quantum metrology and the Heisenberg uncertainty principle and marks a crucial step towards achieving the ultimate precision of multi-parameter quantum estimation.

XIANG's group have been dedicated to counteracting the trade-offs in multi-parameter estimation. They first developed new experimental measurement techniques of collective measurements, which successfully reduced the trade-offs in quantum state tomography and quantum orienteering . Then they optimized the entangled probe states in quantum magnetometry and obtained the ultimate precision limit for three magnetic components with minimum trade-offs. Though diminished, the trade-offs still exist in these past works.

Credit: 
University of Science and Technology of China

Elephant ivory continues to be disguised and sold on eBay

Research from the University of Kent's Durrell Institute of Conservation and Ecology (DICE) has found that elephant ivory is still being sold on the online marketplace eBay, despite its 10-year-old policy banning the trade in ivory.

The trafficking of wildlife over the internet continues to be a problem, with the detection of illegal activity being challenging. Despite efforts of law enforcement, the demand for illegal wildlife products online has continued to increase. In some cases vendors have adopted the use of 'code words' to disguise the sale of illicit items.

Sofia Venturini and Dr David Roberts of DICE investigated the misrepresentation of materials in advertisement descriptions of netsuke being sold on eBay UK. Netsuke are carved objects, attached to the cord of the Japanese kimono and are often made of elephant ivory.

A comparison was made between the materials declared by the vendors and the authors' identification based on the images in the advertisements. As it was not ethically desirable to obtain the physical items for analysis, the researchers verified authentic elephant ivory by analysing the presence of Schreger lines (a unique pattern found in elephant ivory).

The researchers found that authentic elephant ivory was most frequently described as bone in listings of netsuke. Further, by returning a month later they found that only a small percentage (between 1.3% and 6.9%) of these netsuke made of elephant ivory had been removed by eBay. Over half had been sold, while among the items that remained unsold, half were relisted. If eBay was effectively enforcing its policy (introduced in 2008) on ivory, these items would have been removed.

Dr Roberts said: 'Despite eBay's strict policy on Animal and Wildlife Products, there is still an ongoing trade in ivory, mostly concealed as other non-restricted materials. While detecting illegal sales of ivory items can be particularly difficult as, for example, the word "ivory" can be used to describe a colour, companies like eBay have the resources and data that could be mobilised to tackle the challenge of illegal wildlife trade.'

Credit: 
University of Kent

New data-driven global climate model provides projections for urban environments

image: A new climate model that makes projections specific to urban areas predicts that by the end of this century, average warming across global cities will increase by 1.9 degrees Celsius to 4.4 C, depending on the rate of emissions.

Image: 
Graphic by Michael Vincent

CHAMPAIGN, Ill. -- Cities only occupy about 3% of the Earth's total land surface, but they bear the burden of the human-perceived effects of global climate change, researchers said. Global climate models are set up for big-picture analysis, leaving urban areas poorly represented. In a new study, researchers take a closer look at how climate change affects cities by using data-driven statistical models combined with traditional process-driven physical climate models.

The results of the research led by University of Illinois Urbana Champaign engineer Lei Zhao are published in the journal Nature Climate Change.

Home to more than 50% of the world's population, cities experience more heat stress, water scarcity, air pollution and energy insecurity than suburban and rural areas because of their layout and high population densities, the study reports.

"Cities are full of surfaces made from concrete and asphalt that absorb and retain more heat than natural surfaces and perturb other local-scale biophysical processes," said Zhao, a civil and environmental engineering professor and National Center for Supercomputing Applications affiliate. "Incorporating these types of small-scale variables into climate modeling is crucial for understanding future urban climate. However, finding a way to include them in global-scale models poses major resolution, scale and computational challenges."

Global climate models project future scenarios by modeling how broader-scale processes like greenhouse gas emissions force the global climate to respond. By combining this technique with a statistical model that emulates a complex and detailed climate model for urban landscapes, Zhao's team confronted the urban-to-global information gap.

The team applied its urban climate emulation technique to data from 26 global climate models under intermediate- and high-emissions scenarios. This approach allowed researchers to model outputs into city-level projections of temperature and relative humidity through the year 2100, permitting climate change and uncertainty quantification.

The model predicts that by the end of this century, average warming across global cities will increase by 1.9 degrees Celsius with intermediate emissions and 4.4 C with high emissions, with good agreement among existing climate models over certain regions, Zhao said.

The projections also predicted a near-universal decrease in relative humidity in cities, making surface evaporation more efficient and implying that adaptation strategies like urban vegetation could be useful.

"Our findings highlight the critical need for global projections of local urban climates for climate-sensitive urban areas," Zhao said. "This could give city planners the support they need to encourage solutions such as green infrastructure intervention to reduce urban heat stress on large scales."

Currently, the projections do not account for the effects of future urban development. However, the researchers hypothesize that they can extend their strategy to make up for this. "The methodology, overall, is very flexible and can be adjusted to capture things like finer time scales and can even be applied to other ecosystems, like forests and polar regions, for example," Zhao said.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Researchers discover a new tool for reconstructing ancient sea ice to study climate change

image: A compound that was notorious for throwing off reconstructions of sea surface temperature turns out to be a good proxy for reconstructing past sea ice, a new study finds.

Image: 
Karen Wang

PROVIDENCE, R.I. [Brown University] -- Sea ice is a critical indicator of changes in the Earth's climate. A new discovery by Brown University researchers could provide scientists a new way to reconstruct sea ice abundance and distribution information from the ancient past, which could aid in understanding human-induced climate change happening now.

In a study published in Nature Communications, the researchers show that an organic molecule often found in high-latitude ocean sediments, known as tetra-unsaturated alkenone (C37:4), is produced by one or more previously unknown species of ice-dwelling algae. As sea ice concentration ebbs and flows, so do the algae associated with it, as well as the molecules they leave behind.

"We've shown that this molecule is a strong proxy for sea ice concentration," said Karen Wang, a Ph.D. student at Brown and lead author of the research. "Looking at the concentration of this molecule in sediments of different ages could allow us to reconstruct sea ice concentration through time."

Other types of alkenone molecules have been used for years as proxies for sea surface temperature. At different temperatures, algae that live on the sea surface make differing amounts of alkenones known as C37:2 and C37:3. Scientists can use the ratios between those two molecules found in sea sediments to estimate past temperature. C37:4 -- the focus of this new study -- had been long considered a bit of problem for temperature measurements. It turns up in sediments taken from closer to the Arctic, throwing off the C37:2/C37:3 ratios.

"That was mostly what the C37:4 alkenone was known for -- throwing off the temperature ratios," said Yongsong Huang, principal investigator of the National Science Foundation-funded project and a professor in Brown's Department of Earth, Environmental and Planetary Science. "Nobody knew where it came from, or whether it was useful for anything. People had some theories, but no one knew for sure."

To figure it out, the researchers studied sediment and sea water samples containing C37:4 taken from icy spots around the Arctic. They used advanced DNA sequencing techniques to identify the organisms present in the samples. That work yielded previously unknown species of algae from the order Isochrysidales. The researchers then cultured those new species in the lab and showed that they were indeed the ones that produced an exceptionally high abundance of C37:4.

The next step was to see whether the molecules left behind by these ice-dwelling algae could be used as a reliable sea ice proxy. To do that, the researchers looked at concentrations of C37:4 in sediment cores from several spots in the Arctic Ocean near the present-day sea ice margins. In the recent past, sea ice in these spots is known to have been highly sensitive to regional temperature variation. That work found that the highest concentrations of C37:4 occurred when climate was coldest and ice was at its peak. The highest concentrations dated back to the Younger-Dryas, a period of very cold and icy conditions that occurred around 12,000 years ago. When climate was at its warmest and ice ebbed, C37:4 was sparse, the research found.

"The correlations we found with this new proxy were far stronger than other markers people use," said Huang, a research fellow at the Institute at Brown for Environment and Society. "No correlation will be perfect because modeling sea ice is a messy process, but this is probably about as strong as you're going to get."

And this new proxy has some additional advantages over others, the researchers say. One other method for reconstructing sea ice involves looking for fossil remains of another kind of algae called diatoms. But that method becomes less reliable further back in time because fossil molecules can degrade. Molecules like C37:4 tend to be more robustly preserved, making them potentially better for reconstructions over deep time than other methods.

The researchers plan to further research these new algae species to better understand how they become embedded in sea ice, and how they produce this alkenone compound. The algae appear to live in brine bubbles and channels inside sea ice, but it may also bloom just after the ice melts. Understanding those dynamics will help the researchers to better calibrate C37:4 as a sea ice proxy.

Ultimately, the researchers hope that the new proxy will enable better understanding of sea ice dynamics through time. That information would improve models of past climate, which would make for better predictions of future climate change.

Credit: 
Brown University

Immunology study finds protein critical to T cell metabolism and anti-tumor immune response

image: Shao-Cong Sun, Ph.D.

Image: 
MD Anderson Cancer Center

HOUSTON -- Researchers at The University of Texas MD Anderson Cancer Center have discovered that a protein called NF-kappa B-inducing kinase (NIK) is essential for the shift in metabolic activity that occurs with T cell activation, making it a critical factor in regulating the anti-tumor immune response.

The preclinical research, published today in Nature Immunology, suggests that elevating NIK activity in T cells may be a promising strategy to enhance the effectiveness of immunotherapy, including adoptive cellular therapies and immune checkpoint blockade.

In a preclinical melanoma model, the researchers evaluated melanoma-specific T cells engineered to express higher levels of NIK. Compared to controls, these T cells displayed stronger tumor-killing abilities and improved survival, suggesting that increasing NIK activity may improve the effectiveness of adoptive T cell therapies.

"NIK is a novel regulator of T cell metabolism that works in a very unique manner. Biologically, NIK activity stabilizes the HK2 glycolytic enzyme through regulating the cellular redox pathway," said corresponding author Shao-Cong Sun, Ph.D., professor of Immunology. "From the therapeutic point of view, we were able to improve the efficacy of adoptive T cell therapies in preclinical models by overexpressing NIK in those cells."

T cells generally exist in a relatively quiet state with low energy demands and little cell division, Sun explained. However, upon recognizing an antigen, T cells begin expanding and activate the glycolysis metabolic pathway to meet the increased energy demands of carrying out their immune function.

This metabolic shift is closely regulated by immune checkpoint proteins, such as CTLA-4 and PD-1, which act to repress T cell metabolism. Thus, immune checkpoint inhibitors can reinvigorate T cell anti-tumor activity by boosting metabolism. In addition, T cells begin producing proteins called costimulatory molecules after they become activated, which work to stimulate metabolism and the immune response.

Knowing that the NIK protein functions downstream of many of these costimulatory molecules, the researchers sought to better understand its role in regulating T cell function. In melanoma models, NIK loss resulted in an increased tumor burden and fewer tumor-infiltrating T cells, suggesting NIK plays a crucial role in anti-tumor immunity and T cell survival.

Further experiments revealed that NIK is essential for the metabolic reprogramming in activated T cells through its control of the cellular redox system. Increased metabolism can lead to elevated levels of reactive oxygen species (ROS), which can damage the cell and stimulate protein degradation.

The researchers discovered that NIK maintains the NADPH redox system, an important antioxidant mechanism to reduce the accumulation of ROS. This in turn leads to the stabilization of the HK2 protein, a rate-limiting enzyme within the glycolysis pathway.

"Our findings suggest that without NIK, the HK2 protein is not stable, and is constantly being degraded. You need NIK to maintain HK2 levels in T cells," Sun said. "Interestingly, we found that adding more NIK to the cells, you can further increase the levels of HK2 and make glycolysis more active."

As a potential therapeutic application, the researchers currently are working to evaluate modified chimeric antigen receptor (CAR) T cells in the laboratory engineered to overexpress NIK. In the future, they hope to explore other therapeutic approaches, such as targeted therapies that could manipulate NIK activity in tandem with other immunotherapy approaches, including immune checkpoint inhibitors.

Credit: 
University of Texas M. D. Anderson Cancer Center

Severe sepsis predicted by common protein

A sugar-binding protein could fuel terrible inflammation and worsen sepsis, a disease that kills more than 270,000 people every year in the US alone, reports a team of researchers led by UConn Health in the 4 January issue of Nature Immunology.

Sepsis is caused mostly by bacterial infections. The immune system runs out of controls and triggers a cytokine storm, a condition in which inflammation-causing proteins flood the blood. Organs may break down, and death often follows. 

Other diseases can also cause cytokine storms; medical historians believe cytokine storms were behind the lethality of the 1918-1919 flu epidemic, as well as the Black Death. Cytokine storms are also observed in patients with severe COVID-19 and believed to be involved in death in COVID-19.

A main trigger for the cytokine storms during sepsis is the overreaction of the body when it detects an infection inside the cells. When a cell detects bacteria or pieces of bacteria inside itself, it immediately activates enzymes that in turn activate a protein that pokes holes on the cell membrane from within, eventually causing the cell to burst open and spill cytokines into the bloodstream. Cytokines are alarm signals, calling in the immune system to fight the bacteria. Cytokines also make other cells more likely to burst open and sound the alarm. Usually, the system damps itself after a while and calms down, but in sepsis it spins out of control, causing more and more cells to burst and die and release even more cytokines into the bloodstream. 

When cells burst open, they release not only cytokines, but also other danger molecules called alarmins that alarm the body of an infection or injury and can amplify the ongoing cytokine storm.

UConn Health immunologist Vijay Rathinam wanted to know which alarmins were released when a cell detected a specific kind of bacterial molecule called lipopolysaccharide inside itself. Dr. Ashley Russo, then a graduate student in the Rathinam lab, catalogued--in collaboration with immunologists Tony Vella and Antoine Menoret at UConn Health--proteins released by these cells when they detected lipopolysaccharide. 

And they found something exciting. Galectin-1, a protein that binds sugars and sugar-coated proteins, seemed to be emanating from the cells. Interestingly, they found that galectin-1 is small enough to be slipping out of the holes poked in the cells' membrane, even before the cells burst open.

Once they noticed that, they began to look at the role galectin-1 played in sepsis. They found that galectin-1 seemed to be suppressing a brake on inflammation, causing the cytokine storm to ramp up. They also found that mice lacking galectin-1 had less inflammation, less organ damage, and survived longer than normal mice did during sepsis resulting from a bacterial infection and lipopolysaccharide. 

To find out if galectin-1 is released during sepsis in human patients, the team collaborated with the Jena University Hospital's Drs. Deshmukh, Bauer, and Sponholz and found that sepsis patients had higher levels of galectin-1 than other non-sepsis patients in critical care and healthy people.

The team is considering whether galectin-1 might be a good drug target to help dampen cytokine storms during sepsis, as well as a useful marker doctors could use to identify critical ill patients at risk. 

Credit: 
University of Connecticut

Why do males have to wait for 'round 2'? The reason may be different from what we think

If you type into a search engine - "why do men have to wait before having sex again?" - you will very quickly come across Prolactin. This little hormone is thought to be involved in hundreds of physiological processes in the body. Among them is the male post-ejaculatory refractory period. This period begins when a male ejaculates and ends when he recovers his sexual capacity.

If you search a bit more, you'll see that this theory has even led to the development of so called "treatments". These promise to shorten the length of a person's refractory period by reducing their body's prolactin levels.

Well, here is some bad news for anyone who has bought any such merchandise. A new study in mice by scientists at the Champalimaud Centre for the Unknown in Portugal reveals that prolactin may actually not be the culprit after all. These results were published today (January 4th) in the journal Communications Biology.

The Theory

Ironically, the research project that ended up refuting the theory, never aimed to do so.

"When we started working on this project, we actually set off to explore the theory", recalls Susana Lima, the principal investigator who led the study. "Our goal was to investigate in more detail the biological mechanisms by which prolactin might generate the refractory period."

What is the basis of the theory? According to Lima, it emerged through several lines of evidence.

For one, some studies have shown that prolactin is released around the time of ejaculation in humans and rats. And since the refractory period starts right after ejaculation, prolactin seemed like a good candidate. Also, chronic abnormally high levels of prolactin are associated with decreased sexual drive, anorgasmia and ejaculatory dysfunction. Finally, treatment with drugs that inhibit prolactin release in situations of chronically high prolactin, reverse sexual dysfunction.

"These different results all point towards a central role for prolactin in suppressing male sexual behaviour", says Lima. "However, a direct link between prolactin and the male post-ejaculatory refractory period was never directly demonstrated. Still, this theory has become so widespread that it now appears in textbooks as well as in the popular press."

Why Not Prolactin?

How did the team end up discovering that the theory was wrong?

To study the role of prolactin in the male refractory period, Lima and her team performed a series of experiments in mice.

"We chose mice as our model animal because the sequence of sexual behaviour in mice is very similar to that of humans", explains Susana Valente, the first author of the study. "Also, with mice, we can test different strains that exhibit different sexual performance, which makes the data richer. In this case we used two different strains. One that has a short refractory period, and another that has a long one, lasting several days."

The team began by checking if prolactin levels also increase during sexual activity in male mice. "We measured the levels during the different stages of sexual behaviour using blood samples. And sure enough, they significantly increased during sexual interaction", says Valente

Once this aspect was confirmed, the researchers moved forward to investigate the relation between prolactin and the length of the animals' refractory period.

"Our first manipulation was to artificially increase prolactin levels before the animals became sexually aroused. We specifically made sure that the artificial levels matched those we measured during natural sexual behaviour. If prolactin was indeed the cause of the refractory period, the animals' sexual activity should have decreased", Valente explains.

To their surprise, this manipulation had no effect on the sexual behaviour of the mice. "Despite the elevation in prolactin levels, both strains of mice engaged in sexual behaviour normally", she recalls.

Next, the researchers turned to see if blocking prolactin would have the opposite effect on the refractory period. In other words, if animals without prolactin would be more sexually active. Again, the answer was "No".

"If prolactin was indeed necessary for the refectory period, males without prolactin should have regained sexual activity after ejaculation faster than controls", Valente points out. "But they did not."

Back To The Drawing Board

Together, Valente and Lima's results provide strong counter evidence to the theory claiming prolactin triggers the male refractory period. Still, prolactin is undoubtedly a part of male sexual behaviour. What could be its role?

"There are many possibilities", says Lima. "For instance, there are studies that point towards a role for prolactin in the establishment of parental behaviour. Also, it's important to note that prolactin dynamics are quite different in male mice and men. In mice, prolactin levels rise during mating. However, in men, prolactin seems to only be released around the time of ejaculation, and only when ejaculation is achieved. So there may be some differences in its role across species."

So what is the reason males have to wait before round two?

"Our results indicate that prolactin is very unlikely to be the cause", says Lima. "Now we can move on and try to find out what's really happening", she concludes.

Credit: 
Champalimaud Centre for the Unknown

Scientists seek faster route to treat depression

image: The Brazilian research group used epigenetic modulators to try to 'erase' the damage done by stress to neuroplasticity. The study showed that acute intervention in epigenetic mechanisms produces antidepressant-like effects more rapidly than conventional drugs.

Image: 
FCFRP-USP

By Karina Ninni | Agência FAPESP – Treatment of depression faces two main challenges. The first is that almost 50% of patients do not respond well to existing antidepressants. The second is that conventional medications take a relatively long time – around three to five weeks – to have the desired effect. A group of researchers affiliated with the University of São Paulo (USP) in Brazil set out to tackle the second problem by using epigenetic modulators to try to “erase” the consequences of stress. Epigenetic mechanisms are part of a complex system that controls how and when genes are switched on or off.

Exposure to stress, a key trigger of depression, alters certain epigenetic markers in the brain. Many of these alterations occur in genes associated with neuroplasticity, the brain’s ability to change in response to experience. Stress increases DNA methylation in these genes.

DNA methylation is a chromatin remodeling process that regulates gene expression by recruiting proteins involved in gene repression or by inhibiting the binding of transcription factors to DNA. Most existing antidepressants are designed to reduce this process.

The team led by Sâmia Joca, a professor at USP and the University of Aarhus in Denmark, decided to conduct an in-depth investigation into the action of BDNF (brain-derived neurotrophic factor), a nervous system protein with well-documented effects on the regulation of neuronal plasticity.

“Stress reduces expression of BDNF and, as shown in the literature, antidepressants have no effect if BDNF signaling is blocked. That’s why we focused on BDNF,” said Joca, who is affiliated with the Biomolecular Science Department at USP’s Ribeirão Preto School of Pharmaceutical Sciences (FCFRP).

The group tested the hypothesis that stress increases methylation of the gene for BDNF, reducing its expression and that this reduction is linked to depressive behavior. “Our starting point was this: if we administered a genetic modulator that inhibited DNA methylation, the process wouldn’t happen, BDNF levels would be normal, and there would be an antidepressant effect,” Joca said. “If the antidepressant effect is indeed linked to normalization of the methylation profile, so that conventional drugs take time to work because it takes time to eliminate stress-induced alterations, we imagined that direct modulation of these epigenetic mechanisms would produce the effect rapidly. We found this was indeed the case.”

They report the results in an article published in the journal Molecular Neurobiology. The first author is Amanda Juliana Sales, who was supported by FAPESP. The other authors are Izaque S. Maciel and Angélica Suavinha, researchers supervised by last author Joca and also supported by FAPESP.

“We tested two drugs, one of which is used to treat cancer (gliomas). The other is completely experimental,” Joca said. “It’s important to note that these drugs can’t be used to treat depression because if they reduce DNA methylation unrestrictedly they’ll increase the expression of several genes rather than just the gene that interests us. So there will be adverse effects. The findings point not to prospects for novel antidepressants but to an interesting angle from which to develop novel treatments.”

Behavior

According to Joca, to test the hypothesis that direct modulation of epigenetic mechanisms would work faster, it was necessary to use (and validate) a model that distinguished very clearly between chronic and acute treatment. The scientists first validated a stress-induced depression model in rats treated with well-known conventional drugs. In this model, called “learned helplessness”, the rats were exposed to inescapable stress, followed seven days later by a situation in which it was possible to avoid stress by moving to the other side of the chamber they were in.

The results showed a higher number of failures to learn this avoidance behavior among stressed than non-stressed animals, which was expected. This trend was attenuated by chronic treatment with conventional antidepressants and acute treatment with epigenetic modulators.

“What we call learned helplessness in this model is similar to depression in humans, to the feeling that there’s nothing the person can do to make the situation better,” Joca said. “The model was validated and showed that when continuously treated with antidepressants, the animals returned to normal and resembled non-stressed animals in behavioral terms. However, this only happened if they were treated repeatedly. The same applies to depressed people, who have to take the drug continuously. There is no acute effect from a single dose.”

The forced swimming test was also used to stress the rats, whose behavior was observed after 24 hours. In this case, too, conventional drugs reduced the level of stress-induced depression. Having validated the model, the researchers ran another series of experiments in which epigenetic modulators were found to have an antidepressant-like effect.

Retest reliability

The team tested two different drugs as modulators, 5-AzaD and RG108. Both inhibit the enzyme responsible for DNA methylation, “but they aren’t chemically related,” Joca explained. “We wanted to avoid the possibility that the effect was due to some non-specific mechanism in one of the drugs. So we used entirely different drugs and obtained the same result. We measured the effect at two different times, shortly after the inescapable stress in one group and before the helplessness test in the other. We observed a rapid antidepressant effect in both cases.”

The next step was a molecular analysis of 5-AzaD in order to produce a methylation profile of the gene of interest. “We found that stress did indeed increase methylation of BDNF as well as TrkB, another nervous system protein, and this was moderately attenuated by our treatments,” Joca said.

Because the alteration was very subtle, the researchers decided to analyze retest reliability. “Using a different model, we reproduced results of the forced swimming test and injected the drug systemically while also administering a BDNF signaling inhibitor to the cortex. This had no antidepressant effect,” Joca said.

The study was a continuation of the work Joca and her team have been doing for several years. “In 2010, we published an article showing that these drugs had an antidepressant effect. Not long after that, we published another article showing that antidepressant treatment modulated DNA methylation. The interesting point in this latest study was the production of the antidepressant effect by means of an acute intervention. This is the first time epigenetic modulators have been shown to have a rapid antidepressant effect,” Joca said.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Focusing on diversion yields positive results for kids with behavioral issues

Of the 5,300 children enrolled in the Ohio Behavioral Health Juvenile Justice Initiative since 2006, 21% reported that someone close them had been murdered in the past year. Nearly half of the boys and more than a quarter of the girls in the program have both a substance abuse and mental health disorder.

But there's good news, too: From 2017 through 2019, 81% of the participants--aged 10 through 17--successfully completed the state's juvenile diversion program, and data indicated that 79% of youth reduced their contact with police while in treatment.

Those findings are from a new detailed evaluation of the Ohio Behavioral Health Juvenile Justice Initiative (BHJJ) by researchers at the Jack, Joseph and Morton Mandel School of Applied Social Sciences at Case Western Reserve University.

The key conclusion: Many youthful offenders can benefit from community-based diversion programs designed to address mental health and substance use issues in lieu of commitment to local or state-run detention centers.

"The majority of justice-involved youth have a history of mental health and/or substance-use issues, and have experienced a great deal of trauma," said Jeff Kretschmar, co-author of the study and the research associate professor at the university's Begun Center for Violence Prevention Research and Education. "However, local jurisdictions are often ill-equipped to accurately assess youth for behavioral health problems and provide appropriate treatment. Ohio's Behavioral Health Juvenile Justice Initiative was intended to transform and expand the local systems' options to better serve these youths."

The report focused on youth currently enrolled in the program rather than retrospectively, Kretschmar said, to "identify emerging behavioral health trends and better understand the effectiveness of the model as it operates across Ohio today."

Report highlights include:

Youth reported a significant decrease in trauma symptoms and problem severity from intake to termination, and a significant improvement in functioning.

Since 2015, only 3.8% of youth enrolled in BHJJ were committed to a state-run detention facility after enrollment.

BHJJ costs about $5,200 per child, compared with $196,000 per child who enters a state-run detention facility.

"The breadth of the data provides us with an opportunity to examine outcomes for youth in BHJJ from a variety of angles and provides practitioners with enough information to match programming with behavioral health needs," said Fredrick Butcher, research assistant professor at the Begun Center.

Credit: 
Case Western Reserve University

Researchers regenerate deactivated catalyst in methanol-to-olefins process

image: a.First-principle-based simulations provide the criteria of stability and functionality of organic intermediates confined in nano-cavity. b Selective transformation of coke into specific naphthalenic species-rich catalyst, and improvement of MTO performance and atom economy implemented in the circulating fluidized bed reactor-regenerator configuration.

Image: 
GAO Mingbin

MTO process, which was first commercialized in 2010, is a catalytic process converting methanol, which is typically made from coal, natural gas, biomass, and CO2, over SAPO-34 zeolite catalyst. It's becoming one of the main streams for producing light olefins, including ethylene and propylene, from non-oil resources.

One of the major challenges in MTO is the rapid deactivation of zeolite catalyst due to the coke deposition.

In industrial practices, a fluidized bed reactor-regenerator configuration is normally used in order to maintain the continuous operation, in which air or oxygen is usually input to burn off the deposited coke to restore the catalyst activity in the regenerator. This involves the transformation of coke species to CO2, with a substantial fraction of carbon resource being converted to low-value greenhouse gas.

A research group led by Prof. YE Mao and Prof. LIU Zhongmin from the Dalian Institute of Chemical Physics (DICP) of the Chinese Academy of Sciences regenerated deactivated catalyst in industrially important methanol-to-olefins (MTO) process by directly transforming the coke deposited on the zeolite catalyst to active intermediates rather than burning off to carbon oxide.

This work was published in Nature Communications on Jan. 4.

It was previously shown that MTO follows the hydrocarbon pool mechanism, i.e. the light olefins are favorably formed with the participation of active intermediate species, or called hydrocarbon pool species (HCPs), during the reaction. The HCPs will evolve into coke species that deactivate catalyst.

By using the density functional theory (DFT) calculations and multiple spectroscopy techniques, this team showed that naphthalenic cations, amongst HCPs, were highly stable within SAPO-34 zeolites at high temperature, and steam cracking could directionally transform the coke species in SAPO-34 zeolites to naphthalenic species at high temperature.

This technology not only recovers the catalyst activity but also promotes the formation of light olefins owing to the synergic effect imposed by naphthalenic species.

Furthermore, the researchers verified this technology in the fluidized bed reactor-regenerator pilot plant in DICP with industrial-alike continuous operations, achieving an unexpectedly high light olefins selectivity of 85% in MTO reaction and 88% valuable CO and H2 with negligible CO2 in regeneration.

This technology opens a new venue to control the selectivity of products via regeneration in industrial catalytic processes.

Credit: 
Dalian Institute of Chemical Physics, Chinese Academy Sciences

Nanoparticle drug-delivery system developed to treat brain disorders

Use of the delivery system in mouse models results in unprecedented siRNA penetration across the intact blood brain barrier

Technology could offer potential for a variety of human neurological disorders

In the past few decades, researchers have identified biological pathways leading to neurodegenerative diseases and developed promising molecular agents to target them. However, the translation of these findings into clinically approved treatments has progressed at a much slower rate, in part because of the challenges scientists face in delivering therapeutics across the blood-brain barrier (BBB) and into the brain. To facilitate successful delivery of therapeutic agents to the brain, a team of bioengineers, physicians, and collaborators at Brigham and Women's Hospital and Boston Children's Hospital created a nanoparticle platform, which can facilitate therapeutically effective delivery of encapsulated agents in mice with a physically breached or intact BBB. In a mouse model of traumatic brain injury (TBI), they observed that the delivery system showed three times more accumulation in brain than conventional methods of delivery and was therapeutically effective as well, which could open possibilities for the treatment of numerous neurological disorders. Findings were published in Science Advances.

Previously developed approaches for delivering therapeutics into the brain after TBI rely on the short window of time after a physical injury to the head, when the BBB is temporarily breached. However, after the BBB is repaired within a few weeks, physicians lack tools for effective drug delivery.

"It's very difficult to get both small and large molecule therapeutic agents delivered across the BBB," said corresponding author Nitin Joshi, PhD, an associate bioengineer at the Center for Nanomedicine in the Brigham's Department of Anesthesiology, Perioperative and Pain Medicine. "Our solution was to encapsulate therapeutic agents into biocompatible nanoparticles with precisely engineered surface properties that would enable their therapeutically effective transport into the brain, independent of the state of the BBB."

The technology could enable physicians to treat secondary injuries associated with TBI that can lead to Alzheimer's, Parkinson's, and other neurodegenerative diseases, which can develop during ensuing months and years once the BBB has healed.

"To be able to deliver agents across the BBB in the absence of inflammation has been somewhat of a holy grail in the field," said co-senior author Jeff Karp, PhD, of the Brigham's Department of Anesthesiology, Perioperative and Pain Medicine. "Our radically simple approach is applicable to many neurological disorders where delivery of therapeutic agents to the brain is desired."

Rebekah Mannix, MD, MPH, of the Division of Emergency Medicine at Boston Children's Hospital and a co-senior author on the study, further emphasized that the BBB inhibits delivery of therapeutic agents to the central nervous system (CNS) for a wide range of acute and chronic diseases. "The technology developed for this publication could allow for the delivery of large number of diverse drugs, including antibiotics, antineoplastic agents, and neuropeptides," she said. "This could be a game changer for many diseases that manifest in the CNS."

The therapeutic used in this study was a small interfering RNA (siRNA) molecule designed to inhibit the expression of the tau protein, which is believed to play a key role in neurodegeneration. Poly(lactic-co-glycolic acid), or PLGA, a biodegradable and biocompatible polymer used in several existing products approved by the U.S. Food and Drug Administration, was used as the base material for nanoparticles. The researchers systematically engineered and studied the surface properties of the nanoparticles to maximize their penetration across the intact, undamaged BBB in healthy mice. This led to the identification of a unique nanoparticle design that maximized the transport of the encapsulated siRNA across the intact BBB and significantly improved the uptake by brain cells.

A 50 percent reduction in the expression of tau was observed in TBI mice who received anti-tau siRNA through the novel delivery system, irrespective of the formulation being infused within or outside the temporary window of breached BBB. In contrast, tau was not affected in mice that received the siRNA through a conventional delivery system.

"In addition to demonstrating the utility of this novel platform for drug delivery into the brain, this report establishes for the first time that systematic modulation of surface chemistry and coating density can be leveraged to tune the penetration of nanoparticles across biological barriers with tight junction," said first author Wen Li, PhD, of the Department of Anesthesiology, Perioperative and Pain Medicine.

In addition to targeting tau, the researchers have studies underway to attack alternative targets using the novel delivery platform.

"For clinical translation, we want to look beyond tau to validate that our system is amenable to other targets," Karp said. "We used the TBI model to explore and develop this technology, but essentially anyone studying a neurological disorder might find this work of benefit. We certainly have our work cut out, but I think this provides significant momentum for us to advance toward multiple therapeutic targets and be in the position to move ahead to human testing."

Credit: 
Brigham and Women's Hospital

DUAL takes AI to the next level

image: Prof. Yeseong Kim from the Department of Information and Communication Engineering, DGIST

Image: 
dgist

"Today's computer applications generate a large amount of data that needs to be processed by machine learning algorithms," says Yeseong Kim of Daegu Gyeongbuk Institute of Science and Technology (DGIST), who led the effort.

Powerful 'unsupervised' machine learning involves training an algorithm to recognize patterns in large datasets without providing labelled examples for comparison. One popular approach is a clustering algorithm, which groups similar data into different classes. These algorithms are used for a wide variety of data analyses, such as identifying fake news on social media, filtering spam in our e-mails, and detecting criminal or fraudulent activity online.

"But running clustering algorithms on traditional cores results in high energy consumption and slow processing, because a large amount of data needs to be moved from the computer's memory to its processing unit, where the machine learning tasks are conducted," explains Kim.

Scientists have been looking into processing in-memory (PIM) approaches. But most PIM architectures are analog-based and require analog-to-digital and digital-to-analog converters, which take up a huge amount of the computer chip power and area. They also work better with supervised machine learning, which includes labelled datasets to help train the algorithm.

To overcome these issues, Kim and his colleagues developed DUAL, which stands for digital-based unsupervised learning acceleration. DUAL enables computations on digital data stored inside a computer memory. It works by mapping all the data points into high-dimensional space; imagine data points stored in many locations within the human brain.

The scientists found DUAL efficiently speeds up many different clustering algorithms, using a wide range of large-scale datasets, and significantly improves energy efficiency compared to a state-of-the-art graphics processing unit. The researchers believe this is the first digital-based PIM architecture that can accelerate unsupervised machine learning.

"The existing approach of the state-of-the-arts in-memory computing research focuses on accelerating supervised learning algorithms through artificial neural networks, which increases chip design costs and may not guarantee sufficient learning quality," says Kim. "We showed that combining hyper-dimensional and in-memory computing can significantly improve efficiency while providing sufficient accuracy."

Credit: 
DGIST (Daegu Gyeongbuk Institute of Science and Technology)

Scientists find the error source of a sea-ice model varies with the season

image: Schematic diagram of the sea-ice simulation error sources of a regional configuration of MITgcm.

Image: 
Yue Sun

Arctic sea ice has been rapidly declining in recent decades, and changes in arctic sea ice can have a significant impact on global weather and climate through interactions with the atmosphere and oceans. In addition, the Arctic shipping routes are a shortcut to connect the major countries in the Northern Hemisphere. The Arctic region is also rich in natural resources and biological resources. Simulation of the Arctic sea ice could provide valuable information for Arctic shipping as well as climate studies, and it is therefore urgent to evaluate the ability to simulate Arctic sea ice and diagnose the sources of simulation errors.

To address the issue of error source identification, Prof. Fei Zheng and his team from the Institute of Atmospheric Physics at the Chinese Academy of Sciences, evaluated the sea-ice simulations of the Arctic regional ocean-ice coupling configuration of the Massachusetts Institute of Technology general circulation model (MITgcm).

"We evaluated the model's performance in the Arctic cold season (March) and warm season (September), and found the model performances are different in the two months," says Zheng. "Due to the uncertainty of the model, the model's insufficient response to the signal of atmospheric forcings, and the insufficient response to the ocean boundary signal, there were disagreements between the simulations and observations in both March and September."

According to their paper, published in Advances in Atmospheric Sciences, the characteristics of seasonally varying model error sources could be fully considered by means of an ensemble approach, so as to achieve the goal of improving the simulation and prediction of the Arctic sea ice in different seasons in future work.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

NIH study uncovers blood vessel damage & inflammation in COVID-19 patients' brains but no infection

image: In an in-depth study, NIH researchers consistently found blood vessel damage in the brains of COVID-19 patients but no signs of SARS-CoV-2 infections. Here is a high-resolution scan of a patient's brain stem. Arrows point to light and dark spots that are indicative of blood vessel damage observed in the study.

Image: 
Courtesy of NIH/NINDS.

In an in-depth study of how COVID-19 affects a patient's brain, National Institutes of Health researchers consistently spotted hallmarks of damage caused by thinning and leaky brain blood vessels in tissue samples from patients who died shortly after contracting the disease. In addition, they saw no signs of SARS-CoV-2 in the tissue samples, suggesting the damage was not caused by a direct viral attack on the brain. The results were published as a correspondence in the New England Journal of Medicine.

"We found that the brains of patients who contract infection from SARS-CoV-2 may be susceptible to microvascular blood vessel damage. Our results suggest that this may be caused by the body's inflammatory response to the virus" said Avindra Nath, M.D., clinical director at the NIH's National Institute of Neurological Disorders and Stroke (NINDS) and the senior author of the study. "We hope these results will help doctors understand the full spectrum of problems patients may suffer so that we can come up with better treatments."

Although COVID-19 is primarily a respiratory disease, patients often experience neurological problems including headaches, delirium, cognitive dysfunction, dizziness, fatigue, and loss of the sense of smell. The disease may also cause patients to suffer strokes and other neuropathologies. Several studies have shown that the disease can cause inflammation and blood vessel damage. In one of these studies, the researchers found evidence of small amounts of SARS-CoV-2 in some patients' brains. Nevertheless, scientists are still trying to understand how the disease affects the brain.

In this study, the researchers conducted an in-depth examination of brain tissue samples from 19 patients who had died after experiencing COVID-19 between March and July 2020. Samples from 16 of the patients were provided by the Office of the Chief Medical Examiner in New York City while the other 3 cases were provided by the department of pathology at the University of Iowa College of Medicine, Iowa City. The patients died at a wide range of ages, from 5 to 73 years old. They died within a few hours to two months after reporting symptoms. Many patients had one or more risk factors, including diabetes, obesity, and cardiovascular disease. Eight of the patients were found dead at home or in public settings. Another three patients collapsed and died suddenly.

Initially, the researchers used a special, high-powered magnetic resonance imaging (MRI) scanner that is 4 to 10 times more sensitive than most MRI scanners, to examine samples of the olfactory bulbs and brainstems from each patient. These regions are thought to be highly susceptible to COVID-19. Olfactory bulbs control our sense of smell while the brainstem controls our breathing and heart rate. The scans revealed that both regions had an abundance of bright spots, called hyperintensities, that often indicate inflammation, and dark spots, called hypointensities, that represent bleeding.

The researchers then used the scans as a guide to examine the spots more closely under a microscope. They found that the bright spots contained blood vessels that were thinner than normal and sometimes leaking blood proteins, like fibrinogen, into the brain. This appeared to trigger an immune reaction. The spots were surrounded by T cells from the blood and the brain's own immune cells called microglia. In contrast, the dark spots contained both clotted and leaky blood vessels but no immune response.

"We were completely surprised. Originally, we expected to see damage that is caused by a lack of oxygen. Instead, we saw multifocal areas of damage that is usually associated with strokes and neuroinflammatory diseases," said Dr. Nath.

Finally, the researchers saw no signs of infection in the brain tissue samples even though they used several methods for detecting genetic material or proteins from SARS-CoV-2.

"So far, our results suggest that the damage we saw may not have been not caused by the SARS-CoV-2 virus directly infecting the brain," said Dr. Nath. "In the future, we plan to study how COVID-19 harms the brain's blood vessels and whether that produces some of the short- and long-term symptoms we see in patients."

Credit: 
NIH/National Institute of Neurological Disorders and Stroke