Earth

In shaky times, focus on past successes, if overly anxious, depressed

The more chaotic things get, the harder it is for people with clinical anxiety and/or depression to make sound decisions and to learn from their mistakes. On a positive note, overly anxious and depressed people's judgment can improve if they focus on what they get right, instead of what they get wrong, suggests a new UC Berkeley study.

The findings, published today, Dec. 22, in the journal eLife, are particularly salient in the face of a COVID-19 surge that demands tactical and agile thinking to avoid illness and even death.

UC Berkeley researchers tested the probabilistic decision-making skills of more than 300 adults, including people with major depressive disorder and generalized anxiety disorder. In probabilistic decision making, people, often without being aware of it, use the positive or negative results of their previous actions to inform their current decisions.

The researchers found that the study participants whose symptoms intersect with both anxiety and depression -- such as worrying a lot, feeling unmotivated or not feeling good about themselves or about the future -- had the most trouble adjusting to changes when performing a computerized task that simulated a volatile or rapidly changing environment.

Conversely, emotionally resilient study participants, with few, if any, symptoms of anxiety and depression, learned more quickly to adjust to changing conditions based on the actions they had previously taken to achieve the best available outcomes.

"When everything keeps changing rapidly, and you get a bad outcome from a decision you make, you might fixate on what you did wrong, which is often the case with clinically anxious or depressed people," said study senior author Sonia Bishop, a professor of neuroscience at UC Berkeley. "Conversely, emotionally resilient people tend to focus on what gave them a good outcome, and in many real-world situations that might be key to learning to make good decisions."

That doesn't mean people with clinical anxiety and depression are doomed to a life of bad decisions, Bishop said. For example, individualized treatments, such as cognitive behavior therapy, could improve both decision-making skills and confidence by focusing on past successes, instead of failures, she noted.

The study expands on Bishop's 2015 study, which found that people with high levels of anxiety made more mistakes when tasked with making decisions during computerized assignments that simulated both stable and rapidly changing environments. Conversely, non-anxious study participants quickly adjusted to the changing patterns in the task.

For this latest study, Bishop and her team looked at whether people with depression would also struggle to make sound decisions in volatile environments and whether this would hold true when challenged with different versions of the task.

"We wanted to see if this weakness was unique to people with anxiety, or if it also presented in people with depression, which often goes hand in hand with anxiety," Bishop said. "We also sought to find out if the problem was a general one or specific to learning about potential reward or potential threat.

The first experiment involved 86 men and women aged between 18 and 50. The group included people diagnosed with generalized anxiety disorder, major depressive disorder, people who showed symptoms of anxiety or depression, but no formal diagnoses of these disorders, and people with neither anxiety nor depression.

In a laboratory setting, study participants played a game on a computer screen in which they repeatedly chose between two shapes -- a circle and a square. One shape, if selected, would deliver a mild to moderate electrical shock, and another would deliver a monetary prize. The probability of a shape delivering a reward or a shock was predictable at some points in the task, and volatile in others. Participants with high levels of symptoms common to depression and anxiety had trouble keeping pace with these changes.

In the second experiment, 147 U.S. adults, with varying degrees of anxiety and depression were recruited via Amazon's Mechanical Turk crowdsourcing marketplace and given the same task remotely. This time, they chose between red and yellow squares on a screen. They still received monetary rewards, but instead of being penalized with electric shocks, they lost money.

The results echoed those of the in-laboratory outcomes. Overall, having symptoms common to both anxiety and depression predicted who would struggle most with making sound decisions in the face of changing circumstances, regardless of whether they were rewarded or punished for getting things right or wrong, compared to their emotionally resilient counterparts.

"We found that people who are emotionally resilient are good at latching on to the best course of action when the world is changing fast," Bishop said "People with anxiety and depression, on the other hand, are less able to adapt to these changes. Our results suggest they might benefit from cognitive therapies that redirect their attention to positive, rather than negative, outcomes."

Credit: 
University of California - Berkeley

Bait and switch

Perhaps that sauteed snapper you enjoyed last evening at your neighborhood restaurant was not snapper at all. Perhaps it was Pacific Ocean perch, cloaked in a wine sauce to disguise its true identity. The same goes for that grouper you paid a handsome price for at your local fishmonger's and cooked up at home. Instead, you may have been feasting on a plateful of whitefin weakfish and been none the wiser.

Seafood is the world's most highly traded food commodity, and reports of seafood mislabeling have increased over the past decade. However, proof of the environmental effects of mislabeled seafood has been scant as has research. So, Arizona State University researcher Kailin Kroetz and her colleagues analyzed the impact of seafood mislabeling on marine population health, fishery management effectiveness, and habitats and ecosystems in the United States, the world's largest seafood importer.

The results of the study were published in the Proceedings of the National Academy of Sciences.

The study found that approximately 190,000 to 250,000 tons of mislabeled seafood are sold in the United States each year, or 3.4% to 4.3% of consumed seafood. What's more, the substituted seafood was 28% more likely to be imported from other countries, which may have weaker environmental laws than the United States.

"In the United States, we're actually very good at managing our fisheries," said Kroetz, assistant professor in ASU's School of Sustainability. "We assess the stock so we know what's out there. We set a catch limit. We have strong monitoring and enforcement capabilities to support fishers adhering to the limit. But many countries we import from do not have the same management capacity."

The authors used the Monterey Bay Aquarium Seafood Watch scores for wild-caught product pairs to assess marine population health and fishery management effectiveness.

"Although we would like to do a global assessment in the longer run, we focused on the U.S. first because Seafood Watch assesses about 85% of U.S. seafood consumed," Kroetz explained. "The data we were able to access in the U.S. were much more detailed than what we could access on the global scale."

The study found that substitute species came from fisheries that performed worse in terms of population impacts 86% of the time. The population impact metric accounted for fish abundance, fishing mortality, bycatch and discards -- that is, fish thrown back to sea after being caught. In addition, 78% of the time the substituted seafood fared worse than the expected products listed on the label when it came to fishery management effectiveness.

Prior studies have focused on the rates at which specific seafood is mislabeled. But it's the quantity of substituted fish consumed that is key to determining environmental impacts.

"The rates themselves don't tell us the full story about the impact of mislabeling," Kroetz said. That's because some fish that have high rates of substitution have low levels of consumption and vice versa. In fact, the majority of pairs have relatively low rates of substitution and low consumption.

Good examples are shrimp and snapper. The researchers found that giant tiger prawns are substituted for white leg shrimp more than any other seafood product -- and Americans eat more shrimp than any other type of seafood, opening the door to potentially substantial environmental impacts. Meanwhile, snapper has a higher rate of mislabeling, but Americans consume much less of it than shrimp.

At the very minimum, mislabeling fish undermines good population management, and in turn, sustainable fisheries.

Mislabeling can shake consumer confidence in their quest to eat only sustainable, local seafood. That's because substituted fish is more likely to be imported and come from poorly managed fisheries, thereby creating a market for fish that shouldn't be liberally consumed.

For example, you might think you're getting this wonderful local blue crab, supporting local fisheries, and experiencing local cuisine, but in reality, you could be eating something that was imported from Indonesia. Learning about mislabeling might reduce the amount you'd pay for blue crab in the future or result in you not consuming it at all.

"The expected species is often really well managed," Kroetz said. "Consuming fish from a fishery shouldn't have a negative impact in terms of the population now or in the future if the management is good. But if you're consuming fish from poorly managed fisheries, that's not sustainable."

Credit: 
Arizona State University

Preventing nurse suicides as new study finds shift in method

In a new study, University of California San Diego School of Medicine and UC San Diego Health researchers report that the rate of firearm use by female nurses who die by suicide increased between 2014 to 2017. Published December 21, 2020 in the journal Nursing Forum, the study examined more than 2,000 nurse suicides that occurred in the United States from 2003 to 2017 and found a distinct shift from using pharmacological poisoning to firearms, beginning in 2014.

As part of the longitudinal study, researchers looked at data provided by the Centers for Disease Control and Prevention's National Violent Death Reporting System dataset.

"In past research, we determined opioids or other medications were more commonly used as the suicidal method in female nurses," said senior author Judy Davidson, DNP, RN, research scientist at UC San Diego. "From those findings, there was a possibility that there might be a change in the way nurses die by suicide over time. Now that we've looked at the data with a focus on firearms, we are finding that shift and it's resulted in an increase in female nurses sadly taking their own life through the use of firearms."

The World Health Organization reports that one person dies every 40 seconds by suicide, occurring at a rate of 10 per 100,000 persons. While overall mortality rates are decreasing in the U.S., the suicide rate is rising, and many fear the COVID-19 pandemic may accelerate this rise.

"Unfortunately it's very common for suicide rates to increase in conjunction with world health emergencies. We've seen it happen before during such events, including the Ebola and SARS epidemics, and we're seeing it happen now with the COVID-19 pandemic," said co-author Sidney Zisook, MD, professor of psychiatry at UC San Diego School of Medicine. "The use of firearms in death by suicide is more common amongst male nurses, so it's alarming to see this increase among female nurses now as well."

According to Davidson, many of the individuals who died by suicide all included three similar situations: use of firearm, previous attempt and known depression.

"Those three elements together represent preventable deaths of individuals experiencing very similar circumstances," said Davidson. "If the firearm had been removed from the home, research tells us these deaths may not have happened. It is vital that we inform the public about firearm safety, especially during high-risk times, such as those we're facing now."

Since 2009, UC San Diego has offered the Healer Education, Assessment and Referral Program, otherwise known as HEAR, to address the high prevalence of burnout, stress and depression specific to the health care community. HEAR provides education about risk factors and proactive screening focused on identifying, supporting and referring clinicians for untreated depression and/or suicide risk. HEAR has been acclaimed as a best practice in suicide prevention by the American Nurses Association and American Medical Association.

Co-founded by Zisook, the HEAR program was first targeted to prevent suicides in physicians and is now inclusive of all UC San Diego Health staff and faculty. The program has been replicated by other institutions throughout the country.

"As we enter the holiday season, amid a pandemic that's lasted more than 11 months now, we are concerned about members of our health care community in need of support during such a challenging time," said co-author Christine Moutier, MD, chief medical officer of the American Foundation for Suicide Prevention. "In 2020, there has been an unprecedented increase in the purchase of firearms across the country. This is sobering information. But there are evidenced-based approaches to suicide prevention that our community can benefit from, potentially leading to a life saved."

According to the team, tested approaches to preventing suicide include:

1. Removing lethal means from a person who is in the process of being treated for depression.

2. If in the home, ensure firearms are locked and ammunition is stored separately. If the person with suicidal ideation lives alone with a firearm in the home, make arrangements to remove it while the person is being treated for depression.

3. Increase contact. Loneliness is a risk factor for those who are depressed so increased social presence through phone calls, or virtually, can help significantly.

"Nurses are being challenged now in ways they never have before and suicide risk among nurses is higher than the general population," said Davidson. "It's important to know that people who are considering suicide are not alone and action is being taken to protect our nursing workforce. Help is available and we are here to get our team through this challenging time, together."

Credit: 
University of California - San Diego

Coastal ecosystems 'bright spots'

CSIRO, Australia's national science agency, has identified coastal 'bright spots' to repair marine ecosystems globally, paving the way to boost biodiversity, local economies and human wellbeing.

Doctor Megan Saunders, CSIRO Oceans and Atmosphere Senior Research Scientist, said successful coastal restoration efforts could be achieved over large areas, deliver positive impacts for decades, expand restored areas by up to 10-times in size, and generate jobs.

"Coastal ecosystems across the globe including saltmarshes, mangroves, seagrasses, oyster reefs, kelp beds and coral reefs have declined by up to 85 per cent over decades," Dr Saunders said.

"Identifying bright spots that have delivered successful coastal and marine restoration in the past enables us to apply this knowledge to help save marine areas that are struggling to recover from degradation.

"Re-establishing coastal marine ecosystems at large scales can support human health and wellbeing and boost the adaptation response to climate change."

The research published today in the journal Current Biology outlined successful restoration examples from across the world that could be learned from and implemented into similar marine environments.

"A range of techniques have resulted in significant restoration of saltmarshes, coral reefs and seagrass meadows over extended periods of time," Dr Saunders said.

"In Australia, there are some really innovative examples of marine restoration.

"For example, CSIRO is harvesting coral larvae in the Great Barrier Reef to boost large-scale coral restoration efforts.

"Simple changes to how we plant saltmarshes have also resulted in doubled survivorship and biomass."

At least 775 million people have a high dependency on coastal marine ecosystems.

Coastal ecosystems help to remove carbon dioxide from the atmosphere and protect and stabilise shorelines.

Coastal marine restoration is an important nature-based answer to the impacts of global climate change.

"Restoration of coastal marine ecosystems back to a healthy state is an important tool for responding to threats to the marine environment such as coastal development, land use change and overfishing," Dr Saunders said.

The United Nations recognised the importance of the restoration and declared the Decade on Ecosystem Restoration to start from 2021.

The UN panel for a Sustainable Ocean Economy also emphasised the need to undertake restoration and other nature-based approaches at large scales to restore and protect coastal ecosystems.

Professor Brian Silliman, co-author and CSIRO Distinguished Fulbright Chair in Science and Technology and Professor at Duke University, USA, said 'bright spots' gave us the window to understand which restoration methods worked best so we could identify where to focus research efforts and investment to protect people's livelihoods.

"Investing into coastal restoration creates jobs and can be used as a strategy to boost economic recovery and coastal marine health," Professor Silliman said.

"Restoration of marine habitats, such as kelp forests and oyster reefs, has improved commercial and recreational fishing in some countries, which boosted the local economy.

"In the USA, the propagation and dispersal of seagrass seeds resulted in seagrass meadows recovering in areas where they had been lost many decades ago, removing an estimated 170 tonnes of nitrogen and 630 tonnes carbon per year from the atmosphere.

"In another study, recovery of reefs impacted by blast fishing in Indonesia has been achieved by placing rocks or other hard structures underwater to help with coral colonisation, with persistent growth of coral recorded for more than 14 years."

Doctor Chris Gillies, The Nature Conservancy's Oceans Program Director and a co-author on the study, said demonstrating that projects could be successful was important for securing investment into restoration.

"We are starting to see more and more investment into marine restoration in Australia," Dr Gillies said.

"For example, the Australian Government recently invested $20 million into 'Reef Builder' to restore 20 of Australia's lost shellfish reefs."

Credit: 
CSIRO Australia

Speeding toward improved hydrogen fuel production

image: An illustration of the 2D boron nitride substrate with imperfections that host tiny nickel clusters. The catalyst aids the chemical reaction that removes hydrogen from liquid chemical carriers, making it available for use as a fuel.

Image: 
Jeff Urban/Berkeley Lab

Hydrogen is a sustainable source of clean energy that avoids toxic emissions and can add value to multiple sectors in the economy including transportation, power generation, metals manufacturing, among others. Technologies for storing and transporting hydrogen bridge the gap between sustainable energy production and fuel use, and therefore are an essential component of a viable hydrogen economy. But traditional means of storage and transportation are expensive and susceptible to contamination. As a result, researchers are searching for alternative techniques that are reliable, low-cost and simple. More-efficient hydrogen delivery systems would benefit many applications such as stationary power, portable power, and mobile vehicle industries.

Now, as reported in the journal Proceedings of the National Academy of Sciences, researchers have designed and synthesized an effective material for speeding up one of the limiting steps in extracting hydrogen from alcohols. The material, a catalyst, is made from tiny clusters of nickel metal anchored on a 2D substrate. The team led by researchers at Lawrence Berkeley National Laboratory's (Berkeley Lab) Molecular Foundry found that the catalyst could cleanly and efficiently accelerate the reaction that removes hydrogen atoms from a liquid chemical carrier. The material is robust and made from earth-abundant metals rather than existing options made from precious metals, and will help make hydrogen a viable energy source for a wide range of applications.

"We present here not merely a catalyst with higher activity than other nickel catalysts that we tested, for an important renewable energy fuel, but also a broader strategy toward using affordable metals in a broad range of reactions," said Jeff Urban, the Inorganic Nanostructures Facility director at the Molecular Foundry who led the work. The research is part of the Hydrogen Materials Advanced Research Consortium (HyMARC), a consortium funded by the U.S. Department of Energy's Office of Energy Efficiency and Renewable Energy Hydrogen and Fuel Cell Technologies Office (EERE). Through this effort, five national laboratories work towards the goal to address the scientific gaps blocking the advancement of solid hydrogen storage materials. Outputs from this work will directly feed into EERE's H2@Scale vision for affordable hydrogen production, storage, distribution and utilization across multiple sectors in the economy.

Chemical compounds that act as catalysts like the one developed by Urban and his team are commonly used to increase the rate of a chemical reaction without the compound itself being consumed--they might hold a particular molecule in a stable position, or serve as an intermediary that allows an important step to be reliably to completed. For the chemical reaction that produces hydrogen from liquid carriers, the most effective catalysts are made from precious metals. However, those catalysts are associated with high costs and low abundance, and are susceptible to contamination. Other less expensive catalysts, made from more common metals, tend to be less effective and less stable, which limits their activity and their practical deployment into hydrogen production industries.

To improve the performance and stability of these earth-abundant metal-based catalysts, Urban and his colleagues modified a strategy that focuses on tiny, uniform clusters of nickel metal. Tiny clusters are important because they maximize the exposure of reactive surface in a given amount of material. But they also tend to clump together, which inhibits their reactivity.

Postdoctoral research assistant Zhuolei Zhang and project scientist Ji Su, both at the Molecular Foundry and co-lead authors on the paper, designed and performed an experiment that combatted clumping by depositing 1.5-nanometer-diameter nickel clusters onto a 2D substrate made of boron and nitrogen engineered to host a grid of atomic-scale dimples. The nickel clusters became evenly dispersed and securely anchored in the dimples. Not only did this design prevent clumping, but its thermal and chemical properties greatly improved the catalyst's overall performance by directly interacting with the nickel clusters.

"The role of the underlying surface during the cluster formation and deposition stage has been found to be critical, and may provide clues to understanding their role in other processes" said Urban.

Detailed X-ray and spectroscopy measurements, combined with theoretical calculations, revealed much about the underlying surfaces and their role in catalysis. Using tools at the Advanced Light Source, a DOE user facility at Berkeley Lab, and computational modelling methods, the researchers identified changes in the physical and chemical properties of the 2D sheets while tiny nickel clusters formed and deposited on them. The team proposed that the material forms while metal clusters occupy pristine regions of the sheets and interact with nearby edges, thus preserving the tiny size of the clusters. The tiny, stable clusters facilitated the action in the processes through which hydrogen is separated from its liquid carrier, endowing the catalyst with excellent selectivity, productivity, and stable performance.

Calculations showed that the catalyst's size was the reason its activity was among the best relative to others that have recently been reported. David Prendergast, director of the Theory of Nanostructured Materials Facility at the Molecular Foundry, along with postdoctoral research assistant and co-lead author Ana Sanz-Matias, used models and computational methods to uncover the unique geometric and electronic structure of the tiny metal clusters. Bare metal atoms, abundant on these tiny clusters, more readily attracted the liquid carrier than did larger metal particles. These exposed atoms also eased the steps of the chemical reaction that strips hydrogen from the carrier, while preventing the formation of contaminants that may clog the surface of the cluster. Hence, the material remained free of pollution during key steps in the hydrogen production reaction. These catalytic and anti-contamination properties emerged from the imperfections that had been deliberately introduced to the 2D sheets and ultimately helped keep the cluster size small.

"Contamination can render possible non-precious metal catalysts unviable. Our platform here opens a new door to engineering those systems," said Urban.

In their catalyst, the researchers achieved the goal of creating a relatively inexpensive, readily available, and stable material that helps to strip hydrogen from liquid carriers for use as a fuel. This work came out of a DOE effort to develop hydrogen storage materials to meet the targets of EERE's Hydrogen and Fuel Cell Technologies Office and to optimize the materials for future use in vehicles.

Future work by the Berkeley Lab team will further hone the strategy of modifying 2D substrates in ways that support tiny metal clusters, to develop even more efficient catalysts. The technique could help to optimize the process of extracting hydrogen from liquid chemical carriers.

The Molecular Foundry and the Advanced Light Source are DOE Office of Science user facilities at Berkeley Lab.

The research was supported by the DOE Office of Science and EERE's Hydrogen and Fuel Cell Technologies Office.

Credit: 
DOE/Lawrence Berkeley National Laboratory

The mechanics of the immune system

Highly complicated processes constantly take place in our body to keep pathogens in check: The T-cells of our immune system are busy searching for antigens - suspicious molecules that fit exactly into certain receptors of the T-cells like a key into a lock. This activates the T-cell and the defense mechanisms of the immune system are set in motion.

How this process takes place at the molecular level is not yet well understood. What is now clear, however, is that not only chemistry plays a role in the docking of antigens to the T-cell; micromechanical effects are important too. Submicrometer structures on the cell surface act like microscopic tension springs. Tiny forces that occur as a result are likely to be of great importance for the recognition of antigens. At TU Wien, it has now been possible to observe these forces directly using highly developed microscopy methods.

This was made possible by a cooperation between TU Wien, Humbold Universität Berlin, ETH Zurich and MedUni Vienna. The results have now been published in the scientific journal "Nano Letters".

Smelling and feeling

As far as physics is concerned, our human sensory organs work in completely different ways. We can smell, i.e. detect substances chemically, and we can touch, i.e. classify objects by the mechanical resistance they present to us. It is similar with T cells: they can recognize the specific structure of certain molecules, but they can also "feel" antigens in a mechanical way.

"T cells have so-called microvilli, which are tiny structures that look like little hairs," says Prof. Gerhard Schütz, head of the biophysics working group at the Institute of Applied Physics at TU Wien. As the experiments showed, remarkable effects can occur when these microvilli come into contact with an object: The microvilli can encompass the object, similar to a curved finger holding a pencil. They can then even enlarge, so that the finger-like protrusion eventually becomes an elongated cylinder, which is turned over the object.

"Tiny forces occur in the process, on the order of less than a nanonewton," says Gerhard Schütz. One nanonewton corresponds roughly to the weight force that a water droplet with a diameter of one-twentieth of a millimeter would exert.

Force measurement in the hydrogel

Measuring such tiny forces is a challenge. "We succeed by placing the cell together with tiny test beads in a specially developed gel. The beads carry molecules on their surface to which the T cell reacts," explains Gerhard Schütz. "If we know the resistance that our gel exerts on the beads and measure exactly how far the beads move in the immediate vicinity of the T-cell, we can calculate the force that acts between the T-cell and the beads."

These tiny forces and the behavior of the microvilli are likely to be important in recognizing the molecules and thus triggering an immune response. "We know that biomolecules such as proteins show different behavior when they are deformed by mechanical forces or when bonds are simply pulled," says Gerhard Schütz. "Such mechanisms are also likely to play a role in antigen recognition, and with our measurement methods this can now be studied in detail for the first time."

Credit: 
Vienna University of Technology

Brazilian researcher experiments with electron-plasma interactions

image: The traveling wave tube (TWT) used in the experiments at Aix-Marseille University in France.

Image: 
Meirielen Caetano de Sousa/USP

A paper on research conducted by Meirielen Caetano de Sousa, postdoctoral fellow at the University of São Paulo's Physics Institute (IF-USP) in Brazil, is highlighted as Editor's Pick in the September issue of Physics of Plasmas, published by the American Institute of Physics with the cooperation of The American Physical Society. The paper, entitled "Wave-particle interactions in a long traveling wave tube with upgraded helix", is signed by Sousa, Iberê Caldas, her supervisor at IF-USP, and collaborators at Aix-Marseille University in France, where Sousa served a research internship with the support of a scholarship from FAPESP (São Paulo Research Foundation) and CAPES, the Brazilian Ministry of Education's higher research council.

The focus of Sousa's research was an experimental study of electron-plasma interactions. Because plasma is a medium with substantial background noise, analogous conditions to those of plasma were simulated in a vacuum by the use of short electromagnetic waves propagating in a traveling wave tube or TWT.

"TWTs are devices in which electromagnetic waves interact with electron beams," Sousa explained. "Industrial TWTs are between 2 cm and 30 cm long and are mainly used to amplify radio frequency signals in space communications. The TWT at Aix-Marseille University is 4 m long and specially designed for research in plasma physics with a very low level of noise. It's currently the only one of its kind in operation in the world."

The waves are produced at frequencies ranging from 10 to 100 megahertz (the intermediate region of the radio band of the electromagnetic spectrum) by a waveform generator and propagate through a helix coupled to the horizontal axis of the TWT. The helix has recently been upgraded to make it as regular as possible. Sousa took part in the upgrade, and her study would not have been possible without it. In its previous configuration, the TWT was less precise owing to small variations in the pitch of the helix and hence in the waveform generated.

"In the first part of the study we analyzed wave propagation without the electron beam," Sousa said. "We found excellent agreement between the theoretical predictions and the experimental data. This means both that the theoretical model was producing accurate predictions and that the device was working perfectly."

But the really important findings were produced in the second part, which consisted of an investigation of the interaction between the electromagnetic wave and the electron beam. "We observed the energy exchange between the wave and the electrons," Sousa said. "The electrons travel slightly faster than the wave's phase velocity and transmit this kinetic energy differential to the wave, increasing its amplitude. When the wave reaches maximum amplitude, it starts to oscillate, rising and falling, and the electrons are trapped in the wave potential, with their velocities varying around the velocity of the wave phase. They transfer energy to the wave and then receive energy from the wave."

For low values of electric current, she added, the phenomenon matches the linear theory predictions, but when current values are high the electrons interact not only with the wave but also with each other. This results in non-linear effects no longer aligned with theoretical predictions.

"Studying these effects is one aim of the future experiments we're planning," Sousa said. "Another is studying non-monokinetic beams, in which the electrons travel at different velocities, and the interaction between these beams and a broad wave spectrum, meaning several waves that propagate at the same time inside the device."

The completed study and the planned experiments are in the field of basic science, investigating the accuracy of the frontier theory and phenomena not yet described adequately by the theoretical model. Possible technological applications are on the horizon, however. "A more immediate application would be upgrading of industrial TWTs. A more ambitious one would be contributing to upgrades of other devices that use the interaction between electromagnetic waves and electrically charged particles, such as particle accelerators, for example," Sousa said.

In addition to the research internship abroad scholarship awarded to Sousa, FAPESP supported the study via a direct doctorate scholarship and a postdoctoral fellowship, also awarded to Sousa; and a Thematic Project on "Non-linear dynamics", led by Caldas.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Moffitt researchers discover potential new drug target to treat cutaneous T cell lymphoma

TAMPA, Fla. - Cutaneous T cell lymphomas (CTCLs) are a group of non-Hodgkin lymphomas that develop from T cells and mainly impact the skin with painful lesions. The two main subtypes of CTCL are mycosis fungoides and Sézary syndrome. CTCLs are extremely rare, with approximately six cases per 1 million people each year. It is unclear how CTCL develops, and unfortunately there are limited treatment options and no cure.

Moffitt Cancer Center treats approximately 16% of CTCL patients nationwide. In order to improve their understanding of how CTCL develops in hopes of developing new therapies, a team of Moffitt immunologists and hematologists, including Jose Conejo-Garcia, Ph.D., Javier Pinilla-Ibarz,, M.D., Ph.D. and Lubomir Sokol, M.D., Ph.D., conducted a series of studies. In an article published in The Journal of Clinical Investigation, they demonstrate that decreased expression of the protein SATB1 contributes to CTCL development and that drugs that cause SATB1 to become re-expressed may be potential treatment options for this disease.

The protein SATB1 plays an important role in cell death, proliferation and invasion, and has also been shown to be involved in the processes that control T cells differentiations. Recently, it was reported that mycosis fungoides is associated with lower levels of SATB1, suggesting that it may play an important role in the development of this disease.

To understand how SATB1 contributes to CTCL, Moffitt researchers created a mouse model that lacked SATB1 in combination with overexpression of NOTCH1, which is known to be involved in the development of CTCL. The mice developed enlarged spleens and livers, swollen lymph nodes, a high level of T cells and lived for a significantly shorter time than control mice. When the researchers examined the skin of the mice that were missing SATB1, they discovered characteristics that were similar to CTCL, suggesting that loss of SATB1 cooperates with NOTCH1 to promote CTCL development. The researchers confirmed these observations by showing that T cells from patients with Sézary syndrome have significantly lower levels of SATB1 than cells from healthy donors.

The researchers wanted to determine mechanistically how loss of SATB1 contributes to the development of CTCL in mice by assessing signaling pathways that were activated both upstream and downstream of SATB1. They performed a series of laboratory experiments to show SATB1 regulated the downstream protein STAT5 and other protein chemical receptors in T cells that cause them to expand in number.

Next, the researchers focused on upstream processes to determine how SATB1 loss occurs in patients' samples. They discovered that a process called epigenetic regulation plays an important role in this loss. Specifically, they showed that DNA regulatory proteins called histones become methylated, which results in loss of SATB1 expression. They also discovered that drugs that prevent this methylation cause SATB1 to become re-expressed. These methylation inhibitors also prevented the growth of Sézary syndrome cells more effectively than the drug romidepsin, which is commonly used to treat CTCL patients. These observations suggest that drugs that target these methylation processes may be viable options to treat CTCL patients.

The researchers hope that their preclinical studies will eventually lead to clinical trials of methylation inhibitors in CTCL patients. "Our results offer new insight into the pathophysiology of CTCL, as well as a mechanistic rationale for targeting histone methyltransferases to abrogate malignant expansion and skin homing in advanced CTCL patients," said Carly Harro, study first author and student in Moffitt's Cancer Biology Ph.D. Program.

Credit: 
H. Lee Moffitt Cancer Center & Research Institute

California lockdown suppressed excess pandemic deaths

Nearly 20,000 more Californians died in the first six months of the pandemic than would have been expected to die in a normal year, with a disproportionate number of those deaths occurring among older adults, Black or Latino residents, or those who had not completed high school, according to an analysis by researchers at UC San Francisco.

Researchers used excess deaths -- the number of deaths above what would be predicted in a given year without a mass casualty event -- as a rough indicator of the pandemic's overall death toll. But the exact number is hard to discern, and the excess death total may include deaths from causes other than COVID-19.

Despite the pandemic's high death toll, which continues to climb even as the state endures a second period of sheltering in place, the researchers concluded that the first lockdown from March 19 through May 9 lowered the number of excess deaths for most but not all groups.

"The early shutdown worked for California," said Kirsten Bibbins-Domingo, MD, PhD, professor and chair of the Department of Epidemiology and Biostatistics at UCSF and the senior author of the paper.?"Mortality rates that were rising early in the pandemic dropped substantially in a timeframe that coincides with the shutdown. But, importantly, not all Californians seemed to benefit."

Specifically, Latinos--who make up nearly 40 percent of California residents--and adults without a high school degree did not experience a decline in excess deaths with the shutdown. These groups are disproportionately represented among low-wage essential workers, who continued stocking grocery store shelves, making deliveries and performing other work deemed necessary to keep society functioning. These workers were also at heightened risk of infection because of crowded living conditions that made it hard to protect against the coronavirus.

Excess deaths in these groups continued rising through the lockdown and rose even higher once it was lifted. Among Californians with no more than a high school degree there were 500 excess deaths per week early in the spring. That figure rose to 1,000 per week by mid-August.

"These numbers are much higher than in other educational groups," said Yea-Hung Chen, PhD, an epidemiologist at UCSF and first author of the study, published Monday, Dec. 21, 2020, in JAMA Internal Medicine. "The differences are even more dramatic when we account for population size."

On a per capita basis, excess deaths were highest among California's Black population, although they declined toward the end of the lockdown period. But excess deaths never declined for Latinos, or those without a high school degree.

Moreover, once lockdown ended, per capita excess deaths went up for everyone, regardless of their racial or ethnic group, or their level of education.

"This suggests shutdowns are an important tool during the pandemic, but they must be accompanied by attention and resources to high-risk communities," Bibbins-Domingo said.

She and her team have posted a follow-up study on the medRxiv preprint server highlighting the specific sub-groups of Latinos with the highest death rates based on occupation and country of origin. "We hope this data will help to guide the pandemic response in a way that is responsive to the needs of these communities."

Credit: 
University of California - San Francisco

Under Antarctica's ice, Weddell seals produce ultrasonic vocalizations

image: University of Oregon evolutionary biologist Paul Cziko looks over the underwater camera during a dive at the National Science Foundation-funded McMurdo Oceanographic Observatory. The observatory, completed in 2017, is located 21 meters below the sea ice cover at McMurdo Sound, Antarctica, 850 miles from the South Pole.

Image: 
Photo by Henry Kaiser

EUGENE, Ore. -- Dec. 21, 2020 -- Weddell seals are chirping, whistling and trilling under Antarctica's ice at sound frequencies that are inaudible to humans, according to a research team led by University of Oregon biologists.

Two years of recordings at a live-streaming underwater observatory in McMurdo Sound have captured nine types of tonal ultrasonic seal vocalizations that reach to 50 kilohertz. Humans hear in the sonic range of 20 to 20,000 hertz, or 20 kilohertz.

The discovery is detailed in a paper published online Dec. 18 ahead of print the Journal of the Acoustical Society of America.

Weddell seals (Leptonychotes weddelii), the world's southernmost-ranging mammal, thrive under the continent's sea ice, using their large teeth to create air holes. They can dive to 600 meters in search of prey and remain submerged for 80 minutes. Researchers had first identified 34 seal call types at sonic frequencies in 1982, tying the sounds to social interactions.

The study's lead author Paul Cziko, a visiting research professor in the UO's Institute of Ecology and Evolution, began recording the seals' sonic-ranged vocalizations in 2017 after completing the installation of the McMurdo Oceanographic Observatory. Workers at McMurdo Station, he said, often fell asleep listening to broadcasts of the seals' sonic sounds coming from below.

"The Weddell seals' calls create an almost unbelievable, otherworldly soundscape under the ice," Cziko said. "It really sounds like you're in the middle of a space battle in 'Star Wars,' laser beams and all."

Over the next two years, the observatory's broadband digital hydrophone - more sensitive than equipment used in earlier recordings - picked up the higher-frequency vocalizations during passive monitoring of the seals.

"We kept coming across these ultrasonic call types in the data," said co-author Lisa Munger, a marine biologist who studies marine mammal acoustics and a career instructor in the UO's Clark Honors College. "Finally, it dawned on us that the seals were actually using them quite regularly."

The nine new call types were composed of single or multiple vocal elements having ultrasonic fundamental frequencies. Eleven elements, including chirps, whistles and trills, were above 20 kHz. Two exceeded 30 kHz and six were always above 21 kHz. One whistle reached 44.2 kHz and descending chirps in another call type began at about 49.8 kHz. Harmonics, or the overtones, of some vocalizations exceeded 200 kHz.

"It was really surprising that other researchers previously had, in effect, missed a part of the conversation," said Cziko, who earned a doctorate in evolutionary biology from the UO in 2014.

What the ultrasonic vocalizations mean in the Weddell seals' repertoire is unknown. The seals are among 33 species of fin-footed mammals grouped as pinnipeds. Until now, pinnipeds, which also include sea lions and walruses, were believed to vocalize only at sonic levels.

It could be, Cziko said, that the seals produce the sounds simply to "stand out over all the lower-frequency noise, like changing to a different channel for communicating."

Or, the researchers noted, the ultrasonic vocalizations may be used for echolocation, a biological sonar that dolphins, toothed whales and bats use to navigate in limited visibility to avoid obstacles and locate friends or prey.

"The possibility of seals using some kind of echolocation has really been discounted over the years," Cziko said. "We actually had a lot of somewhat heated discussions in our group about whether or how the seals use these ultrasonic sounds for echolocation-like behaviors."

It is not known how Weddell seals navigate and find prey during the months of near absolute darkness in the Antarctic winter. The study provides no evidence for echolocation.

"We'd like to know who is producing the ultrasonic calls -- males, females, juveniles, or all of the above," Munger said. "And how are the seals using these sounds when they're out in deeper water, looking for fish? We need to record in more places to be able to correlate sounds with behaviors."

Credit: 
University of Oregon

Climate change: threshold for dangerous warming will likely be crossed between 2027-2042

The threshold for dangerous global warming will likely be crossed between 2027 and 2042 - a much narrower window than the Intergovernmental Panel on Climate Change's estimate of between now and 2052. In a study published in Climate Dynamics, researchers from McGill University introduce a new and more precise way to project the Earth's temperature. Based on historical data, it considerably reduces uncertainties compared to previous approaches.

Scientists have been making projections of future global warming using climate models for decades. These models play an important role in understanding the Earth's climate and how it will likely change. But how accurate are they?

Dealing with uncertainty

Climate models are mathematical simulations of different factors that interact to affect Earth's climate, such as the atmosphere, ocean, ice, land surface and the sun. While they are based on the best understanding of the Earth's systems available, when it comes to forecasting the future, uncertainties remain.

"Climate skeptics have argued that global warming projections are unreliable because they depend on faulty supercomputer models. While these criticisms are unwarranted, they underscore the need for independent and different approaches to predicting future warming," says co-author Bruno Tremblay, a professor in the Department of Atmospheric and Oceanic Sciences at McGill University.

Until now, wide ranges in overall temperature projections have made it difficult to pinpoint outcomes in different mitigation scenarios. For instance, if atmospheric CO2 concentrations are doubled, the General Circulation Models (GCMs) used by the Intergovernmental Panel on Climate Change (IPCC), predict a very likely global average temperature increase between 1.9 and 4.5C - a vast range covering moderate climate changes on the lower end, and catastrophic ones on the other.

A new approach

"Our new approach to projecting the Earth's temperature is based on historical climate data, rather than the theoretical relationships that are imperfectly captured by the GCMs. Our approach allows climate sensitivity and its uncertainty to be estimated from direct observations with few assumptions," says co-author Raphael Hebert, a former graduate researcher at McGill University, now working at the Alfred-Wegener-Institut in Potsdam, Germany.

In a study for Climate Dynamics, the researchers introduced the new Scaling Climate Response Function (SCRF) model to project the Earth's temperature to 2100. Grounded on historical data, it reduces prediction uncertainties by about half, compared to the approach currently used by the IPCC. In analyzing the results, the researchers found that the threshold for dangerous warming (+1.5C) will likely be crossed between 2027 and 2042. This is a much narrower window than GCMs estimates of between now and 2052. On average, the researchers also found that expected warming was a little lower, by about 10 to 15 percent. They also found, however, that the "very likely warming ranges" of the SCRF were within those of the GCMs, giving the latter support.

"Now that governments have finally decided to act on climate change, we must avoid situations where leaders can claim that even the weakest policies can avert dangerous consequences," says co-author Shaun Lovejoy, a professor in the Physics Department at McGill University. "With our new climate model and its next generation improvements, there's less wiggle room."

Credit: 
McGill University

Scientists and philosopher team up, propose a new way to categorize minerals

Washington, DC-- A diamond lasts forever, but that doesn't mean all diamonds have a common history.

Some diamonds were formed billions of years ago in space as the carbon-rich atmospheres of dying stars expanded and cooled. In our own planet's lifetime, high-temperatures and pressures in the mantle produced the diamonds that are familiar to us as gems. 5,000 years ago, a large meteorite that struck a carbon-rich sediment on Earth produced an impact diamond.

Each of these diamonds differs from the others in both composition and genesis, but all are categorized as "diamond" by the authoritative guide to minerals--the International Mineralogical Association's Commission on New Minerals, Nomenclature and Classification.

For many physical scientists, this inconsistency poses no problem. But the IMA system leaves unanswered questions for planetary scientists, geobiologists, paleontologists and others who strive to understand minerals' historical context.

So, Carnegie's Robert Hazen and Shaunna Morrison teamed up with CU Boulder philosophy of science professor Carol Cleland to propose that scientists address this shortcoming with a new "evolutionary system" of mineral classification--one that includes historical data and reflects changes in the diversity and distribution of minerals through more than 4 billion years of Earth's history.

Their work is published by the Proceedings of the National Academy of Sciences.

"We came together from the very different fields of philosophy and planetary science to see if there was a rigorous way to bring the dimension of time into discussions about the solid materials that compose Earth," Hazen said.

The IMA classification system for minerals dates to the 19th century when geologist James Dwight Dana outlined a way to categorize minerals on the basis of unique combinations of idealized compositions of major elements and geometrically idealized crystal structure.

"For example, the IMA defines quartz as pure silicon dioxide, but the existence of this idealized version is completely fictional," said Morrison. "Every specimen of quartz contains imperfections--traces of its formation process that makes it unique."

This approach to the categorization system means minerals with distinctly different historical origins are lumped together--as with the example of diamonds--while other minerals that share a common causal history are split apart.

"The IMA system is typical," said lead author Cleland, explaining that most classification systems in the natural sciences, such as the periodic table of the elements, are time independent, categorizing material things "solely on the basis of manifest similarities and differences, regardless of how they were produced or what modifications they have undergone."

For many researchers, a time-independent system is completely appropriate. But this approach doesn't work well for planetary and other historically oriented geosciences, where the emphasis is on understanding the formation and development of planetary bodies.

Differences in a diamond or quartz crystal's formative history are critical, Cleland said, because the conditions under which a sample was formed and the modifications it has undergone "are far more informative than the mere fact that a crystal qualifies as diamond or quartz."

She, Hazen, and Morrison argue that what planetary scientists need is a new system of categorizing minerals that includes historical "natural kinds."

Biology faced an analogous issue before Darwin put forward his theory of evolution. For example, lacking an understanding of how organisms are historically related through evolutionary processes, 17th century scholars debated whether bats are birds. With the advent of Darwin's work in the 19th century, however, biologists classified them separately on evolutionary grounds, because they lack a common ancestor with wings.

Because a universal theory of "mineral evolution" does not exist, creating such a classification system for the geosciences is challenging. Hazen, Morrison, and Cleland's proposed solution is what they call a "bootstrap" approach based on historically revelatory, information-rich chemical, physical, and biological attributes of solid materials. This strategy allows scientists to build a historical system of mineral kinds while remaining agnostic about its underlying theoretical principles.

"Minerals are the most durable, information-rich objects we can study to understand our planet's origin and evolution," Hazen said. "Our new evolutionary approach to classifying minerals complements the existing protocols and offers the opportunity to rigorously document Earth's history."

Morrison concurred, adding: "Rethinking the way we classify minerals offers the opportunity to address big, outstanding scientific mysteries about our planet and our Solar System, through a mineralogical lens. In their imperfections and deviations from the ideal, minerals capture the story of what has happened to them through deep time--they provide a time machine to go back and understand what was happening on our planet and other planets in our solar system millions or billions of years ago."

Credit: 
Carnegie Institution for Science

Volcanic eruptions directly triggered ocean acidification during Early Cretaceous

image: Calcium carbonate samples from a sediment core drilled from the mid-Pacific Mountains show evidence of ocean acidification 127 to 100 million years ago.

Image: 
Northwestern University

EVANSTON, Ill. -- Around 120 million years ago, the earth experienced an extreme environmental disruption that choked oxygen from its oceans.

Known as oceanic anoxic event (OAE) 1a, the oxygen-deprived water led to a minor -- but significant -- mass extinction that affected the entire globe. During this age in the Early Cretaceous Period, an entire family of sea-dwelling nannoplankton virtually disappeared.

By measuring calcium and strontium isotope abundances in nannoplankton fossils, Northwestern earth scientists have concluded the eruption of the Ontong Java Plateau large igneous province (LIP) directly triggered OAE1a. Roughly the size of Alaska, the Ontong Java LIP erupted for seven million years, making it one of the largest known LIP events ever. During this time, it spewed tons of carbon dioxide (CO2) into the atmosphere, pushing Earth into a greenhouse period that acidified seawater and suffocated the oceans.

"We go back in time to study greenhouse periods because Earth is headed toward another greenhouse period now," said Jiuyuan Wang, a Northwestern Ph.D. student and first author of the study. "The only way to look into the future is to understand the past."

The study was published online last week (Dec. 16) in the journal Geology. It is the first study to apply stable strontium isotope measurements to the study of ancient ocean anoxic events.

Andrew Jacobson, Bradley Sageman and Matthew Hurtgen -- all professors of earth and planetary sciences at Northwestern's Weinberg College of Arts and Sciences -- coauthored the paper. Wang is co-advised by all three professors.

Clues inside cores

Nannoplankton shells and many other marine organisms build their shells out of calcium carbonate, which is the same mineral found in chalk, limestone and some antacid tablets. When atmospheric CO2 dissolves in seawater, it forms a weak acid that can inhibit calcium carbonate formation and may even dissolve preexisting carbonate.

To study the earth's climate during the Early Cretaceous, the Northwestern researchers examined a 1,600-meter-long sediment core taken from the mid-Pacific Mountains. The carbonates in the core formed in a shallow-water, tropical environment approximately 127 to 100 million years ago and are presently found in the deep ocean.

"When you consider the Earth's carbon cycle, carbonate is one of the biggest reservoirs for carbon," Sageman said. "When the ocean acidifies, it basically melts the carbonate. We can see this process impacting the biomineralization process of organisms that use carbonate to build their shells and skeletons right now, and it is a consequence of the observed increase in atmospheric CO2 due to human activities."

Strontium as corroborating evidence

Several previous studies have analyzed the calcium isotope composition of marine carbonate from the geologic past. The data can be interpreted in a variety of ways, however, and calcium carbonate can change throughout time, obscuring signals acquired during its formation. In this study, the Northwestern researchers also analyzed stable isotopes of strontium -- a trace element found in carbonate fossils -- to gain a fuller picture.

"Calcium isotope data can be interpreted in a variety of ways," Jacobson said. "Our study exploits observations that calcium and strontium isotopes behave similarly during calcium carbonate formation, but not during alteration that occurs upon burial. In this study, the calcium-strontium isotope 'multi-proxy' provides strong evidence that the signals are 'primary' and relate to the chemistry of seawater during OAE1a."

"Stable strontium isotopes are less likely to undergo physical or chemical alteration over time," Wang added. "Calcium isotopes, on the other hand, can be easily altered under certain conditions."

The team analyzed calcium and strontium isotopes using high-precision techniques in Jacobson's clean laboratory at Northwestern. The methods involve dissolving carbonate samples and separating the elements, followed by analysis with a thermal ionization mass spectrometer.
Researchers have long suspected that LIP eruptions cause ocean acidification. "There is a direct link between ocean acidification and atmospheric CO2 levels," Jacobson said. "Our study provides key evidence linking eruption of the Ontong Java Plateau LIP to ocean acidification. This is something people expected should be the case based on clues from the fossil record, but geochemical data were lacking."

Modeling future warming

By understanding how oceans responded to extreme warming and increased atmospheric CO2, researchers can better understand how earth is responding to current, human-caused climate change. Humans are currently pushing the earth into a new climate, which is acidifying the oceans and likely causing another mass extinction.

"The difference between past greenhouse periods and current human-caused warming is in the timescale," Sageman said. "Past events have unfolded over tens of thousands to millions of years. We're making the same level of warming (or more) happen in less than 200 years."

"The best way we can understand the future is through computer modeling," Jacobson added. "We need climate data from the past to help shape more accurate models of the future."

Credit: 
Northwestern University

Invasive in the U.S., lifesaver Down Under

image: Sean Doody, assistant professor and graduate director of integrative biology at the USF St. Petersburg campus, studies the nesting biology of the monitor lizard.

Image: 
University of South Florida

Ten years of research led by the University of South Florida has revealed that a monitor lizard should be regarded as an "ecosystem engineer," a rarity for reptiles. Tortoises and sea turtles are the only reptiles considered to be ecosystem engineers, a term used to describe organisms that have a great impact on their environment based on their ability to create, modify, maintain or destroy a habitat. Sean Doody, assistant professor and graduate director of integrative biology at the USF St. Petersburg campus, discovered that while a related species is considered invasive in the United States, in Australia, small animal communities rely on the monitor lizards' burrow system, called a warren, using it as a habitat, a place to forage for food and nesting.

In his study published in Ecology, Doody and his Australian collaborators investigated the nesting biology of the Yellow-Spotted monitor lizard, which can measure nearly five feet, and its smaller, sister species, the Gould's monitor lizard. The team had recently discovered that the lizards are unique in that they lay their eggs as deep as 13 feet, easily the deepest vertebrate nests on earth. They loosen the soil, creating warm, moist conditions, which are ideal for laying eggs and trapping viable seeds and fruits. But now, the researchers have discovered that the burrows hosted a wide range of animals, including reptiles, frogs, insects and even marsupial mammals. The team found 747 individual species of 28 types of vertebrates.

The timing of the research revealed clues as to why certain species utilized the warrens. For example, throughout the winter dry season, the researchers found hibernating frogs using the burrows to maintain their body moisture. During one excavation, Doody and his team discovered 418 individual frogs in a single warren.

"The finding is significant as it shows that nesting warrens provide critical shelter and other resources for the small animal community," Doody said. "The invasive cane toad is decimating the monitor lizards in some areas, meaning that these nesting warrens, which are re-used year after year, will disappear. This can impact the relative number of predators and prey, which can have unexpected consequences for the ecosystem, such as an overabundance of one species at the cost of another, which in other systems has threatened species with local extinction.

The arrival of the toxic cane toad emphasized the extent of the monitor lizard's impact on the food web. In studies conducted between 2009 and 2017, Doody's research team uncovered abandoned burrows and an increase in the lizard's prey, including smaller lizards, snakes, turtles and birds. Australian researchers and natural resource managers have been unable to successfully control cane toads.

Doody is now expanding his research to include the perentie, another large monitoring lizard that likely nests at great depths in the Australian desert, to see if it too should be deemed an ecological engineer. His team is also looking at how climate warming will impact the facilitation of these animal communities.

Credit: 
University of South Florida

Potential preventative treatment demonstrated for Crohn's disease

BIRMINGHAM, Ala. - A potential preventive treatment for Crohn's disease, a form of inflammatory bowel disease, has been demonstrated in a mouse model and using immune-reactive T cells from patients with Crohn's disease.

This research, led by University of Alabama at Birmingham researcher Charles O. Elson, M.D., professor of medicine, focused on a subset of T cells known as T memory, or Tm cells. The UAB researchers used a triple-punch treatment to remove Tm cells and increase the number of T regulatory, or Treg, cells. Both of these results were able to prevent colitis in a T cell transfer mouse model, and they had similar inhibitory effects on immune-reactive CD4-positive T cells isolated from Crohn's disease patient blood samples.

These results, Elson says, support a potential immunotherapy to prevent or ameliorate inflammatory bowel disease.

Some background is needed to understand how and why the triple-punch treatment, which was reported in the journal Science Immunology, works.

Inflammatory bowel diseases result from an over-activation of the immune response against gut microbes in genetically susceptible hosts. One specific microbial antigen causing this over-reaction by short-lived T effector cells is flagellin, the protein-subunit of bacterial flagella, the long tail-like structures that twirl like a propeller to make some bacterial motile.

One group of immuno-dominant flagellins are those from the Lachnospiraceae family, including CBir1; more than half of Crohn's disease patients have elevated serological reactivity to CBir1 and related flagellins.

Unlike the short-lived T effector cells that act like soldiers to help fight infections, T memory cells serve as sentinels that remember a previous encounter with flagellins. They are long-lived and quiescent, with a low level of metabolism. If reactivated by a fresh encounter with flagellin antigens, they undergo a profound metabolic transition and quickly expand into large numbers of pathogenic T effector cells.

This metabolic switch is controlled by a signaling protein, mTOR, located in the Tm cell.

Thus, activation of mTOR is necessary for T cell expansion, making it an inescapable metabolic checkpoint to create activated Tm cells. It is also the checkpoint for T naïve cells that are encountering flagellin for the first time.

So, Elson and colleagues hypothesized that activation of CD4-positive Tm or T naïve cells by flagellin antigens, while at the same time shutting down the metabolic checkpoint through the use of mTOR inhibition, would result in the death or an absence of the normal immune response to an antigen, which is called anergy. These effects comprise two parts of the triple-punch treatment, with the third being induction of Treg cells.

The activation was prompted by a synthetic peptide that had multiple repeats of one CBir1 epitope. Such a peptide can selectively stimulate memory cells without activating an innate immune response.

To shut down the metabolic checkpoint, the UAB researchers used two existing drugs, rapamycin and metformin. Rapamycin directly inhibits mTOR, and metformin adds to that inhibition by activating a kinase called AMPK that negatively regulates mTOR activity.

Elson calls this treatment cell activation with concomitant metabolic checkpoint inhibition, or CAMCI.

Parenteral application of CAMCI in mice successfully targeted microbiota flagellin-specific CD4-positive T cells, leading to significant antigen-specific CD4-positive T cell death, impaired development and impaired reactivation of CD4-positive memory responses, and substantial induction of a CD4-positive Treg cell response. It prevented colitis in the mouse model and had similar inhibitory effects on microbiota-flagellin-specific CD4-positive T cells isolated from patients with Crohn's disease.

For a potential future treatment of patients with Crohn's disease, only targeting a single flagellin is unlikely to have much effect, Elson says. "Instead, we anticipate the future use of a synthetic multi-epitope peptide containing multiple CD4-positive T cell flagellin epitopes to target many microbiota-flagellin-reactive CD4-positive Tm cells," Elson said. "Depending on the serologic or CD4-positive T cell response to certain microbiota antigens, this CAMCI approach could be tailored to individuals with different combinations of epitopes as a personalized immunotherapy."

Elson says he envisions this CAMCI approach as an intermittent pulse therapy to maintain remission in patients with Crohn's disease. "And with autoantigen epitopes better studied in the future," he said, "this approach could be expanded to treat other inflammatory or autoimmune diseases such as Type 1 diabetes or multiple sclerosis."

In developed countries, three of every 1,000 people have inflammatory bowel disease. Its major forms, Crohn's disease and ulcerative colitis, have substantial morbidity and large medical care costs, and no current therapy alters the natural history of these diseases.

Credit: 
University of Alabama at Birmingham