Earth

Self-eating decisions

The idea of the cell as a city is a common introduction to biology, conjuring depictions of the cell's organelles as power plants, factories, roads, libraries, warehouses and more. Like a city, these structures require a great deal of resources to build and operate, and when resources are scarce, internal components must be recycled to provide essential building blocks, particularly amino acids, to sustain vital functions.

But how do cells decide what to recycle when they are starving? One prevailing hypothesis suggests that starving cells prefer to recycle ribosomes--cellular protein-production factories rich in important amino acids and nucleotides--through autophagy, a process that degrades proteins in bulk.

However, new research by scientists at Harvard Medical School suggests otherwise. In a study published in Nature in July, they systematically surveyed the entire protein landscape of normal and nutrient-deprived cells to identify which proteins and organelles are degraded by autophagy.

The analyses revealed that, in contrast to expectations, ribosomes are not preferentially recycled through autophagy, but rather a small number of other organelles, particularly parts of the endoplasmic reticulum, are degraded.

The results shed light on how cells respond to nutrient deprivation and on autophagy and protein degradation processes, which are increasingly popular targets for drug development in cancers and other disease conditions, the authors said.

"When cells are starving, they don't haphazardly degrade ribosomes en masse through autophagy. Instead, they appear to have mechanisms to control what they recycle," said senior study author Wade Harper, the Bert and Natalie Vallee Professor of Molecular Pathology and chair of cell biology in the Blavatnik Institute at HMS.

"Our findings now allow us to rethink previous assumptions and better understand how cells deal with limited nutrients, a fundamental question in biology," Harper said.

Protein turnover is a constant and universal occurrence inside every cell. To recycle unneeded or misfolded proteins, remove damaged organelles, and carry out other internal housekeeping tasks, cells utilize two primary tools, autophagy and the ubiquitin-proteasome system.

Autophagy, derived from Greek words meaning "self-eating," allows cells to degrade proteins in bulk, as well as larger cellular structures, by engulfing them in bubble-like structures and transporting them to the cell's waste disposal organelle, called the lysosome.

In contrast, the proteasome pathway allows cells to break down individual proteins by tagging them with a marker known as ubiquitin. Ubiquitin-modified proteins are then recognized by the proteasome and degraded.

Surprising discrepancy

Previous studies in yeast have suggested that nutrient-starved cells use autophagy to specifically recycle ribosomes, which are abundant and a reservoir of key amino acids and nucleotides. However, cells have many other mechanisms to regulate ribosome levels, and how they do so when nutrients are low has not been fully understood.

Using a combination of quantitative proteomics and genetic tools, Harper and colleagues investigated protein composition and turnover in cells that were deprived of key nutrients. To probe the role of autophagy, they also focused on cells with genetically or chemically inhibited autophagy systems.

One of the first analyses they carried out revealed that, in starving cells, total ribosomal protein levels decrease only slightly relative to other protein levels. This reduction appeared to be independent of autophagy. Cells that lacked the capacity for autophagy had no obvious defects when nutrient deprived.

"This was a very surprising finding that was at odds with existing hypotheses, and it really led us to consider that something was missing in how we think about autophagy and its role in ribosome degradation," Harper said. "This simple result hides a huge amount of biology that we tried to uncover."

Searching for an explanation for this discrepancy, the team, spearheaded by study co-first authors Heeseon An and Alban Ordureau, research fellows in cell biology at HMS, systematically analyzed the production of new ribosomes and the fate of existing ones in starving cells.

They did so through a variety of complementary techniques, including Ribo-Halo, which allowed them to label different ribosomal components with fluorescent tags. They could apply these tags at different time points and measure how many new ribosomes were being synthesized at the level of a single cell, as well as how many old ribosomes remained after a set amount of time.

When cells were deprived of nutrients, the primary factors that led to lower overall ribosome levels was a reduction in new ribosome synthesis and turnover through non-autophagy dependent pathways, the experiments showed. Both cell volume and the rate of cell division decreased as well, however, which allowed cells to maintain a cellular density of ribosomes.

Global picture

Next, the team examined the patterns of degradation for more than 8,300 proteins throughout the cell during nutrient deprivation. They confirmed that the pattern of ribosome turnover appeared to be independent of autophagy and, instead, matched proteins that are known to be degraded via the ubiquitin-proteasome system.

"With our quantitative proteomics toolbox, we could look simultaneously in an unbiased manner at how thousands of proteins are made and turnover in the cell under different conditions with or without autophagy," Ordureau said. "This allowed us to gain a global picture that wasn't based on inferences drawn from analyses of a limited number of proteins."

The analyses showed that a small number of organelles and proteins were degraded by autophagy in higher amounts than ribosomes, particularly endoplasmic reticulum, which the Harper lab has previously shown is selectively remodeled by autophagy during nutrient stress.

These proteome-wide data may reveal other organelles and proteins that are selectively degraded in response to nutrient stress, the authors said, and the team is pursuing further analyses.

Together, the findings shed light on how starving cells respond to nutrient stress and, in particular, clarify previous assumptions regarding ribosome turnover. Critically, the authors said, the results demonstrate that proteasome-dependent turnover of ribosomes likely contributes to a much greater extent than autophagy during nutrient stress.

This is an important step toward a better, unbiased understanding of autophagy, a widely studied process that is the target of numerous drug discovery efforts.

"Controlling autophagy is being explored in a wide range of contexts such as killing tumor cells by starving them of key nutrients or allowing neurons to remove harmful protein aggregates," An said. "But our understanding of autophagy is incomplete, and many aspects are still unclear."

Only relatively recently have scientists found that starvation-induced autophagy can be selective, she added, and questions such as what organelles are targeted and why, whether autophagy affects only damaged organelles or random ones, and many others remain mostly unanswered.

"We are using the context of starvation to better understand how cells use autophagy, and under what circumstances, to understand this important process better," An said.

Credit: 
Harvard Medical School

Pioneering method reveals dynamic structure in HIV

image: An artist's rendition of Gag molecules proteins that form lattice hexagonal structure diffusing across the virus-like particles.

Image: 
Dave Meikle/Saffarian Lab

Viruses are scary. They invade our cells like invisible armies, and each type brings its own strategy of attack. While viruses devastate communities of humans and animals, scientists scramble to fight back. Many utilize electron microscopy, a tool that can "see" what individual molecules in the virus are doing. Yet even the most sophisticated technology requires that the sample be frozen and immobilized to get the highest resolution.

Now, physicists from the University of Utah have pioneered a way of imaging virus-like particles in real time, at room temperature, with impressive resolution. In a new study, the method reveals that the lattice, which forms the major structural component of the human immunodeficiency virus (HIV), is dynamic. The discovery of a diffusing lattice made from Gag and GagPol proteins, long considered to be completely static, opens up potential new therapies.

When HIV particles bud from an infected cell, the viruses experience a lag time before they become infectious. Protease, an enzyme that is embedded as a half-molecule in GagPol proteins, must bond to other similar molecules in a process called dimerization. This triggers the viral maturation that leads to infectious particles. No one knows how these half protease molecules find each other and dimerize, but it may have to do with the rearrangement of the lattice formed by Gag and GagPol proteins that lay just inside of the viral envelope. Gag is the major structural protein and has been shown to be enough to assemble virus-like particles. Gag molecules form a lattice hexagonal structure that intertwines with itself with miniscule gaps interspersed. The new method showed that the Gag protein lattice is not a static one.

"This method is one step ahead by using microscopy that traditionally only gives static information. In addition to new microscopy methods, we used a mathematical model and biochemical experiments to verify the lattice dynamics," said lead author Ipsita Saha, graduate research assistant at the U's Department of Physics & Astronomy. "Apart from the virus, a major implication of the method is that you can see how molecules move around in a cell. You can study any biomedical structure with this."

The paper published in Biophysical Journal on June 26, 2020.

Mapping a nanomachine

The scientists weren't looking for dynamic structures at first--they just wanted to study the Gag protein lattice. Saha led the two year effort to "hack" microscopy techniques to be able to study virus particles at room temperature to observe their behavior in real life. The scale of the virus is miniscule -- about 120 nanometers in diameter--so Saha used interferometric photoactivated localization microscopy (iPALM).

First, Saha tagged the Gag with a fluorescent protein called Dendra2 and produced virus-like particles of the resulting Gag-Dendra2 proteins. These virus-like particles are the same as HIV particles, but made only of the Gag-Dendra2 protein lattice structure. Saha showed that the resulting Gag-Dendra2 proteins assembled the virus-like particles the same way as virus-like particle made up regular Gag proteins. The fluorescent attachment allowed iPALM to image the particle with a 10 nanometer resolution. The scientists found that each immobilized virus-like particle incorporated 1400 to 2400 Gag-Dendra2 proteins arranged in a hexagonal lattice. When they used the iPALM data to reconstruct a time-lapse image of the lattice, it appeared that the lattice of Gag-Dendra2 were not static over time. To make sure, they independently verified it in two ways: mathematically and biochemically.

First, they divided up the protein lattice into uniform separate segments. Using a correlation analysis, they tested how each segment correlated with itself over time, from 10 to 100 seconds. If each segment continued to correlate with itself, the proteins were stationary. If they lost correlation, the proteins had diffused. They found that over time, the proteins were quite dynamic.

The second way they verified the dynamic lattice was biochemically. For this experiment, they created virus-like particles whose lattice consisted of 80% of Gag wild type proteins, 10% of Gag tagged with SNAP, and 10% of gag tagged with Halo. SNAP and Halo are proteins that can bind a linker which binds them together forever. The idea was to identify whether the molecules in the protein lattice stayed stationary, or if they migrated positions.

"The Gag-proteins assemble themselves randomly. The SNAP and Halo molecules could be anywhere within the lattice--some may be close to one another, and some will be far away," Saha said. "If the lattice changes, there's a chance that the molecules come close to one another."

Saha introduced a molecule called Haxs8 into the virus-like particles. Haxs8 is a dimerizer--a molecule that covalently binds SNAP and Halo proteins when they are within binding radius of one another. If SNAP or Halo molecules move next to each other, they'll produce a dimerized complex. She tracked these dimerized complex concentrations over time. If the concentration changed, it would indicate that new pairs of molecules found each other. If the concentration decreased, it would indicate the proteins broke apart. Either way, it would indicate that movement had taken place. They found that over time, the percentage of the dimerized complex increased; HALO and SNAP Gag proteins were moving all over the lattice and coming together over time.

A new tool to study viruses

This is the first study to show that the protein lattice structure of an enveloped virus is dynamic. This new tool will be important to better understand the changes that occur within the lattice as new virus particles go from immaturity to dangerously infectious.

"What are the molecular mechanisms that lead to infection? It opens up a new line of study," said Saha. "If you can figure out that process, maybe you can do something to prevent them from finding each other, like a type of drug that would stop the virus in its tracks."

Credit: 
University of Utah

Industry-made pits are beneficial for beavers and wolverines, study shows

image: Beavers, like the one pictured here, are making their homes on sites of industry activity. Image credit: A. Colton.

Image: 
A. Colton.

Beavers and wolverines in Northern Alberta are using industry-created borrow pits as homes and feeding grounds, according to a new study by University of Alberta ecologists.

The research examined the relationship between local wildlife and borrow pits, which are industry-created sites where material such as soil, gravel, or sand has been dug up for road construction. The results show that when revegetated the sites provide homes for beavers, which in turn support the survival of wolverines.

"The borrow pits enhance habitats for a number of species of wildlife in the bogs of Northern Alberta," said Mark Boyce, co-author on the paper, professor in the Department of Biological Sciences, and Alberta Conservation Association Chair in Fisheries and Wildlife.

"The deep water and adjacent forage create excellent habitats for beavers. And wolverines thrive when beavers do. Not only do they prey on beavers, wolverines also have been shown to use beaver lodges as dens where they have their cubs."

The displacement of wildlife by industrial development is a complex issue, Boyce explains. "In this case, industrial development created the borrow pits that are now used by beavers that actually enhances habitats for our wilderness icon, the wolverine."

The research was led by PhD student Matthew Scrafford, who formed a partnership with the Dene Tha First Nation that proved instrumental for the study.

"The most important partner on this research was the Dene Tha First Nations," said Boyce. "Several young people in the area were enthusiastic about the project. They were instrumental in building traps and supporting our research."

Credit: 
University of Alberta

Dehydration increases amphibian vulnerability to climate change

Amphibians have few options to avoid the under appreciated one-two punch of climate change, according to a new study from Simon Fraser University researchers and others.

Rising summer temperatures are also resulting in higher rates of dehydration among wet-skinned amphibians as they attempt to keep themselves cool.

Researchers from SFU and the University of California-Santa Cruz predict that by the 2080s, habitats previously thought to be safe for amphibians will either be too hot or too dehydrating for them to inhabit. Even the edges of wetlands may be too hot for up to 74 percent of the summer and that sunny, dry spots will be too dehydrating for up to 95 percent of the summer.

The study was published yesterday in the journal Global Change Biology.

Researchers studied the environmental conditions in shaded and damp nooks and crannies at the edges of wetlands in the mountains of the Pacific Northwest. They sought to predict how suitable those environments will be for amphibians in the future.

These findings are significant because most previous research on the effects of climate change on amphibians has focused solely on temperature, ignoring an equally important physiological process for amphibians--evaporative water loss.

By incorporating rates of water loss, the researchers found that previous studies may have dramatically underestimated the already dire predictions of climate change impacts on amphibians.

Instead of subjecting live amphibians to invasive measurements, the researchers estimated water loss rates and internal body temperatures using model frogs made of agar (seaweed extract) that closely mimic the water loss and temperatures of live amphibians.

These model frogs were placed in four habitats that encompass the behavior of many different amphibian species--shaded locations on land and in shallow wetlands, and sun exposed locations on land and in shallow wetlands.

The data was related to key environmental conditions, including air temperature, precipitation, and relative humidity, and then linked to forecasts of future climate change.

The study also found that amphibians face a difficult trade-off: animals in cool shaded places on dry land face harmful rates of dehydration, and those in shallow water face harmful high temperatures.

"Such trade-offs will only get more challenging with future climate change, with no single habitat being safe at all times," says Gavia Lertzman-Lepofsky, the study's lead author.

This also means that to remain within their environmental limits, frogs and salamanders will have to move between habitats much more often, using up energy for movement rather than for finding food.

Unfortunately, the larger landscape surrounding amphibians is also changing. As individuals become more dependent on finding damp and shaded spots to escape the heat, there will also be less water available in the landscape as small ponds and wetlands dry up over the long, dry summers.

This puts increasing pressure on populations and provides a sobering view of how amphibians will survive in a hotter, drier world.

Credit: 
Simon Fraser University

Antarctica more widely impacted than previously thought

Antarctica is considered one of the Earth's largest, most pristine remaining wildernesses. Yet since its formal discovery 200 years ago, the continent has seen accelerating and potentially impactful human activity.

How widespread this activity is across the continent has never been quantified. We know Antarctica has no cities, agriculture or industry. But we have never had a good idea of where humans have been, how much of the continent remains untouched or largely unimpacted, and to what extent these largely unimpacted areas serve to protect biodiversity.

A team of researchers led by Monash University has changed all of that. Reporting today in the journal Nature, using a data set of 2.7 million human activity records, they show just how extensive human use of Antarctica has been over the last 200 years.

With the exception of some large areas mostly in the central parts of the continent, humans have set foot almost everywhere.

Although many of these visited areas have only been negligibly affected by people, biodiversity is not as well represented within them as it should be.

Only 16% of the continent's Important Bird Areas, areas identified internationally as critical for bird conservation, are located within negligibly impacted areas. And little of the total negligibly impacted area is represented in Antarctica's Specially Protected Area network.

High human impact areas, for example some areas where people build research stations or visit for tourism, often overlap with areas important for biodiversity.

Lead author, Rachel Leihy, a PhD student in the Monash School of Biological Sciences, points out that "While the situation does not look promising initially, the outcomes show that much opportunity exists to take swift action to declare new protected areas for the conservation of both wilderness and biodiversity."

Steven Chown, the corresponding author based at Monash University, adds "Informatics approaches using large data sets are providing new quantitative insights into questions that have long proven thorny for environmental policymakers."

"This work offers innovative ways to help the Antarctic Treaty Parties take forward measures to secure Antarctica's Wilderness."

Credit: 
Monash University

When should you neuter your dog to avoid health risks?

Some dog breeds have higher risk of developing certain cancers and joint disorders if neutered or spayed within their first year of life. Until now, studies had only assessed that risk in a few breeds. A new, 10-year study by researchers at the University of California, Davis, examined 35 dog breeds and found vulnerability from neutering varies greatly depending on the breed. The study was published in the journal Frontiers in Veterinary Science.

"There is a huge disparity among different breeds," said lead author Benjamin Hart, distinguished professor emeritus at the UC Davis School of Veterinary Medicine. Hart said there is no "one size fits all" when it comes to health risks and the age at which a dog is neutered. "Some breeds developed problems, others didn't. Some may have developed joint disorders but not cancer or the other way around."

Researchers analyzed 15 years of data from thousands of dogs examined each year at the UC Davis Veterinary Medical Teaching Hospital to try to understand whether neutering, the age of neutering, or differences in sex when neutered affect certain cancers and joint disorders across breeds. The joint disorders examined include hip dysplasia, cranial cruciate ligament tears and elbow dysplasia. Cancers examined include lymphoma; hemangiosarcoma, or cancer of the blood vessel walls; mast cell tumors; and osteosarcoma, or bone cancer.

In most breeds examined, the risk of developing problems was not affected by age of neutering.

BREED DIFFERENCES BY SIZE AND SEX

Researchers found that vulnerability to joint disorders was related to body size.

"The smaller breeds don't have these problems, while a majority of the larger breeds tend to have joint disorders," said co-author Lynette Hart, professor at the UC Davis School of Veterinary Medicine.

One of the surprising exceptions to this was among the two giant breeds -- great Danes and Irish wolfhounds -- which showed no increased risk to joint disorders when neutered at any age.

Researchers also found the occurrence of cancers in smaller dogs was low, whether neutered or kept intact. In two breeds of smaller dogs, the Boston terrier and the shih tzu, there was a significant increase in cancers with neutering.

Another important finding was that the sex of the dog sometimes made a difference in health risks when neutered. Female Boston terriers neutered at the standard six months of age, for example, had no increased risk of joint disorders or cancers compared with intact dogs, but male Boston terriers neutered before a year of age had significantly increased risks.

Previous studies have found that neutering or spaying female golden retrievers at any age increases the risk of one or more of the cancers from 5 percent to up to 15 percent.

DISCUSS CHOICES WITH VETERINARIANS

Dog owners in the United States are overwhelmingly choosing to neuter their dogs, in large part to prevent pet overpopulation, euthanasia or reduce shelter intake. In the U.S., surgical neutering is usually carried out by six months of age.

This study suggests that dog owners should carefully consider when and if they should have their dog neutered.

"We think it's the decision of the pet owner, in consultation with their veterinarian, not society's expectations that should dictate when to neuter," said Benjamin Hart. "This is a paradigm shift for the most commonly performed operation in veterinary practice."

The study lays out guidelines for pet owners and veterinarians for each of 35 breeds to assist in making a neutering decision. Read the full list here.

Credit: 
University of California - Davis

Reprogramming of immune cells enhances effects of radiotherapy in preclinical models of brain cancer

image: Ludwig Lausanne Member Johanna Joyce

Image: 
Ludwig Cancer Research

JULY 15, 2020, NEW YORK-- A Ludwig Cancer Research study has dissected how radiotherapy alters the behavior of immune cells known as macrophages found in glioblastoma (GBM) tumors and shown how these cells might be reprogrammed with an existing drug to suppress the invariable recurrence of the aggressive brain cancer.

Led by Ludwig Lausanne Member Johanna Joyce and published in the current issue of Science Translational Medicine, the study details how radiotherapy dynamically alters gene expression programs in two subtypes of tumor-associated macrophages (TAMs) and describes how those changes push TAMs into a state in which they aid therapeutic resistance and growth. Joyce and her colleagues, led by first author Leila Akkari, now at the Netherlands Cancer Institute, also demonstrate that combining radiotherapy with daily dosing of a drug that targets macrophages--an inhibitor of the colony stimulating factor-1 receptor (CSF-1R)--reverses that transformation and dramatically extends survival in mouse models of GBM.

"What these preclinical data tell us is that for patients receiving radiotherapy for glioblastoma, adding CSF-1R inhibition to the treatment regimen could have the effect of prolonging survival," says Joyce.

GBM patients typically survive little more than a year following diagnosis, as the cancer inevitably recurs and typically resists multiple therapies. But it was not known whether TAMs--which are linked to cancer cell survival and drug resistance in a variety of tumor types--promote GBM resistance to ionizing radiation, which is part of the standard of care for the aggressive tumor.

Two types of macrophages populate glioma tumors. One is the brain's resident macrophage, or microglia (MG). The other is the monocyte-derived macrophage (MDM) that patrols the body, gobbling up pathogens and dead cells, or their detritus, and initiating additional immune responses. Macrophages can, however, be pushed into an alternative state--often termed the M2-like activation phenotype--in which they aid tissue healing rather than respond to threats. Many cancers coax macrophages into this alternative phenotype, which supports tumor survival and growth.

Joyce and her team found both MG and MDMs flood into GBM tumors in mice to clean up the cellular detritus following an initial course of radiotherapy. But when the gliomas recur, interestingly, it is MDMs that predominate in the TAM populations. The gene expression profiles of these MDMs in irradiated tumors, however, more closely resembles those of MG. They found, moreover, that both MDMs and MG in irradiated gliomas are alternatively activated into a wound-healing phenotype and secrete factors that bolster DNA repair in cells.

"Not only were these macrophage populations changing but, more importantly, they were now able to interfere with the efficacy of radiotherapy because they could help cancer cells repair the DNA damage it causes," explains Joyce.

"So you have this yin/yang situation. The irradiation is of course destroying many of the cancer cells, but it has also caused all these macrophages to rush into the tumor to clean up the mess and, as a consequence, they've been super-activated to create a permissive niche for the remaining cancer cells to form new tumors."

To see if depleting MDMs specifically might reverse that effect, the researchers treated different GBM mouse models with an antibody that blocks the entry of MDMs into the brain. But that only nominally improved survival in one of the models.

The Joyce lab has previously reported that TAMs can be pushed out of the wound-healing phenotype by CSF-1R inhibitors, so they next tested whether that strategy might bolster the efficacy of radiotherapy.

They found that a single, 12-day cycle of CSF-1R inhibitor treatment following radiotherapy enhanced the initial therapeutic response and extended the median survival of mice by about three weeks beyond the modest increase seen with radiotherapy alone. By contrast, a continuous, daily regimen of CSF-1R inhibition for several months following radiotherapy yielded the most striking results, reprogramming TAMs and dramatically extending median survival.

"We had approximately 95% of mice survive the full course of this six-month study," says Joyce. In addition, mice engrafted with patient-derived tumors showed increased survival.

Joyce and colleagues are further exploring the mechanism by which TAMs promote DNA repair and otherwise assist cancer cell survival in GBM.

Credit: 
Ludwig Institute for Cancer Research

Polycatenanes in mesoscale

video: CG animation of Self-assembled poly-catenanes

Image: 
Shiki Yagai

An international research group led by Chiba University Professor Shiki Yagai has for the first time succeeded in forming self-assembled "polycatenanes," which are structures comprised of mechanically interlocked small molecule rings. The research group also succeeded in observing the geometric structure of the polycatenanes by using atomic force microscopy (AFM). This work, published in the journal Nature, is the first to achieve synthesis of nano-polycatenanes through molecular self-assembly without using additional molecular templates. Yagai, a professor of applied chemistry and biotechnology at Chiba University, sees this as the first vital step in technological innovation for creating nanometer-sized topological structures.

Catenane synthesis has been widely researched, especially since Jean-Pierre Sauvage devised a metal-templated strategy to synthesize a catenane. In recognition of their pioneering work, Sauvage and two other researchers were awarded the Nobel Prize in Chemistry for the design and synthesis of molecular machines in 2016. As the molecules in catenanes are linked together into a chain, the links can move relative to one another. This makes synthesis and characterization of the structure very difficult, especially when the rings are not held together by strong covalent bonds.

By modifying the self-assembly protocol with a templated strategy, the research group from Japan, Italy, Switzerland, and the UK were able to create polycatenanes including complex structures made up of five interlocked rings in a linear arrangement similar to the Olympic Games symbol, which were large enough to be observed by atomic force microscopy. While searching for methods to purify the nano-rings, the research group found that adding the rings to hot monomer solution facilitates the formation of new assemblies on the surface of the rings, a process known as secondary nucleation. Based on this finding, the research group examined optimal conditions for secondary nucleation and successfully created poly[22]catenane made up of as many as 22 connected rings. By observing this poly[22]catenane through atomic force microscopy, it was confirmed that the structure reached up to 500 nm in length.

"The innovative finding of this research lies in the utilization of the self-assembly characteristic of the molecules," says Professor Yagai. "We were able to create intricate geometric structures in mesoscale without using complex synthetic methods. This paves the way to creating even more complex geometric compounds such as "rotaxane" and "trefoil knot" in a similar scale. As the molecular assemblies used in this research are made up of molecules which react to light and electricity, this finding can potentially be applied to organic electronics and photonics, and other molecular machines."

Credit: 
Chiba University

Simple twist of DNA determines fate of placenta

The development of the mammalian placenta depends upon an unusual twist that separates DNA's classic double helix into a single-stranded form, Yale researchers report July 15 in the journal Nature.

The Yale team also identified the molecular regulator that acts upon this single strand to accelerate or stop placental development, a discovery with implications not only for diseases of pregnancy but also for understanding how cancer tumors proliferate.

"Placental tissue grows very fast, stimulates blood vessel formation, and invades into neighboring tissues, like a tumor," said senior author Andrew Xiao, associate professor of genetics and a researcher with the Yale Stem Cell Center. "Unlike a tumor, however, the placenta grows through a precise, coordinated, and well-controlled manner."

At the earliest stage of fetal development two linked processes begin simultaneously. As the fertilized egg begins developing specialized cells of the new life, another set of cells begins producing blood vessels in the placenta to nourish the growing fetus.

"In many ways, pregnancy is like a prolonged state of inflammation, as the placenta constantly invades the uterine tissue," Xiao said.

The DNA of the cells that will make up the growing placenta share an unusual trait -- the double helix begins to twist. The resulting torsion causes certain sections of the genome break into a single strand. Although the primary sequences of the DNA are the same between the placenta and embryo, the different structure of the DNA between the two helps determine the fate of the cells.

The Yale team led by Xiao discovered placental growth is then regulated by the sixth base of DNA, N6-methyladenine. This base stabilizes the single-stranded regions of DNA and repels SATB1. SATB1 is protein critical for the organization of chromatin, the material that makes up chromosomes.

Placentas without N6-methyladenine grow uncontrollably while placentas with abnormally high levels of N6-methyladenine develop severe defects that eventually halt embryo development, the researchers found.

The findings could help researchers develop new therapies for conditions such as preeclampsia in pregnancy as well as certain types of cancer characterized by activity from single strands of DNA, the researchers said.

Credit: 
Yale University

Data analytics can predict global warming trends, heat waves

image: Early warning signals as increasing autocorrelation coefficient and standard deviation prior to the early 20th century global warming (left) and mega heat wave during 2010 in Russia (right).

Image: 
Chenghao Wang, Stanford University

New research from Arizona State and Stanford Universities is augmenting meteorological studies that predict global warming trends and heat waves, adding human originated factors into the equation.

The process quantifies the changing statistics of temperature evolution before global warming in the early 20th century and recent heat wave events to serve as the early warning signals for potential catastrophic changes. In addition, the study illustrates the contrast between urban and rural early warning signals for extreme heat waves.

Tracking the pre-event signatures, or tipping points, of the increasing frequency and intensity of heat extremes will support the development of countermeasures to restore climate system resilience.

"Many studies have identified such changes in climate systems, like the sudden end of glacial period," said Chenghao Wang, a former ASU Research Scientist now at the Department of Earth System Science at Stanford University. "These qualitative changes usually have early-warning signals several thousand years before them."

"We detected similar signals in events much shorter than previous studies," said Chenghao Wang. "We found early-warning signals also exist before global warming and heat waves on the time scale of years and days."

In addition to global historical temperature data, the team tracks current temperature variances from airport weather stations. If it's abnormally hot, compared to 30 years of record, for at least three consecutive days, it's considered a heat wave.

"This method isn't just applicable for predicting extreme weather events in the next few days or weeks, said Zhihua Wang, an ASU environmental and water research engineering associate professor. "It observes human-induced variabilities and will support prediction over the next decades or even century." Zhihua Wang also serves as co-director of climate systems research at ASU's National Center of Excellence on Smart Materials for Urban Climate and Energy.

The emergence of early-warning signals before heat waves provides new insights into the underlying mechanisms (e.g., possible feedback via land-atmosphere interactions). In particular, given the increasing frequency and intensity of heat extremes, the results will facilitate the design of countermeasures to reserve the tipping and restore the resilience of climate systems.

According to Zhihua Wang, this method creates a "completely new frontier" for evaluating how things like global energy consumption and, conversely, the introduction of urban green infrastructure, are affecting climate change. "We're not replacing existing evaluation tools," he said. "The data is already there. It's enabling us to gauge what actions are having an impact."

Based on the study results, researchers surmise that urban greening, or the use of public landscaping and forestry projects, along with adequate irrigation, may promote reverse tipping.

In addition to Chenghao Wang and Zhihua Wang, the team included rising high school junior Linda Sun from Horace Greely High School in Chappaqua, NY.

Credit: 
Arizona State University

Identifying sources of deadly air pollution in the United States

MINNEAPOLIS/SAINT PAUL (07/15/20) -- A new study from University of Minnesota researchers provides an unprecedented look at the causes of poor air quality in the United States and its effects on human health.

The research, to be published Wednesday in the journal Environmental Science and Technology Letters, finds that air pollution from sources in the United States leads to 100,000 deaths in the U.S. each year. About half of these deaths are from burning fossil fuels, but researchers also identified less obvious sources of lethal pollution.

"People usually think of power plants and cars, but nowadays, livestock and wood stoves are as big of a problem. It's also our farms and our homes." said Sumil Thakrar, postdoctoral research associate in the Departments of Bioproducts and Biosystems Engineering and Applied Economics.

The researchers found that while some sectors of the economy, such as electricity production and transportation, have reduced pollution amid government regulations, others have received less attention, including agriculture and residential buildings.

Researchers examined U.S. Environmental Protection Agency data on all pollution sources in the United States, including their location and how much pollution they emit. They then used newly-developed computer models to determine where pollution travels and how it affects human health.

Researchers focused on one particularly harmful pollutant: fine particulate matter, also known as PM2.5, which is associated with heart attacks, strokes, lung cancer and other diseases. In examining the data, they discovered that about half of all PM2.5 air pollution-related deaths are from burning fossil fuels, with the remaining largely from animal agriculture, dust from construction and roads, and burning wood for heating and cooking.

"Essentially we're asking, 'what's killing people and how do we stop it?'" Thakrar said. "The first step in reducing deaths is learning the impact of each and every emission source."

In the U.S., air quality is largely regulated by the federal government, which sets maximum allowable levels of pollution in different areas. States and local governments are then charged with enforcing those limits. The authors suggest regulators can improve this broad-brush approach by focusing instead on reducing emissions from specific sources.

"Targeting particularly damaging pollution sources is a more efficient, and likely more effective, way of regulating air quality," said Jason Hill, professor in the Department of Bioproducts and Biosystems Engineering within the University's College of Food, Agricultural and Natural Resource Sciences and College of Science and Engineering. "Think of springing a leak in your boat while out fishing. Why fret too much about how much water is coming in when what you really should be doing is plugging the hole?"

The researchers also report a surprising finding about the sources of PM2.5 responsible for harming human health. Most people are familiar with PM2.5 as soot -- such as the exhaust from a dirty bus -- or road dust. But PM2.5 also forms from other pollutants like ammonia.

Ammonia is released from animal manure and the fertilization of crops. However, unlike many other sources of PM2.5, ammonia is not regulated to any large extent, despite being responsible for about 20,000 deaths, or one-fifth of all deaths caused by PM2.5 pollution from human activity.

To improve air quality in the future, the authors suggest more drastic reductions of emissions from sources that are already regulated, such as electricity generation and passenger vehicles. They also suggest novel ways to target pollutant sources that have not been as extensively regulated, such as manure management, changing personal diets and improving formulations of cleaning supplies, paints and inks.

This research -- the underlying data and results of which are available to the public -- can complement current efforts to mitigate climate change and other environmental problems.

"Our work provides key insights into the sources of damage caused by air pollution and suggests ways to reduce impacts," said Thakrar. "We hope policymakers and the public will use this to improve the lives of Americans."

Credit: 
University of Minnesota

Children exposed to Deepwater Horizon oil spill suffered physical, mental health effects

image: A shoreline clean-up crew removes oil from a beach in Louisiana.

Image: 
Photo: U.S. Coast Guard/Patrick Kelley

On April 20, 2010, an explosion on the Deepwater Horizon oil rig triggered what would become the largest marine oil spill in history. Before the well was finally capped 87 days later on July 15, an estimated 4 million barrels of oil had gushed into the Gulf of Mexico, harming ecosystems, contaminating shorelines, and strangling the fishing and tourism industries.

A study recently published in Environmental Hazards has found that the disaster was also harmful to the mental and physical health of children in the area. Led by Jaishree Beedasy from the National Center for Disaster Preparedness (NCDP) at Columbia University's Earth Institute, the study found that Gulf Coast children who were exposed to the oil spill -- either directly, through physical contact with oil, or indirectly through economic losses -- had a significantly higher likelihood of experiencing physical and mental health problems compared to kids who were not exposed. When interviewed in 2014, three out of five parents reported that their child had experienced physical health symptoms and nearly one third reported that their child had mental health issues after the oil spill. The researchers hope their findings can inform future disaster recovery plans.

The findings also show that "the impacts of the oil spill on children's health appear to persist years after the disaster," said Beedasy.

Although natural disasters don't discriminate, they do disproportionately harm vulnerable populations, such as people of color and people with lower incomes. Children are another vulnerable group, because their coping and cognitive capacities are still developing, and because they depend on caregivers for their medical, social, and educational needs. A growing body of evidence demonstrates that disasters are associated with severe and long-lasting health impacts for children. However, very few studies have evaluated the impacts of oil spills on children.

Oil spills have the potential to affect children in many ways. The child might come into direct contact with the oil by touching it, inhaling it, or ingesting it. Direct exposure to oil, dispersants, and burned oil can cause itchy eyes, trouble breathing, headache, dizziness, rashes, and blisters, among other issues. Children can also suffer from secondary impacts if a parent loses their job, if their daily routines are disrupted, or if others in the family feel distressed or suffer health problems.

To find out how the oil spill might be affecting children in the area, in 2014, the researchers interviewed 720 parents and caregivers who lived in Louisiana communities highly impacted by the oil spill. They collected information such as whether the child or parent had been in contact with oil, whether the household was economically impacted, and the health status of the child and parent.

In the interviews, 60 percent of the parents reported that their child had experienced physical health problems -- defined as respiratory symptoms, vision problems, skin problems, headaches, and unusual bleeding -- at some time after Deepwater Horizon. Thirty percent of the parents said their child had experienced mental health issues such as feeling depressed or very sad, feeling nervous or afraid, having sleeping problems, or having problems getting along with other children.

The survey found that physical health problems were 4.5 times more common in children who had been directly exposed to oil, and in children whose parents had been exposed to oil smell. Children with indirect exposure to oil through their parents were also much more likely to have physical health issues. And those living in households that reported loss of income or jobs as a result of the oil spill were nearly three times more likely to have physical health problems compared to kids whose families hadn't had those problems. In households where the parent was white, held at least a college degree, or the household income was more than $70,000 a year, the parent was less likely to report physical health issues for the child.

The study found similar links in regard to children's mental health. Kids who had been directly exposed to oil were 4.5 times more likely to have mental health issues. These effects were also three times more common in children whose parents had been exposed to oil smell, or whose parents had lost incomes or jobs as a result of the spill.

The researchers acknowledge that the results of the study could have been affected by certain limitations such as parents not having proper recall of some of the effects in their children. However, the results strongly indicate that children exposed to the Deepwater Horizon oil spill were more likely to suffer from adverse physical and mental health effects. The findings also emphasize the importance of considering secondary impacts such as job loss and family tensions during disaster recovery.

To help with recovery, Beedasy and her colleagues at the National Center for Disaster Preparedness previously ran a program called SHOREline for young people who had been affected by disasters along the Gulf Coast. SHOREline empowered youths and taught them disaster preparedness skills so that they could help themselves, their families, communities, and youth in other communities to recover from the losses and disruptions caused by extreme events.

"Programs like SHOREline are particularly helpful to children in disasters as they can lead to the development of skills that can enable them to help themselves, their peers and communities to recover from disasters," said Beedasy.

However, resilience also needs to happen at other levels of society as well. Beedasy said she hopes the findings will help in designing evidence-based policies that enhance disaster resilience. "Our findings underscore the need for communities to have access to healthcare services, social services, job opportunities and education before and after a disaster to enhance their resilience and recovery trajectories," she said.

Credit: 
Columbia Climate School

Setting up an alarm system in the Atlantic Ocean

Climate scientists Laura Jackson and Richard Wood from The Met Office, UK have identified metrics that may give us early warnings of abrupt changes to the European Climate. The work is part of the EU Horizon 2020 TiPES project which is coordinated by the Niels Bohr Institute at the University of Copenhagen, Denmark.

An important goal in climate science is to establish early warning systems - a climate alarm device, one might say - for abrupt changes to the system of sea currents in the Northern Atlantic Ocean.

These currents, known as the Atlantic meridional overturning circulation (AMOC) includes the Gulf Stream which transport upper ocean waters northwards in the Atlantic. Here, they get colder and denser and then sink. In the process, the AMOC transports heat to the coasts of North Western Europe, keeping the continent much warmer than comparable landmasses on the same latitudes.

From the study of past climates, it is well documented that large and sudden changes of temperatures have occurred in and around the North Atlantic. This is thought to be caused by the AMOC shifting abruptly between stronger and weaker states by passing over tipping points.

A collapse of the AMOC in the next century is considered unlikely, but since it would have big impacts on society we need to be prepared to identify signals of tipping in time to mitigate or prepare for abrupt shifts in the AMOC.

One question to answer in that line of work, is which metrics should trigger the alarm system?

The scientific paper "Fingerprints for early detection of changes in the AMOC" now contributes to the clarification of this important question. The study is based on climate simulations and published in Journal of Climate by Laura Jackson and Richard Wood, The Met Office, UK as part of the European Horizon 2020 TiPES project.

"We show, that using metrics based on temperatures and densities in the North Atlantic in addition to continuing to directly monitor the AMOC can improve our detection of AMOC changes and possibly even provide an early warning," explains Laura Jackson.

The authors also conclude that using multiple metrics for monitoring is important to improve detection.

Two systems directly monitor the AMOC. The RAPID array runs from the Florida Strait to the west coast of Northern Africa. The OSNAP array spans from Labrador in Canada to the tip of Greenland on to the west coast of Scotland. There are also current observing systems in place which allow the temperature and density metrics to be monitored.

"Still, it is difficult from these measurements to tell whether a change in the AMOC is from natural variability that takes place across decades, from a gradual weakening because of anthropogenic climate change, or from crossing a tipping point," says Laura Jackson.

In other words, neither is the alarm fully developed, nor does anyone today know exactly which kind of changes to expect, should it go off.

More science is needed. One step in the right direction will be the evaluation of the available metrics in competing climate models to estimate the robustness of the results from the current work.

Credit: 
University of Copenhagen

What COVID-19 can teach tourism about the climate crisis

The global coronavirus pandemic has hit the tourism industry hard worldwide. Not only that, but it has exposed a lack of resilience to any type of downturn, according to new research from Lund University in Sweden. While the virus may or may not be temporary, the climate crisis is here to stay - and tourism will have to adapt, says Stefan Gössling, professor of sustainable tourism.

Tourism has been under pressure even before the current pandemic paralysed the world. The airline industry has seen declining profits, and flying has become cheaper. The platform economy - AirBnB, booking.com and TripAdvisor, to name some of the few dominant global players - has further toughened the market. Tourists book shorter stays, meaning destinations have to attract a higher volume of travelers.

"Even though we have warned for decades that a virus, for example SARS, could significantly affect tourism, nobody expected a virus to have this kind of impact", says Stefan Gössling.

In the same way, there have been clues that another looming crisis is starting to affect tourism. The demise of UK tour operator Thomas Cook in 2019 was attributed to the summer heatwave leading to fewer bookings - and the heatwave has, in turn, been attributed to a changing climate.

"Imagine several crises of similar magnitude to COVID-19. Extreme and unpredictable weather, a global food shortage or other consequences of climate change. And since this will possibly go on for longer than the current pandemic, the tourism industry will suffer greatly," says Stefan Gössling.

Gössling says there are a few tanglible guidelines for both the industry and tourists to pivot towards, that would make tourism more resilient as well as climate friendlier. Encouraging destinations that are closer to the traveler, making stays longer and keeping profits local, are some ways to move away from the focus on volume and energy-intense products:

Increase the length of stay or the length of days in packages sold.

Focus on closer markets, long-haul travelers are the ones contributing to vast emissions of greenhouse gases.

Rethink the food that you serve, organic and regional can benefit farmers nearby.

Move towards a high-value model, where individuals spend more.

Think about what you buy: a lot of the profit is made by foreign-owned, global platforms such as AirBnB and booking.com.

Rethink carbon-intense travel, for example cruise holidays.

Despite this advice, there are many conditions that are beyond the grasp of individual businesses, says Gössling.

"Even if you are a small family-owned business that does everything according to the sustainability book, you may still suffer the consequences of climate change. Many of the major structural changes will of course have to come from policy makers", Stefan Gössling concludes.

Credit: 
Lund University

Moffitt researchers identify factors to predict severe toxicities in CAR T patients

TAMPA, Fla. -- Chimeric antigen receptor T-cell therapy (CAR T) has proved to be a valuable treatment option for patients with lymphoma who have failed other therapies. In clinical trials, the cellular immunotherapy was shown to provide durable remissions for nearly 40% of large B cell lymphoma patients. Despite its success, CAR T may not be the best option for all patients due to their cancer prognosis and the risk for developing severe toxicities or treatment-related death. In a new study published in Clinical Cancer Research, Moffitt Cancer Center researchers identify possible factors that could help physicians know if patients are at higher risk for severe adverse events before they receive CAR T therapy.

The development of immune-mediated toxicities, such as cytokine release syndrome and neurotoxicity, remains a common challenge of CAR T therapy. For this therapy, a patient's own T cells are removed, re-engineered in a lab and infused back into the patient. The new army of immune cells are designed to seek out and attack cancer cells. But that immune boost can cause large amounts of cytokines to be released into the blood, which can cause a patient to develop a fever, increased heart rate, difficulty breathing or low blood pressure.

"Identifying which CAR T patients may be more susceptible to those severe toxicities before therapy could allow us to better tailor their care to mediate or reduce those adverse reactions," said Marco Davila, M.D., Ph.D., study corresponding author, associate member of the Blood and Marrow Transplant and Cellular Immunotherapy Department and medical director of Cell Therapies at Moffitt.

To better understand what may put a patient at higher risk for toxicities, the Moffitt researchers followed 75 patients with large B cell lymphoma who were treated with the CAR T cell product axicabtagene ciloleucel (Yescarta®) as the standard of care. Levels of serum cytokine and catecholamine, a type of neurotransmitter, were measured before receiving lymphodepleting chemotherapy prior to treatment, on the day of their CAR T infusion and daily thereafter during their hospitalization. Tumor biopsies were also performed before treatment to analyze gene expression of the tumor and its microenvironment.

The researchers found that increased levels of pre-treatment with interleukin 6, an inflammatory molecule, indicated a high risk for neurotoxicity and cytokine release syndrome from CAR T therapy. This group also had an elevated risk of death from the treatment. "These patients experienced significant toxicities despite management with early cytokine-blockade and steroids," said Rawan Faramand, M.D., lead study co-author and assistant member of the Blood and Marrow Transplant and Cellular Immunotherapy Department at Moffitt.

Tumor gene expression data showed myeloid cells and regulatory T cells may also play an important role in the development of neurotoxicity and cytokine release syndrome. The researchers believe the interaction between infused CAR T cells and the recipient's immune cells may determine the severity of the toxicities and suggest further studies on reducing inflammation and targeting the tumor microenvironment prior to therapy.

Credit: 
H. Lee Moffitt Cancer Center & Research Institute