Culture

The German press disparages dissenting voices on climate change

The German media have disparaged dissenting voices on climate change, according to research published on 8 October in the high impact factor journal Media Culture & Society. Its authors are Lena von Zabern, UPF alumni who won the award for best master's degree final project in UPF Planetary Wellbeing, and Christopher D. Tulloch, who supervised her work and is a researcher with the Department of Communication.

The study is one of the first on the media representation of Greta Thunberg and the so-called "Fridays for Future" demonstrations protesting for the implementation of more environmental measures by the European youth to reverse climate change.

In their work, the researchers performed a quantitative and qualitative content analysis of the German newspapers Bild.de, Zeit Online and FAZ.net, between August 2018 and March 2019. Their results reveal that while Zeit Online shows a framing towards intergenerational justice, the coverage of FAZ.net and Bild.de strongly adheres to the protest paradigm.

The researchers reveal that most newspaper articles studied guarantee protesters a voice, but this voice is often reduced to apolitical testimonies and the protesters' self-agency is undermined through disparagement. It highlights how the press, both popular (Bild) and quality (FAZ), uses a set of eight frames to undermine legitimate protests. "Thus, German media coverage tends to reproduce existing power structures by marginalizing and depoliticizing the political agenda of a system-critical protest", Zabern and Tulloch assert.

However, the study also demonstrates that the idea of climate change as a matter of intergenerational justice and children's rights has become established on the agenda of the media studied.

Work on planetary wellbeing driven by the University

Lena von Zabern is an alumni of the master's degree in International studies on Media, Power and Difference of the Department of Communication and her work won a prize awarded by the UNESCO Chair in Life Cycle and Climate Change ESCI-UPF and the UPF Barcelona School of Management, as part of the awards for the best bachelor's degree final project (TFG) and master's degree final project (TFM) on social responsibility and planetary wellbeing. Now, this work has resulted in the article published in a high impact factor journal. Tulloch stresses that "this is not the first time a student of this master's degree programme has published on environmental issues in a highly indexed journal". Indeed, in 2018, Elida Hoeg published an article on refugees of climate change, also in conjunction with Christopher Tulloch, in the Journal of Communication Inquiry.

Credit: 
Universitat Pompeu Fabra - Barcelona

Alpha animals must bow to the majority when they abuse their power

image: Vulturine guineafowl occur in the savannahs of Kenya. The birds live in groups, with a strict dominance hierarchy.

Image: 
Danai Papageorgiou

Many animal groups decide where to go by a process similar to voting, allowing not only alphas to decide where the group goes next but giving equal say to all group members. But, for many species that live in stable groups - such as in primates and birds - the dominant, or alpha, group members often monopolise resources, such as the richest food patches and access to mates. Scientists at the Max Planck Institute of Animal Behavior and the Cluster of Excellence Centre for the Advanced Study of Collective Behaviour at the University of Konstanz have studied the links between dominance and group decision-making in wild vulturine guineafowl. They report that democratic decision-making plays an essential role in mitigating the power of alphas by deciding where to move next if those alphas are monopolising resources.

Vulturine guineafowl are large birds native to savannahs of East Africa. They are the first bird species to have been reported to live in a multilevel society where social groups comprising from 15 to more than 60 individuals interact preferentially with other social groups. Within these large groups, there is a clear dominance hierarchy. Like in wolves and primates, the dominant, or alpha, group members can outcompete other group members and exclude them from food.

While it had long been thought that alphas lead the way and decide where the group moves next, studies over the past decade have suggested that all group members can have equal say by 'voting' for where the group goes next. However, it has remained to be determined whether this form of democratic decision-making exists in order to keep the power of dominants in check. "Working together as a group is critical for these birds, as their bright plumage makes isolated individuals easy targets for predators such as leopards and martial eagles", says Damien Farine, the senior author of the study and lead research on the vulturine guineafowl project.

Despotic leadership versus democratic decision-making

The scientists found that who initiated, and therefore decided where the group moved to next, was dependent on the recent actions of the dominant group members. When groups were feeding in large spacious areas, where distributed food was equally accessible to everyone, then all group members contributed equally. However, when dominant individuals monopolised a particularly rich food patch - chasing other group members out - then the excluded subordinates combined their votes to move the group away from the patch, ultimately forcing the dominants to abandon their rich resources. These findings suggest democratic decision-making, as opposed to despotic leadership, has evolved so that all group members can obtain the resources (e.g. food and water) that they need to survive. This would not be possible if dominant individuals always decided what was best for themselve.

The researchers combined observations on foot, video tracking, and high-resolution GPS tracking across multiple groups of vulturine guineafowl, spanning several years. They first recorded all disputes between individuals birds to assign each animal a rank in the dominance hierarchy. They used an evaluation procedure common in chess, football, and table tennis used to assess players ranks based on who they lost and won matches against. The scientists also monitored which bird initiated departures from and to new feeding sites, and the order of individuals following from first to last.

Baboons also follow the majority vote

Having worked with baboons that inhabit the same savannah habitats, Damien Farine previously found that individuals can 'vote with their feet' by moving away from the group in their preferred direction. His previous study, conducted as part of a team of researchers now all based at the Max Planck Institute of Animal Behavior and the Centre for the Advanced Study of Collective Behaviour, found that when these individuals reach a majority, the rest of the group follows in that direction.

The current study on guineafowl suggests that democratic decisions about group movements are a critical piece of the puzzle that allows groups to remain cohesive despite the large inequalities in access to resources among individuals. By being able to initiate and make the group move away from monopolised resources, democratic decision-making allows subordinates to take back control when too much power is lost to dominants. They can even force higher-ranking animals to leave a feeding site that is best for them. "We were very excited when we first noticed that subordinates were leading their group after having lost access to resources, and we nicknamed this the 'losers lead mechanism'", says Danai Papageorgiou, doctoral researcher at the Max Planck Institute of Animal Behaviour and leading researcher of the study. "Our findings highlight how collectives can react to rising social inequality. Democratic decision-making is critical for maintaining a balance of power in societies where functioning as a group is critical to survival", Papageorgiou adds.

Credit: 
Max-Planck-Gesellschaft

Safe ultraviolet light could be used to sterilise high-risk COVID-19 environments

Research at Cranfield University is paving the way for a new solution to kill aerosolised COVID-19 in enclosed environments such as hospitals and long-term care facilities.

Computational modelling has shown that low dose far-ultraviolet C (UVC) lighting can be used to disinfect in-room air, increasing disinfection rates by 50-85% compared to a room's ventilation alone.

Unlike typical UVC - which has been used to kill microorganisms for decades but is extremely harmful to humans, potentially causing cataracts or skin cancer - evidence has shown that far-UVC is safe to use around people.

Dr Liang Yang, Lecturer in Marine Renewable Energy Systems in the Centre for Renewable Energy Systems, Cranfield University, said: "In indoor environments where it may not be possible to socially distance, aerosolised coronavirus released through breathing increases the chance of spreading the disease. Infection controls focus on a combination of personal hygiene and the correct use of personal protective equipment, which has been in short supply in many countries. This research has shown that far-UVC lighting could provide an alternative, safe and inexpensive way to mitigate SARS-CoV-2 transmission.

"We found that far-UVC illumination in poorly ventilated spaces can be as effective as N95 masks in preventing transmission. With detailed and accurate computational fluid dynamics modelling, we were able to track and eliminate the airborne transmission of pathogens."

The research was carried out in collaboration with Dr Andrew Buchan from Queen Mary University of London and Dr Kirk Atkinson from Ontario Tech University in Canada.

Dr Kirk Atkinson said: "Imagine if you could simply screw a far-UVC light bulb into a standard light fixture. Switching the light on will sterilise the air for everyone in the room. This is what we're aiming for. While industry and governments are investing heavily in ventilation infrastructure, ventilation by itself has mixed effectiveness if the virus is not being killed. Far-UVC is much cheaper to implement."

UVC light is a subtype of one of the three types of electromagnetic radiation with wavelengths shorter than visible light rays. At 100 to 280 nanometres (nm), UVC has a shorter wavelength than UVA and UVB. Human-safe far-UVC falls in the 207 to 222 nm range and can be produced by special bulbs and lamps and used to disinfect pathogens.

Far-UVC is safe because it has the unique property of interacting more readily (and loses energy more rapidly) than lower wavelength UVC but is not energetic enough to reach living human cells.

A radiation transport and fluid dynamics model was developed as part of the research to quantify disinfection rates within a 3 metre by 3 metre air-conditioned single occupancy private room, representative of typical environments found in hospitals and long-term care facilities.

Researchers are now hoping to uncover new sources of funding for further investigation and to address outstanding issues necessary to expedite far-UVC light into service.

Credit: 
Cranfield University

It's not too late to save 102 species at risk of extinction

image: The Fraser River estuary in British Columbia is home to 102 species at risk of extinction

Image: 
Alex Harris

The Fraser River estuary in British Columbia is home to 102 species at risk of extinction. A new study says it's not too late to save these species if action is taken now.

"There is currently no overarching plan to save them. If we don't act quickly, many species, including species of salmon and southern resident killer whales, are likely to be functionally extinct in the next 25 years," says senior author Tara Martin, a professor of conservation science at UBC, in a paper published today in Conservation Science and Practice.

The Fraser estuary is the largest on the Pacific coast of North America. More than three million people in B.C.'s Lower Mainland live near the Fraser River, and many of them rely on these species and ecosystems for their livelihoods, culture and well-being.

Applying a conservation decision framework called Priority Threat Management developed by Martin and her team, the authors brought together over 65 experts in the ecology and management of species that utilize the Fraser River estuary to identify conservation actions, estimate their benefit to species recovery, their cost and feasibility.

The plan includes the implementation of an environmental co-governance body that sees First Nation, federal and provincial governments working together with municipalities, NGOs and industry to implement these strategies. The research finds that co-governance underpins conservation success in urban areas, by increasing the feasibility of management strategies.

Throughout history, humans have settled in areas of high biodiversity. Today, these areas are home to our biggest urban centres with biodiversity at increasing risk from escalating cumulative threats.

Lead author Laura Kehoe, who did the work while a postdoctoral fellow at the University of Victoria and UBC, says it's not too late to save these species if we act now.

"The price tag is $381 million over 25 years or $15 million a year and invests in strategies ranging from aquatic habitat restoration and transport regulation to green infrastructure and public land management. This amounts to less than $6 per person a year in Greater Vancouver--the price of a single beer or latte."

The authors acknowledge that identifying the management strategies to conserve species within such regions, and ensuring effective governance to oversee their implementation, presents enormous challenges--yet the cost of not acting is staggering.

"Not only do we risk losing these species from the region, but the co-benefits from investing in these conservation actions are enormous," explains Martin. "For example, along with generating more than 40 full-time jobs for 25 years, historically the value of a Fraser salmon fishery exceeds $300 million a year, and whale watching is more than $26 million. If we lose thriving populations of these species, we lose these industries. Our study suggests that investing in conservation creates jobs and economic opportunities."

Crucially, the study found that future major industrial developments, including the contentious Trans Mountain pipeline and Roberts Bank port terminal expansion, jeopardizes the future of many of these species including the southern resident killer whale, salmon and sturgeon, and the migratory western sandpiper.

The study concludes that biodiversity conservation in heavily urbanized areas is not a lost cause but requires urgent strategic planning, attention to governance, and large-scale investment.

Credit: 
University of British Columbia

Doctors confirm the existence of multiple chronotypes

image: Having conducted a large-scale study, a team of scientists improved the classification of human diurnal activity and suggested using 6 chronotypes instead of just 'early birds' and 'night owls'. Two thousand participants, including the employees of the Institute of Medicine of RUDN University, were tested in the course of the research.

Image: 
RUDN University

Having conducted a large-scale study, a team of scientists improved the classification of human diurnal activity and suggested using 6 chronotypes instead of just 'early birds' and 'night owls'. Two thousand participants, including the employees of the Institute of Medicine of RUDN University, were tested in the course of the research. The results of the work were published in the Personality and Individual Differences journal.

The physiological functions of our bodies are subject to diurnal rhythms. It means that a person can be more or less active and efficient depending on the time of the day. The two widely known chronotypes are 'early birds' that are most active in the morning and 'night owls' for which evenings are the most productive time. However, these two types are not clearly distinguished: about 60% of all people fail to fit into one of the categories. A team of scientists from the Institute of Medicine of RUDN University together with the leading Russian and foreign chronobiologists carried out a large-scale study and identified 4 additional chronotypes, thus expanding the common classification.

"The research of individual chronobiological and chronopsychological differences is predominantly focused on the morning and evening chronotypes. However, recent studies suggest that the existing classification needs to be reconsidered and expanded. Our team conducted a test and asked the participants to choose their diurnal activity types from six suggested options. Based on the results of the test, we studied the dynamics of sleep-wake patterns throughout the day," said Dmitry S. Sveshnikov, MD, an Assistant Professor at the Department of Human Physiology, Institute of Medicine of RUDN University.

The team suggested a new classification that includes 6 chronotypes characterized by different criteria: sleep duration, excessive diurnal drowsiness, ability to wake up and fall asleep as and when required, and so on. The four new chronotypes in the classification are highly active type (that remains active throughout the day), daytime sleepy type (that is active in the mornings and evenings, not in the afternoon), daytime active type (that is active in the afternoon), and moderately active type (with reduced activity throughout the day).

The team conducted a number of online tests with a total of 2,283 participants, and 95% of the respondents identified with one of the six chronotypes. Only 1/3 of them chose either a morning or an evening type (13% and 24%, respectively). The majority of the participants went for the other four types: 15% chose the daily type; 18%--daytime sleepy type, 9%--highly active type, and 16%--moderately active type.

"Taking into account the incidence of the types in question, we consider our hypothesis about the existence of additional chronotypes fully confirmed," added Dmitry S. Sveshnikov.

Credit: 
RUDN University

Researchers discovered solid phosphorus from a comet

An international study led from the University of Turku, Finland, discovered phosphorus and fluorine in solid dust particles collected from a comet. The finding indicates that all the most important elements necessary for life may have been delivered to the Earth by comets.

Researchers have discovered phosphorus and fluorine in solid dust particles collected from the inner coma of comet 67P/Churyumov-Gerasimenko. It takes the comet 6.5 years to orbit the Sun.

The dust particles have been collected with the COmetary Secondary Ion Mass Analyser (COSIMA). The instrument was on-board the European Space Agency's Rosetta spacecraft which tracked the comet at a few kilometre distance between September 2014 and September 2016. The COSIMA instrument collected the dust particles directly in the vicinity of the comet. Three 1cm2 target plates were photographed remotely. The particles were selected from these images and finally measured with a mass spectrometer. All the steps were controlled from Earth.

The detection of phosphorus (P+) ions in solid particles is contained in minerals or metallic phosphorus.

- We have shown that apatite minerals are not the source of phosphorus, which implies that the discovered phosphorus occurs in some more reduced and possibly more soluble form, says the project leader Harry Lehto from the Department of Physics and Astronomy at the University of Turku.

This is the first time that life-necessary CHNOPS-elements are found in solid cometary matter. Carbon, hydrogen, nitrogen, oxygen and sulphur were reported in previous studies by the COSIMA team from e.g. organic molecules. The discovered phosphorus, or P, is the last one of the CHNOPS-elements. The discovery of P indicates cometary delivery as a potential source of these elements to the young Earth.

Fluorine was also detected with CF+ secondary ions originating from the cometary dust. The first discovery of CF gas was from interstellar dust in 2019. CF+ is an ion now discovered on the comet and its characteristics in cometary environment are still unknown.

Credit: 
University of Turku

Big data powers design of 'smart' cell therapies for cancer

Finding medicines that can kill cancer cells while leaving normal tissue unscathed is a Holy Grail of oncology research. In two new papers, scientists at UC San Francisco and Princeton University present complementary strategies to crack this problem with "smart" cell therapies--living medicines that remain inert unless triggered by combinations of proteins that only ever appear together in cancer cells.

Biological aspects of this general approach have been explored for several years in the laboratory of Wendell Lim, PhD, and colleagues in the UCSF Cell Design Initiative and National Cancer Institute- sponsored Center for Synthetic Immunology. But the new work adds a powerful new dimension to this work by combining cutting-edge therapeutic cell engineering with advanced computational methods.

For one paper, published September 23, 2020 in Cell Systems, members of Lim's lab joined forces with the research group of computer scientist Olga G. Troyanskaya, PhD, of Princeton's Lewis-Sigler Institute for Integrative Genomics and the Simons Foundation's Flatiron Institute. Using a machine learning approach, the team analyzed massive databases of thousands of proteins found in both cancer and normal cells. They then combed through millions of possible protein combinations to assemble a catalog of combinations that could be used to precisely target only cancer cells while leaving normal ones alone.

In another paper, published in Science on November 27, 2020, Lim and colleagues then showed how this computationally derived protein data could be put to use to drive the design of effective and highly selective cell therapies for cancer.

"Currently, most cancer treatments, including cell therapies, are told 'block this,' or 'kill this,'" said Lim, also professor and chair of cellular and molecular pharmacology and a member of the UCSF Helen Diller Family Comprehensive Cancer Center. "We want to increase the nuance and sophistication of the decisions that a therapeutic cell makes."

Over the past decade, chimeric antigen receptor (CAR) T cells have been in the spotlight as a powerful way to treat cancer. In CAR T cell therapy, immune system cells are taken from a patient's blood, and manipulated in the laboratory to express a specific receptor that will recognize a very particular marker, or antigen, on cancer cells.

While scientists have shown that CAR T cells can be quite effective, and sometimes curative, in blood cancers such as leukemia and lymphoma, so far the method hasn't worked well in solid tumors, such as cancers of the breast, lung, or liver. Cells in these solid cancers often share antigens with normal cells found in other tissues, which poses the risk that CAR T cells could have off-target effects by targeting healthy organs. Also, solid tumors also often create suppressive microenvironments that limit the efficacy of CAR T cells.

For Lim, cells are akin to molecular computers that can sense their environment and then integrate that information to make decisions. Since solid tumors are more complex than blood cancers, "you have to make a more complex product" to fight them, he said.

In the Cell Systems study--led by Ruth Dannenfelser, PhD, a former graduate student in Troyanskaya's team at Princeton, and Gregory Allen, MD, PhD, a clinical fellow in the Lim lab--the researchers explored public databases to examine the gene expression profile of more than 2,300 genes in normal and tumor cells to see what antigens could help discriminate one from the other. The researchers used machine learning techniques to come up with the possible hits, and to see which antigens clustered together.

Based on this gene expression analysis, Lim, Troyanskaya, and colleagues applied Boolean logic to antigen combinations to determine if they could significantly improve how T cells recognize tumors while ignoring normal tissue. For example, using the Booleans AND, OR, or NOT, tumor cells might be differentiated from normal tissue using markers "A" OR "B," but NOT "C," where "C" is an antigen found only in normal tissue.

To program these instructions into T cells, they used a system known as synNotch, a customizable molecular sensor that allows synthetic biologists to fine-tune the programming of cells. Developed in the Lim lab in 2016, synNotch is a receptor that can be engineered to recognize a myriad of target antigens. The output response of synNotch can also be programmed, so that the cell executes any of a range of responses once an antigen is recognized.

To demonstrate the potential power of the data they had amassed, the team used synNotch to program T cells to kill kidney cancer cells that express a unique combination of antigens called CD70 and AXL. Although CD70 is also found in healthy immune cells, and AXL in healthy lung cells, T cells with an engineered synNotch AND logic gate killed only the cancer cells and spared the healthy cells.

"The field of big data analysis of cancer and the field of cell engineering have both exploded in the last few years, but these advances have not been brought together," said Troyanskaya. "The computing capabilities of therapeutic cells combined with machine learning approaches enable actionable use of the increasingly available rich genomic and proteomic data on cancers."

The work described in the new Science paper, led by former UCSF graduate student Jasper Williams, shows how multiple synNotch receptors can be daisy-chained to create a host of complex cancer recognition circuits. Since synNotch can activate the expression of selected genes in a "plug and play" manner, these components can be linked in different ways to create circuits with diverse Boolean functions, allowing for precise recognition of diseased cells and a range of responses when those cells are identified.

"This work is essentially a cell engineering manual that provides us with blueprints for how to build different classes of therapeutic T cells that could recognize almost any possible type of combinatorial antigen pattern that could exist on a cancer cell," said Lim.

For example, a synNotch receptor can be engineered so that when it recognizes antigen A, the cell makes a second synNotch that recognizes B, which in turn can induce the expression of a CAR that recognizes antigen C. The result is a T cell that requires the presence of all three antigens to trigger killing. In another example, if the T cell encounters an antigen present in normal tissues but not in the cancer, a synNotch receptor with a NOT function could be programmed to cause the T cell carrying it to die, sparing the normal cells from attack and possible toxic effects.

In the Science paper, using complex synNotch configurations like this, Lim and colleagues show they can selectively kill cells carrying different combinatorial markers of melanoma and breast cancer. Moreover, when synNotch-equipped T cells were injected into mice carrying two similar tumors with different antigen combinations, the T cells efficiently and precisely located the tumor they had been engineered to detect, and reliably executed the cellular program the scientists had designed.

Lim's group is now exploring how these circuits could be used in CAR T cells to treat glioblastoma, an aggressive form of brain cancer that is nearly always fatal with conventional therapies.

"You're not just looking for one magic-bullet target. You're trying to use all the data," Lim said. "We need to comb through all of the available cancer data to find unambiguous combinatorial signatures of cancer. If we can do this, then it could launch the use of these smarter cells that really harness the computational sophistication of biology and have real impact on fighting cancer."

Credit: 
University of California - San Francisco

Tree rings capture an abrupt irreversible shift in east Asia's climate

The abrupt shift to hotter and drier conditions over inner East Asia is unprecedented and may herald an irreversible shift to a new climate regime for the region, according to a new study. The findings reveal a positive feedback loop fueled by declining soil moisture, which may have nudged the area's climate over an important tipping point. Extreme heatwaves and droughts are two of society's most pressing concerns as climate change driven by human activity continues unabated. Global warming has already led to recent and rapid shifts in climate worldwide, including inner East Asia, which has experienced some of the most pronounced heatwave-drought concurrences in recent decades. It's thought that such abrupt and substantial changes represent the crossing of critical thresholds within the climate system and signal irreversible shifts from one climate regime to another. But, identifying tipping points and new climate norms is difficult; not only does it require a thorough understanding of a region's natural climate variability over timescales that often exceed available records, regime shifts are likely triggered by complex interactions between a variety of poorly understood factors in the highly dynamic climate system. To determine whether the heatwave-drought trends observed in inner East Asia indeed exceed the range of natural climate variability, Peng Zhang and colleagues used tree-ring data to reconstruct a record of heatwave frequency and soil moisture for the region spanning the past 260 years. Zhang et al. found that the rise in concurrent heatwave-drought events over the last two decades is unique and beyond the natural variability revealed by their centuries-long record. What's more, the authors demonstrate that the rise in heatwave-drought events may be caused by a positive feedback loop between decreasing soil moisture and increasing surface warming - a pattern that's potentially the start of an irreversible trend that's likely to produce more frequent and severe events. While much remains to be understood about the occurrence and mechanisms of climate regime shifts, developing trustworthy long-term climate records is crucial in detecting changes in the interactions of climate variables, discovering past shifts and predicting tipping points of potential future shifts, write Qi-Bin Zhang and Ouya Fang in a related Perspective.

Credit: 
American Association for the Advancement of Science (AAAS)

High genomic variability predicts success in desert tortoise refugees; could inform conservation

Tortoise refugees with the highest genetic variation are far more likely to survive conservation translocation than tortoises whose genetic diversity is lower, according to a new study. The findings suggest that translocation efforts should account for genetic variation when selecting target individuals rather than focusing solely on those determined to be most geographically or genetically similar to the target populations. Human activity and climate change are driving record numbers of species towards extinction and represent a challenge that is being addressed by a myriad of conservation efforts worldwide. One conservation strategy employed to preserve threatened species is through translocation of individual plants and animals to areas where they've become locally extinct or to new locations where they might bolster declining resident populations. While the approach is becoming increasingly common, it is often reserved as a last resort as the long-term success is often quite poor. An ongoing debate in this area relates to whether such efforts are most successful when they target individuals from environmentally similar regions or genetically close target populations, or when they focus on overall genetic diversity. To test these hypotheses, Peter Scott and colleagues used a long-term dataset of displaced Mojave Desert Tortoises - many previously captive pets - brought to the Desert Tortoise Conservation Center's translocation site in Nevada. Scott et al. analyzed genomic data for 166 desert tortoise refugees that either survived or died over a period of twenty years and found that neither geographical distance nor genetic similarity had any effect on post-translocation survival. Instead, the greatest predictor for success was heterozygosity - individuals with the highest genomic variation survived at much higher rates than others. While the authors note that further research is needed to understand the reasons behind this increased survival, the new insights suggest ways to improve current translocation efforts.

Credit: 
American Association for the Advancement of Science (AAAS)

Study revealing the secret behind a key cellular process refutes biology textbooks

COLUMBUS, Ohio - New research has identified and described a cellular process that, despite what textbooks say, has remained elusive to scientists until now - precisely how the copying of genetic material that, once started, is properly turned off.

The finding concerns a key process essential to life: the transcription phase of gene expression, which enables cells to live and do their jobs.

During transcription, an enzyme called RNA polymerase wraps itself around the double helix of DNA, using one strand to match nucleotides to make a copy of genetic material - resulting in a newly synthesized strand of RNA that breaks off when transcription is complete. That RNA enables production of proteins, which are essential to all life and perform most of the work inside cells.

Just as with any coherent message, RNA needs to start and stop in the right place to make sense. A bacterial protein called Rho was discovered more than 50 years ago because of its ability to stop, or terminate, transcription. In every textbook, Rho is used as a model terminator that, using its very strong motor force, binds to the RNA and pulls it out of RNA polymerase. But a closer look by these scientists showed that Rho wouldn't be able to find the RNAs it needs to release using the textbook mechanism.

"We started studying Rho, and realized it cannot possibly work in ways people tell us it works," said Irina Artsimovitch, co-lead author of the study and professor of microbiology at The Ohio State University.

The research, published online by the journal Science today, Nov. 26, 2020, determined that instead of attaching to a specific piece of RNA near the end of transcription and helping it unwind from DNA, Rho actually "hitchhikes" on RNA polymerase for the duration of transcription. Rho cooperates with other proteins to eventually coax the enzyme through a series of structural changes that end with an inactive state enabling release of the RNA.

The team used sophisticated microscopes to reveal how Rho acts on a complete transcription complex composed of RNA polymerase and two accessory proteins that travel with it throughout transcription.

"This is the first structure of a termination complex in any system, and was supposed to be impossible to obtain because it falls apart too quickly," Artsimovitch said.

"It answers a fundamental question - transcription is fundamental to life, but if it were not controlled, nothing would work. RNA polymerase by itself has to be completely neutral. It has to be able to make any RNA, including those that are damaged or could harm the cell. While traveling with RNA polymerase, Rho can tell if the synthesized RNA is worth making - and if not, Rho releases it."

Artsimovitch has made many important discoveries about how RNA polymerase so successfully completes transcription. She didn't set out to counter years of understanding about Rho's role in termination until an undergraduate student in her lab identified surprising mutations in Rho while working on a genetics project.

Rho is known to silence the expression of virulence genes in bacteria, essentially keeping them dormant until they're needed to cause infection. But these genes do not have any RNA sequences that Rho is known to preferentially bind. Because of that, Artsimovitch said, it has never made sense that Rho looks only for specific RNA sequences, without even knowing if they are still attached to RNA polymerase.

In fact, the scientific understanding of the Rho mechanism was established using simplified biochemical experiments that frequently left out RNA polymerase - in essence, defining how a process ends without factoring in the process itself.

In this work, the researchers used cryo-electron microscopy to capture images of RNA polymerase operating on a DNA template in Escherichia coli, their model system. This high-resolution visualization, combined with high-end computation, made accurate modeling of transcription termination possible.

"RNA polymerase moves along, matching hundreds of thousands of nucleotides in bacteria. The complex is extremely stable because it has to be - if the RNA is released, it is lost," Artsimovitch said. "Yet Rho is able to make the complex fall apart in a matter of minutes, if not seconds. You can look at it, but you can't get a stable complex to analyze."

Using a clever method to trap complexes just before they fall apart enabled the scientists to visualize seven complexes that represent sequential steps in the termination pathway, starting from Rho's engagement with RNA polymerase and ending with a completely inactive RNA polymerase. The team created models based on what they saw, and then made sure that these models were correct using genetic and biochemical methods.

Though the study was conducted in bacteria, Artsimovitch said this termination process is likely to occur in other forms of life.

"It appears to be common," she said. "In general, cells use similar working mechanisms from a common ancestor. They all learned the same tricks as long as these tricks were useful."

Credit: 
Ohio State University

UCLA study of threatened desert tortoises offers new conservation strategy

In Nevada's dry Ivanpah Valley, just southeast of Las Vegas, a massive unintended experiment in animal conservation has revealed an unexpected result.

From 1997 to 2014, the U.S. Fish and Wildlife Service moved more than 9,100 Mojave desert tortoises to the 100-square-kilometer (about 39 miles square) Large Scale Translocation Site. The newcomers, many of which were abandoned pets or had been displaced by development, joined nearly 1,500 desert tortoises already living there.

Conventional wisdom would suggest that tortoises from areas closest to the translocation site would fare best. But a new UCLA study, published today in Science, found no connection between the tortoises' place of origin and their chances of survival. It did, however, uncover a far better predictor.

Tortoises with lots of genetic variation were much more likely to survive after their relocation, said UCLA conservation ecologist Brad Shaffer, the study's senior author. Like most organisms, tortoises have two copies of their entire genome, with one copy from each parent. The more those copies differ from each other, the higher the organism's heterozygosity.

The researchers compared translocated tortoises that lived or died over the same time period after being relocated to the site. They found that survivors averaged 23% greater heterozygosity than those that perished. Simply put, tortoises with more genetic variation had higher survival rates.

"It flies in the face of what we know from other translocation studies, but lots of genetic variation was hands-down the best predictor of whether a tortoise lived or died," said Shaffer, a professor of ecology and evolutionary biology and director of the UCLA La Kretz Center for California Conservation Science. "Relocating endangered plants and animals is increasingly necessary to counteract the effects of climate change, and this gives us a new tool to increase survival rates."

Although the relationship between heterozygosity and survival was well supported by the study, it's unclear why greater genetic variation is linked to survival rates, said former UCLA postdoctoral scholar Peter Scott, the study's lead author.

"Potentially, individuals with higher heterozygosity have more genomic flexibility," said Scott, who is now an assistant professor at West Texas A&M University. "It's likely that tortoises with more variation have a better chance of having one copy of a gene that works really well in stressful or new environments compared to those individuals with two identical copies that only work really well in their environment of origin."

The researchers wanted to make tortoise conservation efforts more effective, and uncover trends that would help other species as well, Scott said.

"Oftentimes, the chances of success for relocating plants or animals is pretty dismal," he said. "We wanted to understand why, and use that understanding to increase survival."

Over the years, tortoises that were given up as pets, or removed from places like developments in suburban Las Vegas and solar farms in the desert, were surrendered to the U.S. Fish and Wildlife Service.

The agency took blood samples to screen for diseases and marked each animal before releasing them into the Ivanpah Valley site, which enabled the animals to be tracked in later surveys. The UCLA researchers sequenced blood samples drawn from 79 tortoises that were released to the site and were known to be alive in 2015, and from another 87 known to have died after they were released at the site.

Although the Large Scale Translocation Site provided an intriguing dataset, it's not the same as a controlled experiment. Additional studies would be needed to understand why more heterozygous tortoises had a higher survival probability and precisely how much of an increase in genetic variation improves a tortoise's odds of surviving.

"The only reason we could do this study was because the U.S. Fish and Wildlife Service was incredibly forward-thinking when they set up the translocation site and tracked who lived and died," Shaffer said. "Many died, and no one was happy about that. But we can learn a lot from that unfortunate result to help conservation management improve.

"When thinking about moving animals or plants out of danger, or repopulating an area emptied by wildfire, now we can easily and economically measure genetic variability to better gauge the survival probability of those translocated individuals. It's not the only criteria, but it's an important piece of the puzzle."

Credit: 
University of California - Los Angeles

Offshore submarine freshwater discovery raises hopes for islands worldwide

image: University of Hawai'i Research Affiliate Faculty Eric Attias at Wailupe Beach Park on O'ahu.

Image: 
University of Hawai'i

Twice as much freshwater is stored offshore of Hawai'i Island than was previously thought, according to a University of Hawai'i study with important implications for volcanic islands around the world. An extensive reservoir of freshwater within the submarine southern flank of the Hualālai aquifer has been mapped by UH researchers with the Hawai'i EPSCoR 'Ike Wai project. The groundbreaking findings, published in Science Advances, reveal a novel way in which substantial volumes of freshwater are transported from onshore to offshore submarine aquifers along the coast of Hawai'i Island.

This mechanism may provide alternative renewable resources of freshwater to volcanic islands worldwide. "Their evidence for separate freshwater lenses, stacked one above the other, near the Kona coast of Hawai'i, profoundly improves the prospects for sustainable development on volcanic islands," said UH Mānoa School of Ocean and Earth Science and Technology (SOEST) Dean Brian Taylor.

Paradigm shift

Through the use of marine controlled-source electromagnetic imaging, the study revealed the onshore-to-offshore movement of freshwater through a multilayer formation of basalts embedded between layers of ash and soil, diverging from previous groundwater models of this area. Conducted as a part of the National Science Foundation-supported 'Ike Wai project, research affiliate faculty Eric Attias led the marine geophysics campaign.

"Our findings provide a paradigm shift from the conventional hydrologic conceptual models that have been vastly used by multiple studies and water organizations in Hawai'i and other volcanic islands to calculate sustainable yields and aquifer storage for the past 30 years," said Attias. "We hope that our discovery will enhance future hydrologic models, and consequently, the availability of clean freshwater in volcanic islands."

Co-author Steven Constable, a professor of geophysics at the Scripps Institution of Oceanography, who developed the controlled source electromagnetic system used in the project, said, "I have spent my entire career developing marine electromagnetic methods such as the one used here. It is really gratifying to see the equipment being used for such an impactful and important application. Electrical methods have long been used to study groundwater on land, and so it makes sense to extend the application offshore."

Kerry Key, an associate professor at Columbia University who employs electromagnetic methods to image various oceanic Earth structures, who not involved in this study, said, "This new electromagnetic technique is a game changing tool for cost-effective reconnaissance surveys to identify regions containing freshwater aquifers, prior to more expensive drilling efforts to directly sample the pore waters. It can also be used to map the lateral extent of any aquifers already identified in isolated boreholes."

Two-times more water

Donald Thomas, a geochemist with the Hawai'i Institute of Geophysics and Planetology in SOEST who also worked on the study, said the findings confirm two-times the presence of much larger quantities of stored groundwater than previously thought.

"Understanding this new mechanism for groundwater...is important to better manage groundwater resources in Hawai'i," said Thomas, who leads the Humu?ula Groundwater Research project, which found another large freshwater supply on Hawai'i Island several years ago.

Offshore freshwater systems similar to those flanking the Hualālai aquifer are suggested to be present for the island of O'ahu, where the electromagnetic imaging technique has not yet been applied, but, if demonstrated, could provide an overall new concept to manage freshwater resources.

The study proposes that this newly discovered transport mechanism may be the governing mechanism in other volcanic islands. With offshore reservoirs considered more resilient to climate change-driven droughts, volcanic islands worldwide can potentially consider these resources in their water management strategies.

Credit: 
University of Hawaii at Manoa

Landmark study generates first genomic atlas for global wheat improvement

image: Curtis Pozniak in wheat field

Image: 
Christina Weese/USask

SASKATOON - In a landmark discovery for global wheat production, a University of Saskatchewan-led international team has sequenced the genomes for 15 wheat varieties representing breeding programs around the world, enabling scientists and breeders to much more quickly identify influential genes for improved yield, pest resistance and other important crop traits.

The research results, just published in Nature, provide the most comprehensive atlas of wheat genome sequences ever reported. The 10+ Genome Project collaboration involved more than 95 scientists from universities and institutes in Canada, Switzerland, Germany, Japan, the U.K., Saudi Arabia, Mexico, Israel, Australia, and the U.S.

"It's like finding the missing pieces for your favorite puzzle that you have been working on for decades," said project leader Curtis Pozniak, wheat breeder and director of the USask Crop Development Centre (CDC). "By having many complete gene assemblies available, we can now help solve the huge puzzle that is the massive wheat pan-genome and usher in a new era for wheat discovery and breeding."

Scientific groups across the global wheat community are expected to use the new resource to identify genes linked to in-demand traits, which will accelerate breeding efficiency.

"This resource enables us to more precisely control breeding to increase the rate of wheat improvement for the benefit of farmers and consumers, and meet future food demands," Pozniak said.

One of the world's most cultivated cereal crops, wheat plays an important role in global food security, providing about 20 per cent of human caloric intake globally. It's estimated wheat production must increase by more than 50 per cent by 2050 to meet an increasing global demand.

In 2018 as part of another international consortium, USask researchers played a key role in decoding the genome for the bread wheat variety Chinese Spring, the first complete wheat genome reference and a significant technical milestone. The findings were published in the journal Science.

"Now we have increased the number of wheat genome sequences more than 10-fold, enabling us to identify genetic differences between wheat lines that are important for breeding," Pozniak said. "We can now compare and contrast the full complement of the genetic differences that make each variety unique."

Nils Stein of the Leibniz Institute of Plant Genetics and Crop Plant Research (IPK) and project co-leader from Germany said, "Given the significant impact of the Chinese Spring reference genome on research and application, it is a major achievement that just two years later we are providing additional sequence resources that are relevant to wheat improvement programs in many different parts of the world."

The 10+ Genome study represents the start of a larger effort to generate thousands of genome sequences of wheat, including genetic material brought in from wheat's wild relatives.

The research team was able to track the unique DNA signatures of genetic material incorporated into modern cultivars from several of wheat's undomesticated relatives by breeders over the century.

"These wheat relatives have been used by breeders to improve disease resistance and stress resistance of wheat," said Pozniak. "One of these relatives contributed a DNA segment to modern wheat that contains disease-resistant genes and provides protection against a number of fungal diseases. Our collaborators from Kansas State University and CIMMYT (Mexico) showed that this segment can improve yields by as much as 10 per cent. Since breeding is a continual improvement process, we can continue to cross plants to select for this valuable trait."

Pozniak's team, in collaboration with scientists from Agriculture and Agri-Food Canada and National Research Council of Canada, also used the genome sequences to isolate an insect-resistant gene (called Sm1) that enables wheat plants to withstand the orange wheat blossom midge, a pest which can cause more than $60 million in annual losses to Western Canadian producers.

"Understanding a causal gene like this is a game-changer for breeding because you can select for pest resistance more efficiently by using a simple DNA test than by manual field testing," Pozniak said.

The USask team also included the paper's first author Sean Walkowiak (formerly with Pozniak's team and now with the Canadian Grain Commission), computer scientist Carl Gutwin who developed visualization software and a user-friendly database to compare the genome sequences, and Andrew Sharpe, director of genomics and bioinformatics at the USask Global Institute for Food Security, who did sequencing work through the Omics and Precision Agriculture Laboratory (OPAL), a state-of-the-art laboratory that provides genomics, phenomics and bioinformatics services.

The 10+ Genome Project was sanctioned as a top priority by the Wheat Initiative, a co-ordinating body of international wheat researchers.

"This project is an excellent example of co-ordination across leading research groups around the globe. Essentially every group working in wheat gene discovery, gene analysis and deployment of molecular breeding technologies will use the resource," said Wheat Initiative Scientific Co-ordinator Peter Langridge.

Canadian funding came from the Canadian Triticum Applied Genomics (CTAG2) research project funded by Genome Canada, Genome Prairie, the Western Grains Research Foundation, Government of Saskatchewan, Saskatchewan Wheat Development Commission, Alberta Wheat Commission, Viterra, Manitoba Wheat and Barley Growers Association, and the Canada First Research Excellence Fund through USask's Plant Phenotyping and Imaging Research Centre (P2IRC) initiative.

"This project is a prime example of how genomics can support increased resilience in food production and strengthen Canada's export leadership," said Genome Canada President and CEO Rob Annan.

"Deploying genomics to adapt agricultural production to climate change, address food and nutritional insecurity, and improve crop health is good for farmers and consumers, and our economy will see tangible returns from this research. Genome Canada is immensely proud of the exceptional work by the Canadian researchers and their international collaborators, which underscores the potential of genomics to make a positive impact on the lives of Canadians and others around the world."

Credit: 
University of Saskatchewan

Research provides new insights on health effects of long-duration space flight

image: Professor Susan Bailey and Dr. Kjell Lindgren, NASA astronaut, during a visit to Bailey's lab in 2016. Lindgren is an alumnus of Colorado State University.

Image: 
CSU Photography

The historic NASA Twins Study investigated identical twin astronauts Scott and Mark Kelly and provided new information on the health effects of spending time in space.

Colorado State University Professor Susan Bailey was one of more than 80 scientists across 12 universities who conducted research on the textbook experiment; Mark remained on Earth while Scott orbited high above for nearly one year. The massive effort was coordinated by NASA's Human Research Program.

Bailey has continued her NASA research and now joins more than 200 investigators from dozens of academic, government, aerospace and industry groups to publish a package of 30 scientific papers in five Cell Press journals on Nov. 25.

Jared Luxton, who recently received his doctoral degree in cell and molecular biology at CSU, is the first author of two of the studies. He is now a data scientist with the United States Department of Agriculture in Fort Collins.

The research - including an over-arching paper that covers what the investigators have learned about the fundamental features of space flight - represents the largest set of space biology and astronaut health effects data ever produced.

For Bailey, it is also a milestone marking many years of working with NASA, which included her lead role on basic radiation studies and the honor of being selected as an investigator for the Twins Study and concurrent research projects involving astronauts. During this time, several graduate students in her lab earned doctoral degrees under her mentorship.

"We now have a foundation to build on - things we know to look for in future astronauts, including telomere length changes and DNA damage responses," Bailey said. "Going forward, our goal is to get a better idea of underlying mechanisms, of what's going on during long-duration space flight in the human body and how it varies between people. Not everybody responds the same way. That was one of the good things about having the larger cohort of astronauts in these studies."

Studying the ends of chromosomes, with implications for aging

Bailey is an expert on telomeres and radiation-induced DNA damage, areas of research that were of keen interest around the world when the Twins Study was published. In that study, she and her team found that Scott's telomeres in his white blood cells got longer while in space, and subsequently returned to near normal length after he was back on Earth.

Telomeres are protective "caps" on the ends of chromosomes that shorten as a person ages. Large changes in telomere length could mean a person is at risk for accelerated aging or the diseases that come along with getting older, cardiovascular disease and cancer for example.

In the latest research, Bailey, Luxton, Senior Research Associate Lynn Taylor and team studied a group of 10 unrelated astronauts, including CSU alum Dr. Kjell Lindgren, comparing the results with findings from the Kelly twins. The researchers did not have access to in-flight blood and other samples for all of the crewmembers, but Bailey said they did have blood samples before and after space flight for everyone.

The investigations involved astronauts who spent approximately six months on the International Space Station in low-Earth orbit, which is protected from some space radiation. Despite the protection, scientists found evidence of DNA damage that could be warning signs of potential health effects.

New discovery of oxidative stress

Among the new findings, the research team found that chronic oxidative stress during spaceflight contributed to the telomere elongation they observed. They also found that astronauts in general had shorter telomeres after spaceflight than they did before. The team also observed individual differences in responses.

To gain more insight on these findings, Bailey's team also studied twin mountain climbers who scaled Mt. Everest, an extreme environment on Earth. The non-climbing twins remained at lower altitude, including in Boulder, Colorado. Remarkably, the team found similar evidence of oxidative stress and changes in telomere length in the climbers.

Christopher Mason, associate professor at Weill Cornell Medicine and a co-author with Bailey, performed gene expression analyses on the Mt. Everest climbers. He found evidence of a telomerase-independent, recombination-based pathway of telomere length maintenance known to result in longer telomeres.

Bailey said that when chronic oxidative stress occurs, it damages telomeres.

"Normal blood cells are dying and trying to survive," she said. "They're adapting to their new environment. Some cells will activate an alternative pathway to keep their telomeres going. It's similar to what happens with some tumors. Some of the cells emerge from that process. That's what we think we're seeing during spaceflight as well."

Luxton said the mechanism described above - known as alternative lengthening of telomeres, or ALT - was an unexpected finding.

"You usually see that in cancer or in developing embryos," he said.

Take care of your telomeres

Similar to conclusions from the Twins Study, Bailey said the new findings have implications for future space travelers establishing a base on the Moon or traveling to Mars, or even as a space tourist. Long-duration exploration missions will involve increased time and distance outside of the protection of the Earth.

Although longer telomeres in space might seem like a good thing, perhaps even a "fountain of youth," the scientist said she suspects a somewhat different ending to the story.

"Extended lifespan, or immortality, of cells that have suffered space radiation-induced DNA damage, such as chromosomal inversions, is a recipe for increased cancer risk," she said.

Bailey said she and the team observed increased frequencies of inversions in all crewmembers, during and after spaceflight.

"Telomeres really are reflective of our lifestyles - whether on or off the planet," said Bailey. "Our choices do make a difference in how quickly or how well we are aging. It's important to take care of your telomeres."

Credit: 
Colorado State University

Early birth linked to greater risk of hospital visits during childhood

Being born early (before 37 weeks' gestation) is associated with a higher risk of hospital admission throughout childhood than being born at full term (40 weeks' gestation), finds a study published by The BMJ today.

Although the risk declined as the children grew up, particularly after age 2, an excess risk remained up to age 10, even for children born at 38 and 39 weeks' gestation, representing many potentially vulnerable children, say the researchers.

Preterm birth is a major contributor to childhood ill health. Existing evidence suggests that the risk of illness associated with preterm birth declines as children grow up, but it remains unclear at what age this begins to happen and how these changes vary by week of gestational age at birth.

To explore this further, a team of UK researchers set out to examine the association between gestational age at birth and hospital admissions to age 10 years and how admission rates change throughout childhood.

Their findings are based on data from more than 1 million children born in NHS hospitals in England between 1 January 2005 and 31 December 2006. Children were monitored from birth until 31 March 2015 (an average of 9.2 years per child), during which time the researchers analysed numbers of hospital admissions.

Gestational age at birth was analysed in weeks, from less than 28 up to 42 weeks.

Over 1.3 million hospital admissions occurred during the study period, of which 831,729 (63%) were emergency admissions. Just over half (525,039) of children were admitted to hospital at least once during the study period.

After taking account of other potentially influential risk factors, such as mother's age, marital status and level of social deprivation, and child's sex, ethnicity and month of birth, the researchers found that hospital admissions during childhood were strongly associated with gestational age at birth.

The hospital admission rate during infancy in babies born at 40 weeks was 28 per 100 person years - this figure was about six times higher in babies born extremely prematurely (less than 28 weeks). By the time the children were aged 7-10 years, the hospital admission rate in children born at 40 weeks was 7 per 100 person years - this figure was about three times higher in those born at less than 28 weeks.

But even children born a few weeks early had higher admission rates. Being born at 37, 38, and 39 weeks' gestation was associated with a difference in the rate of admission of 19, 9, and 3 admissions per 100 person years during infancy, respectively, compared with those born at 40 weeks.

The risk of hospital admission associated with gestational age decreased over time, particularly after age 2. However, an excess risk remained up to age 10, even for children born at 38 and 39 weeks' gestation.

Although this excess risk at 38 and 39 weeks was relatively small, the large number of babies born globally at these gestational ages suggests that they are likely to have a large impact on hospital services, say the researchers.

Infections were the main cause of excess hospital admissions at all ages, but particularly during infancy. Respiratory and gastrointestinal conditions also accounted for a large proportion of admissions during the first two years of life.

This is an observational study, so can't establish cause, and the researchers point to some limitations, such as being unable to take account of several factors that can impact child health like maternal smoking and breastfeeding.

However, they say this was a large study using routinely collected data over a 10 year period, and the findings remained relatively stable after further analyses, suggesting that the results withstand scrutiny.

As such, the researchers say their findings indicate that gestational age at birth "is a strong predictor of childhood illness, with those born extremely preterm being at the greatest risk of hospital admission throughout childhood."

And the finding that infections were the main cause of excess hospital admissions at all ages prompt the researchers to call for targeted strategies to help prevent and better manage childhood infections.

Future research should also consider gestational age as a continuum and explore it for outcomes week by week, they conclude.

Credit: 
BMJ Group