Tech

Climate change increases the risk of wildfires confirms new review

Human-induced climate change promotes the conditions on which wildfires depend, increasing their likelihood - according to a review of research on global climate change and wildfire risk published today.

In light of the Australian fires, scientists from the University of East Anglia (UEA), Met Office Hadley Centre, University of Exeter and Imperial College London have conducted a Rapid Response Review of 57 peer-reviewed papers published since the IPCC's Fifth Assessment Report in 2013.

All the studies show links between climate change and increased frequency or severity of fire weather - periods with a high fire risk due to a combination of high temperatures, low humidity, low rainfall and often high winds - though some note anomalies in a few regions.

Rising global temperatures, more frequent heatwaves and associated droughts in some regions increase the likelihood of wildfires by stimulating hot and dry conditions, promoting fire weather, which can be used as an overall measure of the impact of climate change on the risk of fires occurring.

Observational data shows that fire weather seasons have lengthened across approximately 25 per cent of the Earth's vegetated surface, resulting in about a 20 per cent increase in global mean length of the fire weather season.

The literature review was carried out using the new ScienceBrief.org online platform, set up by UEA and the Tyndall Centre for Climate Change Research. ScienceBrief is written by scientists and aims to share scientific insights with the world and keep up with science, by making sense of peer-reviewed publications in a rapid and transparent way.

Dr Matthew Jones, Senior Research Associate at UEA's Tyndall Centre and lead author of the review, said: "Overall, the 57 papers reviewed clearly show human-induced warming has already led to a global increase in the frequency and severity of fire weather, increasing the risks of wildfire.

"This has been seen in many regions, including the western US and Canada, southern Europe, Scandinavia and Amazonia. Human-induced warming is also increasing fire risks in other regions, including Siberia and Australia.

"However, there is also evidence that humans have significant potential to control how this fire risk translates into fire activity, in particular through land management decisions and ignition sources."

At the global scale, burned area has decreased in recent decades, largely due to clearing of savannahs for agriculture and increased fire suppression. In contrast, burned area has increased in closed-canopy forests, likely in response to the dual pressures of climate change and forest degradation.

Co-author Professor Richard Betts, Head of Climate Impacts Research at the Met Office Hadley Centre and University of Exeter, said: "Fire weather does occur naturally but is becoming more severe and widespread due to climate change. Limiting global warming to well below 2?C would help avoid further increases in the risk of extreme fire weather."

Professor Iain Colin Prentice, Chair of Biosphere and Climate Impacts and Director of the Leverhulme Centre for Wildfires, Environment and Society, Imperial College London, added: "Wildfires can't be prevented, and the risks are increasing because of climate change. This makes it urgent to consider ways of reducing the risks to people. Land planning should take the increasing risk in fire weather into account."

The Rapid Response Review is published on ScienceBrief. The papers used in review can be viewed at https://sciencebrief.org/topics/climate-change-science/wildfires .

This is the first review to use the ScienceBrief resource, with further work planned on areas related to climate change science and its impacts in the run up to the United Nations Climate Change Conference - COP26 - in November.

Credit: 
University of East Anglia

Man versus machine: Can AI do science?

image: The pyrochlore crystal structure contains magnetic atoms, which are arranged to form a lattice of tetrahedral shapes, joined at each corner.

Image: 
Theory of Quantum Matter Unit, OIST

Over the last few decades, machine learning has revolutionized many sectors of society, with machines learning to drive cars, identify tumors and play chess - often surpassing their human counterparts.

Now, a team of scientists based at the Okinawa Institute of Science and Technology Graduate University (OIST), the University of Munich and the CNRS at the University of Bordeaux have shown that machines can also beat theoretical physicists at their own game, solving complex problems just as accurately as scientists, but considerably faster.

In the study, recently published in Physical Review B, a machine learned to identify unusual magnetic phases in a model of pyrochlore - a naturally-occurring mineral with a tetrahedral lattice structure. Remarkably, when using the machine, solving the problem took only a few weeks, whereas previously the OIST scientists needed six years.

"This feels like a really significant step," said Professor Nic Shannon, who leads the Theory of Quantum Matter (TQM) Unit at OIST. "Computers are now able to carry out science in a very meaningful way and tackle problems that have long frustrated scientists."

The Source of Frustration

In all magnets, every atom is associated with a tiny magnetic moment - also known as "spin." In conventional magnets, like the ones that stick to fridges, all the spins are ordered so that they point in the same direction, resulting in a strong magnetic field. This order is like the way atoms order in a solid material.

But just as matter can exist in different phases - solid, liquid and gas - so too can magnetic substances. The TQM unit is interested in more unusual magnetic phases called "spin liquids", which could have uses in quantum computation. In spin liquids, there are competing, or "frustrated" interactions between the spins, so instead of ordering, the spins continuously fluctuate in direction - similar to the disorder seen in liquid phases of matter.

Previously, the TQM unit set out to establish which different types of spin liquid could exist in frustrated pyrochlore magnets. They constructed a phase diagram, which showed how different phases could occur when the spins interacted in different ways as the temperature changed, with their findings published in Physical Review X in 2017.

But piecing together the phase diagram and identifying the rules governing the interactions between spins in each phase was an arduous process.

"These magnets are quite literally frustrating," joked Prof. Shannon. "Even the simplest model on a pyrochlore lattice took our team years to solve."

Enter the machines

With increasing advances in machine learning, the TQM unit were curious as to whether machines could solve such a complex problem.

"To be honest, I was fairly sure that the machine would fail," said Prof. Shannon. "This is the first time I've been shocked by a result - I've been surprised, I've been happy, but never shocked."

The OIST scientists teamed up with machine learning experts from the University of Munich, led by Professor Lode Pollet, who had developed a "tensorial kernel" - a way of representing spin configurations in a computer. The scientists used the tensorial kernel to equip a "support vector machine", which is able to categorize complex data into different groups.

"The advantage of this type of machine is that unlike other support vector machines, it doesn't require any prior training and it isn't a black box - the results can be interpreted. The data are not only classified into groups; you can also interrogate the machine to see how it made its final decision and learn about the distinct properties of each group," said Dr Ludovic Jaubert, a CNRS researcher at the University of Bordeaux.

The Munich scientists fed the machine a quarter of a million spin configurations generated by the OIST supercomputer simulations of the pyrochlore model. Without any information about which phases were present, the machine successfully managed to reproduce an identical version of the phase diagram.

Importantly, when the scientists deciphered the "decision function" which the machine had constructed to classify different types of spin liquid, they found that the computer had also independently figured out the exact mathematical equations that exemplified each phase - with the whole process taking a matter of weeks.

"Most of this time was human time, so further speed ups are still possible," said Prof. Pollet. "Based on what we now know, the machine could solve the problem in a day."

"We are thrilled by the success of the machine, which could have huge implications for theoretical physics," added Prof. Shannon. "The next step will be to give the machine an even more difficult problem, that humans haven't managed to solve yet, and see whether the machine can do better."

Credit: 
Okinawa Institute of Science and Technology (OIST) Graduate University

Mysteries of grasshopper response to gravity unlocked

image: Jacob Campbell, ASU PhD candidate in the School of Life Sciences, readies a grasshopper for x-ray imaging at Argonne National Laboratory.

Image: 
Jake Socha

If you jump out of bed too quickly, you might feel a bit light-headed.

That's because when you're lying down, gravity causes your blood to pool in the lower parts of your body rather than in your brain. Fortunately, when you stand up, within a fraction of a second, your heart begins beating faster, moving the blood to your brain and allowing you to maintain your balance.

The opposite happens when you're standing on your head. Gravity causes the blood to rush to your brain, so your heart beats more slowly and the blood vessels leading to your brain constrict to prevent too much blood from building up there.

But insects don't have closed circulatory systems with vessels that can restrict fluid flow to certain parts of the body. So how do they control the effects of gravity when they need to climb a tree or hang upside down on a branch waiting for prey?

Arizona State University School of Life Sciences Professor Jon Harrison, along with Professor Jake Socha with Virginia Tech's College of Engineering, have published the first study that demonstrates how insects adjust their cardiovascular and respiratory activity in response to gravity. The findings appear today in Proceedings of the National Academy of Sciences.

"Interestingly, this has never been looked at in invertebrates," Harrison said. "It's something I've always been interested in because the blood is not in vessels. It's an open circulatory system so the typical biologist would probably say, well the blood must just be sloshing around in there, but we actually don't know much about what's going on with blood circulation in insects."

But thanks to this study, he's now beginning to have an idea. Harrison and his colleagues took x-ray images of live insects and discovered that when grasshoppers were in a heads-up position, their heads were filled with open-air sacs and very little fluid (known as hemolymph or invertebrate blood). In their abdomens, it was the opposite: compressed air sacs and lots of fluid.

However, when grasshoppers have their heads down, their heads are filled with fluid and compressed air sacs while their abdomens have very little fluid and a lot of open-air sacs.

To learn more, they injected a radioactive tracer to track the hemolymph through the body and found that it was reacting to gravity. This indicates that gravity causes blood to flow downward in grasshoppers just like in humans.

And, interestingly, grasshoppers could substantially combat the effects of gravity on blood flow when awake but not when anesthetized. Thus, similar to vertebrates, grasshoppers have mechanisms to adjust to gravitational effects on their blood. So it's not just a pool of fluid sloshing around. But what are the mechanisms? Harrison and his colleagues are starting to figure that out.

First, just as in vertebrates, there seems to be some kind of functional valve within a grasshopper's body to prevent gravity-driven blood flow. Researchers at Virginia Tech showed that blood pressure is not related to gravity, supporting that new hypothesis. In addition, blood pressure in a grasshopper's head is unrelated to its blood pressure in the abdomen, also evidence of valving.

At ASU, undergraduate and postdoctoral researchers in Harrison's lab discovered that both heart and respiratory rates respond to body orientation and gravity. The grasshoppers that had their heads down (similar to a human standing on his or her head) had decreased heart rates to reduce fluid pooling in the brain. However, their ventilation rate increased. Harrison said they think this is because air sacs are being compressed around the brain so it's struggling to get enough oxygen.

"So, grasshoppers have at least three ways to compensate for gravity; variation in heart rate, breathing rate and functional valving. And I'm sure there's other stuff we don't know about," said Harrison.

As for other aspects of their physiology, insect bodies are capable of sophisticated responses to their active lifestyles.

"If you watch grasshoppers, they're all over the place. They're head up, head down, sideways," Harrison said. "They're very flexible in their body orientation, as are most insects. And now we know that when they change their orientation they have to respond to gravity just like humans, and they even show many of the same physiological responses. This is a dramatic example showing how similar animals are physiologically, despite how different they may appear."

Credit: 
Arizona State University

Participants in environmental health studies vulnerable to re-identification

Newton, Mass. (January 13, 2020) -- Before sharing human research data, scientists routinely strip it of personal information such as name, address, and birthdate in order to protect the privacy of their study participants. However, reporting in the journal Environmental Health Perspectives, researchers at Silent Spring Institute and their colleagues show that for environmental health studies, that might not be enough--even anonymized data can sometimes be traced back to individuals.

The new study highlights the need for greater protections for participants in human research studies. It also has implications for a proposed federal rule by the U.S. Environmental Protection Agency (EPA) that would require scientists to make their data public in order for their research to be used as a basis for environmental regulations.

"Researchers promise to protect the privacy of their study participants--a routine practice in nearly all scientific studies involving people," says lead author Katherine Boronow, a staff scientist at Silent Spring. "Our research shows that making data publicly available from environmental health studies, even after obvious identifiers are removed, could violate these pledges."

In a previous study, Silent Spring researchers conducted an experiment in which they shared anonymized data from the Institute's Household Exposure Study in California with a team of Harvard researchers skilled in re-identification techniques. By linking housing and demographic data from the study to publicly-available data such as tax assessor records, and using other information described in the study such as the location of the housing developments and the levels of indoor air pollutants measured, the team successfully re-identified 25 percent of participants from one housing development by name.

Now, in this latest investigation, the researchers show that vulnerability to re-identification is a common aspect of environmental health data. They reviewed a dozen environmental health studies and identified five different types of data (location, medical, genetic, occupation, and housing) that overlap with outside databases and could contribute to the risk of re-identification.

The researchers found that all 12 studies included at least two out of the five data types, and three studies included all five. "Having multiple data types provides more opportunities for someone to match research data against existing commercial or public databases," says Boronow.

Measurements of pollutants in people's bodies or in their homes are also a characteristic data type of many environmental health studies. Currently, however, these measurements alone are less vulnerable to data linkage because there are few databases that include chemical measurements that could be used for matching.

To explore a different way that chemical exposure data might be used in re-identification, the team conducted a cluster analysis using data from Silent Spring's Household Exposure Study in California and in Massachusetts and from the Centers for Disease Control's Green Housing Study in Boston and Cincinnati. They fed the raw chemical measurements to an algorithm that sorted the data within each study into two groups. The groups created by the algorithm corresponded to geographic location with 80 to 98 percent accuracy.

If the data cluster into groups by location, says Boronow, then each group can be matched to data narrowed to that location, making it more likely for a re-identification attack to produce correct matches. This shows how someone could use chemical data to infer a characteristic of people in a study even if that characteristic is excluded when the study data are shared.

Data sharing has many benefits. By pooling data, researchers can create larger, more diverse datasets that could lead to advances in knowledge. It can also give researchers access to data that are difficult or expensive to obtain, such as data from biological or environmental samples collected after an environmental disaster. However, as the new study shows, it also has its risks.

Dr. Julia Brody, executive director at Silent Spring and a co-author of the study, says the implications of privacy risks are not trivial. Loss of privacy could result in stigma for individuals and communities. It could affect property values, insurance, or a person's chances of employment. It could also damage trust in research.

In 2018, EPA released a proposed rule called "Strengthening Transparency in Regulatory Science," that would require researchers to disclose their raw data as a precondition for the agency using a study to support regulatory decisions. Because the requirement could jeopardize confidential information about study participants, it could disqualify critical environmental health studies that form the basis of existing regulations, such as current limits on air pollutants. EPA is expected to release a revised version of the proposed rule early this year.

"Thousands of Americans have contributed personal data to scientific research with the goal of improving health for all," says Brody. "We must not take advantage of their generosity with rules that threaten their privacy and discourage future participation in research."

With growing pressure on scientists to share their data, and with more consumer data available online, Brody says it is important to fully characterize the risks of data sharing and identify solutions. Results from their research, she says, could help scientists develop informed consent documents that are more forthcoming about the risks and could help determine what types of data should be excluded from public sharing. It could also lay the groundwork for legal and policy protections for participants should they fall victim to re-identification.

Credit: 
Silent Spring Institute

Using light to learn

image: Courtship behavior of male fruit flies (left). Light triggers gene expression for memory maintenance (right).

Image: 
Inami et al., <em>JNeurosci</em> 2020

Maintaining long-term memories requires environmental light, according to research in fruit flies recently published in JNeurosci.

Memories begin in a temporary form, which are converted into long term memories as protein expression and brain circuits change. But, long term memories require active maintenance in order to survive the changing molecular landscape of the brain. Previous research indicates exposure to different colors of light alters memory function in humans and animals, but the role of natural lighting conditions in memory maintenance remains unknown.

Inami et al. explored this question by testing the ability of male fruit flies to learn that their proposal is not accepted by females through their courtship toward unreceptive females. After the learning period, the male fruit flies were either exposed to constant darkness, constant light, or a 12-hour light/dark cycle. The flies experiencing a light/dark cycle recognized the ready-to-mate females for five days, whereas flies in constant darkness couldn't maintain the memory. The researchers found environmental light exposure activates light-sensitive neurons, triggering the production of memory maintenance proteins. Darkness during the learning period did not affect memory formation, indicating that light is required for the maintenance, but not creation, of long-term memories.

Credit: 
Society for Neuroscience

Research identifies new route for tackling drug resistance in skin cancer cells

image: A drug-resistant melanoma cell that has altered its cytoskeleton.

Image: 
Queen Mary University of London

Researchers have found that melanoma cells fight anti-cancer drugs by changing their internal skeleton (cytoskeleton) - opening up a new therapeutic route for combatting skin and other cancers that develop resistance to treatment.

The team, led by Queen Mary University of London, found that melanoma cells stop responding to both immunotherapies and drugs targeted at the tumour's faulty genes (B-RAF or N-Ras mutations in the MAPK pathway) by increasing the activity of two cytoskeletal proteins - ROCK and Myosin II. The team found that these molecules were key for cancer cell survival and resistance to these treatments.

The molecules had previously been linked to the process of metastatic spread but not to the poor impact of current anti-melanoma therapies. This work points to a strong connection between metastasis and therapy resistance - confirming that the cytoskeleton is important in determining how aggressive a cancer is.

The findings are published in the journal Cancer Cell today.

Malignant melanoma has very poor survival rates despite being at the forefront of personalised immunotherapy. This is largely due to the development of resistance. Around 16.000 people in the UK are diagnosed with malignant melanoma each year, with more than 2,300 deaths.

Tests in mice suggest that the therapy resistant (or non-responding) tumours are effectively addicted to ROCK-Myosin II to grow. The team discovered that blocking the ROCK-Myosin II pathway not only reduces cancer cell growth, but also attacks faulty immune cells (macrophages and regulatory T cells) that are failing to kill the tumour. This action boosts anti-tumour immunity.

Lead author Professor Victoria Sanz-Moreno, Professor of Cancer Cell Biology at Queen Mary, said: "We were very surprised to find that the cancerous cells used the same mechanism, changing their cytoskeleton, to escape two very different types of drugs. In a nutshell if you are a cancer cell, what does not kill you makes you stronger. However, their dependence on ROCK-Myosin II is a vulnerability that combination drug therapy tests on mice suggest we can exploit in the clinic by combining existing anti-melanoma therapies with ROCK-Myosin II inhibitors."

She said the research may have implications for cancers with similar faulty genes.

First author Jose L. Orgaz, Research Fellow at Queen Mary's Barts Cancer Institute, said: "An important observation was finding increased Myosin II activity levels in resistant human melanomas, which suggests the potential as a biomarker of therapy failure. Resistant melanomas also had increased numbers of faulty immune cells (macrophages and regulatory T cells), which could also contribute to the lack of response."

Fiona Miller Smith, chief executive of Barts Charity, which part-funded the research, said: "We are pleased to support the research led by the extremely talented Professor Victoria Sanz-Moreno and her passionate team who were recruited to the Barts Cancer Institute, as part of Barts Charity's recent £10m strategic investment into cancer research at Queen Mary. Overcoming resistance to cancer therapy is an important area of research, and we are extremely proud that our investment has contributed to these important new findings which could benefit melanoma patients."

Credit: 
Queen Mary University of London

Machine keeps human livers alive for one week outside of the body

image: A surgeon connects the donor liver to the perfusion machine.

Image: 
USZ

Researchers from the University Hospital Zurich, ETH Zurich, Wyss Zurich and the University of Zurich have developed a machine that repairs injured human livers and keeps them alive outside the body for one week. This breakthrough may increase the number of available organs for transplantation saving many lives of patients with severe liver diseases or cancer.

Until now, livers could be stored safely outside the body for only a few hours. With the novel perfusion technology, livers - and even injured livers - can now be kept alive outside of the body for an entire week. This is a major breakthrough in transplantation medicine, which may increase the number of available organs for transplantation and save many lives of patients suffering from severe liver disease or a variety of cancers. Injured cadaveric livers, initially not suitable for use in transplantation, may regain full function while perfused in the new machine for several days. The basis for this technology is a complex perfusion system, mimicking most core body functions close to physiology. The corresponding study was published on January 13 in the scientific journal Nature Biotechnology.

Offering what other machines cannot

"The success of this unique perfusion system - developed over a four-year period by a group of surgeons, biologists and engineers - paves the way for many new applications in transplantation and cancer medicine helping patients with no liver grafts available" explains Prof. Pierre-Alain Clavien, Chairman of the Department of Surgery and Transplantation at the University Hospital Zurich (USZ). When the project started in 2015, livers could only be kept on the machine for 12 hours. The seven-day successful perfusion of poor-quality livers now allows for a wide range of strategies, e.g. repair of preexisting injury, cleaning of fat deposits in the liver or even regeneration of partial livers.

Liver4Life: A project from Wyss Zurich

The Liver4Life project was developed under the umbrella of Wyss Zurich institute, which brought together the highly specialized technical know-how and biomedical knowledge of experts from the University Hospital Zurich (USZ), ETH Zurich and the University of Zurich (UZH). "The biggest challenge in the initial phase of our project was to find a common language that would allow communication between the clinicians and engineers," explains Prof. Philipp Rudolf von Rohr, Professor of Process Engineering at ETH Zurich and co-leader with Professor Clavien of the study now published in Nature Biotechnology.

Technology with great potential

The inaugural study shows that six of ten perfused poor-quality human livers, declined for transplantation by all centers in Europe, recovered to full function within one week of perfusion on the machine. The next step will be to use these organs for transplantation. The proposed technology opens a large avenue for many applications offering a new life for many patients with end stage liver disease or cancer.

Credit: 
University of Zurich

Atlantic circulation collapse could cut British crop farming

Crop production in Britain will fall dramatically if climate change causes the collapse of a vital pattern of ocean currents, new research suggests.

The Atlantic Meridional Overturning Circulation (AMOC) brings heat from the tropics, making Britain warmer and wetter than it would otherwise be.

University of Exeter scientists show that, while warming Britain is expected to boost food production, if the AMOC collapses it would not just wipe out these gains but cause the "widespread cessation of arable (crop-growing) farming" across Britain.

Such a collapse - a climate change "tipping point" - would leave Britain cooler, drier and unsuitable for many crops, the study says.

The main problem would be reduced rainfall and, though irrigation could be used, the amount of water and the costs "appear to be prohibitive".

"If the AMOC collapsed, we would expect to see much more dramatic change than is currently expected due to climate change," said Dr Paul Ritchie, of the University of Exeter.

"Such a collapse would reverse the effects of warming in Britain, creating an average temperature drop of 3.4°C and leading to a substantial reduction in rainfall (?123mm during the growing season).

"These changes, especially the drying, could make most land unsuitable for arable farming."

The study examines a "fast and early" collapse of the AMOC, which is considered "low-probability" at present - though the AMOC has weakened by an estimated 15% over the last 50 years.

Professor Tim Lenton, Director of the Global Systems Institute at the University of Exeter, said worst-case scenarios must be considered when calculating risks.

"Any risk assessment needs to get a handle on the large impacts if such a tipping point is reached, even if it is a low-probability event" he said.

"The point of this detailed study was to discover how stark the impacts of AMOC collapse could be."

The study follows a recent paper by Lenton and colleagues warning of a possible "cascade" of inter-related tipping points.

The new study reinforces the message that "we would be wise to act now to minimise the risk of passing climate tipping points" said Lenton.

Growing crops is generally more profitable than using land as pasture for livestock rearing, but much of northern and western Britain is unsuitable for arable farming.

"With the land area suitable for arable farming expected to drop from 32% to 7% under AMOC collapse, we could see a major reduction in the value of agricultural output," said Professor Ian Bateman, of Exeter's Land, Environment, Economics and Policy Institute.

"In this scenario, we estimate a decrease of £346 million per year - a reduction of over 10% in the net value of British farming."

Speaking about the expectation that moderate warming would boost agricultural production in Britain, he added: "It's important to note that the wider effects for the UK and beyond will be very negative as import costs rise steeply and the costs of most goods climb."

The study focusses on agriculture, but AMOC collapse and the resulting temperature drop could lead to a host of other economic costs for the UK.

The AMOC is one reason that average temperatures in Britain are warmer than those of many places at similar latitudes. For example, Moscow and the southern extremes of Alaska are further south than Edinburgh.

The paper, published in the inaugural issue of the journal Nature Food, is entitled: "Shifts in national land use and food production in Great Britain after a climate tipping point."

Credit: 
University of Exeter

Circular RNA limits skin cancer spread

A mysterious piece of genetic material restrains the spread of skin cancer cells, but is frequently lost as they mature, a new study finds.

Published online January 13 in Cancer Cell, the new work revolves around circular RNA, a recently described type of ribonucleic acid (RNA). Typically, DNA blueprints are converted into RNA and then into proteins with cellular functions. While most RNA are linear molecules, some form circles when their ends loop around and attach.

Instead of encoding proteins, circular RNA (circRNA) seem to be part of complex regulatory systems, but their functions are still unclear, say the study authors.

Led by researchers at NYU Grossman School of Medicine, the study in cell cultures and mice is the first to show that a circRNA called CDR1as blocks the aggressive spread of melanoma cancers, and that its loss promotes it. A study analysis of human melanoma tissues also linked higher CDR1as levels with increased survival.

In patients that die from melanoma, the most lethal form of skin cancer, the aggressive spread, or metastasis, of cancer cells is the primary cause of death. Cancer cells arise from normal cells because of genetic errors, but changes in DNA do not fully explain how the cells spread.

"Our study provides new insights into the aggressive behavior of melanoma, and is the first to expose a circRNA as a suppressor of metastasis," says senior study author Eva Hernando, PhD, associate professor in the Department of Pathology at NYU Langone Health.

"We found CDR1as restrains a known pro-cancer protein called IGF2BP3, revealing a new function of CDR1as that may have therapeutic implications," adds first author Douglas Hanniford, PhD, an instructor in the same department.

Spread Follows Loss of Restraint

Recent work had suggested previously unknown functions for circRNAs, including their binding with proteins that attach to RNA to influence cell functions. Specifically, the new study reveals that metastasis of melanomas proceeds when the interaction between CDR1as and the RNA-binding protein IGF2BP3 is disrupted.

If CDR1as is active, researchers say, IGF2BP3 proteins bind to its circular RNA molecules, instead of attaching to other RNAs that code for pro-metastatic proteins. When CDR1as was removed using molecular techniques, IGF2BP3 was free to promote cancer cell invasion, which occurs when cells penetrate skin layers and spread to distant organs.

The study identifies the source of CDR1as in cells as LINC00632, an example of yet another class of RNA called long non-coding RNA. Experiments revealed further that an "epigenetic" mechanism in melanoma cells silences the LINC00632 gene, which halts CDR1as production.

Epigenetic changes adjust the operation of genes without changing their DNA code, researchers say. These include the attachment of molecules called methyl groups to histones, the "spools" around which DNA chains are wrapped. Methylation status determines whether a given stretch of DNA is unwound and accessible; or instead compacted, with the genes residing there silenced, researchers say.

The new study found that a particular histone methylation, H3K27me3, silences the gene for LINC00632 in melanoma cells starting to spread, which come to lack CDR1as. The authors say that this interaction could represent a mechanism that helps cells migrate during normal (fetal) development, but that then drives cancer spread when it mistakenly re-occurs in tumors.

Credit: 
NYU Langone Health / NYU Grossman School of Medicine

Investigational drugs block bone loss in mice receiving chemotherapy

image: Red and green lines represent new bone in this image of a normal mouse bone. Exposure to chemotherapy and radiation during cancer treatment leads to bone loss and increases the risk of osteoporosis and fractures. A new study from Washington University School of Medicine in St. Louis identifies the trigger for this bone loss and suggests ways to prevent it.

Image: 
Zhangting Yao

Bone loss that can lead to osteoporosis and fractures is a major problem for cancer patients who receive chemotherapy and radiation. Since the hormone estrogen plays an important role in maintaining bone health, bone loss is especially pronounced among postmenopausal women with breast cancer who are treated using therapies aimed at eliminating estrogen.

Men and children treated for other cancers also experience bone loss, suggesting that eliminating estrogen is not the only trigger leading to bone degeneration.

Studying mice, researchers from Washington University School of Medicine in St. Louis have found a driver of bone loss related to cancer treatment. They have shown that radiation and chemotherapy can halt cell division in bone, which results in a stress response referred to as senescence. According to the new study, cell senescence drives bone loss in female mice beyond that seen from the absence of estrogen alone. The researchers further found that this process occurs in males and females and is independent of cancer type. And perhaps most importantly, the researchers showed that such bone loss can be stopped by treating the mice with either of two investigational drugs already being evaluated in clinical trials.

The study appears Jan. 13 in Cancer Research, a journal of the American Association for Cancer Research.

"Researchers have understood that this bone loss has to be due to more than just hormone loss," said senior author Sheila A. Stewart, PhD, a professor of cell biology & physiology. "Cancer patients who receive chemotherapy and radiation lose a lot more bone than women with breast cancer treated with aromatase inhibitors, which eliminate estrogen. And children who have not yet gone through puberty, and aren't making much estrogen, also lose bone. We wanted to understand what causes bone loss beyond a lack of estrogen and whether we can do anything to stop it."

Stopping bone loss could improve quality of life for cancer patients. Bone loss leads to an increased risk of fractures that continues many years after treatment. This loss of bone density makes it much more likely that patients will develop fractures in the pelvis, hips and spine, which affect mobility and increase the risk of death.

The researchers studied bone loss in mice treated with two common chemotherapy drugs -- doxorubicin and paclitaxel -- as well as in mice that received radiation to one limb, to understand whether the bone loss effects were similar in different types of cancer therapies. In all situations, the treatments induced the process of cellular senescence.

"Senescence is a chronic stress response in a cell that stops it from dividing and also results in the release of many molecules, some of which we showed drive bone loss," said Stewart, who is also the associate director for basic sciences at Siteman Cancer Center at Barnes-Jewish Hospital and Washington University School of Medicine.

The cells in mouse bones that were most affected by the cancer therapies included those responsible for bone remodeling. These are sets of cells that strike a vital balance between dismantling old bone and building new bone in its place. This balance is disturbed in conditions such as osteoporosis, in which the bone-building cells can no longer keep up with the bone-dismantling cells. The new study suggests that the balance is even more off-kilter following cancer therapy: Bone-building cell activity slows down, and the activity of cells that remove old bone actually accelerates.

The researchers showed they could prevent bone loss in the mice if they took steps to remove the cells that are no longer dividing, thus eliminating the molecular signals that the cells produce that drive bone loss. Toward possible therapies, Stewart and her colleagues then showed they could achieve a similar effect with two different types of compounds that block these molecular signals.

The investigational drugs, a p38MAPK inhibitor and a MK2 inhibitor, block different parts of the same pathway leading to bone loss. Stewart and her colleagues also published a study in 2018 showing that the two inhibitors slowed the growth of metastatic breast cancer in mice. The p38MAPK inhibitor is being tested in U.S. clinical trials for inflammatory diseases, such as chronic obstructive pulmonary disease (COPD). And the MK2 inhibitor is about to be evaluated as a therapy for rheumatoid arthritis.

Cancer patients at risk of bone loss often are treated with drugs for osteoporosis, including bisphosphonates and denosumab. Both have some undesirable side effects, such as muscle and bone pain and, because of the way they work, they may be less desirable for children whose bones are still growing.

"The inhibitors we studied have extremely low toxicity, so we are interested in exploring whether they could be an improved option to stop bone loss in children receiving cancer therapy," Stewart said. "We're also interested in pursuing a clinical trial to evaluate these drugs in women with metastatic breast cancer to see if we can slow metastatic growth and also preserve bone health in these patients."

Credit: 
Washington University School of Medicine

Bacteria-shredding tech to fight drug-resistant superbugs

image: Golden Staph bacteria before (left) and after (right) exposure to the magnetic liquid metal nanoparticles. In the images, magnified 70,000 times, sharp pieces of liquid metal particle can be seen physically disrupting the bacteria after treatment.

Image: 
RMIT University

Researchers have used liquid metals to develop new bacteria-destroying technology that could be the answer to the deadly problem of antibiotic resistance.

The technology uses nano-sized particles of magnetic liquid metal to shred bacteria and bacterial biofilm - the protective "house" that bacteria thrive in - without harming good cells.

Published in ACS Nano, the research led by RMIT University offers a groundbreaking new direction in the search for better bacteria-fighting technologies.

Antibiotic resistance is a major global health threat, causing at least 700,000 deaths a year. Without action, the death toll could rise to 10 million people a year by 2050, overtaking cancer as a cause of death.

The biggest issues are the spread of dangerous, drug-resistant superbugs and the growth of bacterial biofilm infections, which can no longer be treated with existing antibiotics.

Dr Aaron Elbourne said antibiotics had revolutionised health since they were discovered 90 years ago but were losing effectiveness due to misuse.

"We're heading to a post-antibiotic future, where common bacterial infections, minor injuries and routine surgeries could once again become deadly," Elbourne, a Postdoctoral Fellow in the Nanobiotechnology Laboratory at RMIT, said.

"It's not enough to reduce antibiotic use, we need to completely rethink how we fight bacterial infections.

"Bacteria are incredibly adaptable and over time they develop defences to the chemicals used in antibiotics, but they have no way of dealing with a physical attack.

"Our method uses precision-engineered liquid metals to physically rip bacteria to shreds and
smash through the biofilm where bacteria live and multiply.

"With further development, we hope this technology could be the way to help make antibiotic resistance history."

Let's get physical: New way to kill bacteria

The RMIT team behind the technology is the only group in the world investigating the antibacterial potential of magnetic liquid metal nanoparticles.

When exposed to a low-intensity magnetic field, these nano-sized droplets change shape and develop sharp edges

When the droplets are placed in contact with a bacterial biofilm, their movements and nano-sharp edges break down the biofilm and physically rupture the bacterial cells.

In the new study, the team tested the effectiveness of the technology against two types of bacterial biofilms (Gram-positive and Gram-negative).

After 90 minutes of exposure to the liquid metal nanoparticles, both biofilms were destroyed and 99% of the bacteria were dead. Importantly, laboratory tests showed the bacteria-destroying droplets did not affect human cells.

Postdoctoral Fellow Dr Vi Khanh Truong said the versatile technology could one day be used in a range of ways to treat infections.

"It could be used as a spray coating for implants, to make them powerfully antibacterial and reduce the high rates of infection for procedures like hip and knee replacements," said Truong, currently at North Carolina State University on a Fulbright Scholarship to further the research.

"There's also potential to develop this into an injectable treatment that could be used at the site of infection."

The next stage for the research - testing the effectiveness of the technology in pre-clinical animal trials - is already underway, with the team hoping to move to clinical human trials in coming years.

Led by Truong, Elbourne and Dr James Chapman, the multi-disciplinary team is also planning to expand the technology beyond antibacterial treatment, exploring how it could be used to:

treat fungal infections - the next superbugs

break through cholesterol plaques and battle heart problems

stop tumours by being injected directly into cancer cells.

Credit: 
RMIT University

Accelerated speed of discovery could lead to more effective smoking cessation aids

As smokers know all too well, nicotine is highly addictive. It's hard to quit smoking, a habit that claims the lives of more than seven million people each year.

Smoking tobacco delivers nicotine to the neuroreceptors responsible for addiction, affecting the nervous system and causing addiction.

A new study, led by scientists from the University of Bristol, into the molecular interactions involved has revealed how these neuroreceptors respond to nicotine.

The researchers used new computational simulation methods to discover how receptors in the brain respond to nicotine.

One of the key features of the study is the speed at which the discovery was made, thanks to the use of Oracle Cloud Infrastructure, which allowed the researchers to run a large number of simulations in unprecedentedly short time.

The work brought together computational chemists, biochemists and research software engineers, working together to deploy large numbers of simulations of nicotine receptors in the cloud.

Reducing the time to results to just five days using Oracle's high-performance cloud infrastructure is transformational from a research perspective. Calculations that might otherwise have taken months to complete were completed in a matter of days.

The study, carried out by researchers from Bristol in partnership with Oracle, whose cloud technologies were a key part of the investigation, is reported in the Journal of the American Chemical Society, the flagship publication of the American Chemical Society, the world's largest scientific society and a global leader in providing access to chemistry-related research. The project was supported by funding from EPSRC.

Co-author of the study, Professor Adrian Mulholland, from Bristol's Centre for Computational Chemistry, part of Bristol's School of Chemistry, said: "Nicotine is highly addictive: it's very hard to give up smoking. To understand why it is so addictive, and to make molecules to help people quit, we need to understand how it affects the nervous system.

"We have used simulations to model and understand how nicotine affects receptors in the brain. Using the power of cloud computing, we were able to show how nicotine exerts its effects, at the molecular level, the first stage of signaling in the brain. This information, and the methods we have developing, will help in developing new smoking cessation aids."

Researchers are now working with Achieve Life Sciences to design and develop molecules that mimic nicotine, and computer simulations that will help test their potential effectiveness. This work builds on previous studies using chemical synthetic approaches to develop new smoking cessation aids, which will be investigated and tested in simulation scenarios.

Smoking is the second most common cause of death worldwide, but most current anti-smoking drugs are only moderately effective in reducing symptoms of withdrawal and may cause undesirable side effects. New, specific and effective smoking cessation aids are needed.

Nicotine is the major psychoactive agent in tobacco and causes addiction by binding to specific receptors in the brain. Understanding how nicotine binds to these receptors and creates the nicotine 'hit' and subsequent craving is a key focus for public health research.

The study saw researchers perform 450 individual molecular dynamics simulations of the biochemistry associated with the binding of nicotine to a subtype (α7) of nicotinic acetylcholine receptors in the brain. They were able to compare with other types nicotine receptor and identify common features of receptor activation.

The study also showed how cloud computing can be combined effectively with more traditional high-performance computing.

This work shows how rigorous simulations can be used to predict effects on drug targets in a matter of days.

On this quick timescale, calculations help to plan and interpret experiments, and will help design and develop effective drugs. More broadly, the agility and other benefits of using cloud computing for research offers the potential to accelerate the pace of discovery dramatically.

Credit: 
University of Bristol

Surrey lithium monitor could improve lives of people suffering from bipolar

A wearable lithium drug monitor developed by the University of Surrey could change the lives of patients who suffer from bipolar and depression.

It is estimated that bipolar disorder affects one in 100 people and lithium remains the most effective long-term therapy for the condition. It is incredibly important to monitor lithium intake as it has a narrow therapeutic range and can be toxic once levels elevate above it.

In a new study published by Biosensors and Bioelectronics, researchers from the University of Surrey, in collaboration with the University of Bath, detail how they have built on their ground-breaking research by developing extraction fibres to draw lithium from under the skin. They combined lithium extraction with a lithium sensor fibre and reference fibre to create a miniaturised and flexible potentiometric cell - a wearable monitor that can be used by patients without the need to be "primed" in solution.

The researchers demonstrated that their monitor was able to detect lithium and, through their lab tests, they found that their device could also determine the lithium concentration levels and potentially give a warning signal that high levels had been reached. The team is now looking to understand and investigate whether it is possible for these devices to be sensitive enough to detect extremely narrow therapeutic ranges of lithium.

Many physical wearable sensors have been developed to monitor people's temperature, heart rate and respiration rate and, while wearable glucose monitors are currently on the market, there are few other commercial wearable chemical sensors that exist. It is believed that using wearable chemical sensors at home, instead of a traditional clinical setting, for screening or follow up care would help to reduce the burden on health professional's time.

Dr Carol Crean, Senior Lecturer in Physical and Material Chemistry at the University of Surrey, said: "We are incredibly excited by the potential of this proof-of-concept study, which has shown that wearable fibre-based lithium sensors are viable and potentially life changing for the many living with bipolar disorder. Importantly, these devices could also save valuable time for health professionals because most of the monitoring of the therapeutic drug can be done at the patient's convenience."

Credit: 
University of Surrey

McMaster chemists find new way to break down old tires into material for new ones

image: Graduate student Sijia Zheng pictured with Michael Brook of McMaster University

Image: 
Georgia Kirkos, McMaster University

A team of chemists at McMaster University has discovered an innovative way to break down and dissolve the rubber used in automobile tires, a process which could lead to new recycling methods that have so far proven to be expensive, difficult and largely inefficient.

The method, outlined in the journal Green Chemistry, addresses the enormous environmental burden posed by tires, approximately 3 billion of which were manufactured and purchased worldwide in 2019. Most of those will end up in massive landfills or storage facilities, ultimately leaching contaminants into the ecosystem.

In 1990, a massive fire continued to burn out-of-control in a pile of 14 million scrap tires near Hagersville, Ontario. It continued for 17 days, spewing toxic smoke into the environment, and drove 4,000 residents from their homes. The fire has been linked to many long-term health issues, including rare cancers among the firefighters who worked on scene for days.

Tires are a typical example of a product prepared for a single use from a non-renewable resource. While some are used as fuel in the cement industry or broken down into crumbs to use as fillers in asphalt, cement or artificial turf, there is no convenient method for recovering the petroleum-based polymers from which they are made so they cannot be easily reused, effectively repurposed, or recycled.

"The chemistry of the tire is very complex and does not lend itself to degradation - for good reason," says Michael Brook, a professor in the Department of Chemistry & Chemical Biology at McMaster and lead author of the study. "The properties that make tires so durable and stable on the road also make them exceptionally difficult to break down and recycle."

Charles Goodyear first developed the technique of curing tires in 1850 by combining sulfur with natural rubber, which forms bridges between the natural polymers and transforms the mixture from fluid to rubber.

In the paper, researchers describe a process to efficiently break down the polymeric oils by breaking the sulfur-to-sulfur bond. Brook likens the structure to a piece of fishnet.

"We have found a way to cut all the horizontal lines so instead of having a net, you now have a large number of ropes, which can be isolated and reprocessed much more easily," he says.

The new method could help to eliminate and prevent the major environmental concerns and dangers posed by stockpiled tires.

While promising, researchers caution that the new method has some limitations because it is expensive for industrial applications.

"We're working on it, but this is the first major step. This process closes the loop on automotive rubber, allowing old tires to be converted into new products," says Brook.

Credit: 
McMaster University

Climate gas budgets highly overestimate methane discharge from Arctic Ocean

video: These are gas flare locations in the study area. The yellow dots represent seeps observed during the winter survey, while the red dots represent seeps observed during the summer survey. Methane seeps during the colder season decrease their emissions by 43 percent.

Image: 
B. Ferré/CAGE, UiT.

The atmospheric concentration of methane, a potent greenhouse gas, has almost tripled since the beginning of industrialisation. Methane emissions from natural sources are poorly understood. This is especially the case for emissions from the Arctic Ocean.

The Arctic Ocean is a harsh working environment. That is why many of the scientific expeditions, are conducted in the summer and early autumn months, when the weather and the waters are more predictable. Most extrapolations regarding the amount of methane discharge from the ocean floor, are thus based on observations made in the warmer months.

"This means that the present climate gas calculations are disregarding the possible seasonal temperature variations. We have found that seasonal differences in bottom water temperatures in the Arctic Ocean vary from 1,7°C in May to 3,5°C in August. The methane seeps in colder conditions decrease emissions by 43 percent in May compared to August." says oceanographer Benedicte Ferré, researcher at CAGE Centre for Arctic Gas Hydrate, Environment and Climate at UiT The Arctic University of Norway.

"Right now, there is a large overestimation in the methane budget. We cannot just multiply what we find in August by 12 and get a correct annual estimate. Our study clearly shows that the system hibernates during the cold season."

A frozen lid on top of large methane accumulations

The study was conducted west of the Norwegian Arctic Archipelago Svalbard - an area affected by a branch of North Atlantic Ocean current called West Spitzbergen Current. The observations were made at 400 meters water depth, where the ocean floor is known for its many methane seeps.

"We see bubbles from the methane seeps as flares during echo sounder surveys. There are plenty of them in this area. They probably originate from free gas migrating upwards from reservoirs, through sedimentary layers or tectonic faults." says Ferré.

The area in question is at the limit of so-called gas hydrate stability zone. Gas hydrates are solid, icy compounds of water and, often, methane. They remain solid beneath the ocean floor as long as the temperatures are cold and the pressure is high enough.

The bottom water temperatures affect the extent of the boundary of this stability zone.

"The hydrates form from the upward moving methane gas, in the uppermost sediments. This can happen rapidly given sufficiently cold-water temperatures. So, we get this hydrate lid containing these large accumulations of the greenhouse gas and slowing down the rate of emissions during cold periods. This lid then depletes during summertime, with warmer temperatures. The bottom-water warming, affects the equilibrium and we get seasonal variations of the methane emissions."

Seasonal changes strongly affect methane consuming bacteria

Luckily, more than 90 percent of the methane released from the ocean floor never reaches the atmosphere. Partly due to the physical properties of the ocean itself such as currents and water column stratification.

Methane is also consumed by specific bacteria (methanotrophs) in the water column. These are greatly affected by the seasonal variations described here. To a surprising extent.

"The activity of the methanotrophic bacteria decreases a lot in the colder periods. Which is somewhat logical as there is less methane to consume. However, methane discharge decreases by 43 percent, and one would think that bacterial activity decreased accordingly. But the bacterial activity goes down by some orders of magnitude in spite of there still being methane in the water. There is very little methanotrophs in the system during winter." says co-author of the study Helge Niemann, professor in geomicrobiology at Royal Netherlands Institute of Sea Research (NIOZ).

The seasonal changes have been important for understanding primary production in the ocean for a long time. But biogeochemical processes, such as methane oxidation by bacteria, have not been considered to be strongly influenced by seasonal changes.

"In this paper we prove that assumption wrong", states Niemann

Next step is to do more winter cruises to account for seasonal changes related to West Spitzbergen current all the way from the Norwegian Arctic to East Siberian shelf.

Potential tipping point

How methane will react in future ocean temperature scenarios is still unknown. The Arctic Ocean is expected to become between 3°C and a whopping 13°C warmer in the future, due to climate change. The study in question does not look into the future, but focuses on correcting the existing estimates in the methane emissions budget. However:

"We need to calculate the peculiarities of the system well, because the oceans are warming. The system such as this is bound to be affected by the warming ocean waters in the future." says Benedicte Ferré;.

A consistently warm bottom water temperature over a 12-month period will have an effect on this system.

"At 400 meters water depth we are already at the limit of the gas hydrate stability. If these waters warm merely by 1,3°C this hydrate lid will permanently lift, and the release will be constant." says Ferré.

Credit: 
UiT The Arctic University of Norway