Culture

FloChiP, a new tool optimizing gene-regulation studies

image: The FloChIP microfluidic cartridge.

Image: 
Ricardo Dainese (EPFL)

In the cell, proteins often interact directly with DNA to regulate and influence the expression of genes. For this to happen, proteins need to travel into the cell's nucleus where the DNA is tightly twisted and packed as chromatin, which forms the well-known chromosomes.

When the protein reaches its target location, chromatin unwinds to reveal the section of DNA that the protein will interact with. This interaction is obviously of great interest to biologists as it lies at the heart of multiple important cell functions or even malfunctions that lead to disease.

To study protein-chromatin interactions, biologists use a technique called "chromatin immunoprecipitation" (ChIP). The basic idea behind ChIP is to use an antibody that targets the chromatin-binding protein, and then to "pull it down" or precipitate it with the captured section of DNA. The DNA that is bound by the protein is then identified via sequencing, which is why the technique is usually referred to as "ChIP-seq".

Since it was invented in 2007, ChIP-seq has become the most popular method for studying chromatin-associated proteins like histones and transcription factors. However, it requires a long sequence of manual steps that limit both its throughput and sensitivity.

Now, scientists led by Bart Deplancke at EPFL's Institute of Bioengineering have developed a new approach to ChIP that promises to automate and lower its cost and complexity. The new method, dubbed "FloChIP" uses microfluidics, a bioengineering field that EPFL has helped developing and expanding.

Microfluidics essentially involves the precise manipulation of fluids through chips that contain multiple, carefully designed channels. Because it mimics the inner dynamics of a cell, this technique can and is already used in a number of bioengineering processes.

FloChIP implements microfluidics to greatly streamline the ChIP workflow. In a paper published in PNAS, the EPFL scientists demonstrate that FloChIP is highly modular and can perform multiple ChIP-seq assays simultaneously and reproducibly in an automated way. In the paper, the researchers show this for both histone marks and transcription factors.

"Thanks to its cost-effectiveness, throughput and general applicability, we believe that FloChIP will establish itself as a valid complement to the existing tools for the study of chromatin biology and protein-DNA interactions," says Riccardo Dainese, the study's first author.

"With this new technology, true automation of a difficult assay such as ChIP is within reach," adds Deplancke. "This will hopefully catalyze an increased use of chromatin-bound proteins as highly informative diagnostic indicators for a wide range of diseases including cancer."

Credit: 
Ecole Polytechnique Fédérale de Lausanne

An MRI technique has been developed to improve the detection of tumors

image: DWI of the phantom with polyvinylpyrrolidone (PVP) solutions (b value 500 s/mm2).

Image: 
Kristina Sergunova et al.

Early diagnosis of cancer is one of the highest-priority problem for the healthcare system, because it is critical for overall treatment success and saving patients' lives. DWI may be used to detect a malignancy in various tissues and organs. It has the advantage of providing insight into the diffusion of water molecules in body tissues without exposing patients to radiation.

The way the H2O molecules move depends on whether they are inside or outside the cells. Inside the cell the water movement is somewhat restricted by organelles that sometimes get in the way (slow diffusion) and semi-permeable cell membranes (hindered diffusion). Since in-between the cells there is more room, the only thing that restrict the water movement is cell membrane.

Such movement can be estimated through the mathematical processing of DWI data using apparent diffusion coefficient (ADC) maps. In absence of pathological tissues the intracellular ADC is lower compared to intercellular ADC. However, an increase in cell density (in presence of malignancies, especially those made up of many small-size cells) leads to a decrease in intercellular diffusion.

Since ACD is not an absolute value, it makes it dependable on external factors, such as sequence and reconstruction parameters, image quality and hardware features. In order to boost the efficacy of tumor differential diagnostics ACD must be estimated with greater accuracy and reproducibility. This can be achieved with phantoms (also called test objects) that allow assessing the imaging quality and building various diffusion models (i.e. non-restricted, hindered, and restricted diffusion with permiable and semi-permiable membranes). Such phantom has been developed by a research team from the Innovation Technology Department of the Moscow Center for Diagnostics & Telemedicine.

The scope of application of the phantoms developed by the US National Institute of Standards and Technology, the Quantitative Imaging Biomarkers Alliance and the Institute of Cancer Research in the UK is limited because they utilize polymer solutions. Instead, the authors of this paper suggested using a combination of siloxane-based water-in-oil (W/O) emulsions and aqueous solutions of polyvinylpyrrolidone (PVP). This components allow emulating both hindered and restricted diffusion while maintaining relatively high signal intensity. What is more, the newly-designed phantom can be used to assess the image quality control in terms of fat suppression, which again is critical for detecting pathological processes.

To simulate hindered diffusion, the investigators used aqueous PVP solutions with concentrations of 0 to 70%. The W/O emulsions imitated restricted diffusion in intracellular space. To attain high DWI signal (maximum radiolucent areas), the authors incorporated silicone oil: cyclomethicone and caprylyl-methicone. This phantom was scanned using 1.5T magnetic resonance scanner with various fat suppression techniques.

After a series of control experiments the authors came to a conclusion, that such phantom with the control substances allows modelling the apparent diffusion coefficients ranging from normal tissue to benign and malignant lesions: from 2.29 to 0.28mm2/s. Correspondingly, it is suitable for assessing the ACD measurement quality and the efficacy of fat suppression, as well as for calibrating MR scanners from varioous manufacturers.

This study is a part of the bigger research project started in the Moscow Center for Diagnostics & Telemedicine in 2017, that addresses standardization and optimization of MR scanners. The project is also aimed at developing the means of control over the scanner parameters to secure the high image quality and increase the diagnostic value of imaging studies.

Credit: 
Center of Diagnostics and Telemedicine

New study provides maps, ice favorability index to companies looking to mine the moon

image: UCF Planetary Scientist Kevin Cannon led a team that created the model system.

Image: 
University of Central Florida

The 49ers who panned for gold during California's Gold Rush didn't really know where they might strike it rich. They had word of mouth and not much else to go on.

Researchers at the University of Central Florida want to give prospectors looking to mine the moon better odds of striking gold, which on the moon means rich deposits of water ice that can be turned into resources, like fuel, for space missions.

A team lead by planetary scientist Kevin Cannon created an Ice Favorability Index. The geological model explains the process for ice formation at the poles of the moon, and mapped the terrain, which includes craters that may hold ice deposits. The model, which has been published in the peer-reviewed journal Icarus, accounts for what asteroid impacts on the surface of the moon may do to deposits of ice found meters beneath the surface.

"Despite being our closest neighbor, we still don't know a lot about water on the moon, especially how much there is beneath the surface," Cannon says. "It's important for us to consider the geologic processes that have gone on to better understand where we may find ice deposits and how to best get to them with the least amount of risk."

The team was inspired by mining companies on Earth, which conduct detailed geological work, and take core samples before investing in costly extraction sites. Mining companies conduct field mappings, take core samples from the potential site and try to understand the geological reasons behind the formation of the particular mineral they are looking for in an area of interest. In essence they create a model for what a mining zone might look like before deciding to plunk down money to drill.

The team at UCF followed the same approach using data collected about the moon over the years and ran simulations in the lab. While they couldn't collect core samples, they had data from satellite observations and from the first trip to the moon.

Why Mine the Moon

In order for humans to explore the solar system and beyond, spacecraft have to be able to launch and continue on their long missions. One of the challenges is fuel. There are no gas stations in space, which means spacecraft have to carry extra fuel with them for long missions and that fuel weighs a lot. Mining the moon could result in creating fuel , which would help ease the cost of flights since spacecraft wouldn't have to haul the extra fuel.

Water ice can be purified and processed to produce both hydrogen and oxygen for propellent, according to several previously published studies. Sometime in the future, this process could be completed on the moon effectively producing a gas station for spacecraft. Asteroids may also provide similar resources for fuel.

Some believe a system of these "gas stations" would be the start of the industrialization of space.

Several private companies are exploring mining techniques to employ on the moon. Both Luxembourg and the United States have adopted legislation giving citizens and corporations ownership rights over resources mined in space, including the moon, according to the study.

"The idea of mining the moon and asteroids isn't science fiction anymore," says UCF physics Professor and co-author Dan Britt. "There are teams around the world looking to find ways to make this happen and our work will help get us closer to making the idea a reality."

Credit: 
University of Central Florida

Solubilizer Captisol enables body to absorb authorized COVID-19 drug therapy

image: Rendering of the molecular structure of Captisol, the solubilizer invented at the University of Kansas, which allows remdesivir to be administered to the patient. Remdesivir was recently authorized under an emergency-use protocol to treat patients with COVID-19.

Image: 
Valentino Stella, University Distinguished Professor Emeritus at KU

LAWRENCE -- When the Food and Drug Administration issued an emergency-use authorization for the investigational pharmaceutical remdesivir to treat COVID-19 on May 1, in part it was due to pioneering work performed by pharmaceutical chemists at the University of Kansas School of Pharmacy in 1990. Today, KU graduates still hold important jobs at the firms producing and distributing the potentially life-saving therapy to people around the world during the coronavirus pandemic.

Remdesivir's formulation includes the solubilizer Captisol, developed at KU, which allows remdesivir be administered to the patient. Captisol was invented and patented by Valentino Stella, University Distinguished Professor Emeritus, and Roger Rajewski, research professor in the Department of Pharmaceutical Chemistry.

"We use Captisol as an excipient to enhance the aqueous solubility and chemical stability of remdesivir," said KU alumnus Reza Oliyai, senior vice president for pharmaceutical and biological operations at Gilead, the manufacturer of remdesivir. "All of us at Gilead are committed to doing everything we can to help address the global COVID-19 pandemic. The work of Professor Valentino Stella and also Dr. Roger Rajewski.at the University of Kansas has played a key role in the development of the remdesivir product that is being evaluated now for the treatment of COVID-19 infection."

As co-inventor Stella himself recently detailed in a history of Captisol, at the time of its invention there was a need for a new solubilizer for cancer and other therapies to replace surfactant-based solubilizers that were leading to dire complications for patients.

"A lot of drugs don't dissolve in water, or they're chemically unstable because of their physical and chemical properties," Stella said. "So, if you want to give a drug by intravenous injection, it's got to be in solution. You cannot inject particles into your vein that get trapped by the lung and you end up with a lung embolism. Also, if you take the drug orally with a tablet, they've got to dissolve in the gastrointestinal tract. If a drug is extremely insoluble, it's not absorbed and doesn't dissolve while it's going down the GI tract. So, there's a need for solubilizers to modify the properties of a drug."

Stella and Rajewski landed on the idea for the uniquely modified cyclodextrin, which came to be called Captisol, over beers one night. They were ironing out the path of Rajewski's doctoral research to develop a less toxic solubilizer for cancer drugs. At the time, Rajewski was Stella's graduate student.

Rajewski had the idea to modify the position of the cyclodextrin molecule's charge so it could solubilize drugs without also dangerously interacting with cholesterol in the body. "We thought we could get the safety and have the binding back, and that was what ended up working," Stella said.

Before Rajewski publicly defended his doctoral thesis on the new solubilizer, the pair filed a patent on their invention that also would pay royalties to KU. A startup firm called CyDex was spun out of the research to produce and market Captisol. Eventually, that firm worked with Pfizer but today Captisol is manufactured and marketed by the firm Ligand. It's in the formulation of more than a dozen therapies on the market, including Kyprolis, Vfend IV, Noxafil Injection, Zulresso, Evomela and Nexterone.

But it's the solubilizer's inclusion in remdesivir that's most notable during a global pandemic that's cost so many lives and decimated economies around the globe.

"It's gratifying to know the results of research I conducted in the Department of Pharmaceutical Chemistry here at KU are playing a role in the formulation of remdesivir for COVID-19 patients," Rajewski said. "While I was fortunate to be involved in the invention of Captisol more than 30 years ago, the development of Captisol for human use was the culmination of efforts by many individuals, both at CyDex and then Ligand. Likewise, the ingenuity of scientists such as those at Gilead leading to drugs like remdesivir is humbling. The KU community should be proud to be part of the team that makes such treatments possible."

Like Rajewski and Oliyai, many key players in the invention, production and marketing of remdesivir and Captisol were trained at KU under Stella.

"In 2001, I was fortunate to join a team of scientists and business entrepreneurs, many of whom are KU graduates, to develop and commercialize the nascent Captisol technology," said James Pipkin, vice president of new product development at Ligand, who earned his pharmaceutical chemistry degree from KU in 1981. "Captisol has a proud tradition with KU. The last few months have been intense as Ligand works closely with Gilead to provide support for its use in formulating remdesivir. More Captisol than ever before may be required to make remdesivir available for treating COVID-19 in the U.S. and around the world. I'm grateful for the education and mentoring I received at KU and the opportunity to participate with innovators to create and make available on a global scale many life-saving and life-changing medicines, but none more urgently important than remdesivir."

Those contributing to the manufacture of the drug, along with medical professionals and policymakers, are optimistic that remdesivir can make a difference to people being treated for serious COVID-19 infection.

"FDA's emergency authorization of remdesivir, two days after the National Institutes of Health's clinical trial showed promising results, is a significant step forward in battling COVID-19," Alex Azar, secretary of Health and Human Services said in a statement.

For Stella, Captisol and his many other important contributions to pharmaceutical chemistry can best be measured by the benefit they have to human health.

"It's not publishing the papers, it's not earning the grants, it's not the accolades you receive," Stella said. "It's the impact you make on people's lives. And that's been my mantra to myself and to my kids. It's a very humbling experience, let me tell you, because you don't necessarily start off thinking that way, even though people might say they did in retrospect. You know, I just wanted to be a professor, a teacher and a researcher -- and I had some great mentors along the way. Then, as I've realized the work that I did was so impactful, it's been pretty humbling."

Credit: 
University of Kansas

Female college students more affected academically by high alcohol use than men

Female college students appear to be more affected by high alcohol use than men, which may lead to less interest in academics, according to new research including by faculty at Binghamton University, State University of New York.

Lina Begdache, assistant professor of health and wellness studies at Binghamton University, and fellow researchers sought to compare neurobehaviors and academic effort among college students with low alcohol use with those of high alcohol consumption, and build conceptual models that represent the integration of the different variables.

They sent an anonymous survey to assess college students' alcohol use and frequency along with questions on sleep, academic performance and attitude toward learning. They compared gender responses and found that both young men and women exhibit common behavioral responses to high alcohol use such as abuse of other substances and risk-taking. These behaviors are regulated by the limbic system of the brain. However, the cognitive functions for high alcohol use among young men and women were different.

"Cognitive aptitudes of young women appear to be more affected than for men with high alcohol use," said Begdache. "Young women reported generally less interest in academic work and performance than young men. The latter reported more risky behaviors, such as being arrested, from excessive drinking. We also found that young women are more likely to depend on alcohol to improve mental well-being, which is also concerning, as they may self-medicate through drinking."

Because the gender brain is morphologically different, the long-term impact of excessive drinking may be different. In both genders, the researchers reported an increase in impulsive behaviors, which are under the control of the limbic system (the oldest part of the brain, evolutionary speaking). However, cognitive functions and decision making are controlled by the prefrontal cortex (the newest part of the brain, evolutionary speaking), which completes its maturity by the mid to late 20s. Therefore, seeing differential behaviors could imply that excessive alcohol use has differential effects on prefrontal cortex function/brain maturity, which may have an impact on mental health as well.

"These findings are also explained by the fact that women tend to have higher connectivity between cortices, while men have a large cortical volume in the areas on the limbic system that support impulsivity," said Begdache. "Thus, the differential behaviors noted with increasing alcohol levels are potentially related to the gender-based differences in the brain. We did find that men and women who don't drink or drink minimally exhibit responsible behaviors and academic effort, which are reflective of a normal trajectory of brain maturity."

Another reason for the difference seen is the differential metabolism of alcohol. Women metabolize alcohol at a slower rate, therefore, they are more likely to feel the effect of alcohol. Consequently, their brain is more likely to accumulate a toxic metabolite, acetaldehyde, which may be altering brain chemistry further to add to the differential behaviors identified in this study.

Academic performance and risky behaviors among college students may be linked to their drinking habits, so more education and awareness should be shared with college students, said Begdache.

Going forward, Begdache would like to assess the interrelation of nutrition, alcohol and mental health.

The paper, "Common and differential associations between levels of alcohol drinking, gender-specific neurobehaviors and mental distress in college students," was published in Trends in Neuroscience and Education.

Credit: 
Binghamton University

Ten years of ecosystem services matrix: Review of a (r)evolution

image: There has been an increasing need for robust and practical methodologies to assess ecosystem services: the benefits people obtain from ecosystems.

Image: 
Pensoft

In recent years, the concept of Ecosystem Services (ES): the benefits people obtain from ecosystems, such as pollination provided by bees for crop growing, timber provided by forests or recreation enabled by appealing landscapes, has been greatly popularised, especially in the context of impeding ecological crises and constantly degrading natural environments.

Hence, there has been an increasing need for robust and practical methodologies to assess ES, in order to provide key stakeholders and decision-makers with crucial information. One such method to map and assess ES: the ES Matrix approach, has been increasingly used in the last decade.

The ES Matrix approach is based on the use of a lookup table consisting of geospatial units (e.g. types of ecosystems, habitats, land uses) and sets of ES, meant to be assessed for a specific study area, which means that the selection of a particular study area is the starting point in the assessment. Only then, suitable indicators and methods for ES quantification can be defined. Based on this information, a score for each of the ES considered is generated, referring to ES potential, ES supply, ES flow/use or demand for ES.

Originally developed in a 2009 paper by a team, led by Prof Dr Benjamin Burkhard (Leibniz University Hannover and Leibniz Centre for Agricultural Landscape Research ZALF), the ES Matrix allows the assessment of the capacity of particular ecosystem types or geospatial units to provide ES.

Ten years later, a research led by Dr C. Sylvie Campagne (Leibniz University Hannover, Germany), Dr Philip Roche (INRAE, France), Prof Dr Felix Muller (University of Kiel, Germany) and Prof Dr Benjamin Burkhard conducted a review of 109 published studies applying the ES matrix approach to find out how the ES matrix approach was applied and whether this was done in an oversimplified way or not.

In their recent paper, published in the open-access, peer-reviewed journal One Ecosystem, the review confirms the method's flexibility, appropriateness and utility for decision-making, as well as its ability to increase awareness of ES. Nevertheless, the ES matrix approach has often been used in a "quick and dirty" way that urges more transparency and integration of variability analyses, they conclude.

"We analysed the diversity of application contexts, highlighted trends of uses and proposed future recommendations for improved applications of the ES matrix. Amongst the main patterns observed, the ES matrix approach allows for the assessment of a higher number of ES than other ES assessment methods. ES can be jointly assessed with indicators for ecosystem condition and biodiversity in the ES matrix," explains Campagne.

"Although the ES matrix allows us to consider many data sources to achieve the assessment scores for the individual ES, these were mainly used together with expert-based scoring (73%) and/or ES scores that were based on an already-published ES matrix or deduced by information found in related scientific publications (51%)," she elaborates.

In 29% of the studies, an already existing matrix was used as an initial matrix for the assessment and in 16% no other data were used for the matrix scores or no adaptation of the existing matrix used was made.

"Nevertheless, we recommend to use only scores assessed for a specific study or, if one wishes to use pre-existing scores from another study, to revise them in depth, taking into account the local context of the new assessment," she points out.

The researchers also acknowledge the fact that 27% of the reviewed studies did not clearly explain their methodology, which underlines the lack of method elucidation on how the data had been used and where the scores came from. Although some studies addressed the need to consider variabilities and uncertainties in ES assessments, only a minority of studies (15%) did so. Thus, the team also recommends to systematically report and consider variabilities and uncertainties in each ES assessment.

"We emphasise the need for all scientific studies to describe clearly and extensively the whole methodology used to score or evaluate ES, in order to be able to rate the quality of the scores obtained. The increasing number of studies that use the ES matrix approach confirms its success, appropriateness, flexibility and utility to generate information for decision-making, as well as its ability to increase awareness of ES, but the application of the ES matrix has to become more transparent and integrate more variability analyses," they conclude.

Credit: 
Pensoft Publishers

Major gaps in HIV programs in Africa

image: Focusing on the Tigray region in northern Ethiopia, a Flinders University project led by Dr Fisaha Tesfay examined 1757 hospital records of adults living with HIV who enrolled in a nutritional program and also interviewed 33 people living with HIV, health providers and food program managers.

Image: 
Flinders University

HIV management in developing countries varies with socioeconomic and structural circumstances, with two Flinders University studies finding examples of key ways to close the gap for those worst affected in developing countries.

The studies, just published in PLoS ONE journal, call for reforms to nutritional programs and for better treatment of HIV affected prisoners - providing guidance for several sub-Saharan regions as well as other low and middle-income countries.

Focusing on the Tigray region in northern Ethiopia, a Flinders University project led by Dr Fisaha Tesfay examined 1757 hospital records of adults living with HIV who enrolled in a nutritional program and also interviewed 33 people living with HIV, health providers and food program managers.

"We conclude that the nutritional programs in HIV settings should be reoriented towards addressing the underlying challenges such as poverty, poor livelihood, food insecurity, rather than just malnutrition," he says.

"As well, inefficiencies in the programs can discourage people from using them, particularly if they are not well and less able to resist infection."

Dr Tesfay says stigma and discrimination in the community flows through to food sharing, with the religious practice of fasting - and challenges of distance and transportation - also detrimental to the delivery and uptake of HIV-directed food programs in Ethiopia.

"The need for comprehensive and timely nutritional counselling to people living with HIV is relevant in all countries, including developed economies," he says.

The other Flinders University study was a systematic review of HIV care in low and middle-income countries compared to high-income countries.

Lead researcher Terefe Fuge says incarcerated people are at increased risk of human immunodeficiency virus (HIV) infection relative to the general population.

"Despite a high burden of infection, HIV care use among prison populations is often suboptimal and varies among settings, and little evidence exists explaining the discrepancy.

The Flinders review of 42 reports found a number of barriers to optimal use of HIV care in prisons around the world, particularly in resource-limited countries.

"As well as structural factors, a history of incarceration and re-incarceration, lack of community and social support, stigma, discrimination, substance abuse and negative attitudes to antiretroviral therapy reduced outcomes for HIV positive inmates," says public health researcher Dr Terefe Fuge.

"While correctional facilities often didn't match community standards of HIV care, we round they could have substantial powers to contribute to the use of HIV treatment as a prevention strategy.

"There is therefore an urgent need to improve interventions in poorer countries where there appears to be higher rates of delayed initiation and suboptimal outcomes of ART in prisoners, aiming to reduce the considerable disparity in the practice of standard of HIV care in low-income countries such as many in sub-Saharan Africa," he says.

Credit: 
Flinders University

ASCO 2020: UK-first study shows feasibility of genetic screening for prostate cancer

Genetic screening for prostate cancer in GP surgeries could be effective at picking up otherwise undiagnosed cases of the disease, a new pilot study shows.

Researchers 'barcoded' men for their genetic risk of prostate cancer by testing each for 130 DNA changes - and gave those at higher risk follow-up checks.

Their study found that population screening was safe and feasible, and identified new prostate cancers in over a third of apparently healthy men who were found to have the highest levels of inherited risk.

The pilot was the first ever in the UK to assess genetic screening for prostate cancer in the general population, and will now be followed by a larger-scale study that could prove the potential of a new screening programme for the disease.

The Institute of Cancer Research, London, and The Royal Marsden NHS Foundation Trust worked with GPs to invite more than 300 healthy Caucasian men aged 55-69 to participate in screening. The findings of the pioneering study will be presented today (Friday) at the American Society of Clinical Oncology (ASCO) virtual annual meeting.

The study was funded by the European Research Council with additional support from Cancer Research UK and the National Institute for Health Research.

The researchers collected DNA from saliva samples of 307 men and looked at for more than 130 genetic changes that can influence the risk of developing prostate cancer, each by a small amount.

They combined the effects of the genetic changes to assign each man an overall risk score. This in turn allowed men to be placed in different risk bands depending on how their level of risk compared with others in the population.

Men in the top 10 per cent of risk - 26 out of the 307 - were selected for screening and contacted by the researchers. Of these, 18 men accepted and underwent an MRI scan and a biopsy, and of these 18 apparently healthy men, seven were diagnosed with prostate cancer.

The good level of uptake among men and effectiveness at detecting undiagnosed disease show that population screening is possible and could be reproduced on an even larger scale.

Researchers also looked at how aggressive the cancers of those within the top 10 per cent of the genetic score were. All seven prostate cancers turned out to be manageable by active surveillance, with a mean prostate-specific antigen (PSA) score of 1.8 - a level between 0 and 2.5 is considered safe.

Now that the initiative has been shown to be feasible, a full pilot study, called BARCODE1, is ready to be launched. This study will involve 5,000 patients from 70 GP practices, and aims to provide a definitive answer on the potential role of population genetic screening for improving detection of prostate cancer.

Researchers believe that genetic screening could detect potentially aggressive cancers more effectively than PSA testing - which is controversial because of its high rates of over-diagnosis.

Study leader Professor Ros Eeles, Professor of Oncogenetics at The Institute of Cancer Research, London, and Consultant in Clinical Oncology and Oncogenetics at The Royal Marsden NHS Foundation Trust, said:

"A man's risk of prostate cancer is determined in part by which combination of at least 170 different genetic changes they happen to inherit.

"Our pilot study assessed men's genetic risk by testing for more than 130 genetic changes that have been linked to prostate cancer. We showed that genetic barcoding of men can safely and effectively identify those at the highest level of risk for prostate cancer, so they can be targeted for follow-up checks.

"We were able to identify prostate cancers in over a third of the 18 apparently healthy men who we found to have the highest levels of inherited risk. Our hope is that the larger BARCODE-1 pilot study will now be able to definitively show that population genetic screening for prostate cancer can cost-effectively improve diagnosis and ultimately save lives."

Professor Paul Workman, Chief Executive of The Institute of Cancer Research, London, said:

"It's vital that we find ways of putting our increased knowledge of the genetics and biology of cancer to work not only to find new treatments, but also to identify targeted methods for early detection of the disease.

"This is an exciting early pilot study which for the first time in the UK demonstrates that genetic screening for prostate cancer is safe, feasible and potentially effective. It's great to see that this research is now progressing into a larger-scale pilot, which if successful could show the potential of genetic screening to be a life-saver."

Patient Remy Smits, 59, said:

"I signed up for the trial after seeing the details advertised at my local GP Practice. Although I met all the criteria for joining, I did not think I would be in the high-risk group. I had a PSA test not long before joining the trial and it was relatively low (2.1) so I was quite surprised when I got called back for further investigations. I had another PSA test, followed by an MRI scan and then finally a biopsy where they detected cancer the size of a grain of sand which is quite remarkable.

"I have been put under 'active surveillance' and come into the clinic at The Royal Marsden every six months for repeat PSA testing and MRI scans.

"Whilst the realisation that I have cancer came as a shock; I feel better knowing that it has been identified at a very early stage. I also feel that I am now in a much better position to make an informed decision about any future treatment options. I also like the fact that being part of this trial will make a difference for many men in the future."

Professor David Cunningham, Director of Clinical Research at The Royal Marsden, said:

"Earlier and faster diagnosis is often the key to successfully treating cancer. Using genetic screening for men most at risk for prostate cancer will mean we have a much greater chance of being able to treat the disease successfully at an earlier stage, often with less invasive procedures and fewer long-term side effects."

Credit: 
Institute of Cancer Research

Study shows hydroxychloroquine's harmful effects on heart rhythm

image: Images show the voltage surface on a rabbit heart with and without HCQ. Without the drug (normal) the electrical activation spreads homogeneously, while with HCQ, waves propagate unevenly, generating complex spatiotemporal patterns and arrhythmias.

Image: 
Georgia Tech School of Physics

The malaria drug hydroxychloroquine, which has been promoted as a potential treatment for Covid-19, is known to have potentially serious effects on heart rhythms. Now, a team of researchers has used an optical mapping system to observe exactly how the drug creates serious disturbances in the electrical signals that govern heartbeat.

The research, reported May 29 in the journal Heart Rhythm, found that the drug made it "surprisingly easy" to trigger worrisome arrhythmias in two types of animal hearts by altering the timing of the electrical waves that control heartbeat. While the findings of animal studies can't necessarily be generalized to humans, the videos created by the research team clearly show how the drug can cause cardiac electrical signals to become dysfunctional.

"We have illustrated experimentally how the drug actually changes the waves in the heart, and how that can initiate an arrhythmia," said Flavio Fenton, a professor in the School of Physics at the Georgia Institute of Technology and the paper's corresponding author. "We have demonstrated that with optical mapping, which allows us to see exactly how the waveform is changing. This gives us a visual demonstration of how the drug can alter the wave propagation in the heart."

What the team saw was an elongation of the T wave, a portion of the heart cycle during which voltages normally dissipate in preparation for the next beat. By extending the QT portion of one wave cycle, the drug sets the stage for disturbances in the next wave, potentially creating an arrhythmia. Such disturbances can transition to fibrillation that interferes with the heart's ability to pump.

The ability to easily trigger disturbances known as "long QT" reinforces cautions about using hydroxychloroquine (HCQ) in humans - particularly in those who may have heart damage from Covid-19, cautioned Dr. Shahriar Iravanian, a co-author of the paper and a cardiologist in the Division of Cardiology, Section of Electrophysiology, at Emory University Hospital.

"The hearts used in the study are small and very resistant to this form of arrhythmia," Iravanian said. "If we had not seen any HCQ-induced arrhythmias in this model, the results would not have been reassuring. However, in reality, we observed that HCQ readily induced arrhythmia in those hearts. This finding is very concerning and, in combination with the clinical reports of sudden death and arrhythmia in Covid-19 patients taking HCQ, suggests that the drug should be considered a potentially harmful medication and its use in Covid-19 patients be restricted to clinical trial settings."

Georgia Tech postdoctoral fellow Ilija Uzelac administered HCQ to the animal hearts - one from a guinea pig and one from a rabbit - while quantifying wave patterns changing across the hearts using a high-powered, LED-based optical mapping system. Voltage-sensitive fluorescent dyes made the electrical waves visible as they moved across the surface of the hearts.

"The effect of the arrhythmia and the long QT was quite obvious," said Uzelac. "HCQ shifts the wavelengths to larger values, and when we quantified the dispersion of the electrical current in portions of the heart, we saw the extension of the voltage across the tissue. The change was very dramatic comparing the waveforms in the heart with and without the HCQ."

The drug concentration used in the study was at the high end of what's being recommended for humans. HCQ normally takes a few days to accumulate in the body, so the researchers used a higher initial dose to simulate the drug's effect over time.

In a normal heartbeat, an electrical wave is generated in specialized cells of a heart's right atrium. The wave propagates through the entire atria and then to the ventricles. As the wave moves through the heart, the electrical potential created causes calcium ions to be released, which stimulates contraction of the heart muscle in a coordinated pattern.

Drugs such as HCQ modify the properties of these ion channels and inhibit the flow of potassium currents, which prolongs the length of the electrical waves and creates spatial variations in their properties. Ultimately, that can lead to the development of dangerously rapid and dysfunctional heart rhythms.

"The wavelength becomes less homogeneous and that affects the propagation of additional waves, producing sections of the heart where the waves do not propagate well," Fenton said. "In the worst case, there are multiple waves going in different directions. Every section of the heart is contracting at a different time, so the heart is just quivering. At that point, it can no longer pump blood throughout the body."

Patients taking HCQ for diseases such as lupus and rheumatoid arthritis rarely suffer from arrythmia because the doses they take are smaller than those being recommended for Covid-19 patients, Iravanian said.

"Covid-19 patients are different and are at a much higher risk of HCQ-induced arrhythmia," he said. "Not only is the proposed dose of HCQ for Covid-19 patients two to three times the usual dose, but Covid-19 has effects on the heart and lowers potassium levels, further increasing the risk of arrythmias."

Fenton and his colleagues have already begun a new study to evaluate the effects of HCQ with the antibiotic azithromycin, which has been suggested as a companion treatment. Azithromycin can also cause the long QT effect, potentially increasing the impact on Covid-19 patients.

Credit: 
Georgia Institute of Technology

Nanoparticles can make home refrigeration more accessible for low-income households

image: Nanoparticles can reduce the power consumption of cooling devices such as refrigerators, freezers and air-conditioning by substantial amounts. In a drop-in refrigerant replacement test, researchers from the University of Johannesburg observed a 29% cut in electricity use in a home fridge. Environmentally-unfriendly R134a was replaced by a mix of more energy-efficient R600a and mineral oil, dosed with multi-walled carbon nanotubes (MWCNT’s).

Image: 
Photos supplied by Mr Taiwo O. Babarinde, University of Johannesburg.

Power consumption of a home refrigerator can be cut by 29% while improving cooling capacity. Researchers replaced widely-used, but environmentally unfriendly, R134a refrigerant with the more energy-efficient R600a. They dosed R600a with multi-walled carbon nanotube (MWCNT) nanoparticles. Drop-in refrigerant replacement in the field by trained technicians is possible, says an engineer from the University of Johannesburg.

This test of nanoparticle-dosed refrigerants is a first of its kind and recently published in Energy Reports, an open-access journal. The results can help make home refrigeration more accessible for low-income families.

R134a is one of the most widely-used refrigerants in domestic and industrial refrigerators. It is safe for many applications because it is not flammable. However, it has high global warming potential, contributing to climate change. It also causes fridges, freezers and air-conditioning equipment to consume a lot of electrical energy. The energy consumption contributes even more to climate change.

Meanwhile, a more energy-efficient refrigerant can result in much lower electricity bills. For vulnerable households, energy security can be improved as a result. Improved energy economy and demand-side management can also benefit planners at power utilities, as cooling accounts for about 40% of energy demand.

Nanoparticles enhance power reduction

Nano eco-friendly refrigerants have been made with water and ethylene glycol. Previous studies showed reduced energy use in nano-refrigeration, where refrigerants were dosed with multi-walled carbon nanotube (MWCNT) nanoparticles. The nanoparticles also resulted in reduced friction and wear on appliance vapour compressors.

But previous research did not test the effects of MWCNT's on hydro-carbon refrigerants such as R600a.

In a recent study, researchers at the University of Johannesburg tested the drop-in replacement of environmentally-unfriendly refrigerant R134a, in a home refrigerator manufactured to work with 100g R134a.

They replaced R134a with the more energy-efficient refrigerant R600a, dosed with MWCNT nanoparticles.

Reduces electricity use by more than a quarter

The researchers removed the R134a refrigerant and its compressor oil from a household fridge. They used a new refrigerant, R600a, and dosed it with multi-walled carbon nanotubes (MWCNTs). Mineral oil was used as a lubricant. The new mix was fed into the fridge and performance tests were conducted.

They found that the R600a-MWCNT refrigerant resulted in much better performance and cooling capacity for the fridge.

"The fridge cooled faster and had a much lower evaporation temperature of -11 degrees Celsius after 150 minutes. This was lower than the -8 degrees Celsius for R134a. It also exceeded the ISO 8187 standard, which requires -3 degrees Celsius at 180 minutes," says Dr Daniel Madyira.

Dr Madyira is from the Department of Mechanical Engineering Science at the University of Johannesburg.

"Electricity usage decreased by 29% compared to using R134a. This is a significant energy efficiency gain for refrigerator users, especially for low income earners," he adds.

To gain these advantages, the choice of MWCNT nanoparticles is critical, he says.

"The MWCNT's need to have nanometer-scale particle size, which is extremely small. The particles also need to reduce friction and wear, prevent corrosion and clogging, and exhibit very good thermal conductivity," says Dr Madyira.

Managing flammability

The new refrigerant mix introduces a potential risk though. Unlike R134a, R600a is flammable. On the other hand, it is more energy efficient, and it has a low Global Warming potential. Some refrigerator manufacturers have already adopted production with R600a and these appliances are available in the market.

"To do a safe drop-in replacement, no more than 150g of R600a should be used in a domestic fridge," says Dr Madyira. "Before the replacement, the fridge used 100g of R134a gas. We replaced that with 50g to 70g of R600a, to stay within safety parameters."

An untrained person should not attempt this drop-in replacement, says Dr Madyira. Rather, a trained refrigeration technician or technologist should do it.

Replacement procedure

"Mineral oil is used as the compressor oil. This should be mixed with the recommended concentration. A magnetic stirrer and ultrasonicator are needed to agitate and homogenize the ingredients in the mixture. The mixture can then be introduced into the compressor. After that, R600a can be charged into the refrigerator compressor, while taking care to not use more than 150g of the gas," says Dr Madyira.

A woman's fridge is her castle

A far more energy-efficient refrigerant, such as the R600a-MWCNT mix, can save consumers a lot of money. Vulnerable households in hot climates in developing countries can benefit even more.

Low income earners in many countries are dependent on home fridges and freezers to safely store bulk food supplies. This greatly reduces the risk of wasting food due to spoilage, or food poisoning due to improperly stored food. These appliances are no longer a luxury but a necessity, says Dr Madyira.

Without fridges, people may be forced to buy food daily in small quantities and at much higher prices. Because daily buying may not be required anymore, travel time and costs for buying food can be much lower as well.

Refrigeration also makes it possible to safely store more diverse food supplies, such as fresh fruit and vegetables. Medicines that require cooling can be stored at home. This can make more balanced diets and nutrition, and better physical health, more accessible for a low-income household.

Grid power still rules for low-income refrigeration

From a sustainability point of view, it can look preferable to run most home fridges and freezers from solar power.

However solar panels, backup batteries, and direct current (DC) fridges are still too expensive for most low-income families in areas served by power utilities.

Energy-efficient, alternating current (AC) fridges running on grid power may be more affordable for most. Further cutting power consumption with R600a-MWCNT refrigerant can bring down costs even more.

Refrigeration for all vs demand-side management

As more low-income households and small businesses switch on grid-powered fridges, freezers and air-conditioning, power demand needs be managed better.

In South Africa where the study was conducted, the state-operated power utility faces huge challenges in meeting demand consistently. Long-lasting rolling blackouts, known as load-shedding, have been implemented as a demand-side power management measure.

Shaving off more than a quarter of the power consumption of fridges, freezers and air-conditioning units can free up national power supply for improved energy security.

Credit: 
University of Johannesburg

'Single pixel' vision in fish helps scientists understand how humans can spot tiny details

image: This is a cross-section of a zebrafish eye, showing UV capabilities.

Image: 
Tom Baden

Recently discovered 'single-pixel vision' in fish could help researchers understand how humans are able to spot tiny details in their environment - like stars in the sky.

In a paper published this week, researchers at the University of Sussex found that zebrafish are able to use a single photoreceptor to spot their tiny prey.

This photoreceptor is like an 'eye pixel' and seems to provide enough of a signal for the fish to go and investigate the stimulus.

Professor Tom Baden and his co-authors believe that this could provide insights as to how animals, including humans, are able to process tiny details in their environment.

Professor of Neuroscience Tom Baden, said: "Zebrafish have what's known as an 'acute zone' in their eyes, which is basically an evolutionary forerunner to the fovea that we have in the retina of our own. In both the zebrafish acute zone and the human fovea, visual acuity is at its highest.

"Because of this similarity, zebrafish are really good models to help us understand how the human eye might work.

"We found that, in this acute zone, zebrafish are using single photoreceptors to spot their tiny prey - the equivalent of us spotting a star in the sky. These photoreceptors are a little bit like pixels - but 'special' pixels at that: They have been specifically tuned to be sensitive to prey-like stimuli.

"There have been suggestions that primates and therefore humans too, use similar tricks to enhance our own foveal vision."

The paper, published in the journal Neuron, also reveals how zebrafish are able to see UV light and actively use UV vision to see their prey.

Prof Baden added: "This is effectively super-vision."

"Zebrafish prey is really hard to see with human vision, which ranges from 'red' to 'blue'. However, beyond blue, zebrafish can also see UV, and as it turns out their prey is easily spotted when illuminated with the UV component in natural sunlight. Our research found that without UV-vision, zebrafish have a much harder time spotting their prey."

While humans don't have this skill, the similarities between the zebrafish' acute zone and our own fovea has provided researchers with a model to further investigate how our eyes work and how we're able to see in such high detail.

Scientists now have the possibility to manipulate visual functions in the zebrafish acute zone to see how this affects their sense of sight. These sorts of tests aren't possible in humans so doing them in fish will give researchers new insights into the function and dysfunction of extreme spatial acuity vision.

Credit: 
University of Sussex

Vision: Observing the world during childhood affects the rest of life

Much of what we will be as adults depends on the first years of life, on what we simply observe happening around us and not only on what we are taught explicitly. This also applies to the development of the visual system. This is the conclusion reached by two neuroscientists of SISSA (Scuola Internazionale Superiore di Studi Avanzati), who, for the first time, have experimentally shown the importance of passive visual experience for the maturation and the proper functioning of some key neurons involved in the process of vision. The research, published on Science Advances, is a fundamental step towards understanding learning mechanisms during development. It also has potential clinical implications, for the study of new visual rehabilitation therapies, and technological implications, where it could lead to an improvement of the learning algorithms employed by artificial vision systems.

From the early stages of gestation, our visual system is subject to continuous stimuli that become increasingly intense and structured after birth. They are at the centre of the learning mechanisms that, according to some theories, are fundamental to the development of vision. "Learning comes in two flavours: either 'supervised' (i.e., guided by a 'teacher') or 'unsupervised' (i.e., based on spontaneous, passive exposure to the environment)" explains Davide Zoccolan, director of the Visual Neuroscience Lab of SISSA and lead researcher. "The first is the one we can all associate with our parents or teachers, who direct us to the recognition of an object. The second one happens spontaneously, passively, when we move around the world observing what happens around us."

Giulio Matteucci and Davide Zoccolan have studied the role of spontaneous visual experience and, in particular, the role of the temporal continuity of visual stimuli. This property of natural visual experience is considered fundamental for the maturation of the visual system by some theoretical models that mathematically describe the biological learning processes.

To test this hypothesis, the researchers daily exposed two groups of young rodents to different visual environments. "We played a series of videos, in either their original version or after randomly shuffling the single frames (or images), thus destroying the temporal continuity of visual experience" explain the scientists. "In the subjects exposed to this discontinuous visual flow we observed the impairment of the maturation of some cells of the visual cortex called 'complex'. These neurons play a key role in visual processing: they allow recognising the orientation of the contour of an object regardless of its exact position in the visual field, a perceptual ability that only recently has been implemented in artificial vision systems. Having shown that their maturation is highly sensitive to the degree of continuity of visual experience is the first direct experimental confirmation of the theoretical prediction."

These observations show the importance of passive visual experience for the development of the visual system. They also indicate how forms of spontaneous learning are at the base of the development of at least some elementary visual function, while other forms of learning only come into play later, due to the acquisition of more specific and sophisticated skills.

These are results with potential clinical and technological implications, as Zoccolan explains. "In some developing countries, there are children suffering from congenital cataract, who, after the surgery to remove it, have to develop substantially from scratch their visual recognition skills. Already today, some rehabilitative approaches exploit the temporal continuity of specific visual stimuli (for example, geometric shapes in motion) to teach these patients to discriminate visual objects. Our results confirm the validity of these approaches, revealing the neuronal mechanisms behind it and suggesting possible improvements and simplifications," concludes the neuroscientist. "Furthermore, the development of artificial visual systems currently uses mainly 'supervised' learning techniques, which require the use of millions of images. Our results suggest that these methods should be complemented by 'unsupervised' learning algorithms that mimic the processes at work in the brain, to make training faster and more efficient".

Credit: 
Scuola Internazionale Superiore di Studi Avanzati

Electronic health records fail to detect up to 33% of medication errors

Despite improvements in their performance over the past decade, electronic health records (EHRs) commonly used in hospitals nationwide fail to detect up to one in three potentially harmful drug interactions and other medication errors, according to scientists at University of Utah Health, Harvard University, and Brigham and Women's Hospital in Boston. In tests using simulated medical records, the researchers found that EHR systems consistently failed to detect errors that could injure or kill patients.

"EHRs are supposed to ensure safe use of medications in hospitals," says David C. Classen, M.D., the study's corresponding author and a professor of internal medicine at U of U Health. "But they're not doing that. In any other industry, this degree of software failure wouldn't be tolerated. You would never get on an airplane, for instance, if an airline could only promise it could get you to your destination safely two-thirds of the time."

The study appears in the May 29 issue of JAMA Network Open.

First deployed in the 1960s, EHRs replaced written medical records and manual filing systems. They became almost universally adopted in the early 21st century after an Institute of Medicine report found that medical errors accounted for 1 million inpatient injuries and 98,000 deaths annually. According to the report, medication safety problems were the most frequent cause of preventable harm.

Medical professionals hoped that widespread use of EHRs would reduce this problem. The computerized systems are designed to issue warnings to doctors if their orders for medication could result in allergic reactions, adverse drug interactions, excessive doses, or other potentially harmful effects. However, recent studies suggest that medication safety and overall safety problems in hospitals continue to occur at a high rate despite the almost ubiquitous use of EHRs by hospitals.

One snag is that hospitals must customize and adapt their EHR software to meet their own needs, Classen says. This is a complex process that makes it difficult to keep up with all changes in drug safety. So, for example, a serious drug interaction that would trigger EHR warnings at one hospital might not at another one.

"Although EHRs are now widely used, their safety performance continues to vary from hospital to hospital," says David W. Bates, M.D., a study co-author and chief of the Division of General Internal Medicine and Primary Care at Brigham and Women's Hospital in Boston. "Hospitals decide what drug-related decision supports to turn on within their systems. They have a great deal of latitude around this."

However, Classen says federal regulators only inspect EHR systems with factory specifications, meaning that whatever alterations hospitals make after installation aren't accounted for.

To determine the effectiveness of EHRs in real-world settings, Classen, Bates, and colleagues studied the results of tests conducted by an EHR safety evaluation tool called Leapfrog CPOE EHR test, which simulated actual drug orders that have and could potentially harm patients. Almost all of the scenarios were based on actual adverse drug events that harmed or killed patients in the real world.

In one scenario, for instance, a 52-year-old woman is admitted to the hospital with pneumonia. Prior to hospitalization, she was taking warfarin, a blood-thinning medication, once a day to combat deep vein thrombosis. After admission, she receives warfarin three times a day. This excessive dosage goes undetected by the hospital's EHR system for five days. As a result, the patient has a large hemorrhage and dies of causes directly related to the overdose of warfarin.

Scenarios like this one were fed directly into EHR systems at 2,314 hospitals nationwide to see if their systems would perform better. All of the tests were conducted over a 10-year span, 2009 to 2018.

The researchers found that, in 2009, these systems correctly issued warnings or alerts about potential medication problems only 54% of the time. By 2018, EHRs detected about 66% of these errors.

"These systems meet the most basic safety standards less than 70% of the time," the researchers conclude. "These systems have only modestly increased their safety during a 10-year period, leaving critical deficiencies in these systems to detect and prevent critical safety issues."

In addition, Classen notes that the hospitals in this study used the EHR evaluation tool on a voluntary basis to improve patient safety and care. Many hospitals do not participate in such evaluations, suggesting that the true safety performance of U.S. hospitals could be worse than the study found.

Credit: 
University of Utah Health

Images in neurology: Brain of patient with COVID-19, smell loss

What The Study Did: This case report describes a 25-year-old female radiographer with no significant medical history who had been working in a COVID-19 ward who presented with a mild dry cough that lasted for one day, followed by persistent severe anosmia (loss of smell) and dysgeusia (an impaired sense of taste).

Authors: Letterio S. Politi, M.D., of the IRCCS Istituto Clinico Humanitas in Rozzano, Italy, is the corresponding author.

To access the embargoed study:  Visit our For The Media website at this link https://media.jamanetwork.com/

Editor's Note: Please see the articles for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

People more likely to accept nudges if they know how they work and how effective they are

The more people know about when and why behavioural interventions are being used and their effectiveness, the more likely they are to accept their use to change their behaviour, according to recent research from Queen Mary University of London and the University of Oxford.

The study, published in the journal Behavioural Public Policy, investigated people's views on how acceptable they found the use of behavioural interventions, or nudges, in a variety of different situations.

Over 1700 participants from the US and UK were presented with examples of genuine behavioural interventions used by policy makers, and asked to comment on how acceptable they found them based on several factors including their effectiveness, how easy it was to identify how the nudging works, and the expert proposing the intervention.

The results showed that when judging whether a nudge is acceptable, knowing how an intervention works and its effectiveness, were much more important to people than who was 'nudging' them.

Dr Magda Osman, Reader in Experimental Psychology at Queen Mary, said: "This study reinforces the position that people don't just accept behavioural interventions without consideration, and shows that the US and UK public make informed decisions about the acceptability of behavioural interventions. Our findings would suggest that whatever the combination of methods to achieve behavioural change is, accepting them comes down to, where they are going to be used, how they are going to be used, and how likely they are to work."

Dr Natalie Gold, a Senior Research Fellow at the University of Oxford and first author of the study, added: "Much academic debate has focused on the transparency aspect of nudges, however here we show that effectiveness is just as important for people to accept them. Whilst politicians place a lot of emphasis on the use of nudges, in reality they make a small overall contribution to behaviour change, and their use needs to be considered alongside other traditional policy in order to be effective."

The researchers also found that context was important, with study participants widely accepting the use of nudges in health and wellbeing areas, but tending to think that nudges should not be used in personal finance contexts.

"Pre Covid-19, as well as now, we have clear evidence that many people have few savings, and behavioural interventions have been proposed as a way to target matters of this kind. However, these findings show that any policy makers need to act with caution as people find behavioural interventions in the domain of personal finances the least acceptable," said Dr Osman.

"We can only speculate why people are more accepting of health-related nudges, but one consideration could be that usually with health there's a general agreement that certain things are unhealthy, whereas for financial decisions the right approach is often dependent on the person and their attitude to risk. Another reason for this difference could be a continued lack of trust towards the financial services industry following on from the 2008 financial crisis. The conditions we face during the current Covid-19 pandemic present a further test of the public's acceptability of the use of behavioural interventions alongside other typical policy measures."

Behavioural Interventions, or nudges, are psychological-based tools that are used to generate a change in behaviour that in theory is better for them, and for society.

They are used in public policy all over the world, from getting people to pay their taxes on time to encouraging healthy habits, such as reduced alcohol consumption.

Credit: 
Queen Mary University of London