Tech

VRK1: a protein that reduces the survival of patients with neuroblastoma

Researchers from the Departments of Cell Biology and Medical Physiology at the University of Seville have identified that a high expression of the human protein VRK1 is associated with tumour aggressiveness and low survival among neuroblastoma patients. Aggressive neuroblastoma is one of the most common solid childhood cancers and causes disproportionately high mortality in affected children. Although advances have been made in recent years, the outlook for recovery in children affected by aggressive neuroblastoma remains low and a better understanding of this tumour's biology is needed in order to create new treatments and prognostic tools.

Researchers have characterised the function of VRK1 in neuroblastoma tumour cells and have determined that this protein is essential for tumour cell growth and proliferation. "By studying the expression of this protein in tumours, we were able to identify a priori patients where tumour progression is going to be worse, even in groups where current tools do not predict that behaviour," notes Francisco M. Vega.

This study suggests that VRK1 works in conjunction with other oncogenes such as MYCN, which is heavily affected in this cancer, to boost tumour progression and make it more aggressive. Therefore, the researchers suggest that inhibiting VRK1 could be a new strategy for cancer therapy in neuroblastoma. "VRK1 is a protein kinase. These are some of the best targets for targeted cancer treatment, as we can potentially produce inhibitors in the laboratory that override their activity," explains Professor Vega.

Credit: 
University of Seville

Screening for endocrine disruption in artificial zebrafish for long-term risk assessment

image: Picture of zebra fish

Image: 
Korea Institute of Science and Technology(KIST)

Newly developed chemical substances or cosmetics can be distributed only after passing the risk assessment in human and environment. The developed product distribution is prohibited if it cannot pass the safety assessment for endocrine disruption, such as bisphenol A (BPA), removed from the market. The toxicity assessment of EDCs in the environmental aspect generally requires OECD testing on three species: Daphnia pulex, green algae, and zebrafish. Especially zebrafish is that similar to more than 90% human genes. However, zebrafish has recently become classified as a vertebrate animal testing, and it is limited tests with permission from a hospital or specific organization due to ethical issues for animal testing.

To address this problem, the Korea Institute of Science and Technology (KIST) announced that the collaborative research team led by Dr. Young Jun Kim, leader of environmental safety at KIST Europe, and Professor Hyunjoon Kong from the University of Illinois at Urbana-Champaign tried to develop the long-term toxicity and risks by cultivating organoids that mimic the liver of zebrafish.

The Vitellogenin (VTG) measurement, a biomarker for endocrine disruptors in examining chemical substances, using zebrafish has been widely used in toxicology. However, there have been limitations in using zebra organoids, developed as an alternative to using actual zebrafish for toxicity assessment as the cultured tissues could not produce high concentrations of VTG. The collaborative research team utilized polyethylene glycol (PEG) to produce organoid scaffolds to cultivate zebrafish liver cells. As a result, the team could culture the long-term zebrafish liver cells combined and assembled themselves, maintaining them in shape for 28 days.

Based on this cultivation method, the research team successfully developed artificial liver organoids of zebrafish for chronic toxicity testing used in long-term effect analysis for more than six weeks. The team expects that this artificial liver will yield similar results to direct tests using zebrafish, thereby replacing animal testing without ethical problems. In particular, the use of a three-dimensional zebrafish liver cell biomimetic system developed by the team enables measuring the long-term effects of EDCs on the environment in a shorter time.

"KIST Europe has been focusing on the research of 'framework development for the adverse outcome pathway of EDCs in the ecosystem' based on our accumulated work and research experiences in the environmental safety area since 2018," KIST Europe Director Joon-kyeong Kim said. "We will keep putting efforts to support the tangible safety and medical technology for the public based on our abilities in toxicity experiments and in developing alternative methods to animal testing."

"Our short-term goal is to secure technologies regarding potential toxicity assessment based on global-level alternatives to animal testing, and to arrange the foundation for domestic technology transfer," Young Jun Kim, who led the collaborative research team, said. "In the future, we plan to focus on developing alternative methods to animal testing by developing a toxicity information alert system that can analyze the effects of various EDCs on the environment."

Credit: 
National Research Council of Science & Technology

Impacts of COVID-19 emissions reductions remain murky in the oceans

image: Off the coast of Hawaii, a Woods Hole Oceanographic Institution (WHOI) Hawaii Ocean Time-series Station buoy makes measurements of air-sea carbon dioxide, seawater pH, and other oceanographic parameters.

Image: 
Al Plueddemann, WHOI.

As the COVID-19 pandemic took hold in the first half of 2020, humans around the world stopped moving and making, resulting in a 9% drop in the greenhouse gas emissions at the root of climate change.

Almost overnight, the Himalayas became visible from a distance for the first time in years. Rivers flowed free of toxic pollutants and the air sparkled with blue skies in major cities like New Delhi and Los Angeles. While internet rumors of swans and dolphins returning to Venetian canals were debunked, the idea that "nature is healing" in 2020 quickly took root.

Unfortunately, any silver lining from the pandemic remains murky in the oceans.

Nicole Lovenduski, associate professor of atmospheric and oceanic sciences and director of the Ocean Biogeochemistry Research Group at the Institute of Arctic and Alpine Research, delved into the data and found no detectable slowing of ocean acidification due to COVID-19 emissions reductions. Even at emissions reductions four times the rate of those in the first half of 2020, the change would be barely noticeable.

"It's almost impossible to see it in pH," said Lovenduski. "So has this solved ocean acidification? No, it has not."

Lovenduski shared the results Friday, Dec. 11 at the American Geophysical Union 2020 Fall Meeting. The findings will also be submitted to the journal Geophysical Research Letters.

On the bright side, this study yields important insights on how to track changes in ocean carbon going forward. Lovenduski and fellow oceanographers now have a better idea of where to look for the signs if emissions reductions are having an impact on the Earth system, what they will look like and the resources they will need to gather that data.

The study results also put COVID-19 emissions reductions in sharp perspective as short-term, one-time gains in comparison with the committed, long-term cuts needed to reduce the impacts of human-caused climate change.

"It's a little bit wild to think that that complete economic shutdown of the world didn't do anything immediately that we could detect in terms of ocean acidification or atmospheric carbon. But it's also a little bit wild to think that this reduction in emissions is what it will take every single year to get us back to something that's a healthy version of our climate," said Lovenduski.

Lovenduski analyzed data shared by a group of Canadian modelers, who ran a suite of experiments to see how the climate has been impacted by the reduction in emissions in 2020. She used a fingerprinting technique on the data, often used to differentiate humans' impacts on the climate from non-human impacts like volcanic eruptions and sunspots. Using this method allowed her to separate COVID-19 emissions reductions from non-human influences on the oceans.

While she found no perceptible change in ocean acidity, her analysis showed that by 2021, the oceans were already absorbing slightly less carbon from the atmosphere due to COVID-19 emissions reductions.

"What this suggests is that pretty much immediately, the exchange of carbon between the ocean and atmosphere responds to the change in the loading of carbon in the atmosphere because we've decreased our emissions," said Lovenduski.

The ocean is a major climate change buffer, absorbing a large fraction of the carbon dioxide that human activity emits into the atmosphere every year. This mitigates the immediate impacts of climate change, such as rising global temperatures, but heats up the ocean instead, causing the water to expand and contribute to rising sea levels.

Increased carbon in the ocean is also the cause of ocean acidification, which is detrimental to coral reefs and a significant swath of ocean life. However, if we mitigate our emissions year after year to avoid the worst global warming scenarios, we have a chance to slow the rate of ocean acidification in the long term, according to Lovenduski.

While she doesn't have the dramatic good news that friends and neighbors were hoping to hear, this work offers clues as to what it will take to stop the worsening impacts of global climate change in the world's oceans.

"This sudden precipitous drop in emissions is a big deal," she said. "It can offer insight into what might happen if we actually follow a plan like the Paris Climate Agreement."

Credit: 
University of Colorado at Boulder

The edible marine snail now contains a new species

image: The shell of tegula xanthostigma (left) and the shell of tegula kusairo (right).

Image: 
Daishi Yamazaki

Recognizing species is important for understanding regional biodiversity and for environmental conservation. However, taxonomic identity is sometimes obscure even with the organisms that are closest to human life.

Researchers from Tohoku University and Okayama University have found and described a new species from a very familiar and edible group of the marine snail genus Tegula. This new species has been named Tegula kusairo Yamazaki, Hirano, Chiba & Fukuda, 2020.

Tegula xanthostigma is a common and edible marine snail species with two shell colour forms: black and brown-green. Consistently, the two shell forms have been treated as intraspecific variations of a single nominal species, T. xanthostigma. However, recent molecular phylogenetic and ecological studies have demonstrated that these two forms are genetically distinct and differ in habitat usage patterns. It is thus reasonable to separate them into two different species.

To do so, researchers performed morphological analysis and assessed literature from the 19th century onwards. While the black shell corresponds to T. xanthostigma, the brown-green shell, which has never had a valid scientific name, is now considered its own species: T. kusairo.

The differences between the two species are:

T. xanthostigma's shell surface is black while the underside is bright green. The species is found on rocky shores facing the open sea. The distribution area is from the Japanese mainland to the northern part of Vietnam.

T. kusairo shell surface is brown-green while the underside is pale yellow or white. The shell is smaller than T. xanthostigma. This species mainly inhabits sheltered coastal environments, specifically muddy gravel bottoms in the upper to middle intertidal zones.

Its distribution is limited to Japanese sheltered coastal areas such as the Seto Inland Sea, ?mura Bay in Nagasaki Prefecture, Kink? Bay in Kagoshima Prefecture and the southern part of South Korea.

Familiar marine snail groups caught as food still contain undescribed species. In 2017, the turban shell species, which is well known by the Japanese, gained a new classification (Turbo sazae Fukuda, 2017), and this case is similar to the current study. In short, people did not know what species they were eating.

"Our results highlight the difficulty of identifying and classifying marine molluscs," said Daishi Yamazaki, co-author of the study. "It also showed that integrating morphology, ecology, and genetics provides a powerful method to detect species' boundaries."

In the future, the researchers hope to establish a classification system of marine snail groups, including the Tegula genus, and clarify the process of species diversification in East Asia.

Credit: 
Tohoku University

Single-crystal technology holds promise for next-generation lithium-ion batteries

image: A nickel-rich single crystal created by the PNNL team.

Image: 
Image courtesy of PNNL

RICHLAND, Wash. - A promising technology under development by major battery makers has become even more attractive, thanks to researchers who have taken an unprecedented look at one key barrier to better, longer-lasting lithium-ion batteries.

Scientists at the U.S. Department of Energy's Pacific Northwest National Laboratory report new findings about how to make a single-crystal, nickel-rich cathode hardier and more efficient. The team's work on the cathode, one critical component in the lithium-ion batteries that are common in electric vehicles today, appears in the Dec. 11 issue of the journal Science.

Researchers around the globe are working to create batteries that deliver more energy, last longer and are less expensive to produce. Improved lithium-ion batteries are critical for broader adoption of electric vehicles.

The challenges are plenty. A battery's simple appearance belies its complexity, and controlling the complex molecular interactions within is essential for the device to operate properly. Constant chemical reactions take their toll, limiting how long a battery lasts and influencing its size, cost and other factors.

The promise of a nickel-rich cathode: More energy capacity

Scientists are working on ways to store more energy in the cathode materials by increasing nickel content. Nickel is on the drawing board of lithium-ion battery makers largely because of its relatively low cost, wide availability and low toxicity compared to other key battery materials, such as cobalt.

"Nickel-rich cathode materials have real potential to store more energy," said Jie Xiao, corresponding author of the paper and group leader of PNNL's battery research program. "But large-scale deployment has been a challenge."

While nickel holds great promise, in high amounts it can pose problems in batteries. The more nickel in the material's lattice, the less stable the cathode. High nickel content can increase unwanted side reactions, damaging the material and making storage and handling very difficult.

Exploiting all the benefits from more nickel while minimizing the drawbacks poses a challenge.

Currently the most common nickel-rich cathode is in the form of polycrystals - aggregates of many nanocrystals in one larger particle. These carry advantages for storing and discharging energy faster. But the polycrystals sometimes break down during repeated cycling. This can leave much of the surface area exposed to electrolyte, accelerating unwanted chemical reactions induced by high nickel content and generating gas. This irreversible damage results in a battery with a nickel-rich cathode that fails faster and raises safety concerns.

Of single crystals, ice cubes and lithium-ion batteries

Scientists like Xiao are trying to sidestep many of these problems by creating a single-crystal, nickel-rich cathode. The PNNL researchers developed a process to grow high-performance crystals in molten salts - sodium chloride, common table salt - at high temperature.

What's the advantage of a single crystal compared to a polycrystalline material? Think of keeping your food cool while camping. A solid block of ice melts much more slowly than the same amount of ice that comes in small cubes; the block of ice is more resistant to damage from higher temperatures and other outside forces.

It's similar with nickel-rich cathodes: An aggregate of small crystals is much more vulnerable to its surroundings than a single crystal under certain conditions, especially when there's high nickel content, since nickel is prone to induce unwanted chemical reactions. Over time, with repeated battery cycles, the aggregates are ultimately pulverized, ruining the cathode's structure. That's not so much a problem when the amount of nickel in the cathode is lower; under such conditions, a polycrystalline cathode containing nickel offers high power and stability. The problem becomes more pronounced, though, when scientists create a cathode with more nickel--a cathode truly rich in nickel.

Cathode's microcracks reversible, preventable

The PNNL team discovered one reason why a single-crystal, nickel-rich cathode breaks down: It's due to a process known as crystal gliding, where a crystal begins to break apart, leading to microcracks. They found that the gliding is partially reversible under certain conditions and have proposed ways to avoid the damage altogether.

"With the new fundamental understanding, we will be able to prevent the gliding and microcracks in the single crystal. This is unlike the damage in the polycrystalline form, where the particles are pulverized in a process that is not reversible," said Xiao.

It turns out that gliding motions within the crystal's lattice layers are at the root of microcracks. The layers move back and forth, like cards in a deck as they're shuffled. The gliding occurs as the battery charges and discharges - lithium ions depart and return to cathode, straining the crystal ever so slightly each time. Over many cycles, the repeated gliding results in microcracks.

Xiao's team learned that the process can partially reverse itself through the natural actions of the lithium atoms, which create stresses in one direction when the ions enter the crystal lattice and in the opposite direction when they leave. But the two actions don't completely cancel each other out, and over time, microcracks will occur. That's why single crystals ultimately fail, though they don't break down into small particles like their polycrystalline counterparts.

The team is pursuing several strategies to prevent the gliding. The researchers have discovered that operating the battery at a common voltage - around 4.2 volts - minimizes damage while still within the normal range of lithium-ion batteries for electric vehicles. The team also predicts that keeping the size of a single crystal below 3.5 microns may avoid damage even at higher voltages. And the team is exploring ways to stabilize the crystal lattice to better accommodate the arrival and departure of lithium ions.

The team estimates that the single-crystal, nickel-rich cathode packs at least 25 percent more energy compared to the lithium-ion batteries used in today's electric vehicles.

Now, PNNL researchers led by Xiao are working with Albemarle Corporation, a major specialty chemical manufacturing company and one of the world's leading producers of lithium for electric vehicle batteries. In a collaboration funded by DOE, the team will research the impacts of advanced lithium salts on the performance of single-crystal nickel-rich cathode materials by demonstrating the process at kilogram scale.

Credit: 
DOE/Pacific Northwest National Laboratory

Calibrating kidney function for cancer patients

image: To tame a tumor, clinicians must give a patient the right amount of drugs. Too little, and it will not destroy the tumor. Too much, and it could damage tissues beyond the tumor. Because most drugs are removed from the body through the kidney, it is critical to know how well the kidney functions. Previous methods for estimating kidney function underestimated how quickly kidneys were clearing out the drugs.

Image: 
Ben Wigler/CSHL, 2020

Cold Spring Harbor Laboratory (CSHL) scientists have developed a new model for assessing kidney function in cancer patients as part of an international collaboration that involved contributions from the United Kingdom and Sweden. The model gives clinicians a new tool to aid dose determinations for chemotherapy to treat a patient's disease while limiting the drugs' harmful effects.

Chemotherapy drugs are usually injected into the bloodstream and removed from the body as blood filters through the kidneys. Sometimes the drugs harm the kidneys. Understanding how efficiently a patient can filter blood is essential for predicting how the body will process a drug and monitoring treatment effects. To be sure a treatment reaches tumor cells at the right dose, patients whose kidneys remove compounds from their blood efficiently will need more drug than someone whose kidney function is impaired. Calculating an appropriate dose is a crucial balancing act. CSHL Assistant Professor Tobias Janowitz, the senior author of the study, says:

"This is the constant balance that we are trying to get right in cancer therapy across the board. We have to be careful to make sure that we don't put the patient at risk. And we have to make sure that we dose the drugs in a way that we get this balance right between how effective is the drug and how many side effects does the drug cause."

Clinicians assess kidney function before every round of cancer treatment. Measuring the kidneys' filtration rate directly is costly and requires specialized facilities, so instead, it is frequently estimated by measuring how much creatinine--a chemical waste product--is in the blood. The relationship between creatinine levels and filtration rate depends on several factors, including age, gender, and body mass index. Mathematical models are used to integrate these data.

Previous models were developed using data from patients with kidney disease, whose failing kidney function must be closely monitored. But it hasn't been clear how those models represent what's happening in patients with cancer, Janowitz says. "Cancer patients may have a unique physiology, and most of them don't have impaired kidney function. So, it is conceivable that the models that are available don't serve cancer patients so well."

Janowitz and his colleagues, including CSHL fellow Hannah Meyer, developed their model using data from patients at three cancer centers. The new model, described in the journal Clinical Cancer Research, is compatible with two widely used methods for measuring creatinine and provides enough precision to guide clinical decisions. Follow-up studies will be needed to determine whether using the new model improves treatment outcomes for cancer patients.

Credit: 
Cold Spring Harbor Laboratory

COVID-19 lockdown causes unprecedented drop in global CO2 emissions in 2020

video: From Future Earth and the Global Carbon Project, this animation shows the atmosphere as a "bucket" filling with greenhouse gas pollution, from 1870 to 2020.

Image: 
Concept by Rob Jackson, visualisation by Alistair Scrutton and Jerker Lokrantz.

The global COVID-19 lockdowns caused fossil carbon dioxide emissions to decline by an estimated 2.4 billion tonnes in 2020 - a record drop according to researchers at Future Earth's Global Carbon Project.

The fall is considerably larger than previous significant decreases - 0.5 (in 1981 and 2009), 0.7 (1992), and 0.9 (1945) billion tonnes of CO2 (GtCO2). It means that in 2020 fossil CO2 emissions are predicted to be approximately 34 GtCO2, 7% lower than in 2019.

Emissions from transport account for the largest share of the global decrease. Those from surface transport, such as car journeys, fell by approximately half at the peak of the COVID-19 lockdowns. By December 2020, emissions from road transport and aviation were still below their 2019 levels, by approximately 10% and 40%, respectively, due to continuing restrictions.

Total CO2 emissions from human activities - from fossil CO2 and land-use change - are set to be around 39 GtCO2 in 2020.

The release of this year's Global Carbon Budget comes ahead of the fifth anniversary tomorrow of the adoption of the UN Paris climate Agreement, which aims to reduce the emission of greenhouse gases to limit global warming. Cuts of around 1 to 2 GtCO2 are needed each year on average between 2020 and 2030 to limit climate change in line with its goals.

"Given that we need to reduce global emissions by more than 7% year over year through 2030, this analysis shows that social responses alone will not drive the sustained reductions needed to effectively combat climate change," says Josh Tewksbury, Interim Executive Director at Future Earth. "Alongside energy transformations, smart policy in areas like emissions-free transportation and the future of work may help lock in these observed reductions."

Five years on from the landmark agreement, the international team behind the annual carbon update say growth in global CO2 emissions had begun to falter, with emissions increasing more slowly in recent years, which could be partly in response to the spread of climate policy. For the decade prior to 2020, fossil CO2 emissions decreased significantly in 24 countries while their economy continued to grow.

However, the researchers warn that it is too early to say how much emissions will rebound by during 2021 and beyond, as the long-term trend in global fossil emissions will be largely influenced by actions to stimulate the global economy in response to the COVID-19 pandemic.

Prof Corinne Le Quéré, Royal Society Research Professor at UEA's School of Environmental Sciences, contributed to this year's analysis. She said: "All elements are not yet in place for sustained decreases in global emission, and emissions are slowly edging back to 2019 levels. Government actions to stimulate the economy at the end of the COVID-19 pandemic can also help lower emissions and tackle climate change.

"Incentives that help accelerate the deployment of electric cars and renewable energy and support walking and cycling in cities are particularly timely given the extensive disturbance observed in the transport sector this year."

The emissions decrease appears more pronounced in the US (-12%) and EU27 countries (- 11%), where COVID-19 restrictions accelerated previous reductions in emissions from coal use. It appears least pronounced in China (-1.7%), where the effect of COVID-19 restrictions on emissions occurred on top of rising emissions. In addition, restrictions in China occurred early in the year and were more limited in their duration, giving the economy more time to recover.

In the UK, which first introduced lockdown measures in March, emissions are projected to decrease about 13%. The large decrease in UK emissions is due to the extensive lockdown restrictions and the second wave of the pandemic.

In India, where fossil CO2 emissions are projected to decrease about 9%, emissions were already lower than normal in late 2019 because of economic turmoil and strong hydropower generation, and the COVID-19 effect is potentially superimposed on this changing trend.

For the rest of the world the effect of COVID-19 restrictions occurred on top of rising emissions, with emissions this year projected to decrease by about 7%.

Globally, the peak of the decrease in emissions in 2020 occurred in the first half of April, when lockdown measures were at their maximum, particularly across Europe and the USA.

Emissions from industry, for example metal production, chemicals, and manufacturing, reduced by up to a third during the COVID-19 lockdown in spring. However, they could already be back up to near or even above 2019 levels by now.

Despite lower emissions in 2020, the level of CO2 in the atmosphere continues to grow - by about 2.5 parts per million (ppm) in 2020 - and is projected to reach 412 ppm averaged over the year, 48% above pre-industrial levels.

Lead researcher Prof Pierre Friedlingstein, of the University of Exeter, said: "Although global emissions were not as high as last year, they still amounted to about 39 billion tonnes of

CO2, and inevitably led to a further increase in CO2 in the atmosphere. The atmospheric CO2 level, and consequently, the world's climate, will only stabilize when global CO2 emissions are near zero."

Preliminary estimates based on fire emissions in deforestation areas indicate that emissions from deforestation and other land-use change for 2020 are similar to the previous decade, at around 6 GtCO2. Approximately 16 GtCO2 was released, primarily from deforestation, while the uptake of CO2 from regrowth on managed land, mainly after agricultural abandonment, was just under 11 GtCO2. Measures to better manage land could both halt deforestation and help increase the CO2 sink from regrowth.

Deforestation fires were lower this year compared to 2019 levels, which saw the highest rates of deforestation in the Amazon since 2008. In 2019 deforestation and degradation fires were about 30% above the previous decade, while other tropical emissions, mainly from Indonesia, were twice as large as the previous decade because unusually dry conditions promoted peat burning and deforestation.

Land and ocean carbon sinks continue to increase in line with emissions, absorbing about 54% of the total human-induced emissions.

Credit: 
Future Earth

Rapid lateral flow immunoassay developed for fluorescence detection of SARS-CoV-2 RNA

image: a, The principle of HC-FIA. b, Representative results on a lateral flow strip. c, The fluorescence analysis device. d, A photograph of the portable suitcase laboratory, which has a length of 55.5?cm, a width of 37?cm and a height of 23?cm, with a weight of 8.5?kg. e, The testing process of the HC-FIA assay. Step 1: nucleic acid hybridization in an incubator at constant temperature. Step 2: determination of the intensity of the fluorescence signals on the test card using the device shown in c. f, Probe distribution in the genome of SARS-CoV-2.

Image: 
SIBET

The COVID-19 pandemic has highlighted the need for rapid and accurate nucleic acid detection at the point of care. To meet this need, scientists from the Suzhou Institute of Biomedical Engineering and Technology have developed a novel amplification-free rapid SARS-CoV-2 nucleic acid detection platform based on hybrid capture fluorescence immunoassay (HC-FIA).

The use of the monoclonal antibody S9.6 is the distinctive feature of this process, which recognizes DNA-RNA double-stranded hybrids and enables the conversion of nucleic acid testing into immunofluorescence on a simple lateral flow dipstick.

The whole test procedure involves two steps, namely hybridization and immunofluorescence analysis, and it can be finished in less than an hour.

The process uses optimized DNA probes to target the conserved open reading frame 1ab, envelope protein and the nucleocapsid regions of the SARS-CoV-2 genome derived from clinical throat swab samples. It can achieve a limit of detection of 500 copies/mL, comparable to commercial kits based on real-time quantitative polymerase chain reaction (qPCR). In addition, no significant cross-reaction has been found between the SARS-CoV-2 probes and 55 common pathogens.

The performance of the HC-FIA detection reagent is free of interference by hemoglobin, mucin and various possible drugs that might exist in clinical samples. This assay and test kit can also be adapted for the detection of other viral RNA.

The HC-FIA process described here was supported by a multi-hospital randomized double-blind trial involving 734 samples. The current research not only represents proof-of-concept, but also has resulted in development of a commercial test kit for the diagnosis of SARS-CoV-2. The test kit was recently approved by the National Medical Products Administration and received European Conformity certification.

The test kit requires little in terms of equipment and professional personnel to confirm infection. It can function as a "suitcase laboratory" and thus be used as an on-site detection method for outpatient departments, emergency departments, customs and grassroots disease control.

"We are confident that the HC-FIA test is capable of serving as a sure safeguard in containing the epidemic through reducing community spread and imported cases, especially in developing countries," said WANG Daming, lead researcher of the study.

Credit: 
Chinese Academy of Sciences Headquarters

Tiny bubbles on electrodes key to speeding up chemical processes

image: na

Image: 
na

New Curtin University-led research has shown the formation of bubbles on electrodes, usually thought to be a hindrance, can be beneficial, with deliberately added bubbles, or oil droplets, able to accelerate processes such as the removal of pollutants such as hydrocarbons from contaminated water and the production of chlorine.

Dr Simone Ciampi, from Curtin's School of Molecular Life Sciences, explained many industrial processes are electrochemical, meaning the desired chemical reaction to create an end product is assisted by the flow of electrical currents.

"Electrodes assist chemists to achieve required electrochemical reactions, such as in the purification of alumina, and the technology used to produce chlorine for swimming pools," Dr Ciampi said.

"Often over the course of their use, small bubbles of gas begin to form on these electrodes, blocking parts of their surface. These bubbles prevent fresh solution from reaching the electrodes, and therefore, slow down the necessary reactions.

"It was generally thought these bubbles essentially stopped the electrode from working properly, and the appearance of the bubbles was a bad thing. However, our new research suggests otherwise," Dr Ciampi said.

Using fluorescence microscopy, electrochemistry and multi-scale modelling, the research team showed that in the vicinity of bubbles that stick to an electrode surface, valuable chemical reactions occur under conditions where normally such reactions would be considered impossible.

Co-researcher Dr Yan Vogel, also from Curtin's School of Molecular and Life Sciences, said it was these 'impossible' reactions occurring in the corona of bubbles that piqued the team's interest, and warranted further exploration.

"We revealed for the first time that the surrounding surface of an electrode bubble accumulates hydroxide anions, to surprisingly large concentrations," Dr Vogel said.

"This population of negatively charged ions surrounding bubbles is unbalanced by ions of the opposite sign, which was quite unexpected. Usually charged chemical species in solution are generally balanced, so this finding showed us more about the chemical reactivity of bubbles.

"Basically we've learned that surface bubbles can actually speed up electrochemical reactions where small molecules are joined to form large networks of molecules in a polymer, like in camera films or display devices like glucose sensors for blood sugar monitoring."

Credit: 
Curtin University

Energy-efficient magnetic RAM: A new building block for spintronic technologies

image: Schematic representation of Fe3GeTe2-based non-volatile memory prototype. Fe3GeTe2 is a ferromagnet, where its spins (little white arrows) align in the same direction. The orientation of the spins defines 1 or 0 binary bits. a) Initial state, where the information 0 is recorded. b) To write new information, a small current (orange arrow) is applied, which changes the material from a hard magnet to a soft magnet so that the stored information can be easily modified (say, from 0 to 1). c) Once the current is turned off, the material changes back to a hard magnet, and the information 1 written in the stage b) can be maintained for a long time without any external power supply, making it a non-volatile memory.

Image: 
POSTECH & SNU

Researchers at Pohang University of Science and Technology (POSTECH) and Seoul National University in South Korea have demonstrated a new way to enhance the energy efficiency of a non-volatile magnetic memory device called SOT-MRAM. Published in Advanced Materials, this finding opens up a new window of exciting opportunities for future energy-efficient magnetic memories based on spintronics.

In modern computers, the random access memory (RAM) is used to store information. The SOT-MRAM (spin-orbit torque magnetic RAM) is one of the leading candidates for the next-generation memory technologies that aim to surpass the performance of various existing RAMs. The SOT-MRAM may operate faster than the fastest existing RAM (SRAM) and maintain information even after the electric energy supply is powered off whereas all fast RAMs existing today lose information as soon as the energy supply is powered off. The present level of the SOT-MRAM technology falls short of being satisfactory, however, due to its high energy demand; it requires large energy supply (or large current) to write information. Lowering the energy demand and enhancing the energy efficiency is an outstanding problem for the SOT-MRAM.

In the SOT-MRAM, magnetization directions of tiny magnets store information and writing amounts to change the magnetization directions to desired directions. The magnetization direction change is achieved by a special physics phenomenon called SOT that modifies the magnetization direction when a current is applied. To enhance the energy efficiency, soft magnets are ideal material choice for the tiny magnets since their magnetization directions can be easily alterned by a small current. Soft magnets are bad choice for the safe storage of information since their magnetization direction may be altered even when not intended - due to thermal noise or other noise. For this reason, most attempts to build the SOT-MRAM adopt hard magnets, because they magnetize very strongly and their magnetization direction is not easily altered by noise. But this material choice inevitably makes the energy efficiency of the SOT-MRAM poor.

A joint research team led by Professor Hyun-Woo Lee in the Department of Physics at POSTECH and Professor Je-Geun Park in the Department of Physics at Seoul National University (former associate director of the Center for Correlated Electron Systems within the Institute for Basic Science in Korea), demonstrated a way to enhance the energy efficiency without sacrificing the demand for safe storage. They reported that ultrathin iron germanium telluride (Fe3GeTe2, FGT) - a ferromagnetic material with special geometrical symmetry and quantum properties - switches from a hard magnet to a soft magnet when a small current is applied. Thus when information writing is not intended, the material remains a hard magnet, which is good for the safe storage, and only when writing is intended, the material switches to a soft magnet, allowing for enhanced energy efficiency.

"Intriguing properties of layered materials never cease to amaze me: the current through FGT induces a highly unusual type of spin-orbit torque (SOT), which modifies the energy profile of this material to switch it from a hard magnet to a soft magnet. This is in clear contrast to SOT produced by other materials, which may change the magnetization direction but cannot switch a hard magnet to a soft magnet," explains Professor Lee.

Experiments by Professor Park's group revealed that this FGT-based magnetic memory device is highly energy-efficient. In particular, the measured magnitude of SOT per applied current density is two orders of magnitude larger than the values reported previously for other candidate materials for the SOT-MRAM.

"Controlling magnetic states with a small current is essential for the next-generation of energy-efficient devices. These will be able to store greater amounts of data and enable faster data access than today's electronic memories, while consuming less energy," notes Dr. Kaixuan Zhang who is a team leader in Professor Park's group, interested in studying the application of correlated quantum physics in spintronic devices.

"Our findings open up a fascinating avenue of electrical modulation and spintronic applications using 2D layered magnetic materials," closed Professor Lee.

Credit: 
Pohang University of Science & Technology (POSTECH)

Beta-blockers display anti-inflammatory effects in advanced liver disease

Approximately 170,000 people die every year in Europe from the direct consequences of advanced chronic liver disease (cirrhosis). In Austria, cirrhosis of the liver is usually due to a fatty liver resulting from excessive alcohol consumption or overeating and/or poor diet and - thanks to effective antiviral therapies - less common due to viral hepatitis.

Chronic liver damage leads to scarring (fibrosis) of the liver tissue, which can ultimately lead to increased blood pressure in the vascular system of the gastrointestinal tract (i.e. portal hypertension).

Portal hypertension can result in serious complications such as ascites and variceal bleeding. A paper recently published by Thomas Reiberger's research group at MedUni Vienna has already shown that increasing severity of portal hypertension is paralleled by inflammatory reactions in the body (systemic inflammation). Pronounced systemic inflammation may ultimately trigger the development of serious complications, such as acute-on-chronic liver failure.

Beta-blocker treatment and portal vein pressure measurement at MedUni Vienna

For many years, beta-blockers have been used as a standard drug treatment but only 50% to 60% of patients achieve a clinically relevant reduction in portal vein pressure. It is therefore necessary to assess the hemodynamic response invasively by means of hepatic vein catheterisation. Over the last few years, beta-blocker therapy and the invasive technique of hemodynamic measurements have been continuously optimised in the Vienna Hepatic Haemodynamic Laboratory at the Division of Gastroenterology and Hepatology of the Department of Medicine III of MedUni Vienna and Vienna General Hospital. Head of the Vienna Hepatic Haemodynamic Laboratory, Thomas Reiberger, explains: "We use portal pressure measurements to ensure that our patients with liver cirrhosis receive both best diagnosis and effective beta-blocker therapy."

Anti-inflammatory effects of beta-blockers in cirrhosis

The recently published study shows for the first time that beta-blockers also have an impact on systemic inflammation. For this study, the researchers determined biomarkers of systemic inflammation prior to and subsequently during ongoing treatment with beta-blockers. This showed that patients suffering from advanced stages of cirrhosis were not only more likely to display a pronounced systemic inflammatory response but were also more likely to benefit from the anti-inflammatory effects of beta-blockers. It was observed that cirrhotic patients who achieve a relevant reduction in inflammatory markers (such as the white blood cell count) under beta-blocker therapy developed significantly fewer complications of portal hypertension and had a lower risk of liver-related mortality.

"After further validation in clinical trials, these promising data may help us to predict the individual benefit of beta-blocker treatment more accurately and thus to give our patients optimal advice regarding their prognosis and other treatment options," explains Mathias Jachs, who was mainly responsible for the study as lead author. The study findings were published in the journal Gut, one of the leading international journals in the field of gastroenterology and hepatology.

Credit: 
Medical University of Vienna

"Electronic amoeba" finds approximate solution to traveling salesman problem in linear time

image: A single-celled amoeboid organism, a plasmodium of true slime mold Physarum polycephalum (Photo: Masashi Aono)

Image: 
Masashi Aono

Researchers at Hokkaido University and Amoeba Energy in Japan have, inspired by the efficient foraging behavior of a single-celled amoeba, developed an analog computer for finding a reliable and swift solution to the traveling salesman problem -- a representative combinatorial optimization problem.

Many real-world application tasks such as planning and scheduling in logistics and automation are mathematically formulated as combinatorial optimization problems. Conventional digital computers, including supercomputers, are inadequate to solve these complex problems in practically permissible time as the number of candidate solutions they need to evaluate increases exponentially with the problem size -- also known as combinatorial explosion. Thus new computers called "Ising machines," including "quantum annealers," have been actively developed in recent years. These machines, however, require complicated pre-processing to convert each task to the form they can handle and have a risk of presenting illegal solutions that do not meet some constraints and requests, resulting in major obstacles to the practical applications.

These obstacles can be avoided using the newly developed "electronic amoeba," an analog computer inspired by a single-celled amoeboid organism. The amoeba is known to maximize nutrient acquisition efficiently by deforming its body. It has shown to find an approximate solution to the traveling salesman problem (TSP), i.e., given a map of a certain number of cities, the problem is to find the shortest route for visiting each city exactly once and returning to the starting city. This finding inspired Professor Seiya Kasai at Hokkaido University to mimic the dynamics of the amoeba electronically using an analog circuit, as described in the journal Scientific Reports. "The amoeba core searches for a solution under the electronic environment where resistance values at intersections of crossbars represent constraints and requests of the TSP," says Kasai. Using the crossbars, the city layout can be easily altered by updating the resistance values without complicated pre-processing.

Kenta Saito, a PhD student in Kasai's lab, fabricated the circuit on a breadboard and succeeded in finding the shortest route for the 4-city TSP. He evaluated the performance for larger-sized problems using a circuit simulator. Then the circuit reliably found a high-quality legal solution with a significantly shorter route length than the average length obtained by the random sampling. Moreover, the time required to find a high-quality legal solution grew only linearly to the numbers of cities. Comparing the search time with a representative TSP algorithm "2-opt," the electronic amoeba becomes more advantageous as the number of cities increases. "The analog circuit reproduces well the unique and efficient optimization capability of the amoeba, which the organism has acquired through natural selection," says Kasai.

"As the analog computer consists of a simple and compact circuit, it can tackle many real-world problems in which inputs, constraints, and requests dynamically change and can be embedded into IoT devices as a power-saving microchip," says Masashi Aono who leads Amoeba Energy to promote the practical use of the amoeba-inspired computers.

Credit: 
Hokkaido University

Robots could replace real therapy dogs

video: Robotic animals could be the 'pawfect' replacement for our real-life furry friends, a new study published today by the University of Portsmouth has found.

Image: 
University of Portsmouth

Robotic animals could be the 'pawfect' replacement for our real-life furry friends, a new study published today by the University of Portsmouth has found.

Animals, especially dogs, can have therapeutic benefits for children and young people. A new paper, published in The International Journal of Social Robotics, has found that the robotic animal, 'MiRo-E', can be just as effective and may even be a better alternative.

Dr Leanne Proops from the Department of Psychology, who supervised the study said: "We know that real dogs can provide calming and enjoyable interactions for children - increasing their feelings of wellbeing, improving motivation and reducing stress.

"This preliminary study has found that biomimetic robots - robots that mimic animal behaviours - may be a suitable replacement in certain situations and there are some benefits to using them over a real dog."

Dogs are the most commonly used animals for therapy because of their training potential and generally social nature. However, there are concerns about using them in a setting with children because of the risk of triggering allergies or transmitting disease, and some people do not like dogs, so may not be comfortable in the presence of a real therapy dog.

Olivia Barber, who owns a therapy dog herself, and is first author of the paper, said: "Although lots of people in schools and hospitals benefit greatly from receiving visits from a therapy dog, we have to be mindful of the welfare of the therapy dog. Visits can be stressful and incredibly tiring for therapy dogs, meaning that we should be exploring whether using a robotic animal is feasible."

There are lots of positives to using a robotic animal over a therapy dog. They can be thoroughly cleaned and can work for longer periods of time. They can also be incredibly lifelike, mirroring the movements and behaviour of a real animal, such as wagging their tails to show excitement, expressing "emotions" through sounds and colour, turning their ears towards sounds and even going to sleep.

The researchers used real dogs and a biomimetic robot in a mainstream secondary school in West Sussex to interact with 34 children aged 11-12.

The two real-life therapy dogs were a three-year-old Jack Russell crossed with a Poodle and a 12-year-old Labrador-retriever from the charity Pets as Therapy. The robot was a MiRo-E biomimetic robot developed by Consequential Robotics.

The children were asked to complete a questionnaire about their beliefs and attitudes towards dogs and robots, before they took part in two separate free-play sessions, one with a real-life dog and one with a robot.

The researchers found the children spent a similar amount of time stroking both the real-life dog and the robot, but they spent more time interacting with the robot.

Despite the children reporting they significantly preferred the session with the living dog, overall enjoyment was high and they actually expressed more positive emotions following interaction with the robot. The more the children attributed mental states and sentience to the dog and robot, the more they enjoyed the sessions.

Dr Proops said: "This is a small-scale study, but the results show that interactive robotic animals could be used as a good comparison to live dogs in research, and a useful alternative to traditional animal therapy."

Credit: 
University of Portsmouth

Reactive Video playback that you control with your body

image: Using Reactive Video to follow a Tai Chi instructor

Image: 
Chris Clarke

Computer scientists have developed an entirely new way of interacting with video content that adapts to, and is controlled by, your body movement.

Fitness videos and other instructional content that aims to teach viewers new martial arts skills, exercises or yoga positions have been popular since VHS in the 80s and are abundant on Internet platforms like YouTube.

However, these traditional forms of instructional videos can lead to frustration, and even the potential for physical strain, as novice viewers, or those with limited physical mobility, struggle to keep up and mimic the movements of the expert instructors.

Now an international team of researchers from Lancaster University, Stanford University and FXPAL, have created a solution that dynamically adapts to mirror the position of the viewer's body and matches the speed of video playback to the viewer's movements.

The system, called 'Reactive Video', uses a Microsoft Kinect sensor, the latest in skeleton-tracking software, and probabilistic algorithms to identify the position, and movement of joints and limbs - such as elbows, knees, arms, hands, hips and legs. By working out the viewer's movements it can match and compare this with the movement of the instructor in the video footage. It then estimates the time the user will take to perform a movement and adjusts playback of the video to the correct position, and pace, of the viewer.

As well as providing a more immersive experience, Reactive Video also helps users to more accurately mimic and learn new movements.

The researchers tested the system on study participants performing tai chi and radio exercises - a form of callisthenics popular in Japan. The results from the study showed that both systems could adapt to the users' movements.

Dr Christopher Clarke, researcher from Lancaster University and co-author on the paper, said: "Since the 1980s, and especially now with the Internet, videos have helped people stay active and have offered a cheaper, more convenient alternative to gym memberships and personal trainers. However, traditional video players do have limitations - they can't provide feedback, or adapt the pace and intensity of the physical movement to the user.

"We know performing movements in slow motion is beneficial for learning by providing opportunities to analyse your movements, and developing timing. We also know it can result in less physical strain for inexperienced users.

"For some people, keeping pace can be tricky - especially when learning something new, and for older people or those with impaired movement. Also, constantly reaching for a remote to pause, rewind and replay, can be frustrating and breaks the immersion.

"Our system overcomes these issues by having the video automatically adjust itself to play back at the user's speed, which is less stressful and more beneficial for learning."

Don Kimber, co-author of the research, said: "Reactive Video acts and feels like a magic mirror where as you move the video mirrors your movement, but with a cleaned-up version of the procedure, or position, performed correctly by an expert for the user to mimic and learn from."

An additional benefit of Reactive Video, and something that sets it apart from exercise content developed for game consoles, is that it can be applied to existing footage of appropriate video content removing the need to create specially produced bespoke content.

"By using this system we can post-process existing instructional video content and enhance it to dynamically adapt to users providing a fundamental shift in how we can potentially interact with videos," said Dr Clarke.

The team believe that with further research this kind of adaptive technology could be developed for sports and activities such as learning dance routines or honing golf swings.

Credit: 
Lancaster University

Testing memory over four weeks could predict Alzheimer's disease risk

New research suggests testing people's memory over four weeks could identify who is at higher risk of developing Alzheimer's disease before it has developed. Importantly, the trial found testing people's ability to retain memories for longer time periods could predict this more accurately than classic memory tests, which test memory over half an hour.

The study, led by researchers at the University of Bristol and published in Alzheimer's Research and Therapy, wanted to find out whether testing people's memory of a word list four weeks after they were initially read it could predict who will experience the most cognitive decline over the following year, even if they have no cognitive or memory problems to begin with.

Forty-six cognitively healthy older people (with an average age of 70.7) were recruited to the trial. The participants performed three memory tasks on which delayed recall was tested after 30?minutes and four weeks, as well as the Addenbrooke's Cognitive Examination III (ACE-III) test (a commonly used test for detecting cognitive impairment) and an MRI brain scan. The ACE-III test was repeated after 12?months to assess the change in cognitive ability.

The research found the memory of 15 of the 46 participants declined over the year and that the four-week verbal memory tests predicted cognitive decline in these healthy older people better than the clinical gold standard memory tests. The prediction was made even more accurate by combining the four-week memory test score with information from the MRI brain scan that shows the size of a part of the brain responsible for memory, which is damaged by Alzheimer's disease.

Testing long-term memory recall could enable earlier detection of Alzheimer's disease. This is critical, as any future treatment which slows or stops Alzheimer's disease from getting worse will be most effective if given at the very earliest stages of the disease, and before significant memory problems are detectible using current tests.

Dr Alfie Wearn, Research Associate in the Bristol Medical School: Translational Health Sciences (THS) and co-author, said: "Our study shows evidence for a low-cost and quick to administer screening tool that could be used to identify the very earliest signs of Alzheimer's disease. It could also directly speed up the development of effective Alzheimer's disease therapies, and enable earlier treatment when such therapies are available."

Dr Liz Coulthard, Associate Professor in Dementia Neurology at the University of Bristol and neurologist at North Bristol NHS Trust, and co-author, added: "It is important to note the participants were healthy older people who did not develop Alzheimer's during the trial, but some people did show the type of change over the course of a year in memory and thinking that can precede Alzheimer's disease. Future work will establish whether this test predicts full-blown Alzheimer's dementia."

The next step for the researchers will be to test how specific this test is for detecting Alzheimer's disease compared to other disorders that cause cognitive decline. The research team will do this by comparing long-term memory test scores in people with and without evidence for Alzheimer's disease according to analysis of their cerebrospinal fluid (a very accurate but invasive method of detecting Alzheimer's disease).

Credit: 
University of Bristol