Culture

Record resolution in X-ray microscopy

Researchers at Friedrich-Alexander Universität Erlangen-Nürnberg (FAU), the Paul Scherrer Institute in Switzerland and other institutions in Paris, Hamburg and Basel, have succeeded in setting a new record in X-ray microscopy. With improved diffractive lenses and more precise sample positioning, they were able to achieve spatial resolution in the single-digit nanometre scale. This new dimension in direct imaging could provide significant impulses for research into nanostructures and further advance the development of solar cells and new types of magnetic data storage. The findings have now been published in the renowned journal Optica with the title 'Soft X-ray microscopy with 7 nm resolution'.

Soft X-ray microscopy, which uses low-energy X-rays is used to investigate the properties of materials in the nanoscale. This technology can be used to determine the structure of organic films that play an important role in the development of solar cells and batteries. It also enables chemical processes or catalytic reactions of particles to be observed. The method allows the investigation of so-called spin dynamics. Electrons can not only transport electric charge, but also have an internal direction of rotation, which could be used for new types of magnetic data storage.

To improve research into these processes in the future, researchers need to be able to 'zoom' in to the single-digit nanometre scale. This is theoretically possible with soft X-rays, but up to now it has only been possible to achieve spatial resolution of below 10 nanometres using indirect imaging methods that require subsequent reconstruction. 'For dynamic processes such as chemical reactions or magnetic particle interaction, we need to be able to view the structures directly,' explains Prof. Dr. Rainer Fink from the Chair of Physical Chemistry II at FAU. 'X-ray microscopy is especially suitable for this as it can be used more flexibly in magnetic environments than electron microscopy, for example.'

Improved focusing and calibration

Working with the Paul Scherrer Institute and other institutions in Paris, Hamburg, and Basel, the researchers have now broken a new record in X-ray microscopy as they have succeeded in achieving a record resolution of 7 nanometres in several different experiments. This success is not based primarily on more powerful sources of X-rays, but on improving the focus of the rays using diffractive lenses and more precise calibration of the test samples. 'We optimised the structure size of the Fresnel zone plates which are used to focus X-rays,' explains Rainer Fink. 'In addition, we were able to position the samples in the device at a much higher accuracy and reproduce this accuracy.' It is precisely this limited positioning and the stability of the system as a whole that have prevented improvements in resolution in direct imaging up to now.

Remarkably, this record resolution was not only achieved with specially-designed test structures, but also in practical applications. For example, the researchers studied the magnetic field orientation of iron particles measuring 5 to 20 nanometres with their new optics. Prof. Fink explains: 'We assume that our results will push forward research into energy materials and nanomagnetism in particular. The relevant structure sizes in this fields are often below current resolution limits.'

Credit: 
Friedrich-Alexander-Universität Erlangen-Nürnberg

'The robot made me do it': Robots encourage risk-taking behaviour in people

image: A SoftBank Robotics Pepper robot was used in the two robot conditions. Pepper, 1.21-meter-tall with 25 degrees of freedom, is a medium-sized humanoid robot designed primarily for Human-Robot Interaction (HRI).

Image: 
University of Southampton

New research has shown robots can encourage people to take greater risks in a simulated gambling scenario than they would if there was nothing to influence their behaviours. Increasing our understanding of whether robots can affect risk-taking could have clear ethical, practiCal and policy implications, which this study set out to explore.

Dr Yaniv Hanoch, Associate Professor in Risk Management at the University of Southampton who led the study explained, "We know that peer pressure can lead to higher risk-taking behaviour. With the ever-increasing scale of interaction between humans and technology, both online and physically, it is crucial that we understand more about whether machines can have a similar impact."

This new research, published in the journal Cyberpsychology, Behavior, and Social Networking, involved 180 undergraduate students taking the Balloon Analogue Risk Task (BART), a computer assessment that asks participants to press the spacebar on a keyboard to inflate a balloon displayed on the screen. With each press of the spacebar, the balloon inflates slightly, and 1 penny is added to the player's "temporary money bank". The balloons can explode randomly, meaning the player loses any money they have won for that balloon and they have the option to "cash-in" before this happens and move on to the next balloon.

One-third of the participants took the test in a room on their own (the control group), one third took the test alongside a robot that only provided them with the instructions but was silent the rest of the time and the final, the experimental group, took the test with the robot providing instruction as well as speaking encouraging statements such as "why did you stop pumping?"

The results showed that the group who were encouraged by the robot took more risks, blowing up their balloons significantly more frequently than those in the other groups did. They also earned more money overall. There was no significant difference in the behaviours of the students accompanied by the silent robot and those with no robot.

Dr Hanoch said: "We saw participants in the control condition scale back their risk-taking behaviour following a balloon explosion, whereas those in the experimental condition continued to take as much risk as before. So, receiving direct encouragement from a risk-promoting robot seemed to override participants' direct experiences and instincts."

The researcher now believe that further studies are needed to see whether similar results would emerge from human interaction with other artificial intelligence (AI) systems, such as digital assistants or on-screen avatars.

Dr Hanoch concluded, "With the wide spread of AI technology and its interactions with humans, this is an area that needs urgent attention from the research community."

"On the one hand, our results might raise alarms about the prospect of robots causing harm by increasing risky behavior. On the other hand, our data points to the possibility of using robots and AI in preventive programs, such as anti-smoking campaigns in schools, and with hard to reach populations, such as addicts."

Credit: 
University of Southampton

The pressure sensor of the venus flytrap

image: Open trap of Dionaea muscipula with potential prey. Middle: basal part of a trigger hair, where action potentials are elicited in the sensory cells upon touch stimulation. During the late phase of the action potential, potassium ions need to be reimported into the sensory cells via KDM1 to enable the generation of consecutive action potentials.

Image: 
(Picture: Ines Kreuzer, Soenke Scherzer / University of Wuerzburg)

All plant cells can be made to react by touch or injury. The carnivorous Venus flytrap (Dionaea muscipula) has highly sensitive organs for this purpose: sensory hairs that register even the weakest mechanical stimuli, amplify them and convert them into electrical signals that then spread quickly through the plant tissue.

Researchers from Julius-Maximilians-Universität (JMU) Würzburg in Bavaria, Germany, have isolated individual sensory hairs and analysed the gene pool that is active in catching insects. "In the process, we found for the first time the genes that presumably serve throughout the plant kingdom to convert local mechanical stimuli into systemic signals," says JMU plant researcher Professor Rainer Hedrich.

That's a fine thing, because virtually nothing was known about mechano-receptors in plants until now. Hedrich's team presents the results in the open-access journal PLOS Biology.

Sensory hairs convert touch into electricity

The hinged trap of Dionaea consists of two halves, each carrying three sensory hairs. When a hair is bent by touch, an electrical signal, an action potential, is generated at its base. At the base of the hair are cells in which ion channels burst open due to a stretching of their envelope membrane and become electrically conductive. The upper part of the sensory hair acts as a lever that amplifies the stimulus triggered by even the lightest prey.

These micro-force-touch sensors thus transform the mechanical stimulus into an electrical signal that spreads from the hair over the entire flap trap. After two action potentials, the trap snaps shut. Based on the number of action potentials triggered by the prey animal during its attempts to free itself, the carnivorous plant estimates whether the prey is big enough - whether it is worth setting the elaborate digestion in motion.

From genes to the function of the touch sensor

To investigate the molecular basis for this unique function, Hedrich's team "harvested" about 1000 sensory hairs. Together with JMU bioinformatician Professor Jörg Schultz, they set out to identify the genes in the hairs.

"In the process, we noticed that the fingerprint of the genes active in the hair differs from that of the other cell types in the trap," says Schulz. How is the mechanical stimulus converted into electricity? "To answer this, we focused on the ion channels that are expressed in the sensory hair or are found exclusively there," says Hedrich.

In search of further ion channels

The sensory hair-specific potassium channel KDM1 stood out. Newly developed electrophysiological methods showed that without this channel, the electrical excitability of the sensory hairs is lost, i.e. they can no longer fire action potentials. "Now we need to identify and characterise the ion channels that play an important role in the early phases of the action potential," Hedrich said.

Credit: 
University of Würzburg

Majority of pregnant women who tested positive for COVID-19 were asymptomatic, study finds

The majority of pregnant women who tested positive for COVID-19 on arrival to the delivery room were asymptomatic, according to a paper by Mount Sinai researchers published in PLOS One on Thursday, December 10. The pregnant patients who tested positive for the coronavirus were also more likely than those who tested negative to identify as Hispanic and report their primary language as Spanish.

In a retrospective cross-sectional study of universal screenings for SARS-Cov-2, the virus that causes COVID-19, implemented in the labor and delivery unit of Elmhurst Hospital in Queens, New York, during March and April, the researchers found that more than one-third of almost 130 pregnant women tested positive for the coronavirus. This is a much higher proportion than reported at other hospitals in New York City during the pandemic surge, and likely related to social inequities experienced by the surrounding population. Elmhurst Hospital is a public hospital that serves a diverse, largely immigrant and low-income patient population that was severely affected by the COVID-19 pandemic in the spring. The majority, or 72 percent, of the pregnant patients who tested positive were asymptomatic, meaning they did not display any symptoms associated with COVID-19. These findings add to the evidence that there was early and rampant asymptomatic spread of the disease at a time when most community and hospital testing was limited to symptomatic individuals.

"This study is instructive for other labor and delivery units and hospitals across the world as we continue to refine pandemic preparedness," says Sheela Maru, MD, MPH, Assistant Professor of Global Health, and Obstetrics, Gynecology and Reproductive Science, at the Icahn School of Medicine at Mount Sinai. "In future epidemics, it may be prudent to look at labor and delivery screening numbers much earlier on, as pregnant women continue to seek essential care despite social distancing measures and also represent the general young and healthy community population."

Dr. Maru said universal screening in the labor and delivery unit ensured safety of patients and staff during an acute surge in COVID-19 infections through appropriate identification and isolation of pregnant women with positive test results. Women were roomed by their status and were provided postpartum counseling and follow-up protocols tailored to their specific social needs.

In addition to their status for COVID-19, the study reviewed patients' demographic data including age, ethnicity, primary language, zip code, marital status, and health insurance status, and clinical data including the mode of delivery, length of stay, and comorbidities such as chronic hypertension, preeclampsia, prepregnancy obesity, asthma, diabetes, depression, and anxiety.

Credit: 
The Mount Sinai Hospital / Mount Sinai School of Medicine

Masonic Medical Research Institute studies brown fat: Implications in obesity

image: The Lin Lab at the MMRI, quantified the number of brown fat cells present in newborn animals. For years, researchers have argued over whether brown fat continues to grow after birth. Dr. Lin and his team have become the first to prove that it does.

Image: 
Masonic Medical Research Institute

UTICA, NY -- Brown fat, also known as brown adipose tissue, is a special type of fat that "turns on" (becomes activated) when you get cold, to help maintain body temperature. Importantly, brown fat is a biological fuel that can increase the metabolic rate, decrease fat storage, and thereby, lower one's propensity for developing obesity. Interestingly, it was previously thought that individuals were born with only a finite number of brown fat cells. Here, and for the first time, a recent publication in the journal Scientific Reports identifies that brown fat can continue to grow and divide, even after birth. This finding has major implications; scientists can try to increase the overall number of these cells to prevent or reduce the onset of obesity.

Dr. Zhiqiang Lin, Assistant Professor and senior author of the manuscript, together with his research team at the Masonic Medical Research Institute (MMRI), quantified the number of brown fat cells present in newborn animals. "For years, researchers have been arguing over whether brown fat continues to grow after birth - we can now say with certainty that it does. This discovery opens a whole new direction for future breakthroughs. Our next step will focus on identifying the developmental signals responsible for the growth of brown fat cells and determining whether we can manipulate gene expression to generate more" said Dr. Lin.

The human body has two main forms of fat: brown and white. Brown fat acts as a furnace in the body, burning energy and turning it into heat. White fat, on the other hand, operates as a freezer, storing energy for later use. When the energy one consumes exceeds the energy exerted, obesity ensues. It is indeed a consequence of energy imbalance - white fat is storing more energy than brown fat is burning. This published manuscript is the first to suggest that brown fat cells continue to divide after birth, albeit only for a small window of 1-2 weeks. With this knowledge on hand, researchers can now interrogate the mechanisms that allow brown fat to grow and potentially devise ways to continue their propagation, as a means to control weight.

Credit: 
Masonic Medical Research Institute

Predicting British railway delays using artificial intelligence

image: Train schedule at a station near London.

Image: 
Debra Larson, The Grainger College of Engineering

Over the past 20 years, the number of passengers traveling on British train networks has almost doubled to 1.7 billion annually. With numbers like that it's clear how much people rely on rail service in Great Britain, and how many disgruntled patrons there would be when delays occur. A recent study used real British Railway data and an artificial intelligence model to improve the ability to predict delays in railway networks.

"We wanted to explore this problem using our experience with graph neural networks," said Huy Tran, an aerospace engineering faculty member at the University of Illinois Urbana-Champaign. "These are a specific class of artificial intelligence models that focus on data modeled as a graph, where a set of nodes are connected by edges."

"This was a collaboration with Simon Hu, an expert on railway networks, from the Zhejiang University-University of Illinois Urbana-Champaign Institute," Tran said. "We worked together to develop a new way to represent a rail network and apply it to real-world data to predict delays."

They applied the Spatial-Temporal Graph Convolutional Network model to predict delays within a portion of the British rail network where Didcot Parkway and Long Paddington serve as gateway stations.

"Compared with other statistical models, this one outperforms them for forecasting delays up to 60 minutes in the future," Tran said.

He added that others have developed similar models for social networks, but he wanted to apply it to an engineering problem.

"One challenge was that this data only captures the full trip of a train from start to finish. It doesn't tell us where it was delayed along the way," he said. "The new formulation we developed was a way to approximate in which leg of the trip the delay occurred. From this research, we learned that using our formulation with this class of AI models can work well with real-world networks to predict behaviors."

Tran said they are addressing some limitations, however.

"A lot of times with AI models, we don't really understand why the model says what it does. We just try to predict what the delay will be, but we don't have any insight into why it was delayed, or where," Tran said. "So, one of the things we're interested in is getting more explainability into these models, so we can better understand why it's making the suggestion or predictions that it does.

"We'd also like to at some point, close the loop and say, given this information, here's how you might want to react to that delay."

Credit: 
University of Illinois Grainger College of Engineering

Nutrigenomics: new frontiers

image: Addressing the latest advances at the intersection of postgenomics medicine, biotechnology and global society, including the integration of multi-omics knowledge, data analyses and modeling, and applications of high-throughput approaches to study complex biological and societal problems. Public policy, governance and societal aspects of the large-scale biology and 21st century data-enabled sciences

Image: 
Mary Ann Liebert, Inc., publishers

New Rochelle, NY, December 10, 2020--Plant omics and food engineering offer novel perspectives and value to sustainable agriculture and ecological sciences. They contribute to advances in nutrigenomics, according to a Special Issue on Plant Omics, Food Engineering and New Frontiers in Nutrigenomics in the peer-reviewed OMICS: A Journal of Integrative Biology. Click here (read the issue now.

"Nutrigenomics contributes to precision nutrition by unraveling the mechanisms of person-to-person and population differences in response to food exposures. Multi-omic variability in plants, and engineering of food composition, are the "input" functions leading to variability in nutritional outcomes. The OMICS December special issue presents new findings on plant omics and food engineering that address these knowledge dimensions. The articles signal new frontiers in nutrigenomics and the potentials of systems science-driven food engineering for precision nutrition," states Vural Özdemir, MD, PhD, Editor-in-Chief of OMICS: A Journal of Integrative Biology.

Plant omics is also relevant to identifying novel therapeutics for use in the current COVID-19 pandemic.

Featured in the special issue is an article by Anil Kumar, PhD, at Rani Lakhmi Bai Central Agricultural University, and coauthors. They discuss the future of food based on multi-omic analyses of iron and zinc homeostasis in finger millet, a staple food of agricultural importance worldwide.

Maria Taciana Cavalcanti Vieira Soares, at Rural Federal University of Pernambuco, Recife, and coauthors focused on the genetic mobility of Enterococcus faecium. They report on new findings and an approach based on comparison of the genetic mobility of (1) probiotic, (2) pathogenic, and (3) nonpathogenic and non-probiotic strains, so as to differentiate probiotics, and inform their safe use.

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

Signs of healthy aging found in ergothioneine telomere study

image: A potent antioxidant, Ergothioneine helps fight oxidative stress and cellular imbalance that contribute to cell damage associated with aging and several health-related issues.*

* Emerging science shows potential cognitive, immune, prostate and cardiovascular health benefits.

Blue California's ErgoActive ® Ergothioneine is:
Not synthetic and GMO free. It's made by fermentation through a proprietary technology and manufacturing process.
No-Objection Letter from the US FDA to its GRAS notification, GRN 734.

Image: 
Blue California

Rancho Santa Margarita, Calif. (Dec. 10, 2020) --- An in vitro study published in the Journal of Dietary Supplements, demonstrated Blue California's ErgoActive® ergothioneine helped to preserve telomere length and reduced the rate of telomere shortening under oxidative stress.

The in vitro study is the first time ergothioneine has been studied for its effect on telomere length. Blue California provided its ErgoActive ergothioneine, which is produced by a proprietary fermentation process.

"Our results suggest that ergothioneine as part of a healthy diet could potentially mitigate the negative effects of oxidative stress and support healthy aging by helping to preserve telomere length and reduce the rate of shortening," said Chief Science Officer, Dr. Priscilla Samuel.

Telomeres are complex protein structures located at the end of each DNA strand, protecting chromosomes from becoming damaged. When DNA strands are frayed or worn down, cells are challenged with performing specialized functions, thus making the protection offered by telomeres critical for the life of cells.

Shortened telomeres are associated with many chronic conditions such as cancer, cardiovascular disease, and diabetes. "Many areas of health are impacted by oxidative stress during aging, including longevity, bone health, cardiovascular health, cognition and skin vitality," said Samuel. "As oxidative stress accelerates the shortening of telomeres, antioxidants such as ergothioneine may help to decelerate it."

Ergothioneine is a naturally occurring amino acid with potent antioxidant properties that the body does not make but obtains from dietary sources such as specific species of mushrooms, beans and oat bran. However, for most people, the dietary consumption of foods rich in ergothioneine tends to be low.

Moreover, humans produce a highly specific ergothioneine transporter (ETT), leading many to reason its importance, and suggest its essentiality to human health. Renowned scientist Dr. Bruce Ames has proposed classifying ergothioneine as a "longevity vitamin."

In the in vitro study, human neonatal dermal fibroblast cells were used to observe the effect of ergothioneine on telomerase activity and telomeres under standard and oxidative stress conditions over an 8-week period.

Under oxidative conditions, at week 8 across all four tested concentrations (0.04 to 1.0 mg/ml) of ergothioneine, median telomere length was significantly longer than control and a significantly reduced percent of short telomeres was also observed, demonstrating a protective effect of ergothioneine.

"Blue California actively invests in clinical studies to advance the science and impact of our ErgoActive ergothioneine on overall health and wellness and look forward to investigating these effects in human clinical studies as well," said Samuel. "We are committed to furthering research for substantiating functional benefits and claims associated with ingredients for use in dietary supplements, functional foods and beverages, personal care products, cosmetics and pet nutrition."

Early in February 2020, Blue California filed a patent application reporting the discovery of ErgoActive ergothioneine's impact on telomere shortening associated with oxidative stress.

Credit: 
Blue California

Double element co-doped carbon quantum dots enhance photocatalytic efficiency

image: Photocatalytic process of phosphorus and nitrogen co-doped carbon quantum dots (PNCQDs)/TiO2 nanosheets. Due to the quantum wells created by synergy between N and P elements, photogenerated electrons can be localized to the dopant sites of PNCQDs, improving the quantum efficiency.

Image: 
Authors

In a paper published in NANO, researchers from Nanjing Tech University proposed a theory which attributes the photocatalytic efficiency enhancement of Phosphorus and Nitrogen co-doped CQDs (PNCQDs)/TiO2 nanosheets composite photocatalyst to the quantum wells of PNCQDs.

Doped carbon quantum dots (CQDs) have been popular nanomaterials to enhance the performance of composite semiconductor photocatalyst in recent years, since their excellent optical and electronic characteristics. In the field of fluorescence, it is commonly known that double element co-doping can effectively increase the quantum yield of CQDs because of the synergy between doping elements, but the specific mechanism is still unclear.

This study found the difference of surface states and bandgaps between PNCQDs and nitrogen doped CQDs (NCQDs). The potential barriers produced by energy level difference of P and N element created numerous well-like electron traps, which can effectively capture photogenerated electrons. Moreover, the PNCQDs can reduce the fluorescence of composite photocatalyst, which reflects the capability of photogenerated carrier separation. PNCQDs also enhanced the UV absorption performance of the catalyst. The kinetic constant of PNCQDs/TiO2 sample for Methylene Blue (MB) photodegradation under simulated sunlight reached 3.4 times of pure TiO2, with a Z-scheme mechanism.

Compared with traditional Nitrogen doped carbon quantum dots (NCQDs), Phosphorus and Nitrogen co-doped carbon quantum dots (PNCQDs) show unique characteristic of carrier separation and electron localization due to the structure of quantum wells created by synergistic effect of doping elements. In this work, double element co-doped CQDs are proved to have potential in the field of photocatalysis, and the theory of quantum wells provides reference for designing CQDs photocatalyst system with better performance.

Credit: 
World Scientific

Genetic differences important in Alzheimer's diagnosis

image: Schematic illustration of how brain imaging resp. cerebrospinal fluid measures the accumulation of amyloid protein.

Image: 
The research team.

The two used methods for detecting amyloid pathology in Alzheimer's disease do not give unambiguous results, with the risk of incorrect or delayed care interventions. Now, researchers at Karolinska Institutet in Sweden have found genetic explanations for the differences. The study is published in Molecular Psychiatry and may be important for more individual diagnostics and the development of future drugs.

Alzheimer's disease is the most common dementia disease and leads to gradual memory loss and premature death. Approximately 120,000 people in Sweden have Alzheimer's and there are approximately 50 million people worldwide. According to Hjärnfonden, the number will increase by 70 percent in 50 years, partly because we are living longer and longer.

One of the earliest signs of Alzheimer's is a pathological accumulation of amyloid protein forming insoluble deposits in the brain, also called plaques. This process can last for many years without appreciably affecting the person's cognitive ability.

Amyloid plaques are present in the brain from an early stage of Alzheimer's disease, already before mild cognitive impairment. At the same time, an early diagnosis is important for care interventions that could dampen the course of the disease.

Today, brain imaging of amyloid plaques with a PET camera and analysis of cerebrospinal fluid, CSF, from the spinal cord are the accepted methods for detecting pathological accumulations of amyloid.

But in up to 20 percent of cases, especially at early stages of the disease, the methods show different results. These differences can have implications for the patient for early diagnosis and treatment.

Now, researchers at Karolinska Institutet and Vita-Salute San Raffaele University in Milano have identified two alternative pathways for the development of amyloid pathology in Alzheimer's disease.

The results are based on PET imaging and CSF analyses in 867 participants, including patients with mild cognitive impairment, Alzheimer's dementia and healthy controls. For two years, the amyloid accumulation in a subset of nearly 300 participants had been documented with both a PET camera and CSF analysis.

The results show that pathological changes in some individuals are first detected in the brain with a PET camera, and in other individuals first with CSF analysis. In the latter group, the researchers also saw a higher incidence of Alzheimer's genetic risk factor and faster accumulation of amyloid plaques in the brain compared to the former group.

According to the researchers, the results reveal two different groups of patients, with different genetics and speed of amyloid plaque accumulation in the brain.

"The results may be important as amyloid biomarkers play a significant role as early diagnostic markers for clinical diagnosis. Today, CSF-analysis and PET are considered equivalent to determine the degree of amyloid accumulation, but the study indicates that the two methods should rather be seen as complementary to each other," says first author Arianna Sala, currently a post-doctoral fellow at the University of Liège, Belgium and Technical University of Munich, Germany.

"The differences in the results for biomarkers in the brain and CSF provide unique biological information and the opportunity for earlier and more individualized diagnosis and treatment for Alzheimer's disease in the future. The results may also be important for the design of clinical trials of new drugs against amyloid accumulation in the brain," says last author Elena Rodriguez-Vieitez, senior researcher at the Department of Neurobiology, Caring Sciences and Society, Karolinska Institutet.

Credit: 
Karolinska Institutet

Bristol researchers publish significant step toward quantum advantage

image: Layout of qubits in Google's Sycamore architecture

Image: 
A. Montanaro

Researchers from the University of Bristol and quantum start-up, Phasecraft, have advanced quantum computing research, bringing practical hybrid quantum-classical computing one step closer.

The team, led by Bristol researcher and Phasecraft co-founder, Dr. Ashley Montanaro, has discovered algorithms and analysis which significantly lessen the quantum hardware capability needed to solve problems which go beyond the realm of classical computing, even supercomputers.

In the paper, published in Physical Review B, the team demonstrates how optimised quantum algorithms can solve instances of the notorious Fermi-Hubbard model on near-term hardware.

The Fermi-Hubbard model is of fundamental importance in condensed-matter physics as a model for strongly correlated materials and a route to understanding high-temperature superconductivity.

Finding the ground state of the Fermi-Hubbard model has been predicted to be one of the first applications of near-term quantum computers and one that offers a pathway to understanding and developing novel materials.

Dr. Ashley Montanaro, research lead and co-founder of Phasecraft: “Quantum computing has critically important applications in materials science and other domains. Despite the major quantum hardware advances recently, we may still be several years from having the right software and hardware to solve meaningful problems with quantum computing. Our research focuses on algorithms and software optimisations to maximise the quantum hardware’s capacity, and bring quantum computing closer to reality.

“Near-term quantum hardware will have limited device and computation size. Phasecraft applied new theoretical ideas and numerical experiments to put together a very comprehensive study on different strategies for solving the Fermi-Hubbard model, zeroing in on strategies that are most likely to have the best results and impact in the near future.”

Lana Mineh, a PhD student in the School of Mathematics and the Centre for Doctoral Training in Quantum Engineering, who played a key role in the research, said, “The results suggest that optimising over quantum circuits with a gate depth substantially less than a thousand could be sufficient to solve instances of the Fermi-Hubbard model beyond the capacity of current supercomputers. This new research shows significant promise for producing the ground state of the model on near-term quantum devices, improving on previous research findings by around a factor of 10.”

Physical Review B, published by the American Physical Society, is the top specialist journal in condensed-matter physics. The peer-reviewed research paper was also chosen as the Editors’ Suggestion and to appear in Physics magazine.

Andrew Childs, Professor in the Department of Computer Science and Institute for Advanced Computer Studies at the University of Maryland: “The Fermi-Hubbard model is a major challenge in condensed-matter physics, and the Phasecraft team has made impressive steps in showing how quantum computers could solve it. Their work suggests that surprisingly low-depth circuits could provide useful information about this model, making it more accessible to realistic quantum hardware.”

Hartmut Neven, Head of Quantum Artificial Intelligence Lab, Google: “Sooner or later, quantum computing is coming. Developing the algorithms and technology to power the first commercial applications of early quantum computing hardware is the toughest challenge facing the field, which few are willing to take on. We are proud to be partners with Phasecraft, a team that are developing advances in quantum software that could shorten that timeframe by years.”

Phasecraft Co-founder Dr. Toby Cubitt: “At Phasecraft, our team of leading quantum theorists have been researching and applying quantum theory for decades, leading some of the top global academic teams and research in the field. Today, Ashley and his team have demonstrated ways to get closer to achieving new possibilities that exist just beyond today’s technological bounds.”

Phasecraft has closed a record seed round for a quantum company in the UK with £3.7m in funding from private-sector VC investors, led by LocalGlobe with Episode1 along with previous investors. Former Songkick founder Ian Hogarth has also joined as board chair for Phasecraft. Phasecraft previously raised a £750,000 pre-seed round led by UCL Technology Fund with Parkwalk Advisors and London Co-investment Fund and has earned several grants facilitated by InnovateUK. Between equity funding and research grants, Phasecraft has raised more than £5.5m.

Dr. Toby Cubitt: “With new funding and support, we are able to continue our pioneering research and industry collaborations to develop the quantum computing industry and find useful applications faster.”

Paper:
‘Strategies for solving the Fermi-Hubbard model on near-term quantum computers,’ by Cade, C., Mineh, L., Montanaro, A. and Stanisic, S. in Physical Review B.

About Phasecraft

Since it was founded in 2019, Phasecraft has established itself as an emerging leader in quantum research.

It was started by leading quantum scientists, Dr. Toby Cubitt, Dr. Ashley Montanaro and Prof John Morton, who have spent decades leading top research teams at UCL and University of Bristol. Together they have built a team to enable useful applications of quantum computing by developing high-efficiency algorithms to optimise the capabilities of near-term quantum hardware.

Phasecraft works in partnership with leading quantum hardware companies, including Google and Rigetti, academic and industry leaders, to develop high-efficiency software that evolves quantum computing from experimental demonstrations to useful applications.

Notes to editors:

Include link to research paper.

Read the insight piece on the research from the Phasecraft team here: https://www.phasecraft.io/insight/strategies-for-solving-the-fermi-hubbard-model-on-near-term

Journal

Physical Review B

Credit: 
University of Bristol

A biased evaluation of employees' performance can be useful for employers

image: Sergey Stepanov, Assistant Professor, Faculty of Economic Sciences (HSE University)

Image: 
Sergey Stepanov (HSE University)

In assessing an employee's performance, employers often listen to his immediate supervisor or colleagues, and these opinions can be highly subjective. Sergey Stepanov, an economist from HSE University, has shown that biased evaluations can actually benefit employers. An article substantiating this finding was published in the Journal of Economic Behavior and Organization.

The model described in the article 'Biased Performance Evaluation in A Model of Career Concerns: Incentives versus Ex-Post Optimality' was developed within the 'career concerns' framework pioneered by Bengt Holmström. His paper represents the relationship between an employee (often called an agent by economists) and an employer, or principal (broadly speaking, this can be the market as a whole). This modelling considers three components of performance: talent, effort and random factors. An agent's incentive to exert effort arises from the fact that better performance results in a higher evaluation of the agent's talent by the market, which, in turn, can help to increase his future wage.

In the canonical model, an employer (or the market) observes the results of an agent's work. Sergey Stepanov, Assistant Professor of HSE University's Faculty of Economic Sciences, modified the model by adding an intermediate party - an evaluator. If the principal is busy or has many employees, it would be difficult for her to monitor each agent individually, and thus she will often rely on the evaluation of an agent by his supervisor or peers. For a variety of reasons, their assessments are likely to be biased, either in favour of the agent or against. With this in mind, the question the researcher sought to answer in this study was: 'what should the best direction and degree of the bias be?'

'In classic career concerns models, the principal observes the performance of an agent directly. However, we know that this is often not the case, and principals receive such information through 'evaluators'. However, the interests of these people may not coincide with those of the principal. And I thought: maybe it's actually a good thing that they don't? Objective evaluation is, of course, optimal from the point of view of making correct decisions about an agent (e.g., to promote him or not), but such an evaluation may create sub-optimal incentives to exert effort,' the author of the article explained.

Agents who are very talented a priori will lose motivation if they are evaluated fairly, because they know they will most likely clear the performance bar even with a low effort. Similarly, agents who are initially believed to be below average will lose motivation because they are unlikely to succeed even with a high effort. Hence, an ideal evaluator should be stricter on employees who seem to be capable and talented, but more lenient towards those who are less capable. In addition, the greater the degree of career concerns of an agent, the less objective the optimal evaluator should be, while the performance of those whose abilities are initially very uncertain, for example, without a prior track record, should be judged most objectively.

Thus, the 'unfair' opinion of an evaluator may prove to be more useful in motivating an employee than an objective assessment.

The model may be useful, for example, for organizing internships. This proves that stronger interns with good CVs should indeed be given more demanding supervisors, whereas those for applicants with very brief CVs (which tell very little about their experience of skills) should be more balanced in their assessments.

The results of this research will be useful in evaluating the performance of government officials working on public projects or senior corporate managers, as well as in making internal promotion decisions.

Credit: 
National Research University Higher School of Economics

Getting the right grip: Designing soft and sensitive robotic fingers

image: Scientists at Ritsumeikan University, Japan, design a 3D printable soft robotic finger containing a built-in sensor with adjustable stiffness

Image: 
Ritsumeikan University

Although robotics has reshaped and even redefined many industrial sectors, there still exists a gap between machines and humans in fields such as health and elderly care. For robots to safely manipulate or interact with fragile objects and living organisms, new strategies to enhance their perception while making their parts softer are needed. In fact, building a safe and dexterous robotic gripper with human-like capabilities is currently one of the most important goals in robotics.

One of the main challenges in the design of soft robotic grippers is integrating traditional sensors onto the robot's fingers. Ideally, a soft gripper should have what's known as proprioception--a sense of its own movements and position--to be able to safely execute varied tasks. However, traditional sensors are rigid and compromise the mechanical characteristics of the soft parts. Moreover, existing soft grippers are usually designed with a single type of proprioceptive sensation; either pressure or finger curvature.

To overcome these limitations, scientists at Ritsumeikan University, Japan, have been working on novel soft gripper designs under the lead of Associate Professor Mengying Xie. In their latest study published in Nano Energy, they successfully used multimaterial 3D printing technology to fabricate soft robotic fingers with a built-in proprioception sensor. Their design strategy offers numerous advantages and represents a large step toward safer and more capable soft robots.

The soft finger has a reinforced inflation chamber that makes it bend in a highly controllable way according to the input air pressure. In addition, the stiffness of the finger is also tunable by creating a vacuum in a separate chamber. This was achieved through a mechanism called vacuum jamming, by which multiple stacked layers of a bendable material can be made rigid by sucking out the air between them. Both functions combined enable a three-finger robotic gripper to properly grasp and maintain hold of any object by ensuring the necessary force is applied.

Most notable, however, is that a single piezoelectric layer was included among the vacuum jamming layers as a sensor. The piezoelectric effect produces a voltage difference when the material is under pressure. The scientists leveraged this phenomenon as a sensing mechanism for the robotic finger, providing a simple way to sense both its curvature and initial stiffness (prior to vacuum adjustment). They further enhanced the finger's sensitivity by including a microstructured layer among the jamming layers to improve the distribution of pressure on the piezoelectric material.

The use of multimaterial 3D printing, a simple and fast prototyping process, allowed the researchers to easily integrate the sensing and stiffness-tuning mechanisms into the design of the robotic finger itself. "Our work suggests a way of designing sensors that contribute not only as sensing elements for robotic applications, but also as active functional materials to provide better control of the whole system without compromising its dynamic behavior," says Prof Xie. Another remarkable feature of their design is that the sensor is self-powered by the piezoelectric effect, meaning that it requires no energy supply--essential for low-power applications.

Overall, this exciting new study will help future researchers find new ways of improving how soft grippers interact with and sense the objects being manipulated. In turn, this will greatly expand the uses of robots, as Prof Xie indicates: "Self-powered built-in sensors will not only allow robots to safely interact with humans and their environment, but also eliminate the barriers to robotic applications that currently rely on powered sensors to monitor conditions."

Let's hope this technology is further developed so that our mechanical friends can soon join us in many more human activities!

Credit: 
Ritsumeikan University

Bio-inspired lanthanide-transition metal cluster for efficient overall water splitting

image: Crystal structures of the synthetic NdCo3 cluster (a, b) and the native CaMn4 of PSII (c,b). The synthetic schematic diagram process of NdCo3/PCN(e).

Image: 
©Science China Press

Photosynthesis in nature uses the CaMn4O5 cluster as the oxygen-evolving center to catalyze the water oxidation efficiently in photosystem II (PS II). Synergistic effect among the multi-metal centers of PSII plays a key role for the high catalytic activity. Mimicking the natural photosynthesis, light-driven overall water splitting to produce H2 and O2 including both hydrogen evolution reaction (HER) and oxygen evolution reaction (OER), is a promising pathway for artificial conversion and storage of solar energy. In the past, various inorganic and organic systems have been developed as overall water splitting photocatalysts. However, direct photocatalytic overall water splitting, including hydrogen evolution reaction (HER) and oxygen evolution reaction (OER), still faces great challenges of low activity.

Recently, researches from Xiamen university demonstrated a bio-inspired heterometallic cluster LnCo3 (Ln = Nd, Eu and Ce) clusters, which can be viewed as synthetic analogs of CaMn4O5 cluster. Anchoring LnCo3 on phosphorus-doped graphitic carbon nitrides (PCN) shows efficient overall water splitting activity without any sacrificial reagents and a quantum efficiency of 2.0 % at 350 nm.

Lanthanide-transition metal cluster mimics the structure of CaMn4O5 of PSII. The NdCo3 cluster can be viewed as the CaMn4O5 missing one metal vertex from the cubane and adding one bridging-O atom between Nd3+ and Co3+. In addition, the coordination mode of bridging-O in NdCo3 cluster is also very similar to that in CaMn4O5 of PSII, except for the five bridging-O atoms are O2- in biological CaMn4O5-cluster, while six bridging-O atoms are come from the -OH groups of two btp-3H ligands in NdCo3. Notably, the mixed oxidation states of the cobalt ions (+2 and +3) in NdCo3 cluster are similar to the mixed oxidation states of manganese ions (+3 and +4) in CaMn4O5, suggesting that the NdCo3 cluster can be viewed as a synthetic model of the OEC. Notably, the NdCo3 cluster shows high stability because of the presence of a chelating btp-3H ligand.

Considering the monotonic change in radius and chemical properties of the lanthanides, it was an attractive choice for investigating the physical characteristics of the clusters. "The synthetic biomimetic OECs should also be studied in an integrated system to reveal their true potentials and provide better understanding of the synergistic effect in catalysis on atomic level." They emphasized. By anchoring the bio-inspired LnCo3 as OEC on phosphorus-doped graphitic carbon nitrides (PCN), they realized light-driven spontaneous overall water splitting to efficiently produce O2 and H2.

"Traditionally, overall water splitting is composed of two half reactions, hydrogen evolution reaction and oxygen evolution reaction, lanthanide-transition cluster serve as the oxygen evolution catalyst and the P-doped carbon nitride as the hydrogen evolution catalyst. The NdCo3/PCN-c exhibited remarkable water-splitting activity with high H2 production rate of ~297.7 μmol h-1 g-1 and O2 evolution rate of 148.9 μmol h-1 g-1 under light irradiation." They state in an article titled "Integration of Bio-Inspired Lanthanide-Transition Metal Cluster and P-doped Carbon Nitride for Efficient Photocatalytic Overall Water Splitting."

"The lanthanide-transition cluster based photocatalysts not only performed a synthetic model of bio-inspired oxygen-evolving center but also an effective catalyst to realize light-driven overall water splitting. This is a promising way to artificially convert and store solar energy." Interestingly, ultrafast transient absorption spectroscopy revealed the transfer of photoexcited electron and hole into the PCN and LnCo3 for hydrogen and oxygen evolution reactions, respectively. DFT calculation showed the cooperative water activation of O-O bond formation on lanthanide and transition metal for water oxidation. This work provided an effective strategy to realize light-driven overall water splitting.

Credit: 
Science China Press

Designed a tool that will automate device programming in the IoT

The Internet of Things (IoT) has ushered in a new era, with everyday items evolving into what we now refer to as cyber-physical systems. These systems are physical mechanisms controlled or monitored by computer algorithms and deeply intertwined through the internet. Such systems have pierced their way into industry and are being deployed and used above all to manage and control industrial processes, thus giving rise to the so-called Industry 4.0. ICREA research professor Jordi Cabot and researcher Abel Gómez, two members of the Systems, Software and Models (SOM) Research Lab at the Universitat Oberta de Catalunya (UOC) Internet Interdisciplinary Institute (IN3), in collaboration with the IKERLAN technology research centre, have designed an innovative new tool for automating and streamlining the creation of systems that employ asynchronous event-driven communication, one of the most widely used computer architectures in this sector. The tool is the first to use the recently published AsyncAPI specification, which standardizes work with this type of architecture. The prototype, initially developed as part of the MegaMa@Rt2 project, is open-source and thus available for free online.

In IT infrastructures where numerous devices have to communicate with each other, such as in factories with different machinery to be monitored and controlled, the entirety of information is usually managed by a central node. In order to prevent these infrastructures from collapsing due to a faulty component, event-driven asynchronous architectures are deployed. Among the advantages of these architectures is that a breakdown in one component does not trigger a full system crash. One of the most popular paradigms is called the publish-subscribe architecture, where messages are not sent to specific receivers. According to Abel Gómez, "a channel shared by all the devices in a network is set up, and when one element, whether a computer, server or other type of device, wants to receive certain information, all it has to do is subscribe to a specific category of messages. This way, when another device is able to provide the information, it simply publishes it in the shared channel under the agreed category, and only the subscribed devices will receive the messages published on this topic."

Although these distributed architectures are particularly scalable and flexible, they are not without problems since there is still no established programming standard like there are for website creation, for instance. Therefore, the sector still needs to agree on the message categories, as well as their internal format and structure. The researcher said: "As there is no common language and they are such distributed infrastructures, the likelihood that each element is programmed by a different person is high, meaning messages may vary between devices. As a result, if there is any divergence in the topic names or format used, the receivers will either not receive or not know how to decipher the content.

A new solution is now on the table, seeking to standardize the programming of event-driven architectures: the AsyncAPI specification. This specification allows users to define all the relevant information needed to design and run IoT devices in these environments. However, AsyncAPI is still in the early stages of development and therefore the tools that support it remain in short supply. Despite this, the researchers have developed a tool based on this new proposal that allows users to automate the creation of messages in the appropriate format, as well as the sending and receiving of these messages. Abel Gómez said: "Much of the work that goes into implementing a program for an IoT device involves creating messages in the format that subscribers to the channel expect and also "translating" messages from other devices in order to process the information. A large amount of code must therefore be programmed and, when done manually, this can be a source of errors."

The researcher continued: "By adopting this new tool, we can significantly shorten the amount of time needed to develop and launch programs, which favours interoperability, improves code quality and in turn limits the number of errors in the software development life cycle.

A model for optimizing programs based on time series data

Another challenge posed by the integration of cyber-physical systems in Industry 4.0 is the need to improve computerized management of time series data, such as temperature or other data collected on a continuous basis. These series of historical data are key to monitoring system runtimes, improving industrial processes and pinpointing possible sources of error after a catastrophic failure. In this area, the UOC researchers have teamed up with an Austrian research group to design a theoretical model that combines model-based engineering and time series databases to automate part of the development process.

Time series databases support the storage and analysis of massive amounts of historical data, such as the temperature reading of an industrial refrigerator at regular intervals. This information then allows different operations to be carried out, such as calculating the maximum or average temperature over a specific period of time. Above all, the project aims to integrate time series calculations such as these into a model, i.e. a representation of a certain computer system which automates the whole process and removes the need to code its functions repeatedly for different cases. Abel Gómez said: "We have come up with a model that allows us to optimize historical queries by taking advantage of time series databases. This model outlines the preferred query language and the appropriate structure of the time series databases that would support that model."

According to the researchers, this proposal is beneficial in that it would make it possible to automate the programming process and circumvent the risk of potential coding errors, since the model would specify all the information necessary for it to run properly. "This model would reduce programming time and the prevalence of errors. By generating the code automatically from the model, you don't have a programmer doing it by hand who can make mistakes," the researcher concluded.

Credit: 
Universitat Oberta de Catalunya (UOC)