Tech

A nice day for a quantum walk

image: First, a phonon is prepared at the location of ion 2 with an illuminating light. The vibration propagates among the four ions because of the Coulomb interaction between them. After a certain time (which varied between 0 to 0.01 seconds), the probability of finding the phonon at each ion was measured with another beam of light. The probability shows a complex pattern, which matches precisely with theoretical expectations.

Image: 
Osaka University

Osaka, Japan - Researchers at the Center for Quantum Information and Quantum Biology at Osaka University used trapped ions to demonstrate the spreading of vibrational quanta as part of a quantum random walk. This work relies on their exquisite control of individual ions using lasers, and can lead to new quantum simulations of biological systems.

Here's a simple game you can play with a group of friends. Everyone lines up shoulder to shoulder, and then each person flips a coin to decide weather to take a step forward or backwards. After a few rounds of flips, you will find that your neat line will have spread out randomly. While this game sounds very simplistic, scientists have found that these "random walks" are incredibly useful for explaining diverse phenomena from molecular diffusion to problems in statistics and probability.

Among the very weird features of quantum mechanics—the laws of physics that govern the behavior of small objects like individual atoms—is the surprising mix of randomness and predictability. In particular, while the probability of finding a particle at a certain location spreads out predictably over time, like ripples in pond, when you actually make a measurement there is inherent uncertainty. This makes quantum random walks fundamentally different from their conventional counterparts. Unlike gas molecules spreading out in a room, the waves of a quantum random walk can interfere with itself, creating a distinct oscillation pattern.

The scientists at Osaka University started by creating an artificial crystal by trapping a row of four calcium ions with lasers. The ions could still influence each other with their electric charge. Then, the team showed that they could start one ion vibrating by shining a separate laser on it.

This minimum possible vibration, called a phonon, acted like a packet of energy that could be passed to a neighboring ion. As first author Masaya Tamura explains, "By employing the capability to prepare and observe a localized phonon, its propagation in a four-ion linear crystal can be observed with single-site resolution." By waiting for various lengths of time up to 10 milliseconds, the phonon locations measured matched the theoretical predictions.

"Our system using phonons offers a platform for realizing quantum simulations for studying open questions in chemistry and biology," says senior author Kenji Toyoda. "For example, it has been hypothesized that the incredible 95% efficiency of photosynthesis depends, at least in part, on the fact that quantum random walks act differently compared with classical randomness. The system shown here may be able to resolve these and other important issues."

Credit: 
Osaka University

Reducing neighborhood crime: Place management of alcohol outlets

Recent research from the Prevention Research Center of the Pacific Institute for Research and Evaluation and the University of Pittsburgh School of Public Health suggests that neighborhood crime may be reduced by enhancing "place management" resources in and around off-premise alcohol sales outlets, particularly at small and independent stores.

Place management is monitoring and controlling what people do in and around a place. Poor place management may provide opportunities for crime. Neighborhood crime is sometimes higher where there is higher alcohol outlet density and, often, alcohol outlet managers are held responsible for these problems. However, it's unclear how alcohol stores staff can control neighborhood crime.

To understand how place management operates across a wide range of store and neighborhood types, researchers assessed crime prevention strategies at all 403 off-premise outlets in six contiguous California cities; interviewed managers in 40 of these outlets; and conducted extensive observations in 15 of these 40 outlets.

According to some managers, physical and verbal threats from customers and intoxicated people and insufficient law enforcement response made it difficult for store staff to control behavior of people in and around their stores. Some managers reported relying on their own strong personalities and friendships with neighborhood-based customers to manage problems. Managers reported some ability to assert their authority over interior spaces, but less so over exterior, public spaces.

Further, small and independently-operated stores were the most common type in the study but had fewer resources for place management than large and chain stores.

Says Principal Investigator Dr. Juliet Lee: "In under-regulated alcohol markets like California many stores, and many kinds of stores, can get a license to sell alcohol, but not all retailers have sufficient resources or authority to prevent area crimes. Improved law enforcement and manager training may help reduce crime in areas with high density of alcohol sales outlets."

Credit: 
Pacific Institute for Research and Evaluation

Does MRI have an environmental impact?

image: Samples were taken along rivers around Tokyo. Measurements of rare earth element quantities indicate a clearly elevated amount of gadolinium compared to that in natural shale.

Image: 
Tokyo Metropolitan University

Tokyo, Japan - Researchers from Tokyo Metropolitan University have surveyed the amount of gadolinium found in river water in Tokyo. Gadolinium is contained in contrast agents given to patients undergoing medical magnetic resonance imaging (MRI) scans, and it has been shown in labs to become toxic when exposed to ultraviolet rays. The researchers found significantly elevated levels, particularly near water treatment plants, highlighting the need for new public policy and removal technologies as MRI become even more commonplace.

Modern medicine owes a lot to magnetic resonance imaging (MRI). Doctors can see tumors, inflammation and hemorrhaging deep inside the human body without the need for invasive surgery; unlike CT scans, patients are also not exposed to any ionizing radiation. Its many benefits have meant that MRI machines are now more wide-spread than ever. For example, in 1995, Japan had 6.12 machines per million residents; in 2017, it had 55.21, the highest number per million in the world.

But it might not be all good news. MRI imaging is often carried out after patients are injected with a contrast agent which makes features inside the body clearer in scans. It contains gadolinium, an originally toxic rare earth element that is rendered safe for medical use by binding it to a chelation agent, making it unreactive. After completing its task, 98% of the compound is expelled from a patient's body within 24 hours in the urine and makes its way through the wastewater system. Common wastewater treatment plants cannot remove it, so it passes directly into the environment, albeit in small quantities. On exposure to UV light, lab experiments have shown that it may transform back into a toxic state. This makes it vital to track how much gadolinium finds its way into the environment.

Thus, a team led by Professor Kazumasa Inoue of Tokyo Metropolitan University set out to measure how much gadolinium was being released into rivers in Tokyo. They took samples from a number of locations along the many major rivers of the city. Correcting for the amounts expected in natural shale, they carried out a broad survey of rare earths using mass spectrometry and found a significant elevation in the amount of gadolinium in the water. Importantly, they noticed large spikes in the amounts depending on proximity to water treatment plants. These findings are in agreement with previous work for samples taken inside a treatment plant on the River Weser, Germany.

It should be remembered that the reason why gadolinium is released in the first place is that a patient's kidneys safely pass it from the body. This means that, for the most part, it is also non-reactive in the environment. But as more MRI machines are installed to cater to an ageing population with more healthcare needs, the research team noted that new public policy and the development of new treatment techniques are vital to mitigating the environmental impact of this well-established, lifesaving medical technology.

Credit: 
Tokyo Metropolitan University

MS risk 29% higher for people living in urban areas, new research reveals

(Vienna, Saturday, 23 May, 2020) Air pollution could be a risk factor for the development of multiple sclerosis (MS), a new study conducted in Italy has found.

The research, presented today at the European Academy of Neurology (EAN) Virtual Congress, detected a reduced risk for MS in individuals residing in rural areas that have lower levels of air pollutants known as particulate matter (PM). It showed that the MS risk, adjusted for urbanisation and deprivation, was 29% higher among those residing in more urbanised areas.

The study sample included over 900 MS patients within the region, and MS rates were found to have risen 10-fold in the past 50 years, from 16 cases per 100,000 inhabitants in 1974 to almost 170 cases per 100,000 people today. Whilst the huge increase can partly be explained by increased survival for MS patients, this sharp increase could also be explained by greater exposure to risk factors.

The analysis was conducted in the winter, given that this is the season with the highest pollutant concentrations, in the north-western Italian region of Lombardy, home to over 547,000 people.

Commenting on the findings at the EAN Virtual Congress, lead researcher Professor Roberto Bergamaschi explained, "It is well recognised that immune diseases such as MS are associated with multiple factors, both genetic and environmental. Some environmental factors, such as vitamin D levels and smoking habits, have been extensively studied, yet few studies have focused on air pollutants. We believe that air pollution interacts through several mechanisms in the development of MS and the results of this study strengthen that hypothesis."

Particulate matter (PM) is used to describe a mixture of solid particles and droplets in the air and is divided into two categories. PM10 includes particles with a diameter of 10 micrometres of smaller and PM2.5 which have a diameter of 2.5 micrometres or smaller.

Both PM10 and PM2.5 are major pollutants and are known to be linked to various health conditions, including heart and lung disease, cancer and respiratory issues. According to the World Health Organisation, 4.2 million deaths occur every year because of exposure to ambient (outdoor) air pollution.

Three different areas were compared within the study region based on their levels of urbanisation, of which two areas were found to be above the European Commission threshold of air pollution. "In the higher risk areas, we are now carrying out specific analytical studies to examine multiple environmental factors possibly related to the heterogeneous distribution of MS risk", added Professor Bergamaschi.

The number of people living with MS around the world is growing, with more than 700,000 sufferers across Europe. The vast majority (85%) of patients present with relapsing remitting MS, characterised by unpredictable, self-limited episodes of the central nervous system. Whilst MS can be diagnosed at any age, it frequently occurs between the ages of 20-40 and is more frequent in women. Symptoms can change in severity daily and include fatigue, walking difficulty, numbness, pain and muscle spasms.

Credit: 
Spink Health

Scientists identify a temperature tipping point for tropical forests

image: An aerial view of a tropical forest along the eastern Pacific Ocean shoreline of Barro Colorado Island, Panama.

Image: 
Smithsonian Tropical Research Institute photo

All living things have tipping points: points of no return, beyond which they cannot thrive. A new report in Science shows that maximum daily temperatures above 32.2 degrees Celsius (about 90 degrees Fahrenheit) cause tropical forests to lose stored carbon more quickly. To prevent this escape of carbon into the atmosphere, the authors, including three scientists affiliated with the Smithsonian Tropical Research Institute (STRI) in Panama, recommend immediate steps to conserve tropical forests and stabilize the climate.

Carbon dioxide is an important greenhouse gas, released as fossil fuels are burned. It is absorbed by trees as they grow and stored as wood. When trees get too hot and dry, they may close the pores in their leaves to save water, but that also prevents them from taking in more carbon. And when trees die, they release stored carbon back into the atmosphere.

Tropical forests hold about 40% of all the carbon stored by land plants. For this study, researchers measured the ability of tropical forests in different sites to store carbon.

"Tropical forests grow across a wide range of climate conditions," said Stuart Davies, director of the Smithsonian's Forest Global Earth Observatories (ForestGEO), a worldwide network of 70 forest study sites in 27 countries. "By examining forests across the tropics, we can assess their resilience and responses to changes in global temperatures. Many other studies explored how individual forests respond to short-term climatic fluctuations. This study takes a novel approach by exploring the implications of thermal conditions currently experienced by all tropical forests."

By comparing carbon storage in trees at almost 600 sites around the world that are part of several different forest monitoring initiatives: RAINFOR, AfriTRON, T-FORCES and ForestGEO, the huge research team led by Martin Sullivan from the University of Leeds and Manchester Metropolitan University found major differences in the amount of carbon stored by tropical forests in South America, Africa, Asia and Australia. South American forests store less carbon than forests in the Old World, perhaps due to evolutionary differences in which tree species are growing there.

They also found that the two most important factors predicting how much carbon is lost by forests are the maximum daily temperature and the amount of precipitation during the driest times of the year.

As temperatures reach 32.2 degrees Celsius, carbon is released much faster. Trees can deal with increases in the minimum nighttime temperature (a global warming phenomenon observed at some sites), but not with increases in maximum daytime temperature.

They predict that South American forests will be the most affected by global warming because temperatures there are already higher than on other continents and the projections for future warming are also highest for this region. Increasing carbon in the atmosphere may counterbalance some of this loss, but would also exacerbate warming.

Forests can adapt to warming temperatures, but it takes time. Tree species that cannot take the heat die and are gradually replaced by more heat-tolerant species. But that may take several human generations.

"This study highlights the importance of protecting tropical forests and stabilizing the Earth's climate," said Jefferson Hall, co-author and director of the Smithsonian's Agua Salud Project in Panama. "One important tool will be to find novel ways to restore degraded land, like planting tree species that help make tropical forests more resilient to the realities of the 21st century."

The Agua Salud project asks how native tree species adapted to an area can be used to manage water, store carbon and promote biodiversity conservation at a critical point where North and South America connect.

It is also noted that one of the first permanent tropical forest study sites in the world, located at STRI's research station on Barro Colorado Island in Panama, is currently not being monitored for the first time in 40 years as a result of the COVID-19 pandemic, giving scientists less of a handle on any climate change effects that may be in play.

Steve Paton, director of STRI's physical monitoring program, notes that in 2019 there were 32 days with maximum temperatures over 32 degrees Celsius at a weather station in the forest canopy on the island, and a first glance at his data indicates that these exceptionally hot days are becoming more common.

Credit: 
Smithsonian Tropical Research Institute

Blood flow recovers faster than brain in micro strokes

image: Rice neurobiologists show that increased blood flow to the brain is not an accurate indicator of neuronal recovery after a microscopic stroke. The researchers created a custom implant that combines the ability to simultaneously monitor both blood flow and brain activity.

Image: 
Luan Laboratory/Rice University

HOUSTON - (May 22, 2020) - Increased blood flow to the brain after a microscopic stroke doesn't mean that part of the brain has recovered. At least not yet.

A study in Science Advances by Rice University neuroengineer Lan Luan and her colleagues used advanced neural monitoring technology to discover a significant disconnect between how long it takes blood flow and brain function to recover in the region of a microinfarct, a tiny stroke in tissue less than 1 millimeter in size.

The study led by Luan, a core faculty member of Rice's Neuroengineering Initiative, shows "a pronounced neurovascular dissociation that occurs immediately after small-scale strokes, becomes the most severe a few days after, lasts into chronic periods and varies with the level of ischemia," the researchers wrote.

The study in rodent models revealed the restoration of blood flow in the brain occurs first, followed by restoration of neuronal electrical activity. They observed that neuronal recovery could take weeks even for small strokes, and possibly longer for larger strokes.

The study required implants and instrumentation designed to monitor both blood flow and brain activity simultaneously before, during and after the onset of strokes.

"This started with the device," said Luan, an assistant professor of electrical and computer engineering at Rice's Brown School of Engineering, who developed a flexible neural electrode with co-author Chong Xie while both were at the University of Texas at Austin. "That was my transition from being trained as a material physicist to neuroengineering.

"As soon as we had the electrodes, I wanted to use them to understand brain functions and dysfunctions in a domain that was difficult to probe with previous technology," she said. "The electrodes are extremely flexible and well suited to be combined with optical imaging in exactly the same brain regions."

The electrodes were combined with optical lines able to measure blood flow by recording laser speckle patterns. The combined data, gathered for as long as eight weeks, gave the researchers an accurate comparison between blood flow and electrical activity.

"The strokes we focus on are so small that when they happen, it's very hard to detect them from behavioral measures," Luan said. "We would not easily see impairment in animal locomotion, meaning the animal could walk away just fine, from a lay perspective.

"The implications in humans are similar," she said. "These microinfarcts can occur spontaneously, especially in aged populations. Because they're so tiny, it's not like you're having a stroke. You will not notice it at all. But it has been long hypothesized that it's related to vascular dementia."

Luan said the neurological impact of individual microinfarcts is largely unknown. "That's what motivated us to set up a series of experiments to really directly measure the impacts of those extremely small-scale injuries," she said.

While the study would be hard to replicate in humans, the implications could improve diagnoses of patients who suffer microinfarcts.

"There are a lot of similarities in neurovascular coupling in rodent models and in humans," she said. "What we observed in rodents likely has a similar signature in humans, and I hope that can be of use to clinicians."

Luan said she is continuing her research at Rice, supported by a five-year R01 grant from the National Institute of Neurological Disorders and Stroke.

"We're interested in knowing not just how a single microinfarct would alter neural activity but also, cumulatively, whether the effect of multiple microinfarcts that occur at different times would be stronger or weaker than the sum of the individuals," she said.

Fei He, a research specialist in Luan's Rice laboratory, is lead author of the paper. Co-authors are Rice graduate students Hanlin Zhu and Xue Li and postdoctoral researchers Zhengtuo Zhao and Colin Sullender; and University of Texas at Austin graduate student Michael Williamson, Theresa Jones, a professor of behavioral neuroscience, and Andrew Dunn, a professor of diagnostic medicine and biomedical engineering. Xie is an associate professor of bioengineering and of electrical and computer engineering at Rice.

Credit: 
Rice University

Combinatorial screening approach opens path to better-quality joint cartilage

image: The biomaterials-based high-throughput screening approach that can simultaneously test combinations of physical and biochemical factors for their ability to synergistically form functional joint cartilage from stem cells enabled the team to identify specific cartilage-inducing microenvironments.

Image: 
Khademhosseini and Alsberg labs

(LOS ANGELES and CHICAGO) -- Cartilage is far from being like cartilage. As a rubber-like elastic tissue with widely varying properties, it lubricates our joints to keep them healthy and in motion, and forms many of our internal structures such as the intervertebral discs in our spine, the flexible connections between our ribs, and our voice box, as well as external tissues like nose, and ears.

Specifically, in joints, the wear-and-tear of cartilage over time eventually can result in the painful bone-on-bone contacts, and the bone damage and inflammatory reactions that plague patients with osteoarthritis, the most common form of arthritis. In the US alone, 32.5 million adults are affected by osteoarthritis, and thus far there is no strategy that allows lasting repair or replacement of degenerating joint (articular) cartilage.

To overcome this problem, researchers are using tissue engineering strategies to generate cartilage from stem cells outside of the human body, but "it can be challenging to prevent fibrocartilage and hypertrophic cartilage from forming when using tissue engineering strategies", said Eben Alsberg, Ph.D., the Richard and Loan Hill Professor of Bioengineering, Orthopedics, Pharmacology and Mechanical & Industrial Engineering at the University of Illinois in Chicago, who was previously at Case Western Reserve University. Upon implantation into joints, engineered cartilage can become unstable and dysfunctional, and methods that can determine more complex conditions for the production of high-quality cartilage ex vivo and its maintenance in vivo thus far were limited.

Now, a collaborative research team led by Ali Khademhosseini, Ph.D., the Director and CEO of the Terasaki Institute who was previously Director of the University of California, Los Angeles (UCLA) Center for Minimally Invasive Therapeutics, and Alsberg, has developed a multi-component biomaterial-based screening approach that identifies material compositions, and mechanical and molecular stimuli enabling human stem cells to differentiate into cells capable of generating higher-quality articular cartilage. The study is published in Science Advances.

"We took a holistic approach to cartilage engineering with this multicomponent in vitro approach by screening with high-throughput through many combinations of material, biomechanical and molecular parameters, which in this complexity had not been done before," said Khademhosseini. "This allowed us to define material properties and compositions, and specific mechanical, biochemical, and pharmacological contributions that help guide human mesenchymal stem cells (hMSCs) down a differentiation path towards articular cartilage-producing chondrocytes in vitro, and better maintain their functionality when transferred into mice."

Chondrocytes, which are differentiating from hMSCs, form cartilage by secreting collagen and other biomolecules into their extracellular environments where they form a hydrated elastic matrix. However, as differentiated cartilage only retains relatively low numbers of normally functioning chondrocytes, and lacks supportive blood vessels, it cannot efficiently repair and regenerate itself.

In the study, the team assembled a compression bioreactor from 3D printed components with an array of 288 individual hydrogel-based biomaterials for screening of multiple parameters presented in the native developing cartilage microenvironment. These hydrogels were made up of two different biomaterials, oxidized methacrylated alginate (OMA) and polyethylene glycol (PEG). The two hydrogel components can be cross-linked to each other to create a biodegradable and biocompatible dense interconnected elastic network. Within the biomaterial, the researchers embedded hMSCs and, in addition, cell-binding ligands that mimic the normal extracellular environment of developing cartilage, and growth factors favoring cartilage cell differentiation. The hydrogel biomaterial with the encapsulated hMSCs could be mechanically manipulated between fixed and movable plates, whereby the movable plate is cyclically pushed up from the bottom with finely calibrated forces, causing the biomaterial scaffold to be compressed and then relaxed again each time.

To be able to support the hMSCs with cartilage-specific cell culture medium and expose them to additional biochemical cues while they differentiate, the device was separated into multiple chambers and each chamber was linked to a microfluidic support system. Since all relevant biomaterial, mechanical and chemical parameters could be individually varied between biomaterials of the array, the researchers could study multiple combinations of cues simultaneously.

"Our approach pinpointed biomaterial compositions that provided a sweet spot of hydrogel physical properties, just the right amounts of extracellular matrix and critical growth factors, and mechanical stimulation that hMSCs needed in this complexity to develop into highly functional articular chondrocytes in the engineered system," said co-first author Junmin Lee, Ph.D., a postdoctoral fellow in Khademhosseini's group.

Alsberg added that the team's device-driven biomaterials strategy "identified cues in the cellular microenvironment that could preferentially drive engineered tissue constructs to a preferred hyaline cartilage phenotype". Chondrocytes that matured in the biomaterials secreted substantial amounts of extracellular matrix molecules that compose natural joint cartilage.

Lee together with the other co-first author Oju Jeon, Ph.D., a Research Professor working with Alsberg, and additional team members, also studied molecular pathways that chondrocytes normally use to transduce mechanical signals from their extracellular environment to control their gene expression. "We found that suboptimal biomaterial properties that elevated the activity of a mechanotransducing protein called YAP and its down-stream effects were causing chondrocytes to adopt a less functional state strongly resembling the one in hypertrophic cartilage in patients," said Jeon. "In contrast, inhibiting YAP with a specific drug favored the formation of functional articular chondrocytes in our system."

The YAP inhibitor as well as an inhibitor of WNT, another protein involved in mechanotransduction, were also found by the team in a search for drugs that would favor the formation of healthy articular cartilage in their system.

To investigate whether their overall approach could enable the generation of chondrocytes that would also be more effective in vivo, they scaled up a successful condition that resulted from their screening procedure from a hydrogel 1 mm in diameter to one that measured 8 mm in diameter. "When we actively inhibited YAP or the mechanical signal transducer WNT during 21 days of chondrocyte differentiation in vitro, implanted the engineered tissue under the skin of mice, and analyzed the implants again after an additional 21 days, we observed higher-quality chondrocytes with significantly less hypertrophy compared to controls that were not treated with inhibitors prior to implantation," said Jeon.

"The opportunities that our approach offers and the information it already helped us provide is an important step towards the generation of truly therapeutic articular cartilage, and some of the insights we gleaned could also be tooled for enhancing the function of existing joint cartilage in patients with osteoarthritis and for more personalized strategies," said Khademhosseini. His group continues their efforts at the interface of the Terasaki Institute's Personalized Implants, Personalized Cells, and Personalized Materials platforms in collaboration with the Alsberg Stem Cell & Engineered Novel Therapeutics (ASCENT) Laboratory.

Credit: 
Terasaki Institute for Biomedical Innovation

COVID-19 test results after clinical recovery, hospital discharge among patients in China

What The Study Did: Reverse transcriptase-polymerase chain reaction tests were used to assess potential viral shedding among patients who previously had been diagnosed with and had clinically recovered from COVID-19.

Authors: Pa Wu, Ph.D., of the Hunan Normal University in Changsha, China, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2020.9759)

Editor's Note: Please see the article for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

A stitch in time: How a quantum physicist invented new code from old tricks

image: Dr Benjamin Brown is a Research Fellow at the University of Sydney Nano Institute and School of Physics.

Image: 
University of Sydney

A scientist at the University of Sydney has achieved what one quantum industry insider has described as "something that many researchers thought was impossible".

Dr Benjamin Brown from the School of Physics has developed a type of error-correcting code for quantum computers that will free up more hardware to do useful calculations. It also provides an approach that will allow companies like Google and IBM to design better quantum microchips.

He did this by applying already known code that operates in three-dimensions to a two-dimensional framework.

"The trick is to use time as the third dimension. I'm using two physical dimensions and adding in time as the third dimension," Dr Brown said. "This opens up possibilities we didn't have before."

His research is published today in Science Advances.

"It's a bit like knitting," he said. "Each row is like a one-dimensional line. You knit row after row of wool and, over time, this produces a two-dimensional panel of material."

Fault-tolerant quantum computers

Reducing errors in quantum computing is one of the biggest challenges facing scientists before they can build machines large enough to solve useful problems.

"Because quantum information is so fragile, it produces a lot of errors," said Dr Brown, a research fellow at the University of Sydney Nano Institute.

Completely eradicating these errors is impossible, so the goal is to develop a "fault-tolerant" architecture where useful processing operations far outweigh error-correcting operations.

"Your mobile phone or laptop will perform billions of operations over many years before a single error triggers a blank screen or some other malfunction. Current quantum operations are lucky to have fewer than one error for every 20 operations - and that means millions of errors an hour," said Dr Brown who also holds a position with the ARC Centre of Excellence for Engineered Quantum Systems.

"That's a lot of dropped stitches."

Most of the building blocks in today's experimental quantum computers - quantum bits or qubits - are taken up by the "overhead" of error correction.

"My approach to suppressing errors is to use a code that operates across the surface of the architecture in two dimensions. The effect of this is to free up a lot of the hardware from error correction and allow it to get on with the useful stuff," Dr Brown said.

Dr Naomi Nickerson is Director of Quantum Architecture at PsiQuantum in Palo Alto, California, and unconnected to the research. She said: "This result establishes a new option for performing fault-tolerant gates, which has the potential to greatly reduce overhead and bring practical quantum computing closer."

Path to universal computation

Start-ups like PsiQuantum, as well as the big technology firms Google, IBM and Microsoft, are leading the charge to develop large-scale quantum technology. Finding error-correcting codes that will allow their machines to scale up is urgently needed.

Dr Michael Beverland, a senior researcher at Microsoft Quantum and also unconnected with the research, said: "This paper explores an exciting, exotic approach to perform fault-tolerant quantum computation, pointing the way towards potentially achieving universal quantum computation in two spatial dimensions without the need for distillation, something that many researchers thought was impossible."

Two-dimensional codes that currently exist require what Dr Beverland refers to as distillation, more precisely known as 'magic-state distillation'. This is where the quantum processor sorts through the multiple computations and extracts the useful ones.

This chews up a lot of computing hardware just suppressing the errors.

"I've applied the power of the three-dimensional code and adapted it to the two-dimensional framework," Dr Brown said.

Dr Brown has been busy this year. In March he published a paper in top physics journal Physical Review Letters with colleagues from EQUS and the University of Sydney. In that research he and colleagues developed a decoder that identifies and corrects more errors than ever before, achieving a world record in error correction.

"Identifying the more common errors is another way we can free up more processing power for useful computations," Dr Brown said.

Professor Stephen Bartlett is a co-author of that paper and leads the quantum information theory research group at the University of Sydney.

"Our group at Sydney is very focused on discovering how we can scale-up quantum effects so that they can power large-scale devices," said Professor Bartlett, who is also Associate Dean for Research in the Faculty of Science.

"Dr Brown's work has shown how to do this for a quantum chip. This type of progress will enable us to go from small numbers of qubits to very large numbers and build ultra-powerful quantum computers that will solve the big problems of tomorrow."

Credit: 
University of Sydney

How the darter got stripes: Expanding a sexual selection theory explains animal patterns

image: Samuel Hulse (right), from UMBC, and Julien Renoult, from the Centre National de la Recherche Scientifique, collect darter fish in a freshwater stream. The complex Fourier analysis they completed based on photos of the fish and their habitats supported and greatly expanded on sensory drive theory, a sexual selection theory that emphasizes the role of the environment in determining which sexual signals are selected for. The new work, published in Nature Communications, shows for the first time that the theory holds true for complex visual patterns, not only simple signals like color.

Image: 
Courtesy of Tamra Mendelson

Samuel Hulse, a Ph.D. candidate at UMBC, spent a lot of time in waders over the last two years. He traipsed from stream to stream across the eastern U.S., carefully collecting live specimens of small, colorful freshwater fish known as darters and taking photos of their habitats. Then he brought them back to the lab to capture high-quality images of their coloration patterns.

Hulse developed a precise, quantitative analysis of those visual patterns, such as stripes, spots, and various mottled looks. His work shows, for the first time, a strong correlation between the complicated patterns on male fish and the fishes' highly variable environments. The results were published today in Nature Communications.

These findings represent a major expansion of a theory in sexual selection known as "sensory drive," which emphasizes how an animal's environment can influence what sexual signals--like visual patterns--are selected for over time.

Driving progress

So far, sensory drive has successfully explained examples such as coloration in cichlids, a group of freshwater fish in Africa. Hulse was working to expand on this research.

Different species of cichlids live at different depths, and which colors the fish can easily see changes as you go deeper and there is less light. Why does this matter? The idea of sensory drive is that animals perceive visual signals, like colors, as more attractive when they are easier for their brains to process. And which signals are easier to process is dependent on the environment. When male fish are perceived as more attractive, they are more likely to reproduce, and their colors are more likely to be passed to the next generation of fish. So, if the theory of sensory drive is true, eventually, most male fish will have colors that are easy for mates to perceive in their particular environment.

In cichlid fish, "you see this depth-dependent change in the male colors as you go deeper," Hulse says. With the new work, "we were able to expand on this theory to explain more complicated traits, such as visual patterns," like stripes and spots.

Using math to understand biology

Hulse, who is also taking courses toward an M.S. in mathematics at UMBC, brought his quantitative skills to bear on this research. He used a measure called Fourier analysis to examine his fish images, looking at variations in color contrast.

For example, if you were to look at a photo of a grassy hill under a bright blue sky, the greatest contrast in brightness would be between the large areas above and below the horizon line. That contrast is on a larger scale than the differences in brightness between, say, tiny blades of grass. The differences between each blade are small, but occur frequently across the image.

Fourier analysis can translate the contrast patterns in an image into a representative set of mathematical sine and cosine waves. The low-frequency waves, which only swoop up and down once or twice across the entire image, represent large-scale differences, like above and below the horizon. High-frequency waves swoop up and down many times across an image and represent small-scale differences, like between blades of grass.

Researchers can look at the relationships between those waves--how much high-frequency versus low-frequency contrast there is in the image. Hulse's work looked at that measure to examine the visual relationship between a habitat and the fish that lived in it. And sure enough, his calculations revealed a strong correlation, providing evidence of sensory drive in male darters.

Moving past "wishy-washy terminology"

One argument against the idea that these patterns are attractive to females is the idea of camouflage. Wouldn't it make sense for animals to match the visual patterns of their environment to avoid getting eaten rather than to attract females? Darters are under strong predation pressure, so, Hulse says, it's a valid point.

However, the fact that he found that only male fish match their environment is a strong argument in favor of sensory drive. Predators don't discriminate between males and females, so you would expect females to also match their environment if camouflage was the reason.

"Quantitatively describing visual patterns is a big challenge, and there's not one easy way to do that, so being able to use tools like Fourier analysis is wonderful," Hulse says. "That actually lets us quantify some of these things that have historically been very hard to describe other than with wishy-washy terminology."

Perfect timing

Tamra Mendelson, professor of biological sciences, is Hulse's advisor and a co-author on the new paper. She had just begun formulating the ideas for this research with visual ecologist Julien Renoult, a colleague at Centre National de la Recherche Scientifique (CNRS) in Montpellier, France, and another co-author, when Hulse joined her laboratory in 2016.

"Julien had inspired me to take concepts from a field called human empirical aesthetics, which is the mathematical and biological basis of human appreciation of art, and apply them to animals' appreciation of other animals," Mendelson says. "I was super excited about it, but I didn't have the mathematical chops to really take it as far as it could go."

So, when Hulse arrived, "It was a perfect match. Sam is the ideal student to be doing this project."

Hulse also spent several months in France working with Renoult to iron out some of the statistical challenges of the work--which were many. "The data analysis became a lot more complicated than we thought, and there were a lot of technical snags," Hulse says. "So it was really great to be able to be there working directly with Julien, who has a lot of background with these sorts of methods."

Bringing it all together

Hulse was drawn to this work by the unique blend of skills it requires. "I love the interdisciplinary nature of it. We're bringing together field biology, sensory biology, a little bit of neurobiology, and image analysis," he says. "That's one of the most attractive things about this project for me--how much I get to learn and how much I get to take little pieces from so many different areas."

Now, Hulse, Mendelson, and Renoult are excited to see where their new work leads. "There's not a lot of theory in sexual selection that can be used to explain why you see one pattern evolve in one animal where you see a different one evolve in a closely related species," Hulse says.

The new findings open the door to much more exploration with different species, including animals that live on land. In any group of animals that relies on vision, has visually distinct environments, and where the animals have distinct habitat preferences, Hulse argues, "this theory should hold."

Credit: 
University of Maryland Baltimore County

Mechanism underlying the development of diabetes and fatty liver illuminated

image: Figure 1

Image: 
Kobe University

A research group including Professor OGAWA Wataru (Division of Diabetes and Endocrinology, Kobe University Graduate School of Medicine) and Project Associate Professor HOSOOKA Tetsuya (Division of Development of Advanced Therapy for Metabolic Disease, Kobe University Graduate School of Medicine) has clarified the mechanism underlying the development of diabetes and non-alcoholic steatohepatitis (NASH), a severe form of fatty liver.

NASH is a chronic liver disease often associated with diabetes. It sometimes progresses to more serious conditions such as liver cirrhosis and liver cancer. However, the mechanism by which NASH develops is ambiguous and no approved medication for the disease is available.

The current study revealed that insufficient insulin action in adipocytes leads to metabolic defects affecting the entire body via the hyperactivation of the protein FoxO1, which in turn results in the development of diabetes and NASH. The pathway revealed by the current study could serve as a potential target for the development of new drugs for these conditions.

These findings were published in the American scientific journal 'Proceedings of the National Academy of Sciences of the USA' on May 11, 2020.

Main Points

Diabetes and NASH develop as a result of insufficient action of the hormone insulin in fat cells.

This insufficient action of insulin in fat cells leads to FOXO1 overactivation, which in turn causes diabetes and NASH to develop.

There is a demand for better diabetes medication and, furthermore, there is currently no available treatment for NASH.

The pathway uncovered in this study would serve as a potential target for the development of new medications for these conditions.

Research Background

In Japan, more than 10 million people suffer from diabetes, which can lead to a wide variety of health problems. Prevention of diabetes and diabetes-associated diseases is an important medical issue worldwide. NASH is a health problem often associated with diabetes. NASH develops from fatty liver and may progress into more serious conditions such as liver cirrhosis and liver cancer. However, the mechanistic relationship between diabetes and NASH is unclear and no medication for NASH is currently available.

Fat cells, or adipocytes, play an important role in regulating the entire body's metabolism. The impairment of adipocyte functions is thought to contribute to the development and progression of various diseases, including diabetes and NASH. However, the mechanism by which adipocyte dysfunction triggers these diseases is not fully understood.

Professor Ogawa's research team has found that insufficient action of insulin in adipocytes results in the overactivation of the protein FoxO1, which in turn leads to the development of diabetes and NASH through the alteration of the entire body's metabolism. The link between insufficient action of insulin in adipocytes and NASH hasn't been proposed until now. In addition, the research team discovered that the overactivation of FoxO1 caused an abundant increase in leukotriene B4 (*1), and that this inflammation-causing substance plays an important role in the onset of metabolic dysfunction.

Summary of the Discovery

Insulin is an important hormone that regulates the metabolism of the entire body, and insufficient action of insulin, often called "insulin resistance", serves as a basis for various diseases. Professor Ogawa's team generated mice in which PDK1, a protein essential for insulin action, was deficient only in the adipocytes. These mice exhibited inefficient insulin action not only in adipocytes but also throughout the body, which lead to the development of diabetes and NASH.

Insulin is known to suppresses the activity of the protein FoxO1 via PDK1 activation. Therefore, the team generated mice which lacked FoxO1 and PDK1 only in their adipocytes to test whether overactivation of FoxO1 contributes to the development of diabetes and NASH, and found that these two conditions did not manifest in the mice at all.

These results indicate that inefficient insulin action in adipocytes triggers diabetes and NASH via FOXO1 overactivation, culminating in insulin resistance throughout the whole body (Figure 1).

The researchers further examined the mechanism as to how the overactivation of FoxO1 influences the functions of other organs. They subsequently revealed that FoxO1 is capable of increasing the amount of the protein 5-lipoxygenase (Figure 2). 5-lipoxgenase is a protein responsible for the production of the inflammatory substance leukotriene B4. In mice with adipocyte-specific PDK1 deficiency, inhibition of the production or the function of leukotriene B4 ameliorated diabetes (*1), indicating that the overactivation of FoxO1 triggers diabetes through leukotriene B4 activity. The team also revealed that the overactivation of FoxO1 and the consequent upregulation of 5-lipoxgenase also occurred in the adipose tissues of obese regular mice fed a diet containing a large amount of fat.

This research revealed that diabetes and NASH develop as a result of insufficient action of insulin (i.e. insulin resistance) in adipocytes, which led to the exaggerated activation of FoxO1 and subsequent upregulation of leukotriene B4.

The link between insufficient insulin action in adipocytes and NASH onset, as well as insulin's ability to control the production of leukotriene B4 are new discoveries that had not been hypothesized.

How can these new research findings be applied to medical care?

In Japan, at least 3 million patients suffer from NASH, however there is no approved medication available for the disease. The findings uncovered by this research may pave the way for the development drugs for NASH that target the "insufficient action of insulin in adipocytes".

Drugs that inhibit the production or the function of leukotriene B4 have already been developed; some of which have been previously utilized to treat asthma in some countries. Therefore, drug repositioning (*2) using these existing drugs presents a promising option for the development of new diabetes medication.

Credit: 
Kobe University

Kidney transplants: Better results can be inferred from a larger number of cases

In complex operations, is there a correlation between the volume of service provided per hospital and the quality of the treatment outcome? This question is addressed in eight commissions on minimum volumes that the Federal Joint Committee (G-BA) has issued to the Institute for Quality and Efficiency in Health Care (IQWiG). The IQWiG report is now available for the fifth intervention to be tested, kidney transplantation.

According to this report, in the case of kidney transplantation there is a correlation between the volume of services and the quality of the treatment outcome: In hospitals with larger case numbers, the chances of survival are higher up to one year after transplantation. For the target figure "transplant failure" no correlation between the volume of services and the quality of treatment can be deduced.

The most frequent organ transplantation in Germany

In cases of chronic kidney failure, in most cases caused by diabetes or high blood pressure, kidney transplantation is the only treatment option besides dialysis. The organ donation is then made either as a post-mortem donation or as a living donation from direct relatives or people very close to the patient. 5 years after transplantation, 78 percent of postmortem donated kidneys and 87 percent of live donated kidneys still function in the new body (figures for Europe).

Kidney transplantation is the most common organ transplantation in Germany: In 2018, doctors in Germany transplanted 1,671 kidneys after post-mortem organ donation and 638 kidneys after living donation. The waiting list for a donor kidney included more than 7,500 patients in the same year. The average waiting time for a kidney transplant is currently more than 8 years.

Currently, a minimum of 25 treatments per hospital location and year is required for kidney transplants (including living donations) in Germany. In contrast to the regulation on the annual minimum volume for liver transplants, organ removals are not counted as part of the number of interventions required to achieve the minimum quantities.

Positive correlation between service volume and chance of survival

The question of whether hospitals with larger case numbers achieve better treatment results for kidney transplantation than hospitals with smaller case numbers can be answered in the affirmative by IQWiG for the survival chances of patients on the basis of a short-term observation period: For all-cause mortality up to 12 months after transplantation, two of the three studies evaluated in this context show a lower probability of dying with a higher volume of services, although the significance of the results is low. IQWiG researchers cannot derive such a correlation for the medium-term all-cause mortality after 36 months, for which a US study had collected data. After evaluating the data from two relevant studies, the Institute also sees no overall connection between the volume of services and the quality of treatment for the target value "transplant failure". No usable data were available for the target variables "adverse effects of therapy", "health-related quality of life" and "length of hospital stay", so that no statements can be made on this.

Since none of the included studies included the individual service quantities of the surgeons, it is also not possible to assess whether more routine kidney transplantation leads to better treatment results.

There are no studies on the effects of minimum case numbers specifically introduced into the care system for kidney transplants. Accordingly, IQWiG cannot make a statement on this.

The report preparation process

In February 2019, the Federal Joint Committee commissioned IQWiG to prepare the report on the relationship between the volume of services and quality in kidney transplantation in an accelerated procedure as a "rapid report". Intermediate products were therefore not published and not submitted for consultation. The work on this rapid report started in August 2019 and after completion it was sent to the contracting agency, the G-BA, in April 2020.

An English extract of this rapid report will be available soon. If you would like to be informed when it is published, please contact info@iqwig.de

Credit: 
Institute for Quality and Efficiency in Health Care

Why toothpaste and cement harden over time

image: University of Delaware chemical and biomolecular engineering professor and chair Eric Furst and a team of researchers from the Ecole des Ponts and University Paris-Est and in France have discovered a process called contact-controlled aging that explains some age-related changes in paste materials.

Image: 
Photo illustration by Joy Smoker

Take a look inside the cap of your favorite toothpaste, and you might see hard, white residue, a firm version of the smooth paste you squeeze onto your brush.

Many paste materials, also known as dense colloidal suspensions, stiffen as they age. Structural dynamics, or changes in the loads the materials undergo over time, are partly responsible for this change, but for decades, experts have suspected that there's more going on inside these materials.

Now, University of Delaware chemical and biomolecular engineering professor and chair Eric Furst and a team of researchers from the Ecole des Ponts and University Paris-Est and in France have discovered a process called contact-controlled aging that explains some age-related changes in paste materials.

They found that contacts form between particles, stabilizing the microstructure of these materials. Then, those contacts stiffen, increasing the stiffness of the materials.

The team described their findings in a paper published in the journal Nature Materials.

"When people think about aging in materials and the mechanical properties of materials as they age, especially in rheology or the study of how things flow, this mechanism has been overshadowed by changes in the organization, or microstructure, of the material," said Furst.

Not only are the findings novel, they are likely to prove useful. By understanding how materials age, the people who use them can design better ways to predict and mitigate unwanted changes in materials performance. The experiments closely tie the chemistry of the particle surfaces, which can be tailored by chemical reactions or with additives like surfactants and polymers, to the bulk material properties.

"This paper has some broad-ranging implications because there are a lot of types of problems out there where this type of contact aging may be really important," said Furst.

People in a wide range of industries could benefit from understanding the aging process of materials of this type, which includes cements, clays, soils, inks, paints, and more.

The researchers used a variety of methods to explore aging in silica and polymer latex suspensions. Initial experiments showed that the microstructure of the materials does not change over time. If the particles don't change positions, the team thought, then something must be happening between them.

In previous experiments, Furst has used laser tweezers -- use of a focused laser beam to manipulate, bend, and break microscopic structures of particles -- which proved to be the right experimental setup for spelunking this particular problem. Francesco Bonacci, then a doctoral student in France, visited UD to conduct laser tweezer experiments and study the stiffness of bonds in the silica and latex materials under investigation. These experiments enabled the discovery of contact aging.

Additional experiments suggested genericity -- that the results are likely to apply to a wide variety of dense colloidal suspensions.

For Furst, this project is an example of the power of collaborating with experts around the world.

"This was the result of an incredible international collaboration, just a beautiful team," he said. The co-authors on the paper include Bonacci, Xavier Chateau, Julie Goyon, Jennifer Fusier, and Anaël Lemaître.

Credit: 
University of Delaware

When predictions of theoretical chemists become reality

image: Honeycomb-kagome structure

Image: 
Yu Jing

For the renowned journal Nature Materials, this was the occasion to invite Thomas Heine to a News and Views article, which was published this week. Under the title "Making 2D Topological Polymers a reality" Prof. Heine describes how his theory became a reality.

Ultrathin materials are extremely interesting as building blocks for next generation nano electronic devices, as it is much easier to make circuits and other complex structures by shaping 2D layers into the desired forms. Thomas Heine, Professor for Theoretical Chemistry at TU Dresden, is working on the prediction of such innovative materials. Their properties can be precisely calculated using modern methods of computational chemistry, even before they have been realized in the laboratory.

This research is particularly interesting for 2D polymers: their lattice type is defined by the shape of their building blocks, and those can be selected from the almost infinite variety of plane organic molecules which match the required structure. A particularly interesting example is the kagome lattice, which consists of the corners and edges of a trihexagonal tiling. In 2019, Yu Jing and Thomas Heine proposed to synthesize such 2D polymers from triangular organic molecules (so-called triangulenes). These materials have a combined honeycomb-kagome structure (see figure). Their calculations suggest that these 2D structures combine the properties of graphene (quasi massless charge carriers) with those of superconductors (flat electronic bands).

Now the Italian materials scientist Giorgio Contini and his international team have succeeded in synthesizing this 2D honeycomb kagome polymer, as published in Nature Materials earlier this week. An innovative surface synthesis method made it possible to produce crystals of such high quality that they were suitable for the experimental characterization of electronic properties. Indeed, the predicted fascinating topological properties were revealed. Thus, for the first time, it could be experimentally proven that topological materials can be realized via 2D polymers.

Research on 2D polymers is thus placed on a solid basis. The kagome lattice described here is only one example out of hundreds of possibilities to connect plane molecules to regular lattices. For some of these variants, other interesting electronic properties have already been predicted theoretically. This opens up numerous new possibilities for theorists and experimentalists in chemistry and physics to develop materials with previously unknown properties.

For the renowned journal Nature Materials, this was the occasion to invite Thomas Heine to a News and Views article, which was published this week. Under the title "Making 2D Topological Polymers a reality" Prof. Heine describes how his theory became a reality.

Prof. Heine explains: "These results show that 2D polymers can be materials with useful electronic properties, although their structures are much more wide-meshed than regular electronic materials, with distances of more than one nanometer between the lattice points. The prerequisite is that the materials are of excellent structural quality. This includes a high crystallinity and a very low defect density. Another important contribution of the colleagues around Prof. Contini is that, although the 2D polymers were produced on a metal surface, they can be detached and transferred to any other substrate, such as silicon oxide or mica, and thus be incorporated into electronic devices".

Credit: 
Technische Universität Dresden

Parasitic wasp discovery offers chemical-free pest control for growers

A species of parasitic wasp discovered by chance could provide growers with a chemical-free way of controlling a major pest.

Researchers made the discovery when the wasps appeared mysteriously in colonies of cabbage stem flea beetles (CSFB) they were studying to test feeding preferences on oilseed rape.

The wasps appeared even though the beetles were confined to potted oilseed rape plants inside micro-perforated bags.

Further exploration revealed that the colonies of around 3000 beetles collected from three sites around Norfolk had been infected by a parasitic wasp that lays eggs within the beetle's body.

Genetic sequencing and enquiries by the Natural History Museum, UK, and the Swedish Museum of Natural History identified the wasp as an obscure species called Microctonus brassicae which was first reported in 2008 with no further identifications until now.

The study carried out by researchers at the John Innes Centre is the first English published description of this parasitoid of the adult CSFB.

Experiments showed that within controlled conditions the presence of wasps in sufficient numbers led to the collapse of CSFB colonies.

Beetle hosts are rendered sterile and die after the wasp larvae emerge from the body after passing through its digestive system. The short generation time of 43.5 days from egg to adult means it would be possible to rapidly rear multiple generations in controlled conditions.

The study raises the possibility of employing Microctonus Brassicae and other species of genetically similar parasitic wasps as a biocontrol to protect oilseed rape and a range of commercially important crops prone to attack by CSFB.

The beetle is a major threat to oilseed rape, particularly the winter crop, throughout the UK and Europe. It causes characteristic damage known as "shot-holing" to leaves often resulting in crop failure or poor crop establishment.

The beetle has become a prominent pest in the UK, particularly in East Anglia, and surrounding counties following the European ban on neonicotinoid seed treatment use in flowering crops.

The ban on these and other systemic pesticides followed research linking their use to decline in pollinators. Further legislation in 2019 upgraded the ban to include other broad-spectrum pesticides.

With the removal of seed treatments for oilseed rape the numbers of CSFB and the damage they cause have increased. Figures for 2014 value damage at £23million with an approximate loss of 3.5% of the national crop area of winter oilseed rape to CSFB.

The estimated best-case crop production for 2020/21 is 1.26 million tonnes, a year on year decline of 489,000 tonnes, putting the future of the valuable UK rapeseed crop in doubt.

Using beneficial insects for biocontrol has been investigated in the past with five species of parasitoid wasps found to target CSFB. But these have demonstrated limited effectiveness and found to be economically unnecessary while the now banned pesticides were in use.

Under captivity in this study M. brassicae rate of parasitism was greater than 44%. The research suggests that the wasp may have the potential to deliver positive effects under field conditions.

Further research has been performed by Rothamsted Research to look at wasp presence and parasitism levels across the UK. Agricultural practices such as promotion of field margins, beetle banks and conservation headland may provide habitats to support the beneficial parasitoids.

"Something that was initially very annoying leading to the collapse of our research colonies has turned out to be fortunate," says lead author Dr Rachel Wells of the John Innes Centre. "It offers the possibility of using parasitoid wasps as bio-controls for farmers and growers of oilseed rape and brassica vegetables against cabbage stem flea beetle as part of an integrated pest management approach."

The research: The potential of the solitary parasitoid Microctonus brassicae for the biological control of the adult cabbage stem flea beetle, Psylliodes chrysocephala is in the journal Entomologia Experimentalis et Applicata.

Credit: 
John Innes Centre