Tech

Proenkephalin (penKid®) included in the ADQI consensus statements publication as functional kidney biomarker for the management of AKI patients

The latest consensus meeting of international experts in critical care and nephrology supports the use of novel biomarkers in the prevention and management of Acute Kidney Injury (AKI)

The consensus recommends using a combination of damage and functional biomarkers together with clinical information for routine practice

Proenkephalin (penKid®), the kidney function biomarker, proposed as a marker for the assessment of AKI progression and kidney recovery

The CE-IVD marked assay for penKid® is available for point of care usage on the fully automated Nexus IB10 platform

Hennigsdorf/Berlin, Germany, December 16, 2020 - Diagnostics company SphingoTec GmbH ("SphingoTec") announced today that the Acute Disease Quality Initiative (ADQI) recommends the use of novel biomarkers for AKI, including functional biomarkers as penKid®. Since AKI is affecting 1 in 3 Intensive Care Unit (ICU) patients [1], and the current standard of care diagnostics has considerable sensitivity and specificity limitations, there is an urgent need to implement new biomarkers to assist a better management of AKI.

The current consensus recommendations [2] support clinicians in making more informed decisions and improve outcomes with biomarker guided management of AKI patients, including triage, diagnosis, and guidance of therapy. Among the main recommendation of the ADQI meeting is the use of novel biomarkers to assess AKI progression and kidney recovery. The consensus statements highlight the performance and added value of penKid® for the prediction of duration and recovery of AKI. Based on the results of a multicenter trial (3), the presented evidence shows that penKid® concentration is significantly lower in AKI patients with improving kidney function when compared to patients without kidney recovery. Additional data (4) is used by the experts in ADQI to convey that significantly higher penKid® levels are indicating those patients with major adverse kidney events, patients with persistent AKI, and those who had worsening of kidney function. Furthermore, the consensus statement also underlines that penKid® is an earlier biomarker than today's standard of care diagnostics in identifying the patients with worsening kidney function.

Prof. Peter Pickkers (Radboud University, Nijmegen), member of the ADQI explained "Since the last evaluation of novel AKI biomarkers 9 years ago, we have collected enough evidence now to consider the usage of functional and damage biomarkers in the prediction and management of AKI. Although many novel biomarkers can measure the damage that already occurred in the kidneys, there are few choices available for measuring the kidney function. Besides Cystatin C, penKid is the only novel functional biomarker available for clinical routine practice."

The kidney function biomarker penKid® was previously validated in over 40.000 patients and published data [5] demonstrate that penKid® can detect the presence and severity of AKI and enables the identification of patients at high risk of unfavorable outcomes. Moreover, previous findings from the AdrenOSS 1, a 24-centers study, show that penKid® not only diagnoses AKI earlier than today's standard of care, but it also indicates the renal recovery. [4] The utility of penKid® has been proved both in adult and children population. [6]

Dr. Andreas Bergmann, CEO and founder of SphingoTec stated "We are excited that penKid® has been recognized by ADQI as a suitable functional kidney biomarker. Not only that penKid® reflects kidney function and true GFR independent of inflammation and comorbidities, but unpublished clinical data could complement the consensus recommendation to further evaluate the functional biomarker's role in defining the optimal timing for initiating and stopping kidney replacement therapy."

To support timely treatment decisions that are likely to improve patient management in critical care patients, SphingoTec makes available the CE-IVD marked assay for penKid® on its proprietary Nexus IB10 platform. The fully automated point-of-care analyzer uses whole blood, delivers results in only 20 minutes, and can be flexibly deployed in near-patient as well as laboratory settings.

Credit: 
sphingotec GmbH

New salmonella proteins discovered

image: The combination of different bioinformatics methods has brought new small proteins from salmonella to light.

Image: 
Image: Sandy Westermann / SCIGRAPHIX

Salmonella are bacteria that can cause food poisoning with severe diarrhoea. If they penetrate from the intestine into the blood system, this can lead to sepsis, life-threatening inflammatory reactions in the entire organism. Since salmonellae are also becoming increasingly resistant to antibiotics, new approaches are being sought to combat them.

An international research team, led by scientists from Würzburg, shows how to succeed in this search in the new research journal microLife.

More than 100 new proteins found

In a bioinformatic reassessment of the Salmonella genome, the team led by JMU doctoral student Elisa Venturini identified many unknown small proteins that may play a crucial role in infection. As a result, the number of known small Salmonella proteins has grown by 139 to over 600.

The small protein MgrB, which consists of 47 amino acids, stood out in the analyses. If the gene containing the blueprint for this protein is switched off, the salmonellae can no longer infect human cells. Although the protein had been studied before, this important function had not been recognised. This has only now been achieved thanks to a new combinatorial approach. Among other things, three data sets that had been generated in earlier infection studies were used for this purpose.

Blueprint for other bacteria too?

"Hopefully our approach will provide a blueprint that can also be applied to other organisms for which data sets already exist," says Venturini. The study has clearly shown that the method can still bring new relevant genes to light even in comprehensively studied organisms such as salmonella: The scientific community now has a priority list of previously unknown infection-related small salmonella proteins for further investigation.

Credit: 
University of Würzburg

Flexible and powerful electronics

Tsukuba, Japan - Researchers at the University of Tsukuba have created a new carbon-based electrical device, π-ion gel transistors (PIGTs), by using an ionic gel made of a conductive polymer. This work may lead to cheaper and more reliable flexible printable electronics.

Organic conductors, which are carbon-based polymers that can carry electrical currents, have the potential to radically change the way electronic devices are manufactured. These conductors have properties that can be tuned via chemical modification and may be easily printed as circuits. Compared with current silicon solar panels and transistors, systems based on organic conductors could be flexible and easier to install. However, their electrical conductivity can be drastically reduced if the conjugated polymer chains become disordered because of incorrect processing, which greatly limits their ability to compete with existing technologies.

Now, a team of researchers led by the University of Tsukuba have formulated a novel method for preserving the electrical properties of organic conductors by forming an "ion gel." In this case, the solvent around the poly(para-phenyleneethynylene) (PPE) chains was replaced with an ionic liquid, which then turned into a gel. Using confocal fluorescent microscopy and scanning electron microscopy, the researchers were able to verify the morphology of the organic conductor.

"We showed that the internal structure of our π-ion gel is a nanofiber network of PPE, which is very good at reliably conducting electricity" says author Professor Yohei Yamamoto.

In addition to acting as wires for delocalized electrons, the polymer chains direct the flow of mobile ions, which can help move charge-carriers to the carbon rings. This allows current to flow through the entire volume of the device. The resulting transistor can switch on and off in response to voltage changes in less than 20 microseconds--which is faster than any previous device of this type.

"We plan to use this advance in supramolecular chemistry and organic electronics to design a whole arrange of flexible electronic devices," explains Professor Yamamoto. The fast response time and high conductivity open the way for flexible sensors that enjoy the ease of fabrication associated with organic conductors, without sacrificing speed or performance.

Credit: 
University of Tsukuba

Physicists solve geometrical puzzle in electromagnetism

A team of scientists have solved the longstanding problem of how electrons move together as a group inside cylindrical nanoparticles.

The new research provides an unexpected theoretical breakthrough in the field of electromagnetism, with perspectives for metamaterials research.

The team of theoretical physicists, from the University of Exeter and the University of Strasbourg, created an elegant theory explaining how electrons move collectively in tiny metal nanoparticles shaped like cylinders.

The work has led to new understanding of how light and matter interact at the nanoscale, aland has implications for the realization of future nanoscale devices exploiting nanoparticle-based metamaterials with spectacular optical properties.

Metallic nanoparticles have a positively charged ionic core, with a cloud of negatively charged electrons swirling around it. When light is shone on such a metallic object, the electronic cloud is displaced.

This displacement causes the whole group of electrons to be set into oscillation about the positive core. The group of electrons sloshing back and forth behaves like a single particle (a so-called quasiparticle), known as a "plasmon".

The plasmon is primarily characterized by the frequency at which it oscillates, which is known as the plasmon resonance frequency.

Exploring how the resonance frequency of the plasmon changes depending on the geometry of its hosting nanoparticle is a fundamental task in modern electromagnetism. It is commonly thought that only some particular nanoparticle geometries can be described with analytical theory - that is, without recourse to heavy, time-consuming numerical computations.

The list of geometries permitting an analytical description is widely believed to be very short, being composed of only spherical and ellipsoidal nanoparticles.

This fact is highly inconvenient due to the experimental ubiquity of cylindrical nanoparticles, which arise in a variety of aspect ratios from long, needle-like nanowires to thin, pancake-like nanodisks.

In the research, the researchers addressed how plasmons in cylindrical nanoparticles oscillate. By using a theoretical technique inspired by nuclear physics, the researchers built an elegant analytic theory describing the behaviour of plasmons in cylinders with an arbitrary aspect ratio.

The theory has enabled a complete description of cylindrical plasmonic nanoparticles, describing simply the plasmonic resonance in metallic nanoparticles from nanowires to circular nanodisks.

The two condensed matter theorists also considered the plasmonic response of a pair of coupled cylindrical nanoparticles and found quantum mechanical corrections to their classical theory, which is relevant due to the small, nanometric dimensions of the nanoparticles.

Dr Charles Downing from the University of Exeter's Physics and Astronomy department explains: "Quite unexpectedly, our theoretical work provides deep, analytic insight into plasmonic excitations in cylindrical nanoparticles, which can help to guide our experimental colleagues fabricating metallic nanorods in their laboratories".

Guillaume Weick from the University of Strasbourg adds: "There is a trend for increasing reliance on heavy duty computations in order to describe plasmonic systems. In our throwback work, we reveal humble pen-and-paper calculations can still explain intriguing phenomena at the forefront of metamaterials research".

The theoretical breakthrough is of immediate utility to a swathe of scientists working with nano-objects in the cutting edge science of plasmonics. Longer term, it is hoped that plasmonic excitations can be exploited in the next generation of ultra-compact circuitry, solar energy conversion and data storage as our technology becomes increasingly miniaturized.

Credit: 
University of Exeter

Study of dune dynamics will help scientists understand the topography of Mars

image: Researchers at the University of Campinas conducted more than 120 experiments with dunes of up to 10 cm that interact for a few minutes, obtaining a model valid for dunes on the surface of Mars that are many miles long and take more than a thousand years to interact

Image: 
Agência FAPESP

 Barchans are crescent-shaped sand dunes whose two horns face in the direction of the fluid flow. They appear in different environments, such as inside water pipes or on river beds, where they take the form of ten-centimeter ripples, and deserts, where they can exceed 100 meters, and the surface of Mars, where they can be a kilometer in length or more. If their size varies greatly, so does the time they take to form and interact. The orders of magnitude range from a minute for small barchans in water to a year for large desert formations and a millennium for the giants on Mars.

They are formed by the interaction between the flow of a fluid, such as gas or liquid, and granular matter, typically sand, under predominantly unidirectional flow conditions (read more at: agencia.fapesp.br/29178). 

“What’s interesting is the similarity of their formation and interaction dynamics, regardless of size. As a result, we can study aquatic barchans in the laboratory to make predictions about the evolution of the dunes in Lençóis Maranhenses [a coastal ecosystem in the Northeast of Brazil] or to investigate the origins of the topography in the Hellespontus region on Mars,” said Erick Franklin, a researcher and professor at the University of Campinas’s School of Mechanical Engineering (FEM-UNICAMP) in the state of São Paulo, Brazil.

Working with his PhD student Willian Righi Assis, Franklin performed more than 120 experiments and identified five basic types of interaction between barchans. 

The study, conducted entirely at UNICAMP, is reported in an article published in the journal Geophysical Research Letters. It was supported by FAPESP via a Phase 2 Young Investigator Grant awarded to Franklin and a direct doctorate scholarship awarded to Assis.

A striking aspect of the topic is that as well as having a robust shape that appears in many different environments, barchans typically form corridors in which their sizes are approximately the same. Analysis of individual dunes suggests they should grow indefinitely, becoming steadily larger, but this is not the case. One explanation for their characteristic size in a given environment is that binary interactions, especially collisions, redistribute the mass of sand, and instead of growing continuously they subdivide into smaller dunes.

“This has been proposed in the past, but no one had extensively tested and mapped these interactions, as dune collisions take decades to happen in terrestrial deserts,” Franklin said. “Taking advantage of the fact that underwater barchans are small and move much faster, we conducted experiments in a hydrodynamic channel made of transparent material, with turbulent water flow forming and transporting pairs of barchans while a camera filmed the process. We identified for the first time the five basic types of binary interaction.”

In the experiments, the researchers varied independently each of the parameters involved in the problem, such as grain diameter, density and roundness, water flow velocity, and initial conditions. The images acquired were processed by computer using a numerical code written by the researchers. Based on the results, they proposed two maps that supplied a general classification of the possible interactions.

“Our experiments showed that when a binary collision occurs, the barchan that was originally downstream, i.e. in front, expelled a dune of an approximately equal mass to that of the barchan upstream, i.e. behind,” Franklin said. “The first impression was that the upstream barchan passed over the other barchan like a wave, but the use of colored grains helped us show this didn’t happen. Actually, the upstream barchan entered the downstream barchan, which became too large and released a mass more or less equal to the mass received.”

Interactions between the two barchans basically involved two mechanisms. One was the disturbance caused in the fluid, which bypassed the upstream barchan, accelerated and impacted the downstream barchan, which eroded. This is termed the “wake effect”. The other was the collision in which the colliding barchans’ grains merged. 

“Our experimental data showed that these two mechanisms caused five types of barchan-barchan interaction,” Franklin said. “Bearing in mind that the velocity of a dune is inversely proportional to its size, the simplest two are what we call chasing and merging.”

Chasing occurs when the two barchans are roughly the same size and erosion due to the wake effect makes the downstream dune decrease in size. The two barchans then move at the same velocity and remain at a constant distance from each other. Merging happens when the upstream barchan is much smaller than the downstream barchan. Erosion caused by the wake does not substantially decrease the size of the upstream dune, so that the barchans collide and merge, forming a single dune.

The third type of interaction is exchange, which is more complicated. “Exchange happens when the upstream barchan is smaller than the downstream barchan, but not much smaller. Here, too, the upstream dune catches up with the downstream dune and they collide. As they do so, the smaller dune ascends and spreads over the larger one. During this process, however, the fluid flow, which is deflected by the new dune, strongly erodes the front of the dune, which ejects a new dune. Because it is smaller and emerges downstream, the new dune moves faster and a gap opens up between the two dunes,” Franklin said.

The last two types of interaction happen when fluid flow is very strong. “What we call ‘fragmentation-chasing’ is when the dunes are of different sizes. The wake effect on the downstream dune is so strong that it splits into two. Both the resulting dunes are smaller than the upstream dune. The result is three dunes with gaps widening between them. The last type is ‘fragmentation-exchange’, which is similar. The difference is that the upstream dune reaches the downstream dune before its division into two is complete,” Franklin said.

The five types are easy to understand in this video. In fact, the researchers were able to construct the typology thanks to the visual support afforded by the movies described in the article. “Our results, obtained for subaqueous barchans that were centimeters in length and developed in minutes, significantly advance the understanding of the dynamics and formation of this type of dune,” Franklin said. “Through laws of scale, they enable us to transpose the findings to other environments, where sizes are larger and timespans longer. Understanding the past of Mars or projecting its distant future, both of which are currently of interest to scientists, could be greatly facilitated by these findings.”

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Semiconductor material analysis made possible with artificial intelligence

image: (A) Data generation process showing the sampled spin configurations generated through the simulated annealing process. The color wheel indicates the in-plane magnetization directions, and the grayscale indicates the out-of-plane magnetization directions. (B and C) The training and testing processes used in this study. (D) The additional validation process with experimentally observed magnetic domain images.

Image: 
Korea Institute of Science and Technology(KIST)

Studies on spintronics, which deal with the intrinsic spin of electrons and the field of electronic engineering, are actively conducted to address the limitation of integration level of silicon semiconductors currently in use and to develop ultra-low-power and high-performance next-generation semiconductors. Magnetic materials are one of the most commonly used materials to develop spintronics devices such as magneto-resistive random-access memory (MRAM). Therefore, it is essential to accuratly identify various properties of the magnetic materials, such as thermal stability, dynamic behaviors, and the ground state configuration, through the analysis of the magnetic Hamiltonian and its parameters.

Previously, the magnetic Hamiltonian parameters were directly measured through various experiments in order to acquire more accurate and deeper understanding of the properties of magnetic materials, and such processes required extensive amount of time and resources.

To overcome these limitations, researchers in South Korea have developed an artificial intelligence (AI) system that can analyze magnetic systems in an instant. The Korea Institute of Science and Technology (KIST) reported that the collaborative research team led by Dr. Heeyong Kwon and Dr. Junwoo Choi from Spin Convergence Research Center and Professor Changyeon Won from Kyung Hee University developed a technique for estimating magnetic Hamiltonian parameters from spin structure images using AI techniques.

They consturcted a deep neural network structure and trained it with machine learning algorithms and existing magnetic domain images. As a result, the magnetic Hamiltonian parameters could be estimated in real-time by inputting spin structure images obtained from electron microscope. Further, when compared with the experimentally investigated parameter values, the estimation errors of the AI system were less than 1%, indicating high estimation accuracy. According to the team, the developed AI system is capable of completing material parameter estimation process that previously took up to tens of hours in an instant by using deep learning techniques.

"We presented a novel approach on how AI technologies can be implemented to analyze the properties of magnetic systems," Dr. Hee-young Kwon at KIST said. "We expect that new methods for studying physical systems using such AI technologies will be able to reduce the gap between experimental and theoretical aspects, and will further lead to expanding a new research field called convergence of AI technology and fundamental science research."

Credit: 
National Research Council of Science & Technology

Energy transition at the crossroads: New topical issue in <i>Russian Journal of Economics</i>

image: Three Czarevnas of Underground Kingdom (1879). A painting by Viktor Vasnetsov. The painting is based on the Russian fairy tale The Three Kingdoms: Copper, Silver, and Golden

Image: 
Viktor Vasnetsov; Tretyakov Gallery

Titled "Energy transition at the crossroads", the new issue of the Russian Journal of Economics gets a set of profound messages across, which can be summarized as: "transition matters, transition goes, yet transition is not a simple, unified march towards a Green future".

Together, the seven articles in the issue bring forward the notion that the world is a diverse place regarding resources, population growth, human capital, development and political agendas. However, the path to the Energy transition--an integral part of the United Nations' Sustainable development goals (SDGs)--is something every nation needs to face in the wake of multiple and unprecedented simultaneous global crises: systemic for health, economic, environmental, political and humanitarian.

"All papers look into substantive issues that have emerged following key, global decisions made in recent years. It's high time that we stop persuading each other into taking the Green path and, instead, turn to the actual problems, costs and obstacles on the way," says the issue's editor Prof. Leonid Grigoryev, HSE University, Moscow, Russia.

The collection of articles analyses the outcomes of recent trends in the field of global energy, as well as the mechanisms behind the dramatic changes in the business world, public attitudes and government policies.

"Gazing into the crystal ball right now may be a question of analyzing the casualties from the 2020 recession and the interaction with the COVID-19 pandemic, as well as the logic of interests and intentions of all parties and actors involved in the decision-making process. This is certainly an 'interesting time' -- as the oriental curse allegedly says," comments Grigoryev.

The first article in the issue: "Long-term development of the global energy sector under the influence of energy policies and technological progress" is authored by renowned energy economists at the Energy Research Institute at the Russian Academy of Sciences and the Energy Centre of the Moscow School of Management SKOLKOVO: Alexey Makarov, Tatyana Mitrova, Vyatcheslav Kulagin. It addresses the current period of transformation in the world energy sector, defined by the emergence of a whole range of cost-effective technologies and the formation of new state priorities that can radically change the structure of energy use. The researchers use a complex forecasting model to predict how the world energy markets will be developing in the period up to 2040. Amongst their estimations is that oil and coal will pass their peak of consumption before 2040. That will not only lead to a radical change in the price environment of energy markets, but also to a transformation of the way they are organized and regulated, as well as to a revision of business models of most energy companies.

The second article, authored by Emre Hatipoglu, Saleh Al Muhanna and Brian Efird and titled "Renewables and the future of geopolitics: Revisiting main concepts of international relations from the lens of renewables" covers a diverse scope of research. It presents a review of the geopolitical, institutional, and technological aspects of the development of renewable energy sources, including transportation and delivery of energy across national borders. With their work, the research team at the King Abdullah Petroleum Studies and Research Center (Saudi Arabia) warn that global issues currently linked to the use of non-renewable sources are most likely bound to remain after the energy transition. These include security, export interdependence, and availability of source materials.

The third article is focused on the energy transition in the European Union in line with the evolution of the European Green Package to the New Green Deal. The paper is authored by Manfred Hafner and Pier Paolo Raimondi, affiliated with Fondazione Eni Enrico Mattei (Italy), Johns Hopkins University SAIS Europe (Italy) and the Science Po Paris School of International Affairs (France). The authors give an analyzed full-fledged account of the growing ambition of the EU to lead the global transition to a climate-neutral world. In their work, they also suggest that the transition will also impact the external relations of the EU, for example with Russia, and suggest that the two blocs can preserve their energy relationship in light of the energy transition, notably through the conversion of natural gas to hydrogen and storing the resulting CO2.

The fourth article: "The role of gases in the European energy transition" by Prof. Jonathan Stern (Oxford Institute for Energy Studies, United Kingdom) suggests an interesting approach to the role of gases in the global economy, with a focus on the EU. He bases his evaluation of the next three decades on the forecast for global demand of gases, which will reach its peak by 2030 in North America, Eurasia and China, and then subsequently diminish by 2050. The demand for liquefied natural gas (LNG) is predicted to increase, given the relatively low prices. However, later on, LNG producers will need "revolutionary" technologies of decarbonization as well.

In the fifth article by Kirsten Westphal (German Institute for International and Security Affairs): "German-Russian gas relations in face of the energy transition", the author reviews the subject of energy transition from a few different perspectives. Firstly, she discusses it as a part of the historically tested alliance between Russia and Germany, developed and framed over time. Then, she refers to it as an "energy diplomacy" case. "This would require a political shift away from securitization to decarbonization, not only in Germany, but even more so in the EU, and in particular, in Russia," concludes Westphal.

The sixth article: "Fossil fuels markets in the 'energy transition' era" is authored by Vyatcheslav Kulagin, Dmitry Grushevenko and Nikita Kapustin of the Energy Research Institute at the Russian Academy of Sciences. In their study, the researchers investigate the long-term impact of the energy transition and related processes on the markets of key fossil fuels: oil, natural gas, and coal. In addition, they discuss important cases, such as traditional versus electric cars, with subsidies also factored in. Overall, the article can be seen as a "technology-friendly" one, while simultaneously avoiding overoptimistic expectations on efficiency and decarbonization. One may call this approach to energy transition as "optimism through a rational lens".

The last article in the special issue: "Global energy trilemma" by Leonid Grigoryev and Dzhanneta Medzhidova (HSE University and Primakov National Research Institute of World Economy and International Relations, Russia), brings us back to the profound interaction between growth, poverty, inequality and the problems concerning energy transition and climate change. Essentially, the authors pose a rather simple and straightforward question: if the EU succeeds in its fast decarbonization program by 2030-2050, but the globe ends up with another billion of people suffering from poverty (including energy poverty), would this be a satisfactory outcome, according to the UN, as outlined in the SDGs? This problem is referred to as the "Global energy trilemma", which sets poverty and inequality,?growth, and?energy and climate against each other. The authors also point out that reaching the UN Agenda 2030 will be very difficult, given the consequences of the 2020 recession. In the aftermath of the COVID-19 pandemic, financial resources for development and energy transition are expected to be heavily diverted to inevitable health care reforms, while investments are declining on a global level. The researchers conclude that the solution lies in coordinated actions aimed at avoiding the potential aggravations of global social problems.

Credit: 
Pensoft Publishers

Dartmouth-led research featured in national journal focused on health system performance

New findings published by Dartmouth researchers are featured in a special issue of Health Services Research--a top journal that reports on original investigations that enhance understanding of the healthcare field and help improve the health of individuals and communities.

The special issue focuses on new research highlighting the role and impact of integrated health systems in the U.S.--a type of organization that has grown dramatically in recent years and includes hospitals and outpatient physician practices.

In the paper, "Organizational Integration, Practice Capabilities and Outcomes in Clinically Complex Medicare Beneficiaries," Carrie Colla, PhD, a professor of The Dartmouth Institute for Health Policy and Clinical Practice at the Geisel School of Medicine, and colleagues assessed the association between clinical integration and financial integration, quality?focused care delivery processes, and beneficiary utilization and outcomes.

Physician practices have become increasingly financially integrated through ownership relationships in recent years. Health system leaders have argued this financial integration improves patient care through better clinical integration.

However, the results of the study show that financial integration and clinical integration were not well-correlated, while financial integration had few positive effects on health or spending indicators. "We found that within multi-physician practices, higher levels of financial integration were not associated with improved care delivery or better outcomes for Medicare patients," explains Colla. "Greater clinical integration, however, was associated with greater use of quality-focused care delivery processes and less spending."

Clinical integration deserves greater attention, she says, as organizations with high clinical integration are more likely to adopt care processes to improve the experience and outcomes of patients.

Colla's paper is one of 10 peer-reviewed journal articles featured in the
special issue that together examine health systems from multiple perspectives, including impacts on quality, costs, and care delivery.

While collectively the findings show that healthcare systems as they currently exist are not yet a cure for the problems that ail healthcare across the country, Colla's paper and the others', including recently published research by Dartmouth professor Elliott Fisher, are leading to new understandings about the prevalence, roles, and impacts of integrated health systems.

The U.S. Agency for Healthcare Research and Quality (AHRQ)--the lead federal agency responsible for improving the safety and quality of the U.S. healthcare system--supported this research by creating the Comparative Health System Performance (CHSP) initiative.

Dartmouth is one of three centers of excellence established by the CHSP that are tasked with taking different approaches to studying the landscape, impact, and trends across health systems--including factors that impact use of evidence-based practices in care delivery.

The Dartmouth Institute for Health Policy and Clinical Practice is a world leader in studying and advancing models for disruptive change in healthcare delivery. The work of Dartmouth Institute faculty and researchers includes developing the concept of shared decision-making between patients and healthcare professionals, creating the model for Accountable Care Organizations (ACOs), and introducing the game-changing concept that more healthcare is not necessarily better care.

Credit: 
The Geisel School of Medicine at Dartmouth

Reversible superoxide-peroxide conversion drives high-capacity potassium-metal batte

image: (a) Theoretical output potential, specific capacity and energy density for the KO2 and other typical K-ion battery cathodes. (b) XRD pattern and TEM images of KO2-RuO2@rGO cathode composites. The XRD pattern of KO2-rGO is shown for comparison. (c) Galvanostatic discharge/charge curves (the initial cycle) of KO2-rGO (grey traces) and KO2-RuO2@rGO (blue traces) cathodes, respectively. (d) Typical discharge/charge curves of KO2-RuO2@rGO cathode collected in half-cell (coupled with large excess amount of K-metal anode). The average overpotential (vs. thermodynamics 1.94 V) upon charging and discharging are shown inset for clarity. The current density are fixed at 300 mA/g~KO2.

Image: 
@Science China Press

There is an urgent need for high-energy-density rechargeable batteries to further satisfy the ever-growing demand for electrical energy storage devices. Triggering the O-related anionic redox activities (e.g. typical Li/Na/K-O2 battery, and Li/Na-rich cathodes) have been regarded as the most promising capacity-boosting strategy for batteries. However, the practical realization of Li/Na/K-O2 battery, a gas-open cell architecture, is severely plagued by some gaseous O2-related intrinsic defects. For example, porous air cathode is easily clogged by hosting the solid O2 reduction products, resulting in the practical stored energy reveals far below the theoretical value. Moreover, due to the phase changes between gaseous O2 and solid Li/Na/KxO, sluggish kinetic obstacle leads to large round-trip overpotential. Besides, air purifier devices or O2 storage cylinder have to be equipped, and further drag the energy density promotion.

In a new research published in the Beijing-based National Science Review, scientists at the Nanjing University in China, and at National Institute of Advanced Industrial Science and Technology in Japan present the realization of a reversible superoxide-peroxide conversion in a K-based high-capacity rechargeable sealed battery device.

Co-authors Yu Qiao and Haoshen Zhou likewise outline the potential development directions and design principle of this reversible superoxide/peroxide (KO2/K2O2) inter-conversion on KO¬2-based cathode system for potassium-ion battery (KIB) technology.

"Traditionally, A novel synergistical modification ideal is abandoning the utilization of gaseous O2, and controlling the high-energy-density oxygen-based redox reaction processes within the redox interconversion among different solid phases (a more commercialized sealed cell environment)." they state in an article titled "A high-capacity cathode for rechargeable K-metal battery based on reversible superoxide-peroxide conversion"

"In 2019, our group successfully trapped the O-related redox activity within a reversible peroxide-oxide (Li2O2/Li2O) interconversion stage (Nature Catalysis, 2019, doi.org/10.1038/s41929-019-0362-z), and achieved a novel high-energy-density Li-ion battery. However, to the best of our knowledge, the interconversion between superoxide-peroxide has not been reported yet, which is another potential high-energy-density redox candidate." Qiao and Zhou state. They also point out that "The most difficult point to realize the reversible superoxide-peroxide interconversion focused on the stabilization of superoxide."

"In this study, we originally achieved the reversible interconversion between superoxide (KO2) and peroxide (K2O2) in a rechargeable battery system, and made great improvements on the specific capacity for potassium-ion (K-ion) battery." they state.

They point out that there are two key issues achieved in this work:

1) On the aspect of "Chemistry": For the first time, the reversible superoxide-peroxide interconversion has been realized, and they successfully restrained the irreversible oxygen loss (O2 evolution, etc.). By the employment of advanced systematical operando spectroscopies (in-situ Raman + SERS + GC-MS), the authors originally found the formation of nucleophilic superoxo anion (O2-) was the chief criminal for the irreversible redox behavior. Moreover, by the sharp comparison verse blank group, they proved the hybridization between RuO2 catalyst and K-deficient K1-xO2 can induce the formation of stable surface protection layer (proved by hard-XAS spectroscopy) and prevent the oxygen loss. For the first time, the essential difference between moderate superoxide and dangerous superoxo anion (O2-) has been clarified. "We believe these findings would be full of novelty on the viewpoint of "Chemisty"." they state.

2) On the aspect of "Battery": For the practical battery technology, the development of K-ion battery is severely hindered by the limited capacity of cathode candidates (typically around 100 mAh/g cathode capacity). In this work, benefitting from the high-capacity KO2/K2O2 redox couple, the cathode capacity has been largely boosted to 300 mAh/g. This sealed battery system is totally different from previous reported gas-open K-O2 battery (O2/KO2 conversion). Moreover, the round-trip overpotential has been successfully restrained within 0.2 V (at quite high current rate of 300 mA/g), indicating a high energy efficiency around 90%. The reversible cycling can be achieved around 900 times, indicating remarkable long-term cycling stability. Besides, not only for half-cell mode, after electrolyte modification, a practical full-cell has been fabricated with high-energy-density and superior cycle stability. "We believe these large improvements present great significance on the practical "Battery" level." Qiao and Zhou state.

"We believe that the demonstration of a superoxide/peroxide redox dominated battery system with ultralong cycle stability will open up a new gate and spur the development of more effective catalytic cathode frameworks." Qiao and Zhou state. "More importantly, the identification of the feasibility (from the mechanism/chemistry perspective) and the realization of the impressive cyclability (from the practical viewpoint) herein would stimulate the development of oxygen-based anionic redox activity in enhancing the energy density of rechargeable battery technologies." they added. "The desirable features revealed in the current battery system also triggers a design direction for synergistically combining the electrode and electrocatalyst materials, engineering the next-generation high-energy-density battery technologies."

Credit: 
Science China Press

New approach can improve COVID-19 predictions worldwide

video: Jonas Lybker Juul and Sune Lehmann give a brief overview of their paper explaining how the standard approach for summarizing ensembles of epidemic trajectories systematically suppresses critical epidemiological information.

Image: 
Sune Lehmann

"It is about understanding best and worst-case scenarios - and the fact that worst case is one of the most important things to keep track of when navigating through pandemics - regardless whether it be in Denmark, the EU, the USA or the WHO. If you are only presented with an average estimate for the development of an epidemic - not knowing how bad it possible can get, then it is difficult to act politically", says Professor Sune Lehmann, one of four authors of the article Fixed-time descriptive statistics underestimate extremes of epidemic curve ensembles just published in Nature Physics.

Researchers Jonas L. Juul, Kaare Græsbøll, Lasse Engbo Christiansen and Sune Lehmann, all from DTU Compute, act as advisors to the National Board of Health in Denmark during the corona crisis. And partly based on their own experience as advisors, they have become aware that the existing methods of projecting the development of epidemics such as COVID-19 have a problem in describing the extremes possibilities of the expected development.

Epidemics are unpredictable

"Disease outbreaks are fundamentally stochastic processes. The same disease introduced in the same population can infect a large number of people or disappear quickly without having a particular prevalence. It depends in part on coincidences," explains postdoc Jonas L. Juul.

It is precisely the unpredictability of epidemics which makes it so difficult to make the right decisions everywhere in society when it hits. How many beds and respirators will there be a need for? And how much can we reduce this demand by enforcing restrictions?

However, the general unpredictability is just one of many problems in estimating the development of an epidemic.

"It is not just the unpredictable nature of epidemics that makes it difficult to predict their course - it is also our lack of knowledge about the disease's characteristics and prevalence in society at any given time. Just to give a few concrete examples of this: there is typically no one who has any idea exactly when an outbreak has started, how many infected we have in an area on any given day, or in which regions the epidemic is getting a foothold right now. The only thing we know for sure is that when the health authorities discover an outbreak, it has been going on for a while, "says Sune Lehmann.

The common way to deal with the lack of information, almost everywhere in the world, is to model many scenarios based on e.g. different numbers of unknown infections and starting times and then summarize by looking at each day separately and assessing the 'middle' predictions as the most likely outcomes of the day. If most input parameters give infection numbers of less than 4000 on Christmas Day, more than 4000 new infected are subsequently assessed to be unlikely.

The 'day-based' way of making these predictions is used all over the world, and although the link between the development of an epidemic and specific dates is useful in some contexts, it systematically excludes data on how bad or mild the epidemic will be.

If all projections e.g. predict that the epidemic will peak at 4000 infected in one day, but none of the curves shows it on the same day, then on a given day it will be an extreme and therefore not included in any estimate.

"We, therefore, suggest making the summary 'curve-based': Instead of assessing which infection rates are probable or unlikely on individual days, we should look at one entire simulation at a time. Is the entire simulated infection curve probable or not? And based on that you can make a summary of the most likely curves for the development of the epidemic, "says Jonas L. Juul.

"By looking at entire prediction curves instead of individual days, you will get a more realistic estimate of how bad the epidemic can become. It is especially useful if you are trying to avoid the hospital system being overloaded," concludes Sune Lehmann.

Credit: 
Technical University of Denmark

Positive messages encourage safer driver behavior than fear tactics

A new study has shown that films demonstrating responsible behaviour could lead to young drivers taking fewer risks on the road than if they only saw videos aimed at provoking fear of accidents.

Over one million people are killed in road accidents each year and drivers below the age of 25 account for nearly half of road deaths.

Dr Yaniv Hanoch, Associate Professor of Risk Management at the University of Southampton said, "Governments around the world have adopted a plethora of interventions aimed at encouraging safer driving, the majority of which use fear-based content, such as graphic depictions of sudden car crashes. We are all familiar with the UK Government's "Think" campaign, especially at this time of year.

"However, previous research has suggested that such messages can be counter-productive, possibly because the emotive content can trigger defensive reactions and message rejection."

In this latest research, led by Dr. Cutello of the University of Antwerp in partnership with the University of Southampton and the University of Warwick, 146 young drivers undertook tests to compare the difference in their attitudes to risky driving.

Half of the group viewed a six-minute video aimed at instilling fear through a crash caused by a reckless driver, distracted by his passengers. The other half saw a video showing a positive scene with a careful driver asking the passengers not to distract him and successfully reaching their destination. Both road safety films were developed specifically for, and used by, the Fire and Rescue Services across the United Kingdom

The study also tested whether watching the videos in an immersive setting on a Virtual Reality (VR) headset made a difference to watching the videos on a two-dimensional (2D) TV screen.

Each participant took a questionnaire to assess his or her attitude to risk taking on the road before and after the trial. At the end of the trial, they also took a second test, the Vienna Risk-Taking Test-Traffic. In this test, they watched video clips of driving situations that require a driver reaction (for example, considering whether to overtake in icy conditions) and asked to indicate if, and when, they regarded the manoeuvre as too risky.

The findings, published in the journal Risk Analysis, indicate that the positively framed films significantly decreased risky driving after being seen in on a 2D screen, and even more so when viewed in VR format. In contrast, the fear film shown in VR failed to reduce risky driving behaviours, and in fact, increased young drivers' risk taking.

Dr Hanoch continued, "Finding the best means to tackle this issue is of paramount importance.

"By studying driver safety interventions currently used by the Fire and Rescue service across the United Kingdom, this research provides the first examination of the effects of both message content and mode of delivery on risky driving behaviour among young drivers."

Dr. Cutello added, "Our results provide key insights about the role of positive versus fear-framed messages in tackling risky driving among young drivers. On the one hand, they build on previous work showing the effectiveness of positively framed messages in promoting road safety through the portrayal of driving safely and the positive consequences. They also show that allowing the participants to experience what proactive behaviours can lead to, and giving them the illusions that the events occurring are authentic through VR, can encourage the creation of positive role models and strategies to be safer on the roads."

Credit: 
University of Southampton

Augmented reality visor makes cake taste moister, more delicious

image: The modified image looks fresh and soft while the real food is dried one. Crossmodally, the taste and flavor were changed by the appearance of the food.

Image: 
Yokohama National University

Researchers have developed an augmented reality (AR) visor system that enables them to manipulate the light coming off food in such a way as to 'trick' people consuming the food into experiencing it as more or less moist, watery, or even delicious. The findings not only reveal how human taste is experienced in a multisensorial way -- through a combination of visual perception, smell and even sound -- but the technique could be used in hospitals to improve the palatability of food, or as a design development tool in the food industry.

The findings were published in Scientific Reports on September 30, 2020.

It has long been known that taste is not only a product of a food's chemical composition, which directly shapes the experience of consumption of food and drink, but that its visual appearance also contributes to how we experience its taste.

For example, research has shown that humans tend to associate sour-tasting food and carbonated beverages with sharper shapes, and to associate creamy food and still drinks with more rounded shapes. Underlying that visual experience is the way that light bounces off an object, or -- to put it in more scientific terms -- the distribution of luminance. Earlier research had shown that variation in this luminance distribution influences how fresh a cabbage appears to people when looking at a series of still pictures of the vegetable. But pictures are not the same as the dynamic experience of actually eating a piece of food.

"So we wondered whether manipulating this luminance distribution while someone was eating something would produce a similar effect," said Katsunori Okajima, who specializes in vision and brain sciences at Yokohama National University in Japan.

The researchers developed an augmented reality (AR) system that allows them to manipulate the standard deviation of luminance distribution. This is a statistical term that describes how spread out a set of numbers are from their average value. So for example, the total amount of light bouncing off two different slices of cake might be the same, but for the first slice, the deviation of luminance distribution is small, giving it a smoother appearance, while the deviation is large for the second slice, giving it a rougher appearance.

They used their AR system in two experiments. In the first, people wore the AR visor system while eating slices of Baumkuchen, a type of German cake widely available in Japan, and in the second they wore it while eating a spoonful of ketchup. The researchers were thus able to manipulate the appearance during consumption of food, going one step further than the photographs of cabbage.

Upon interviewing the participants, they found that manipulating the standard deviation of the luminance distribution (while keeping the color and the overall luminance constant) altered not only what the participants expected to taste in terms of moistness, wateriness and deliciousness, but also the actual taste and texture properties upon sampling the food itself.

The AR manipulation was most effective in moistness (of the cake) and wateriness (of the ketchup), while the effect of the system on perception of sweetness was relatively modest

"This suggests that the association between visual texture and sweetness is weak," added Dr. Okajima.

The researchers now hope to develop new image processing technology which can manipulate any appearance of any food in real time. Ultimately, they want to use these techniques to quantify all ways that visual information affects our taste perception, and to describe the precise mechanisms of such processing within the brain.

Credit: 
Yokohama National University

Drug for pulmonary hypertension may become an option against cancer

By André Julião  |  Agência FAPESP – A drug used to treat pulmonary hypertension significantly reduced the capacity of tumor cells to migrate and invade other tissues in trials involving pancreatic, ovarian, breast cancer, and leukemia cell lines. Furthermore, in mice with an aggressive form of breast cancer, the drug reduced the incidence of metastasis in the liver and lungs by 47% and lengthened survival compared with untreated animals.

The study is published in Scientific Reports.

“The drug ambrisentan is an inhibitor of the endothelin type A receptor, which is known to play a role in vasoconstriction, so the drug is used to treat pulmonary hypertension [typically caused by autoimmune diseases such as lupus and systemic sclerosis]. In the laboratory, we found that the drug prevented migration of tumor cells to other tissues and had other effects we’re still investigating,” said Otávio Cabral Marques, a researcher at the University of São Paulo’s Biomedical Sciences Institute (ICB-USP) in Brazil and principal investigator for the study, which was funded by FAPESP

Marques conducted the study while he was a postdoctoral fellow at the University of Freiburg in Germany, collaborating with researchers there and in the United Arab Emirates. He is currently principal investigator for a project supported by FAPESP via a Young Investigator Grant.

Endothelin type A receptor is expressed in endothelium, the layer of cells that line the inner surface of blood vessels, and in the cells of the immune system. Other research has also shown its involvement in tumor growth and metastasis.

“The effects of the drug appear not to be confined to preventing tumor cell migration, but also to include inhibition of neoangiogenesis, the formation of new blood vessels required to sustain tumor growth,” Marques said. “We’re currently doing experiments to confirm this. If so, the drug must have a systemic effect, preventing tumor migration to other tissues and inhibiting tumor growth by blocking the generation of new vessels.”

The drug’s benefits in cancer treatment have yet to be proven. Its use without a physician’s guidance can be harmful to health, especially in pregnancy.

Experiments

Using special techniques to measure cell migration, the researchers found that the drug significantly reduced both migration of tumor cells that received a stimulus and spontaneous migration. They tested ovarian, pancreatic, breast and leukemia cancer cell lines. 

Next, 4T1 cells derived from the mammary gland tissue of a mouse strain were injected into mice to mimic the initial stage of an aggressive form of breast cancer in humans. The mice were treated with the drug for two weeks before the injection and another two weeks afterward. In this experiment, the drug reduced metastasis by about 43% and enhanced median survival by about 30%.

“Metastasis of 4T1 cells is very fast in mice, so we began treatment earlier in order to approximate what happens in humans,” Marques explained.

Marques is now preparing to perform clinical trials with other researchers at ICB-USP. The drug will be tested on a group of cancer patients undergoing chemotherapy to see if they recover better than the control group that will not be given the drug. 

Although the drug can be administered by mouth, which is an advantage, Marques wants to test applying it directly to the tumor in order to enhance its effect. The type of cancer on which it will be tested has not yet been decided. 

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Researchers develop Si-based super-high frequency nanoelectromechanical resonator

Silicon single-electron/hole transistors (SETs/SHTs) and super-high frequency nanoelectromechanical resonators show great potentials in quantum computation, sensing and many other areas.

Recently, a group led by Prof. GUO Guoping from the University of Science and Technology of China of the Chinese Academy of Sciences, collaborating with Prof. ZHANG Zhen's group from Uppsala University, Sweden, designed and fabricated CMOS-compatible suspended SHT devices which worked as super-high frequency nanoelectromechanical resonators. The work was published in Advanced Materials.

The researchers developed the devices using the standard complementary metal-oxide-semiconductor (CMOS) fabrication technology, which is convenient for large-scale integration. The observed Coulomb diamond transport features confirmed the formation of SHT.

When suspended, the SHT can also work as a super-high frequency nanoelectromechanical resonator, demonstrating excellent mechanical properties. At ultra-low temperature and under high vacuum, the device showed single-hole tunneling behavior and a mechanical resonance at a record high value of 3 GHz.

These properties will be helpful for exploring the interactions between mechanical vibrations and charge carriers, and investigating potential quantum effects.

Besides, the researchers found that the electrical readout of the mechanical resonance mainly relied on piezoresistive effect, and was strongly correlated to single-hole tunneling. In the SHT regime, the piezoresistive gauge factor was an order of magnitude larger than that at other different driving powers. This property can be applied to study the piezoresistive effect of silicon in nanoscale and more novel mechanical sensing devices' design.

Credit: 
University of Science and Technology of China

Moffitt identifies genomic and immune indicators that predict lethal outcomes in prostate cancer

TAMPA, Fla. -- Prostate cancer is one of the most common cancers among men in the United States. One in nine men will be diagnosed during his lifetime. When diagnosed, a patient's disease is graded from 1 to 5 based on how aggressive it is, with 5 being the most aggressive. Those with grades 4/5 disease are at the highest risk of poor outcomes or death from the disease; however, there are no immunologic or genomic indicators that can help physicians determine the best course of treatment for this group of patients.

Moffitt Cancer Center researchers, led by Kosj Yamoah, M.D., Ph.D., associate member and director of cancer disparities research in the departments of Radiation Oncology and Cancer Epidemiology, are hoping to change that. The team conducted studies to determine if genomic heterogeneity in tumors from grade 4/5 prostate cancer patients can be exploited to identify patient subsets that are at higher risk for lethal outcomes and that may benefit from targeted treatment strategies. Their results were published in the journal European Urology.

Their studies focused on transcriptomic interactions between the tumor immune content score and the Decipher score, a 22-gene classifier that provides a score predicting the probability that cancer will spread. The researchers analyzed data from 8,071 prostate cancer patient samples of any disease grade (6,071 prostatectomy and 2,000 treatment naïve) in the Decipher Genomics Resource Information Database (GRID) registry. Each patient sample was also given an immune content score (ICS) that was derived using the mean expression of 264 immune cell-specific genes.

The samples were separated into four distinct immunogenomic subsets based on their results: ICS high/Decipher high, ICS low/Decipher high, ICS high/Decipher low and ICS low/Decipher low. The researchers discovered that approximately 25% of all grade 4/5 patient samples were in the ICS high/Decipher high subset.

The ICS high/Decipher high patient samples were further evaluated for the association between immunogenomic subtypes and radiation response signatures. They found that the ICS high/Decipher high subset were genomically more radiosensitive, meaning these tumors would respond well to radiation therapy. This subset also had a higher abundance of T cells and monocyte/macrophages. However, the research team says further research is needed to unravel the biologic mechanisms of this association.

"Our results will aid in the subtyping of aggressive prostate cancer patients who may benefit from combined immune-radiotherapy modalities," said Yamoah.

Although the findings may not be applicable to other tumor genomic platforms at this time, he said the Decipher GRID platform is used routinely in clinical care throughout the country and the results can be readily validated in various ongoing clinical trials and promises to be practice changing in the near future.

Credit: 
H. Lee Moffitt Cancer Center & Research Institute