Tech

How commonly do patients develop persistent opioid use after cardiac surgery?

What The Study Did: A large, national database was used to determine how common it was for patients who hadn't used opioids before undergoing a coronary artery bypass grafting or heart valve procedure to subsequently develop persistent opioid use after surgery.

Authors: Nimesh D. Desai, M.D., Ph.D., of the Hospital of the University of Pennsylvania in Philadelphia, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamacardio.2020.1445)

Editor's Note: The article includes funding/support disclosures. Please see the articles for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Study calls for reallocation of subsidies for biocontrols to fight fall armyworm

image: A new CABI-led study is calling for governments to reallocate subsidies to encourage the use of lower risk control options -- such as biopesticides -- in the fight against the devastating maize pest fall armyworm.

Image: 
CABI

A new CABI-led study is calling for governments to reallocate subsidies to encourage the use of lower risk control options - such as biopesticides - in the fight against the devastating maize pest fall armyworm (FAW).

The research, which was spearheaded by Dr Justice Tambo and published in the journal Science of the Total Environment, also suggests that the enforcement of pesticide regulations is also needed to curb the use of highly toxic and banned products such as monocrotophos, dichlorvos and methamidophos.

Dr Tambo and his team also argue that mass media campaigns and the training of plant health advisory service providers is essential to inform farmers about the recommended pesticides for FAW control, risks and safety precautions, as well as alternative and more sustainable management options.

The advice follows surveys carried out amongst 2,356 maize-growing households in Ghana, Rwanda, Uganda, Zambia and Zimbabwe which revealed that while a variety of cultural, physical, chemical and local options are employed to mitigate the effects of FAW, the use of synthetic pesticides remains the most popular option.

The scientists found, for example, that across the five countries the use of synthetic pesticides ranged from 33% in Zimbabwe to 87% in Rwanda. In respect of biopesticides, only 22% of farmers in Ghana used them compared to less than 5% in the other four countries.

Where a variety of cultural, physical, chemical and local options to mitigate the effects of FAW were used these included, particularly in Rwanda and Uganda, the handpicking of egg masses and caterpillars and the destruction of infested maize plants.

Farmers also avoided late or staggered planting; carried out regular weeding to remove alternative host plants such as pasture grasses; applied manure or inorganic fertilizer to support healthy plant growth so that the maize plants could withstand FAW infestations; and intercropped and rotated maize with non-host crops such as cowpea and cassava.

In addition, the researchers express concern at the low use of personal protective equipment (PPE) used by farmers when applying the pesticides (e.g. half of farmers in Ghana did not use PPE at all) with almost 30% suffering from headaches and 20% skin irritations and dizziness.

Dr Tambo said, "Given the lack of knowledge and experience related to FAW, the rapid build-up of the pest, the devastating havoc it is wreaking on farms and the absence of resistant varieties, there is a tendency of farmers to opt for pesticides, which are generally effective against most pests and offer rapid control.

"However, recognising the negative effects of synthetic pesticides on humans and the environment, biopesticides have been recommended as an appropriate alternative option but these need to be incentivised for farmers to adopt them as part of an Integrated Pest Management (IPM) plan."

The scientists also suggest it is necessary to test the effectiveness of the various FAW management practices identified and to intensify research efforts into other low-cost and low-risk IPM options. These include resistant maize varieties and biological control which have shown great potential for FAW control elsewhere but are not yet options for farmers in Ghana, Rwanda, Uganda, Zambia and Zimbabwe.

Dr Tambo added that a relatively high use of biopesticides was observed in Ghana, where there is an ongoing national effort to promote this option for FAW control, but stated that there was almost no adoption of biological control using predators and parasitoids for the management of FAW by any of the smallholders in their sample.

Credit: 
CABI

Using tiny electrodes to measure electrical activity in bacteria

image: The organic electrochemical transistor in which the researchers have been able to deposit Shewanella oneidensis on one of the microelectrodes.

Image: 
Thor Balkhed

Scientists at Laboratory of Organic Electronics, Linköping University, have developed an organic electrochemical transistor that they can use to measure and study in fine detail a phenomenon known as extracellular electron transfer in which bacteria release electrons.

The study of bacteria and their significance for the natural world, and for human society and health, is a growing research field, as new bacteria are continuously being discovered. A human body contains more bacteria than human cells, and a millilitre of fresh water can hold as many as a million bacteria. Respiration in a normal human cell and in many bacteria takes place through biochemical reactions in which a compound, often glucose, reacts with oxygen to form carbon dioxide and water. During the process, energy is converted to a form that the cell can use. In oxygen-free environments, bacteria are found that metabolise organic compunds, like lactate, and instead of forming water, they release, or respire, electric charges, a by product of metabolism, into the environment. The process is known as extracellular electron transfer, or extracellular respiration.

The phenomenon is currently used in several electrochemical systems in applications such as water purification, biosensors and fuel cells. Adding bacteria is an eco-friendly way to convert chemical energy to electricity.

One such bacteria often used in research is Shewanella oneidensis, which previous research has shown to produce electrical current when fed with arsenic, arabinose (a type of sugar) or organic acids. A similar bacterium has recently been discovered in the human gastrointestinal system.

We do not, however, understand in detail what happens when bacteria release charges. In order to capture and measure the amount of charge released, electrodes are placed into the microbial systems. An individual bacterium gives a very weak signal, and thus until now, researchers have had to be satisfied with studying extracellular electron transfer in large systems with large numbers of bacteria.

In order to increase our understanding, scientists at the Laboratory of Organic Electronics at Linköping University have employed a combination of microelectronics, electrochemistry and microbiology. They have developed an organic electrochemical transistor in which they have been able to deposit Shewanella oneidensis on one of the microelectrodes, with a surface area of only a quarter of a square millimetre. The amplification of the signal that occurs in the transistor makes it possible for them to study in detail what happens when various substances are added to the system. They describe in an article in Advanced Science experiments in which they fed lactate to the bacteria.

"We have shown that we can detect very small differences in extracellular electron transfer, in other words the amount of charge released by the bacteria. Another plus is that we can achieve very short response times, and obtain a stable signal within ten minutes", says principal research engineer Gábor Méhes, who, together with senior lecturer Eleni Stavrinidou, is corresponding author for the article.

"This is a first step towards understanding extracellular electron transfer in bacteria occupying olny a small area with the help of a transistor, and how the conversion takes place between the bacteria and the electrode", says Gábor Méhes. "One future goal is to learn how bacteria interact with each other, and with other cells and chemical substances in the human gastrointestinal tract."

The research is being conducted within the framework of the Biocom Lab at the Laboratory of Organic Electronics, and is financed by Vinnova, the Swedish Research Council, the Swedish Foundation for Strategic Research, the Wallenberg Wood Science center and the European Research Council, ERC.

It is hoped that the research will lead to optimising microbial electrochemical systems that harvest energy, and increase our understanding of, for example, serious gastrointestinal conditions. Looking far inte the future, the idea has been raised among reserachers of using bacteria that respire iron compounds to support human life on the oxygen-free planet Mars.

Credit: 
Linköping University

During the COVID-19 outbreak in China, marked emission reductions, but unexpected air pollution

Using a combination of satellite and ground-based observations to study air pollution changes in China during COVID-19 lockdowns, researchers report up to 90% reductions of certain emissions, but also an unexpected increase in particulate matter pollution. The results will inform efforts to regulate precursor gases from all possible sectors when developing an emission control strategy. In China, as elsewhere, COVID-19 shutdowns reduced or suspended motor vehicle traffic and some manufacturing. Here, to assess the related atmospheric effects, in a country that has continued to battle particulate haze pollution, Tianhao Le and colleagues used a combination of satellite-retrieved atmospheric compositions, national ground station measurements of major pollutants, meteorology data, and atmospheric chemistry model simulations. Using spaceborne measurements of nitrogen dioxide - an important precursor for both ozone production and secondary aerosol formation - Le and colleagues report that in 2020, Wuhan experienced a 93% fractional reduction in nitrogen dioxide at the peak of the outbreak. "Such a short-term human-induced reduction in nitrogen dioxide is unprecedented," the authors say. Unexpectedly, at the same time, and particularly in northern China, particulate matter levels were extremely high, the authors found, likely due to anomalously high humidity, along with stagnant airflow and uninterrupted emissions from power and petrochemical facilities. These features contributed to severe haze formation driven by a "multi-phase chemistry" approach that requires further study, Le and colleagues write. The authors say their work shows that a protocol of reducing emissions by focusing on the traffic and manufacturing sectors alone achieves only limited effects. "We suggest a more comprehensive regulation of precursor gases from all possible sectors when developing an emission control strategy," they say, including power plants and heavy industry, such as petrochemical facilities.

Credit: 
American Association for the Advancement of Science (AAAS)

Stocks of vulnerable carbon twice as high where permafrost subsidence is factored in

image: A time series shows ground-ice 'atlases' in permafrost struggling to support the active layer as soil temperatures warm and accelerate thaw. As ice is lost, we see a significant shift in the soil surface over time, and the need to account for subsidence in measurements.

Image: 
Victor Leshyk, Center for Ecosystem Science and Society

New research from a team at Northern Arizona University suggests that subsidence, gradually sinking terrain caused by the loss of ice and soil mass in permafrost, is causing deeper thaw than previously thought and making vulnerable twice as much carbon as estimates that don't account for this shifting ground. These findings, published this week in the Journal of Geophysical Research: Biogeosciences, suggest traditional methods of permafrost thaw measurement underestimate the amount of previously-frozen carbon unlocked from warming permafrost by over 100 percent.

"Though we've known for a long time that subsidence happens across the permafrost zone, this phenomenon hasn't been systematically accounted for when we talk about thaw and carbon vulnerability," said Heidi Rodenhizer, a researcher at the Center for Ecosystem Science and Society at Northern Arizona University and lead author of the study, which was co-authored by a team from NAU, Woods Hole Research Center, Instituto de Ciencias Agrarias, and Yale University. "We saw that in both warming and control environments, slight temperature increases drove significant thaw and unlocked more carbon than we saw when we weren't looking at subsidence."

Traditionally, permafrost thaw has been calculated by measuring active layer thickness. To do that, scientists insert a metal rod into the ground until it hits permafrost, and measure from that depth to the soil surface. However, subsidence can mask actual thaw by lowering the soil surface and changing the frame of reference; for instance, some long-term experiments that rely on measuring active layer thickness have not recorded significant changes in thaw depth from year to year, despite rapid temperature warming.

So Rodenhizer and her team combined subsidence with active layer measurements to discover how much the ground was sinking, and how much unlocked carbon was being missed. At their warming site near Healy, Alaska, the team used high-accuracy GPS to measure the elevation of experimental plots at six time points over nine years. At each plot, Rodenhizer and her team found that permafrost thawed deeper than the active layer thickness indicated: 19 percent in the control plots, and 49 percent in the warming plots. The amount of newly-thawed carbon within the active layer was between 37 percent and 113 percent greater.

As the Arctic warms twice as fast as the rest of the planet, these findings have potentially vast implications for global carbon fluxes. Due to the widespread nature of subsidence--about 20 percent of the permafrost zone is visibly subsided, and contains approximately 50 percent of all carbon stored in permafrost--failing to account for subsidence could lead to significant underestimates of future carbon release in global climate change projections. Rodenhizer's team hopes that this study will convince more Arctic researchers across the permafrost monitoring network to apply this method and help change that.

"We know that these vast carbon stores in permafrost are at risk, and we have the tools to account for subsidence and track where the carbon is going," said permafrost researcher and senior author Ted Schuur. "We should be using everything in our toolbox to make the most accurate estimates, because so much depends on what happens to Arctic carbon."

Credit: 
Northern Arizona University

10 percent of patients continue to use opioids three to six months after heart surgery

PHILADELPHIA -- Nearly 10 percent of patients who are prescribed opioid medications following heart surgery will continue to use opioids more than 90 days after the procedure, according to a new study led by researchers in the Perelman School of Medicine at the University of Pennsylvania.

The study, published today in JAMA Cardiology, also revealed a direct link between the dosage of opioids--or oral morphine equivalent (OME)--first prescribed following discharge and the likelihood of persistent opioid use 90 to 180 days after the procedure. Patients who were prescribed more than 300mg OMEs (about 40 tablets of 5mg oxycodone) had a significantly higher risk of prolonged use compared to those who received a lower dosage.

"Our findings support a much-needed shift toward decreasing opioid dosages at discharge and using alternative approaches to reduce the risk for persistent opioid use," said the study's lead author Chase Brown, MD, MHSP, a Cardiovascular Surgery resident and research fellow.

Opioids, such as oxycodone, codeine, tramadol and morphine, are routinely prescribed for postoperative pain management in many countries. However, recent research suggests that overprescribing opioid medications for short-term pain may be widespread in the United States. The excessive prescribing can increase the risk of drug diversion, new long-term opioid use and the development of opioid use disorder.

Heart disease is the leading cause of death in the United States, accounting for about one in every four deaths. Every year, hundreds of thousands of people undergo heart surgery to treat conditions of the heart. While recent studies revealed that persistent opioid use occurs in 3 to 10 percent of patients after minor and major general surgery procedures, there is limited large-scale research that examines this issue among cardiac surgery patients in the United States.

In this study, the Penn team sought to determine the proportion of opioid-naïve patients who develop persistent opioid use after heart surgery and to investigate the link between the dosage first prescribed and the patient's risk of prolonged use. Using a national database, the team examined data of 25,673 patients who underwent coronary artery bypass grafting--the most common type of heart surgery--or heart valve repair or replacement between 2004 and 2016.

More than half of the patients--about 60 percent of CABG patients and 53 percent of valve surgery patients--filled an opioid prescription within 14 days of the surgery. Researchers found that 9.6 percent of the cardiac surgery patients continued to fill prescriptions between three and six months after surgery, with the refill rate slightly higher among CABG patients. In fact, nearly 9 percent of CABG patients continued to fill an opioid prescription 180 to 270 days after surgery. The team also found a higher incidence rate among women, younger patients and those with preexisting medical conditions, such as congestive heart failure, chronic lung disease, diabetes and kidney failure.

To examine whether the results would apply to patients considered "low risk," researchers excluded patients who had preoperative use of benzodiazepines, muscle relaxants, alcoholism, chronic pain, drug use and those discharged to a facility after cardiac surgery. Researchers found a similar incidence rate among the low-risk cohort, with 8 percent of the patients continuing to use opioids between 90 and 180 days of their discharge.

"Cardiothoracic surgeons, cardiologists and primary care physicians should work together to enact evidence-based protocols to identify high-risk patients and minimize prescriptions via a multi-faceted pain management approach," said the study's senior author Nimesh Desai, MD, PhD, a cardiovascular surgeon and an associate professor of Surgery. "Centers must adopt protocols to increase patient education and limit opioid prescriptions at discharge."

Credit: 
University of Pennsylvania School of Medicine

Non-invasive fetal oxygen monitor could make for safer deliveries

image: Prototype of the external fetal oxygen monitor developed by engineers at UC Davis. The device uses light to directly measure fetal oxygen saturation. Existing monitors used in the delivery room infer fetal oxygen by measuring fetal heart rate and contractions, and have a high false positive rate leading to many Cesarean sections. The new device has so far been successfully tested in sheep.

Image: 
Daniel Fong, UC Davis

A device to directly measure blood oxygen saturation in a fetus during labor has been developed by researchers at the University of California, Davis. By providing better information about the health of a fetus right before birth, the device could both reduce the rate of Cesarean sections and improve outcomes in difficult deliveries.

Since the 1970s, U.S. obstetricians have monitored fetal heart rate and the mother's rate of contractions as a way to assess the health of the fetus during labor. Taken together, these measurements are a proxy for fetal blood oxygen levels. If the fetus is deprived of oxygen before birth, it may suffer lasting damage or die - leading doctors to perform C-sections if they think a fetus is getting into trouble.

This practice has led to a high rate of C-sections, but without much improvement in the rate of fetal complications associated with oxygen deficiency.

"We wondered if we could build a device to measure fetal blood oxygen saturation directly," said Soheil Ghiasi, professor of electrical and computer engineering at UC Davis.

Results from the work have been presented at the Society for Maternal-Fetal Medicine pregnancy meeting in Grapevine, Texas in February, and in an upcoming issue of IEEE Transactions in Biomedical Engineering.

Direct measurement of fetal blood oxygen saturation

The new device is based on the same principle as the oximeter you might have slipped on your finger at the doctor's office. Hemoglobin in red blood cells absorbs colors of light differently depending on how much oxygen it has bound. A finger oximeter measures different wavelengths of light to calculate the oxygen saturation in your blood.

Measuring blood oxygen saturation in a fetus within the mother poses additional problems. First, there's more tissue to get through to reach the fetus, so only a tiny amount of light can be reflected back to be measured non-invasively.

Second, there's the problem of separating the signal from fetal blood from that of the mother.

Experimental tests in pregnant sheep, published in IEEE Transactions of Biomedical Engineering, show that the new device could accurately measure oxygen levels in the fetus.

Ghiasi became interested in the problem when he and his wife had their first child five years ago. Although like many couples they had wanted a natural childbirth, they found that the care team soon recommended C-section based on fetal monitoring.

Credit: 
University of California - Davis

Multispecialty centers for pediatric dysphagia deliver better outcomes, reduced costs

Children who choke when they drink or eat may have what's known as dysphagia, or a swallowing disorder -- one of the most common medical complaints seen in young children. This condition can be due to various causes that require care from clinicians with expertise in areas including otolaryngology-head and neck surgery, gastroenterology, pulmonology, pediatric surgery, and speech-language pathology.

A new study has found that by combining these different medical disciplines in one center rather than a typical care journey making appointments one specialist at a time, children had better outcomes, reduced the number of procedures needed, and health care costs were reduced.

The new research, published online June 18 in NEJM Catalyst Innovations in Care Delivery, is the first study to look at value-based health care metrics by comparing integrative practice units (IPUs) to a control group of traditional, single-specialty care models.

"Children with swallowing disorders are not just an ear, a nose, a throat, or a lung or stomach; each needs to be looked at holistically as child not a body part," said study principal investigator Christopher J. Hartnick, MD, MS, Director of the Division of Pediatric Otolaryngology and the Pediatric Airway, Voice and Swallowing Center at Massachusetts Eye and Ear, and professor of Otolaryngology-Head and Neck Surgery at Harvard Medical School. "Our study shows that by caring for children with swallowing issues with many disciplines in one setting, centers would provide excellent care, while diminishing unnecessary visits and procedures, and better streamlining a diagnosis. We feel based on these results that this integrative care model can be applied to other common medical conditions."

Hartnick partnered with a leader in costing and value-based healthcare modeling, Robert S. Kaplan, Senior Fellow, Marvin Bower Professor of Leadership Development, Emeritus, at Harvard Business School. Kaplan observed, "This is the first study to demonstrate that using a multi-disciplinary integrated practice unit (IPU) to treat a complex medical condition is not only better for patients, and their families; it also lowers the costs of the hospital to treat the condition."

Unique method to compare care for children's swallowing disorders

Swallowing problems when eating or drinking can lead to fluid entering the lungs, known as aspirations, which in turn can lead to serious lung infections like pneumonia.

Pediatric patients with feeding and swallowing difficulties who get typical care outside of an IPU may have to make appointments with one specialist at a time until they get the right diagnosis, whether it's a pulmonologist, gastroenterologist, or an ENT, among others. Specialists will look at the issue under the umbrella of their expertise and order procedures accordingly, and once they rule out a possible condition, will refer to the next specialist. This process can take weeks or months until a diagnosis is made, and bouncing around from providers and extra appointments can add stress and costs to families. In the case of pediatric aerodigestive care, each separate visit may require a test or procedure that requires undergoing general anesthesia. The researchers sought to see whether the IPU model could reduce the number of these visits, thereby reducing health care costs and improving the experience for the family.

Leaders at six hospitals partnered to assess and compare the health outcomes and costs of pediatric aerodigestive care. Four of the hospitals delivered pediatric aerodigestive care with an IPU: Massachusetts Eye and Ear in Boston, Children's Hospital Colorado in Aurora, Seattle Children's Hospital, and the Children's Hospital at Vanderbilt in Nashville, Tennessee; while two delivered care traditionally -- Children's Hospital of the King's Daughters (CHKD) in Norfolk, Virginia, and the Hospital for Sick Children in Toronto -- with isolated specialists.

The researchers selected swallowing disorders because they represented a large percentage of the children seen within each aerodigestive IPU, and had a clear one-year full cycle of care that included all procedures and tests, with objective and patient (caregiver) reported outcome measures. Their study included children aged 0-10 years old with swallowing difficulties. Each center developed process maps to get a sense of which providers would see a particular patient based on presentation.

Once the data and process maps were collected, Kaplan's HBS team helped the clinicians conduct a time-driven activity-based costing (TDABC) analysis. TDABC estimates cost per minute for time and the quantity of resources used over each patient's treatment cycle and cost of all resource inputs required, including the average time, the personnel type, and space and equipment required.

Benefits from multidisciplinary care may be applicable to other diseases

The researchers found that providing care for swallowing disorders in a pediatric IPU improves better post-operative swallowing outcomes, and also lowered costs by having a single nursing team support the multiple physicians in the IPU.

In the traditional, fragmented care model, the delayed diagnoses caused higher costs for families and to the health care system for multiple physician office visits, unexpected emergency room use that led to hospital admissions, and parent's increased time off from work, travel and other care considerations.

The IPU enabled pediatric patients to get a single evaluation on the same day from multiple providers each using different diagnostic tests. If necessary based on the multidisciplinary evaluation, the procedures necessary for testing can all be done at the same time with only need for one dose of anesthesia.

Some have criticized the IPU model, believing it would reduce the number of patients that specialists can see in one day. This study found that by incorporating mid-level providers such as nurse practitioners, physician assistants and speech language pathologists into the treatment team, there was less than a 5 percent difference in the number of patients seen.

The non IPU sites required separate nursing teams for each physician. This led the average total personnel costs of the non-IPU sites to be 28 percent higher than the average of the four IPU sites ($4,284 versus $3,347). And this understates the total increase in costs. The non-IPU sites impose much higher costs and risk on the child and parents when they require separate visits to each physician, with separate procedures performed, across many more days. The lead clinicians unanimously agreed that the nurse coordinator was most important to the success of the IPU.

The study enabled each hospitals to see the process maps of the others, and to implement changes that would improve future efficiencies. For example, based on a different skill mix used at one IPU site, the others saw how to reduce nursing costs by task downshifting from a nurse to a medical assistant to room patients, and adding a patient care coordinator to increase efficiency, reduce extra testing, and improve the family's experience with the IPU.

The research had prompted the two hospitals used as controls to switch to implement their own IPU programs, and future studies will track these institutions' changes in outcomes and costs. A similar IPU model has been employed at Mass. Eye and Ear in some programs, and may be applicable to other areas of medicine as well.

"We have seen great success with integrated practice model within our multidisciplinary Head and Neck Surgery program at Mass. Eye and Ear and Mass General Hospital," said Mark A. Varvares, MD, FACS, Associate Chair of Otolaryngology-Head and Neck Surgery at Mass. Eye and Ear, who was not involved in the study. "This new study provides evidence that the IPU model can be applied successfully to other areas of otolaryngology."

Credit: 
Mass Eye and Ear

Using a Gaussian mathematical model to define eruptive stages of young volcanic rocks

image: Laser 40Ar/39Ar isochron ages and age-probability diagrams

(p) probability statistics for all ages; (q) probability statistics for the first peak; (r) probability statistics for the second peak; (s) probability statistics for the third peak.

Image: 
©Science China Press

Reconstruction of Quaternary environments, late Cenozoic geodynamics and evaluation of volcanic hazards, all depend on the precise delineation of eruptive stages. To date, it has been difficult to achieve high-precision dating of young volcanic rocks.

A research entitled "Using a Gaussian mathematical model to define eruptive stages of young volcanic rocks in Tengchong based on laser 40Ar/39Ar dating", Zhao Xinwei as the first author, Zhou Jing, Ma Fang, Ji Jianqing and Alan Deino as co-authors, published in Science China Earth Sciences.

As one of the youngest volcanic areas in Southwest China, Tengchong volcanoes are characterized by multi-stage eruptions and complex overlapping relationships. However, the stages of volcanic eruption within this area are controversial. This paper applied high-precision laser 40Ar/39Ar dating to the main volcanic units in the Tengchong area and obtained ages in the range of 0.025~5.1 Ma using conventional data processing methods. But the dating of young volcanic rocks is very difficult and requires high precision. Any tiny factors, such as very low radiogenic 40Ar content, inconsistency in the initial 40Ar/36Ar ratios, instrumental precision levels, errors from preparation and measurement methods, contribute to great age deviations. Moreover, lacking a unified timescale, conventional methods were unable to strictly define the stages of the Tengchong volcanic eruptions. Therefore, there is such great uncertainty related to using a single age as an eruption age in the absolute chronological study of volcanic eruptions; it can even be misleading.

To solve the above issues, researchers applied a Gaussian mathematical model to deal with all 378 original ages from 13 samples (Figure 1). An apparent age-probability diagram, consisting of three independent waveforms, have been obtained. The corresponding isochron ages of these three waveforms suggest there were three volcanic eruptive stages, namely during the Pliocene 3.78±0.04 Ma, early Middle Pleistocene 0.63±0.03 Ma and late Middle Pleistocene to early Late Pleistocene 0.139±0.005 Ma. The new method for identifying stages proposed in this paper within the same timescale has the advantages of controlling variables, reducing deviations, approaching the true age value, weakening the influence of subjective factors and precisely defining the eruptive stages in young volcanic rocks. In addition, by increasing the number of testing data, the final result will be more representative of the real volcanic history, with reproducible and verifiable eruptive stages.

Credit: 
Science China Press

High performance sodium-ion capacitors based on Nb2O5 nanotubes@carbon cloth

image: The growth of Nb2O5-based products on carbon cloth under different pH values.

Image: 
©Science China Press

Hybrid sodium-ion capacitors (SICs) bridge the gap between supercapacitors (SCs) and batteries and have huge potential applications in large-scale energy storage. However, designing appropriate anode materials with fast kinetics behavior as well as long cycle life to match with the cathode electrodes remains a crucial challenge.

Recently, the joint research groups from the University of Science and Technology Beijing and Institute of Semiconductors, Chinese Academy of Sciences directly synthesized Nb2O5 nanotubes and nanowire-to-nanotube homojunctions on carbon cloth (CC) via a simple hydrothermal process, which was published in Science China Materials (DOI: 10.1007/s40843-020-1278-9). The as-prepared Nb2O5@CC nanotubes displayed a high reversible capacity of 175 mAh/g at the current density of 1 A/g with the Coulombic efficiency of 97% after 1500 cycles. Moreover, the SICs fabricated with Nb2O5@CC and activated carbon (AC) electrode materials showed high energy density of 195 Wh/kg at 120 W/kg, power density of 7328 W/kg at 28 Wh/kg and 80% of the capacitance retention after cycling for 5000 cycles.

Prof. Shen stated: "Although Nb2O5 has good chemical stability and large interplanar spacing, its conductivity is relatively poor. To overcome it, we directly grew Nb2O5 nanomaterials with different morphologies on the current collectors (carbon cloth) using the synergetic effect of pyridine and pH value of the acid solution. The physical and electrochemical properties of the prepared materials were systematically studied. Studies found that the nanotubes have large specific surface area and pore volume, which is beneficial for more active sites to be involved in electrochemical reactions. The Nb2O5@CC nanotube electrode not only possesses good conductivity, but also reduces the volume expansion caused by sodium ion intercalation/de-intercalation. All these advantages contribute to good electrochemical performance in sodium ion capacitors. Additionally, the flexible SIC devices can operate normally at various bendable conditions. The Nb2O5@CC nanotubes in this work can be promising electrode materials in flexible and wearable energy storage devices".

Credit: 
Science China Press

Energy storage using oxygen to boost battery performance

image: Schematic of the formation process of EG-water complexes and illustration of the penetration process of an isolated water molecule.

Image: 
Jeung Ku Kang, KAIST

Researchers have presented a novel electrode material for advanced energy storage device that is directly charged with oxygen from the air. Professor Jeung Ku Kang's team synthesized and preserved the sub-nanometric particles of atomic cluster sizes at high mass loadings within metal-organic frameworks (MOF) by controlling the behavior of reactants at the molecular level. This new strategy ensures high performance for lithium-oxygen batteries, acclaimed as a next-generation energy storage technology and widely used in electric vehicles.

Lithium-oxygen batteries in principle can generate ten times higher energy densities than conventional lithium-ion batteries, but they suffer from very poor cyclability. One of the methods to improve cycle stability is to reduce the overpotential of electrocatalysts in cathode electrodes. When the size of an electrocatalyst material is reduced to the atomic level, the increased surface energy leads to increased activity while significantly accelerating the material's agglomeration.

As a solution to this challenge, Professor Kang from the Department of Materials Science and Engineering aimed to maintain the improved activity by stabilizing atomic-scale sized electrocatalysts into the sub-nanometric spaces. This is a novel strategy for simultaneously producing and stabilizing atomic-level electrocatalysts within metal-organic frameworks (MOFs).

Metal-organic frameworks continuously assemble metal ions and organic linkers.

The team controlled hydrogen affinities between water molecules to separate them and transfer the isolated water molecules one by one through the sub-nanometric pores of MOFs. The transferred water molecules reacted with cobalt ions to form di-nuclear cobalt hydroxide under precisely controlled synthetic conditions, then the atomic-level cobalt hydroxide is stabilized inside the sub-nanometric pores.

The di-nuclear cobalt hydroxide that is stabilized in the sub-nanometric pores of metal-organic frameworks (MOFs) reduced the overpotential by 63.9% and showed ten-fold improvements in the life cycle.

Professor Kang said, "Simultaneously generating and stabilizing atomic-level electrocatalysts within MOFs can diversify materials according to numerous combinations of metal and organic linkers. It can expand not only the development of electrocatalysts, but also various research fields such as photocatalysts, medicine, the environment, and petrochemicals."

Credit: 
The Korea Advanced Institute of Science and Technology (KAIST)

Liver perfusion could save 7 in 10 rejected donor livers

image: The VITTAL team perform a liver transplant using a liver that has undergone the perfusion technique

Image: 
Mr Hynek Mergental, Honorary Senior Lecturer at the University of Birmingham and Consultant Surgeon at the UHB Liver Unit

A major study investigating the effectiveness of liver perfusion as a technique to improve the function of donor livers that would have otherwise been rejected has shown that up to 7 in every 10 could be used after just 4-6 hours of the assessment.

The study, 'Transplantation of discarded livers following viability testing with normothermic machine perfusion', published today in Nature Communications, could have significant implications for the liver transplant waiting list and the commissioning of local transplant services.

Currently, across the UK, a third of donated livers don't meet desired transplant criteria and aren't used. Chronic liver disease in the UK is rising annually, a result of obesity and increasing alcohol misuse causing approximately 8500 deaths per year. For those with end-stage liver disease, a transplant is the only hope for survival, but demand for livers suitable for transplantation far outstrips supply. According to the latest NHS Blood and Transplant report, up to 20% of people awaiting a transplant operation died or were removed from waiting lists due to ill health.

A growing proportion of donated livers are coming from high-risk donors with a history of alcohol misuse, obesity or elderly people with comorbidities, often when a patient has suffered cardiac arrest that is unexpected and when the patient cannot or should not be resuscitated. These livers are of lower quality and pose risks to recipients. Consequently, the majority are not transplanted.

Funded by the Wellcome Trust, experts from the University of Birmingham's Centre for Liver and Gastrointestinal Research, University Hospitals Birmingham NHS Foundation Trust and the NIHR Birmingham Biomedical Research Centre have found that just 4-6 hours of normothermic machine perfusion assessment enabled 70 per cent of currently discarded livers to recover enough to allow successful transplantation into a recipient.

Mr Hynek Mergental, Honorary Senior Lecturer at the University of Birmingham and Consultant Surgeon at the UHB Liver Unit said: "Whilst liver transplantation is one of the most advanced surgical procedures, up to now, there has been no objective mean to assess suitability of donor livers for transplantation. The VITTAL trial validated our pre-clinical research and pilot clinical observations and these viability criteria can now guide transplant teams worldwide to provide access to the life-saving transplantation to more patients in need. "

VITTAL project lead, Professor Darius Mirza, Consultant Transplant Surgeon at University Hospitals Birmingham NHS Foundation Trust, added: "This challenging study was designed to assess function of discarded livers in the real-life situation, using the normothermic machine perfusion. The major challenge in this pioneering clinical trial was to assure patients safety while pushing the envelope of sub-optimal liver utilisation."

Mr Thamara Perera, Consultant Transplant Surgeon at UHB explains: "This ground breaking trial has proven that objective parameters can be used for making a decision to use a borderline liver. The observed 100% study participants post-transplant survival was reassuring and provided our patients and the surgical team with confidence to implement and further expand this approach, which now helps the sickest patients on our waiting list to underwent transplantation sooner and safer."

Dr Simon Afford, Reader in Liver Immunobiology at the of the University of Birmingham's Institute of Immunology and Immunotherapy, said: "It has long been recognised that as a consequence of our population aging the quality of donated livers keeps declining. Based on our latest discoveries we believe that in the near future the machine perfusion platform will facilitate therapeutic interventions to improve liver viability. We expect we will be able salvage even more organs than 70% observed in the VITTAL trial, including livers from donors with known alcohol misuse or obesity."

Tim Knott, Head of Innovation Programmes at the Wellcome Trust, said: "Many more patients who need liver transplants will benefit from this technology. Giving surgeons the tools to assess if a liver transplant will be viable will help the thousands of people who have chronic liver disease globally."

John Forsythe, Medical Director of Organ Donation and Transplantation for NHS Blood and Transplant, said: "New techniques of Organ perfusion and preservation are a vital developing area of organ donation and transplantation. We are delighted that a number of doctors and scientists in the UK are leading the way in this field of research.

"Each year a small number of donated organs are not transplanted for a variety of reasons. Transplant success relies on a significant amount of activity taking place in a short space of time. New techniques are already allowing us to transplant donated organs that would not have been possible in the past. More research in this area is likely to increase that ability."

Credit: 
University of Birmingham

Simple blood test could one day diagnose motor neurone disease

Scientists at the University of Sussex have identified a potential pattern within blood which signals the presence of motor neuron disease; a discovery which could significantly improve diagnosis.

Currently, it can take up to a year for a patient to be diagnosed with amyotrophic lateral sclerosis (ALS), more commonly known as motor neuron disease (MND).

But after comparing blood samples from patients with ALS, those with other motor-related neurological diseases, and healthy patients, researchers were able to identify specific biomarkers which act as a diagnostic signature for the disease.

Researchers hope that their findings, published in the journal Brain Communications, and funded by the Motor Neurone Disease Association (MNDA), could lead to the development of a blood test which will identify the unique biomarker, significantly simplifying and speeding up diagnosis.

With patients living, on average, just 2-5 years after diagnosis, this time could be crucial.

Professor Majid Hafezparast, a professor of Molecular Neuroscience at the University of Sussex, led the research in collaboration with Professors Nigel Leigh and Sarah Newbury from the Brighton and Sussex Medical School, Martin Turner from the University of Oxford, Andrea Malaspina from Queen Mary, University of London, and Albert Ludolph from the University of Ulm.

He said: "In order to effectively diagnose and treat ALS, we are in urgent need of biomarkers as a tool for early diagnosis and for monitoring the efficacy of therapeutic interventions in clinical trials.

"Biomarkers can indicate the disease is present and help us to predict its progression rate.

"In our study, we compared serum samples taken from the blood of 245 patients and controls, analysing their patterns of non-coding ribonucleic acids (ncRNA).

"We found a biomarker signature for motor neurone disease that is made up of a combination of seven ncRNAs. When these ncRNA are expressed in a particular pattern, we are able to classify whether our samples come from ALS patients or controls."

Dr Greig Joilin, the research fellow who undertook this work in Professor Hafezparast's team said: "We hope that, with further work to validate these biomarkers, a blood test could be developed to help improve diagnosis of motor neuron disease.

"We are now looking to see whether they can predict prognosis to give patients and their families some insight as they begin to understand the disease. Our work could also help other scientists to measure the effectiveness of potential drug treatments against the ncRNA levels. Further, it provides new insight into the cellular and molecular events that contribute to the disease."

ALS is a group of conditions which affects the nerves in the brain and spinal cord leading to weakness in the muscles and rapid deterioration.

Doctors still don't know why this happens and there is currently no cure, although existing drug treatments can help patients with daily life and extend life expectancy - but only by two to four months on average.

Stephen Hawking is perhaps one of the most famous cases of motor neuron disease, but more recently Geoff Whaley and his wife Ann brought to light the troubling situation of patients in the UK who wish to end their life before the final phase of the disease takes hold.

Professor Hafezparast hopes that his team's discovery will improve the outlook for patients by improving diagnosis and giving other researchers a valuable tool to test potential treatments. The researchers are now looking to validate this biomarker signature in a larger cohort of patients and begin to understand why these ncRNAs change in ALS patients.

Credit: 
University of Sussex

Half of the world's population exposed to increasing air pollution, study shows

image: Figure 1: (a) Map of global PM2.5 in 2016; (b) Changes in concentrations between 2010 and 2016. Units for both are μg/m3.

Image: 
Professor Gavin Shaddick/University of Exeter

Half of the world's population is exposed to increasing air pollution, new research has shown.

A team of researchers, led by Professor Gavin Shaddick at the University of Exeter, has shown that, despite global efforts to improve air quality, vast swathes of the world's population are experiencing increased levels of air pollution.

The study, carried out with the World Health Organisation, suggests that air pollution constitutes a major, and in many areas increasing, threat to public health.

The research is published in leading journal Climate and Atmospheric Science on Wednesday, June 17th 2020.

Professor Shaddick, Chair of Data Science & Statistics at the University of Exeter said: "While long-term policies to reduce air pollution have been shown to be effective in many regions, notably in Europe and the United States, there are still regions that have dangerously high levels of air pollution, some as much as five times greater than World Health Organization guidelines, and in some countries air pollution is still increasing".

The World Health Organization has estimated that more than four million deaths annually can be attributed to outdoor air pollution.

Major sources of fine particulate matter air pollution include the inefficient use of energy by households, industry, the agriculture and transport sectors, and coal-fired power plants. In some regions, sand and desert dust, waste burning and deforestation are additional sources of air pollution.

Although air pollution affects high and low-income countries alike, low- and middle-income countries experience the highest burden, with the highest concentrations seen in Central, Eastern Southern and South-Eastern Asia.

For the study, the research team examined trends in global air quality between 2010 and 2016, against a backdrop of global efforts to reduce air pollution, both through short and long term policies.

The team used ground monitoring data together with information from satellite retrievals of aerosol optical depth, chemical transport models and other sources to provide yearly air quality profiles for individual countries, regions and globally.

This methodology constitutes a major advance in the ability to track progress towards the air quality-related indicators of United Nation's Sustainable Development Goals, and to expand the evidence base of the impacts of air pollution on health.

Professor Shaddick added: "Although precise quantification of the outcomes of specific policies is difficult, coupling the evidence for effective interventions with global, regional and local trends in air pollution can provide essential information for the evidence base that is key in informing and monitoring future policies."

Credit: 
University of Exeter

Red squirrels making comeback as return of pine marten spells bad news for invasive grey squirrel

image: The number of red squirrels is on the increase in Ireland thanks to the return of the pine marten, a native carnivore, a new survey led by NUI Galway has found.

Image: 
Poshey Aherne

The number of red squirrels is on the increase in Ireland thanks to the return of the pine marten, a native carnivore, a new survey led by NUI Galway has found.

The new findings indicate that the return of the red squirrel is due to the decrease in the number of grey squirrels, which compete with them for food and carry a disease that is fatal to the native species. The re-emergence of the pine marten, which had previously almost disappeared in Ireland, is linked to the local demise of the greys.

High densities of pine martens were found in areas - particularly the midlands - where grey squirrels had disappeared, with red squirrel numbers recovering in many of these areas indicating that they are capable of sharing habitat with the native carnivore, unlike grey squirrels. In urban areas, such as Dublin and Belfast, the grey squirrel continues to thrive.

Grey squirrels were introduced to Ireland early in the twentieth century, and had spread to cover the eastern half of the island. As a result, the red squirrel range had contracted over several years and the native species was struggling to survive.

The citizen science survey, a cross-border collaboration with the Ulster Wildlife and Vincent Wildlife Trust led by NUI Galway, detected significant changes in the ranges of squirrels and pine martens particularly in the midlands and Northern Ireland.

Dr Colin Lawton of the Ryan Institute, NUI Galway said: 'This study brought together colleagues from institutions all across the island, and this collaborative approach gives us a full picture of the status of these three mammals in Ireland. We are delighted with the response from the public, who were enthusiastic and showed a wealth of knowledge of Ireland's wildlife. It is great news to see two native species recovering and doing well.'

The report on the survey makes recommendations to ensure that the red squirrel and pine marten continue to thrive, with further monitoring required to allow early intervention if conservation at a local or national level is required.

Dr Lawton added: 'We encourage our citizen scientists to continue to log their sightings of Irish wildlife on the two national database platforms. Our collective knowledge is a powerful tool in conservation.'

The survey used online platforms provided by the National Biodiversity Data Centre (RoI) and Centre for Environmental Data and Recording (NI) to develop the data. It was funded by the National Parks and Wildlife Service.

The All Ireland Squirrel and Pine Marten Survey 2019 report (Irish Wildlife Manual No.121) can be downloaded at https://www.npws.ie/sites/default/files/publications/pdf/IWM121.pdf

Credit: 
University of Galway