Tech

Catalyst opens door to more efficient, environmentally friendly ethylene production

image: Reaction pathways for oxidative dehydrogenation of ethane facilitated by the molten carbonate shell modified perovskite redox catalyst.

Image: 
Fanxing Li, NC State University

A research team led by North Carolina State University has engineered a new catalyst that can more efficiently convert ethane into ethylene, which is used in a variety of manufacturing processes. The discovery could be used in a conversion process to drastically reduce ethylene production costs and cut related carbon dioxide emissions by up to 87%.

"Our lab previously proposed a technique for converting ethane into ethylene, and this new redox catalyst makes that technique more energy efficient and less expensive while reducing greenhouse gas emissions," says Yunfei Gao, a postdoctoral scholar at NC State and lead author of a paper on the work. "Ethylene is an important feedstock for the plastics industry, among other uses, so this work could have a significant economic and environmental impact."

"Ethane is a byproduct of shale gas production, and the improved efficiency of our new catalyst makes it feasible for energy extraction operations in remote locations to make better use of that ethane," says Fanxing Li, corresponding author of the paper and an associate professor and University Faculty Scholar in NC State's Department of Chemical Engineering.

"It is estimated that more than 200 million barrels of ethane are rejected each year in the lower 48 states due to the difficulty of transporting it from remote locations," Li says. "With our catalyst and conversion technique, we think it would be cost effective to convert that ethane into ethylene. The ethylene could then be converted into liquid fuel, which is much easier to transport.

"The problem with current conversion techniques is that you can't scale them down to a size that makes sense for remote energy extraction sites - but our system would work well in those locations."

The new redox catalyst is a molten carbonate promoted mixed metal oxide, and the conversion process takes place at between 650 and 700 degrees Celsius with integrated ethane conversion and air separation. Current conversion techniques require temperatures higher than 800 degrees C.

"We estimate that the new redox catalyst and technique cut energy requirements by 60-87%," Li says.

"Our technique would require an initial investment in the installation of new, modular chemical reactors, but the jump in efficiency and ability to convert stranded ethane would be significant," Li says.

Credit: 
North Carolina State University

Underprotected marine protected areas in a global biodiversity hotspot

image: Buoy delimiting the fully protected zone within the Portofino Marine Protected Area, Italy.

Image: 
Joachim Claudet

Through the assessment of the 1062 marine protected areas (MPAs) in the Mediterranean Sea, covering 6% of the Mediterranean Basin, a research team led by the CNRS has shown that 95% of the total area protected lacks regulations to reduce human impacts on biodiversity[1]. Unevenly distributed across political boundaries and eco-regions, effective levels of protection for biodiversity conservation represent only 0.23% of the Mediterranean Basin. This study, published on April 24th 2020 in One Earth by scientists from the Centre de recherche insulaire et observatoire de l'environnement (CRIOBE, CNRS/UPVD/EPHE) and the Royal Belgium Institute of Natural Science, shows that current efforts are insufficient at managing human uses of nature at sea and protection levels should be increased to deliver tangible benefits for biodiversity conservation.

Credit: 
CNRS

Scientists discover just how runny a liquid can be

image: The image shows how fundamental constants of Nature set the fundamental lower limit for liquid viscosity.

Image: 
Image by thehackneycollective.com

Scientists from Queen Mary University of London and the Russian Academy of Sciences have found a limit to how runny a liquid can be.

Viscosity, the measure of how runny a fluid is, is a property that we experience daily when we fill a kettle, take a shower, pour cooking oil or move through air.

We know that liquids get thicker when cooled and runnier when heated, but how runny can a liquid ever get if we keep heating it?

Eventually, the liquid boils and becomes a gas or a dense gas-like substance if heated at high enough pressure. At the point where it transitions between the liquid-like and gas-like state is the minimum value of viscosity.

Viscosity is considered impossible to calculate from theory because it strongly depends on liquid structure, composition and interactions as well as external conditions in a complicated way. Nobel laureate Steven Weinberg compared the difficulty of calculating the viscosity of water to the problem of calculating fundamental physical constants, the constants which shape the fabric of our Universe.

Despite this difficulty, the researchers have developed an equation to do so.

In the study, published in Science Advances, they show that two fundamental physical constants govern how runny a liquid can be. Physical constants, or constants of Nature, are measurable properties of the physical universe that do not change.

Their equation relates the minimal value of elementary viscosity (the product of viscosity and volume per molecule) to the Planck constant, which governs the quantum world, and the dimensionless proton-to-electron mass ratio.

Professor Kostya Trachenko, lead author of the paper from Queen Mary University of London, said: "This result is startling. Viscosity is a complicated property varying strongly for different liquids and external conditions. Yet our results show that the minimal viscosity of all liquids turns out to be simple and universal."

There are practical implications of discovering this limit too. It could be applied where a new fluid for a chemical, industrial or biological process with a low viscosity is required. One example where this is important is the recent use of supercritical fluids for green and environmentally clean ways of treating and dissolving complex waste products.

In this instance, the discovered fundamental limit provides a useful theoretical guide of what to aim for. It also tells us that we should not waste resources trying to beat the fundamental limit because the constants of Nature will mould the viscosity at or above this point.

Fundamental physical constants and in particular dimensionless constants (fundamental constants that do not depend on the choice of physical units) are believed to define the Universe we live in. A finely-tuned balance between the proton-to-electron mass ratio and another dimensionless constant, the fine structure constant, governs nuclear reactions and nuclear synthesis in stars leading to essential biochemical elements including carbon.

This balance provides a narrow 'habitable zone' where stars and planets can form and life-supporting molecular structures can emerge. Change one of the dimensionless fundamental constants slightly, and the Universe becomes very different, with no stars, heavy elements, planets and life.

Professor Trachenko said: "The lower fundamental limit reminds us how fundamental constants of Nature affect us daily, starting from making a morning cup of tea by extending their overarching rule to specific, yet complex, properties such as liquid viscosity."

Vadim Brazhkin, co-lead author from the Russian Academy of Sciences, added: "There are indications that the fundamental lower limit of liquid viscosity may be related to very different areas of physics: black holes as well as the new state of matter, quark-gluon plasma, which appears at very high temperature and pressure. Exploring and appreciating these and other connections is what makes science ever so exciting."

Credit: 
Queen Mary University of London

Two steps closer to flexible, powerful, fast bioelectronic devices

image: Conformable enhancement-mode, internal ion-gated organic electrochemical transistor (e-IGT)
A) Micrograph displaying the top view of an e-IGT (top). Scale bar, 5 μm. Ultra-flexible, ultra-thin e-IGT array conforming to the surface of a human hand (bottom).
B) Optical micrograph of an e-IGT-based device with four transistors for LFP and spike recording. The anchor hole facilitates insertion of the conformable device into deep layers of cortex. Scale bar, 80 μm.

Image: 
Columbia Engineering

New York, NY--April 24, 2020--Dion Khodagholy, assistant professor of electrical engineering, is focused on developing bioelectronic devices that are not only fast, sensitive, biocompatible, soft, and flexible, but also have long-term stability in physiological environments such as the human body. Such devices would greatly improve human health, from monitoring in-home wellness to diagnosing and treating neuropsychiatric diseases, including epilepsy and Parkinson's disease. The design of current devices has been severely constrained by the rigid, non-biocompatible electronic components needed for safe and effective use, and solving this challenge would open the door to a broad range of exciting new therapies.

In collaboration with Jennifer N. Gelinas, Department of Neurology, and the Institute for Genomic Medicine at Columbia University Iriving Medical Center, Khodagholy has recently published two papers, the first in Nature Materials (March 16) on ion-driven soft and organic transistors that he and Gelinas have designed to record individual neurons and perform real-time computation that could facilitate diagnosis and monitoring of neurological disease.

The second paper, published today in Science Advances, demonstrates a soft, biocompatible smart composite--an organic mixed-conducting particulate material (MCP)--that enables the creation of complex electronic components which traditionally require several layers and materials. It also enables easy and effective electronic bonding between soft materials, biological tissue, and rigid electronics. Because it is fully biocompatible and has controllable electronic properties, MCP can non-invasively record muscle action potentials from the surface of arm and, in collaboration with Sameer Sheth and Ashwin Viswanathan at Baylor College of Medicine's department of neurosurgery, large-scale brain activity during neurosurgical procedures to implant deep brain stimulation electrodes.

"Instead of having large implants encapsulated in thick metal boxes to protect the body and electronics from each other, such as those used in pacemakers, and cochlear and brain implants, we could do so much more if our devices were smaller, flexible, and inherently compatible with our body environment," says Khodagholy, who directs the Translational NeuroElectronics Lab at Columbia Engineering. "Over the past several years, my group has been working to use unique properties of materials to develop novel electronic devices that allow efficient interaction with biological substrates--specifically neural networks and the brain."

Conventional transistors are made out of silicon, so they cannot function in the presence of ions and water, and in fact break down because of ion diffusion into the device. Therefore, the devices need to be fully encapsulated in the body, usually in metal or plastic. Moreover, although they work well with electrons, they are not very effective at interacting with ionic signals, which is how the body's cells communicate. As a result, these properties restrict the abiotic/biotic coupling to capacitive interactions only on the surface of material, resulting in lower performance. Organic materials have been used to overcome these limitations as they are inherently flexible, but the electrical performance of these devices was not sufficient to perform real-time brain signal recording and processing.

Khodagholy's team took advantage of both the electronic and the ionic conduction of organic materials to create ion driven transistors they call e-IGTs, or enhancement-mode, internal ion-gated organic electrochemical transistors, that have embedded mobile ions inside their channels. Because the ions do not need to travel long distances to participate in the channel switching process, they can be switched on and off quickly and efficiently. The transient responses depend on electron hole rather than ion mobility, and combine with high transconductance to result in a gain-bandwidth that is several orders of magnitude above that of other ion-based transistors.

The researchers used their e-IGTs to acquire a wide range of electrophysiological signals, such as in vivo recording of neural action impulses, and to create soft, biocompatible, long-term implantable neural processing units for the real-time detection of epileptic discharges.

"We're excited about these findings," says Gelinas. "We've shown that E-IGTs offer a safe, reliable, and high-performance building block for chronically implanted bioelectronics, and I am optimistic that these devices will enable us to safely expand how we use bioelectronic devices to address neurologic disease."

Another major advance is demonstrated by the researchers in their Science Advances paper: enabling bioelectronic devices, specifically those implanted in the body for diagnostics or therapy, to interface effectively and safely with human tissue, while also making them capable of performing complex processing. Inspired by electrically active cells, similar to those in the brain that communicate with electrical pulses, the team created a single material capable of performing multiple, non-linear, dynamic electronic functions just by varying the size and density of its composite mixed-conducting particles.

"This innovation opens the door to a fundamentally different approach to electronic device design, mimicking biological networks and creating multifunctional circuits from purely biodegradable and biocompatible components," says Khodagholy.

The researchers design and created mixed conducting particulate (MCP)-based high performance anisotropic films, independently addressable transistors, resistors, and diodes that are pattern-free, scalable, and biocompatible. These devices carried out a variety of functions, including recording neurophysiologic activity from individual neurons, performing circuit operations, and bonding high-resolution soft and rigid electronics.

"MCP substantially reduces the footprint of neural interface devices, permitting recording of high-quality neurophysiological data even when the amount of tissue exposed is very small, and thus decreases the risk of surgical complications," says Gelinas. "And because MCP is composed of only biocompatible and commercially available materials, it will be much easier to translate into biomedical devices and medicine."

Both the E-IGTs and MCP hold great promise as critical components of bioelectronics, from wearable miniaturized sensors to responsive neurostimulators. The E-IGTs can be manufactured in large quantities and are accessible to a broad range of fabrication processes. Similarly, MCP components are inexpensive and easily accessible to materils scientist and engineers. In combination, they form the foundation for fully implantable biocompatible devices that can be harnessed both to benefit health and to treat disease.

Khodagholy and Gelinas are now working on translating these components into functional long-term implantable devices that can record and modulate brain activity to help patients with neurological diseases such as epilepsy.

"Our ultimate goal is to create accessible bioelectronic devices that can improve peoples' quality of life," says Khodagholy, "and with these new materials and components, it feels like we have stepped closer to that."

Credit: 
Columbia University School of Engineering and Applied Science

Organic heterostructures composed of one- and two-dimensional polymorph

image: The morphologies of 1D organic microrod (α phase), 2D microplate (β phase) and organic heterostructure (OHSs), the chemical structures of m-B2BCB, The molecular packing arrangement of ab plane of α phase and the packing arrangement of ac plane of β phase.

Image: 
©Science China Press

Organic heterostructures (OHSs) with high spatial and angular precision are key component of the organic optoelectronics, such as organic photovatic (OPV), organic light emitting diodes (OLED), and photo-detectors. Moreover, heterostructure can integrate multiple components into one structure to overcome the challenge of single output channel and provide feasibility for realizing more transmission modes. Besides, the OHSs inherently have advantages including simple solution preparation and synthesis, flexible molecular structure design and broad spectral tenability.

However, OHSs usually require two or more kinds of organic molecules, which brings difficulty to search the universal condition for all materials to grow together. Moreover, the phase separation during the self-assembly of OHSs remains a big challenge. As we know, polymorphism is quite a convenient approach to tune the chemical/physical properties of organic crystal based on one organic compound. Thus, Organic materials with polymorph property can be utilized to fabricate OHSs with enormous structural diversity and novel optical/electronical properties.

Very recently, through the polymorphism, Dr. Xue-Dong Wang and colleagues in Soochow University elaborately fabricated the OHSs of one organic compound of 3,3'-((1E,1'E)-anthracene-9,10-diylbis(ethane-2,1-diyl))dibenzonitril (m-B2BCB), which can simultaneously be self-assembled into one-dimensional (1D) microrods (α phase) and 2D microplates (β phase).The growth mechanism of OHSs is due to the low interplanar spacing mismatch rate of 5.8% between the (010) crystal plane of the 2D branch microplate and (001) of the 1D trunk microrod. These two organic polymorph microcrystals both have good optical waveguide performance with the Rα = 0.022 dB/μm of 1D microrod and the average Rβ = 0.036 dB/μm of 2D microplate. More significantly, the multiple output channels have been achieved in the OHSs, which exhibits the structure-dependent optical signals (green and yellow light) at the different output channel in the OHSs. This work exhibits the great value of polymorphism in OHSs, which could provide further applications on multifunctional organic integrated photonics circuits.

Credit: 
Science China Press

Fueling the world sustainably: Synthesizing ammonia using less energy

image: Ammonia (NH3) is one of the most important industrial chemicals today, synthesized globally for use in fertilizers that then enable food production for approximately 70% of the world's population. Ammonia is currently obtained by reacting nitrogen (N2) from air with hydrogen (H2). This reaction requires high energy and is, therefore, powered by fossil fuels, contributing to over 3% of the global CO2 emissions.

Image: 
Irasutoya, Michikazu Hara

Scientists at Tokyo Institute of Technology (Tokyo Tech) have developed an improved catalyst by taking the common dehydrating agent calcium hydride and adding fluoride to it. The catalyst facilitates the synthesis of ammonia at merely 50 °C, by using only half the energy that existing techniques require. This opens doors to ammonia production with low energy consumption and reduced greenhouse gas emission.

Ammonia is a critical for making plant fertilizer, which in turn feeds approximately 70% of the world's population. In industries, ammonia is produced via the Haber-Bosch process, where methane is first reacted with steam to produce hydrogen, and hydrogen is then reacted with nitrogen to give ammonia. The problem with this process is that as the temperature increases, the yield decreases. To continue to get a good yield, the pressure applied in the reaction chamber needs to be increased. This requires much energy. Further, the iron-based catalysts used for the reaction are only effective above 350 °C. Maintaining such high temperatures also requires a significant amount of energy. To top it all, the yield is only 30-40%.

Fossil fuels are currently used to power the process, contributing large amounts of carbon dioxide to the atmosphere. Renewable resource alternatives, such as wind energy, have been applied, but those have not proven sustainable. To increase the yield while reducing harm to the environment, therefore, the reaction must take place at low temperatures. For this to happen, catalysts that enable the reaction at low temperatures are required.

So far, such catalysts have been elusive to scientists. "Conventional catalysts lose the catalytic activity for ammonia formation from N2 and H2 gases at 100-200 °C, even if they exhibit high catalytic performance at high temperatures," remark a group of scientists from Tokyo Tech, Japan, who appear to have finally solved the catalyst problem. The scientists, led by Dr.Michikazu Hara, developed a catalyst that is effective even at 50 °C. "Our catalyst produces ammonia from N2 and H2 gases at 50 °C with an extremely small activation energy of 20 kJmol-1, which is less than half that reported for conventional catalysts," Dr. Hara and colleagues report in their paper published in Nature Communications.

Their catalyst comprises a solid solution of CaFH, with ruthenium (Ru) nanoparticles deposited on its surface. The addition of fluoride (F-) to calcium hydride (CaH2), a common dehydrating agent, is what makes the catalyst effective at lower temperatures and pressures. After conducting spectroscopic and computational analyses, the scientists propose a possible mechanism by which the catalyst facilitates ammonia production.

The calcium-fluoride (Ca-F) bond is stronger than the calcium-hydrogen (Ca-H) bond. So, the presence of the Ca-F bond weakens the Ca-H bond and the Ru is able to extract H atoms from the catalyst crystal, leaving electrons in their place. The H atoms then desorb from the Ru nanoparticles as H2 gas. This occurs even at 50 °C. The resultant charge repulsion between the trapped electrons and F- ions in the crystal lower the energy barriers for these electrons to release, thereby giving the material high electron-donating capacity. These released electrons attack the bonds between the nitrogen atoms in the N2 gas, facilitating the production of ammonia.

This new method of ammonia production cuts energy demands, thereby reducing the carbon dioxide emissions from the use of large amounts of fossil fuels. The findings of this study illuminate the possibility of an environmentally sustainable Haber-Bosch process, opening the door to the next revolution in agricultural food production.

Credit: 
Tokyo Institute of Technology

Quantum electrodynamics experiment

image: Artistic visualization: Symmetries constrain the motion of ultracold atoms in the lab.

Image: 
© Cellule

The fundamental laws of physics are based on symmetries that, among other things, determine the interactions between charged particles. Using ultracold atoms, researchers at Heidelberg University have experimentally constructed the symmetries of quantum electrodynamics. They hope to gain new insights for implementing future quantum technologies that can simulate complex physical phenomena. The results of the study were published in the journal Science.

The theory of quantum electrodynamics deals with the electromagnetic interaction between electrons and light particles. It is based on so-called U(1) symmetry, which for instance specifies the movement of particles. With their experiments, the Heidelberg physicists, under the direction of Junior Professor Dr Fred Jendrzejewski, seek to advance the efficient investigation of this complex physical theory. They recently succeeded in experimentally realising one elementary building block. "We see the results of our research as a major step towards a platform built from a chain of properly connected building blocks for a large-scale implementation of quantum electrodynamics in ultracold atoms," explains Prof. Jendrzejewski, who directs an Emmy Noether group at Heidelberg University's Kirchhoff Institute for Physics.

According to the researchers, one possible application would be developing large-scale quantum devices to simulate complex physical phenomena that cannot be studied with particle accelerators. The elementary building block developed for this study could also benefit the investigation of problems in materials research, such as in strongly interacting systems that are difficult to calculate.

Credit: 
Heidelberg University

FSU researchers discover new structure for promising class of materials

Florida State University researchers have discovered a novel structure for organic-inorganic hybrid materials that shows potential for more efficient technologies.

Professor of Chemistry and Biochemistry Biwu Ma and his team have published a new study in the journal Science Advances that explains how they created a hollow nanostructure for metal halide perovskites that would allow the material to emit a highly efficient blue light. Metal halide perovskites are a material that have shown great potential for photon-related technologies such as light-emitting diodes and lasers, but scientists have still been working to make them more efficient and effective.

"The fabrication of new generation color displays and solid-state lighting requires luminescent materials and devices of the three primary colors, red, green and blue," Ma said. "Although multiple ways of color tuning have been demonstrated for perovskites to achieve highly efficient green and red emissions, producing efficient and stable blue emissions is not trivial. This work provides a facile technique to prepare highly efficient blue emitting thin films."

Ma's research group at FSU has been working on the development and study of metal halide perovskites and perovskite-related materials for optoelectronics and energy applications since 2014. His team has pioneered scientific research on the structural and compositional control of metal halide perovskites and hybrids that would allow them to exhibit unique and useful properties.

In this case, researchers worked with a metal halide perovskite made of cesium lead bromide nanocrystals to build the structure. Previous nanostructures made from this material, including nanoplatelets, nanowires and quantum dots, had positive curvatures; this is the first negative curvature hollow structure of a metal halide perovskite that exhibits pronounced quantum size effects.

"We believe that our work would stimulate exploration of other nanostructures with remarkable and unique properties," Ma said.

Credit: 
Florida State University

Researchers rebuild the bridge between neuroscience and artificial intelligence

image: In an article published today in the journal Scientific Reports, researchers reveal that they have successfully rebuilt the bridge between experimental neuroscience and advanced artificial intelligence learning algorithms. Conducting new types of experiments on neuronal cultures, the researchers were able to demonstrate a new accelerated brain-inspired learning mechanism. When the mechanism was utilized on the artificial task of handwritten digit recognition, for instance, its success rates substantially outperformed commonly-used machine learning algorithms.

Figure: Advanced learning mechanisms of our brain might lead to more efficient AI algorithms

Image: 
Prof. Ido Kanter, Bar-Ilan University

The origin of machine and deep learning algorithms, which increasingly affect almost all aspects of our life, is the learning mechanism of synaptic (weight) strengths connecting neurons in our brain. Attempting to imitate these brain functions, researchers bridged between neuroscience and artificial intelligence over half a century ago. However, since then experimental neuroscience has not directly advanced the field of machine learning and both disciplines -- neuroscience and machine learning -- seem to have developed independently.

In an article published today in the journal Scientific Reports, researchers reveal that they have successfully rebuilt the bridge between experimental neuroscience and advanced artificial intelligence learning algorithms. Conducting new types of experiments on neuronal cultures, the researchers were able to demonstrate a new accelerated brain-inspired learning mechanism. When the mechanism was utilized on the artificial task of handwritten digit recognition, for instance, its success rates substantially outperformed commonly-used machine learning algorithms.

To rebuild this bridge, the researchers set out to prove two hypotheses: that the common assumption that learning in the brain is extremely slow might be wrong, and that the dynamics of the brain might include accelerated learning mechanisms. Surprisingly, both hypotheses were proven correct.

"A learning step in our brain is believed to typically last tens of minutes or even more, while in a computer it lasts for a nanosecond, or one million times one million faster," said the study's lead author Prof. Ido Kanter, of Bar-Ilan University's Department of Physics and Gonda (Goldschmied) Multidisciplinary Brain Research Center. "Although the brain is extremely slow, its computational capabilities outperform, or are comparable, to typical state-of-the-art artificial intelligence algorithms," added Kanter, who was assisted in the research by Shira Sardi, Dr. Roni Vardi, Yuval Meir, Dr. Amir Goldental, Shiri Hodassman and Yael Tugendfaft.

The team's experiments indicated that adaptation in our brain is significantly accelerated with training frequency. "Learning by observing the same image 10 times in a second is as effective as observing the same image 1,000 times in a month," said Shira Sardi, a main contributor to this work. "Repeating the same image speedily enhances adaptation in our brain to seconds rather than tens of minutes. It is possible that learning in our brain is even faster, but beyond our current experimental limitations," added Dr. Roni Vardi, another main contributor to the research. Utilization of this newly-discovered, brain-inspired accelerated learning mechanism substantially outperforms commonly-used machine learning algorithms, such as handwritten digit recognition, especially where small datasets are provided for training.

The reconstructed bridge from experimental neuroscience to machine learning is expected to advance artificial intelligence and especially ultrafast decision making under limited training examples, similar to many circumstances of human decision making, as well as robotic control and network optimization.

Credit: 
Bar-Ilan University

Protein produced in sepsis lowers blood pressure, treatment identified to reverse effects

(Philadelphia, PA) - Overreaction is rarely useful, and in the case of the human immune system, it can be outright deadly. When the body overreacts to an infection, the result is sepsis - a life-threatening condition that frequently leads to acute organ dysfunction, including deterioration of the heart and blood vessels, which make up the cardiovascular system. A major indication that the cardiovascular system is failing in sepsis is a drop in blood pressure, the only treatment for which is fluid replacement.

Now, in a new study published online April 23 in the journal JCI Insight, scientists at the Lewis Katz School of Medicine at Temple University (LKSOM) show that when a molecule known as c-Jun N-terminal kinase (JNK) becomes active in sepsis, it increases the production of a protein called B-type natriuretic peptide (BNP) - the more BNP that is produced in sepsis, the greater the deterioration of cardiovascular function. But perhaps more significantly, in mice, the researchers show that JNK and BNP activity can be halted, reversing cardiovascular damage and reducing the risk of death from sepsis.

"Low blood pressure is characteristic of the most severe form of sepsis, known as septic shock, in which fluid loss and decreased oxygen and nutrient delivery to tissues severely damages organ function," explained Konstantinos Drosatos, PhD, Assistant Professor of Pharmacology and Assistant Professor in the Center for Translational Medicine, the Center for Metabolic Disease Research, and the Alzheimer's Center at LKSOM and senior investigator on the new study.

In previous work, Dr. Drosatos's team found out why heart cells decrease their energy output in sepsis. They also knew from earlier studies that blocking JNK activation could correct cardiovascular dysfunction and lower BNP levels in an animal model of sepsis. The new study expands on this work and demonstrates how treatments aimed at improving cardiovascular function facilitate communication between the heart and blood vessels, which circulate blood throughout the body.

The researchers carried out their investigation of JNK and BNP activation in heart cells and in a mouse model of sepsis. Their experiments revealed a direct relationship between the two molecules, in which a c-Jun activating protein attaches to the gene that encodes BNP. When this happens, the gene is switched on, resulting in the production of BNP. In sepsis, the BNP-encoding gene is always "on," explaining why BNP protein is secreted in excess by the heart.

In separate experiments, the researchers blocked either JNK activation, using a chemical inhibitor, or BNP activity, using an antibody against the protein that was developed in Dr. Drosatos's laboratory. Both approaches restored blood pressure in septic mice, though JNK inhibition yielded the most robust benefits. Inhibition of either molecule also led to improvements in survival from sepsis.

"At a clinical level, JNK or BNP inhibition could stabilize blood pressure and give other medications, such as antibiotics, time to work," Dr. Drosatos explained. "This strategy could be used alongside current supportive strategies, which attempt to slow or prevent fluid loss to stabilize blood pressure."

In follow-up research, Dr. Drosatos is working closely with Nina Gentile, MD, Professor of Emergency Medicine at LKSOM to explore potential clinical applications of BNP inhibition. "We hope to develop innovative treatments that will work in patients to combat septic shock," Dr. Drosatos said.

Another important next step is to explore whether the new monoclonal antibody against BNP can be used as an intervention to treat low blood pressure in sepsis patients. "In light of the ongoing COVID-19 global pandemic, the association of COVID mortality with viral sepsis, and elevated BNP plasma levels in critical COVID-19 patients, the topic of BNP inhibition is very timely," Dr. Drosatos noted.

Credit: 
Temple University Health System

New targeted agent produces considerable responses in patients with uterine cancer

TORONTO -- In its first clinical trial in patients with a hard-to-treat form of uterine cancer, a targeted drug that subjects tumor cells to staggering levels of DNA damage caused tumors to shrink in nearly one-third of patients, investigators at Dana-Farber Cancer Institute report.

The preliminary results, to be presented online at Thursday's virtual session of the Society for Gynecologic Oncology (SGO) Annual Meeting on Women's Cancer, demonstrated strong activity of WEE1-directed therapy in uterine serous carcinoma (USC), which accounts for about 10% of uterine cancers but up to 40% of deaths from the disease, trial leaders say.

The drug tested in the study - adavosertib - takes advantage of an inherent weakness in the relentless growth of some cancer cells. Their non-stop proliferation creates a condition known as replication stress, where their ability to duplicate their DNA effectively is significantly impaired. The cell cycle - the carefully choreographed process by which cells grow, copy their DNA, and divide into two daughter cells - includes several checkpoints that halt the cycle so DNA can be inspected and repaired, if necessary. In some cancers, a checkpoint fails to function due to a genetic mutation or other problem, allowing the cycle to proceed even as DNA damage accumulates.

USC is one such cancer. More than 90% of cases are marked by a mutation or other abnormality in the TP53 gene, which plays a critical role in the checkpoint between the first phase of cell growth and the DNA-duplication phase. Without a working TP53 gene, cells can barrel into the DNA-duplication phase with extensive DNA damage on board.

The absence of functional TP53 places enormous strain on a checkpoint further on in the cell cycle called G2/M. Providing a final quality check, G2/M, guards the entry to mitosis, the act of dividing into two daughter cells. Hobbling G2/M by blocking one of the proteins involved in it could burden tumor cells with so much DNA damage that they cannot survive.

That is the strategy behind adavosertib, which targets a protein called WEE1 that helps regulate the G2/M checkpoint. The new trial marked the first time the drug, which has been tested in patients with other cancers, including breast and ovarian cancer, was tested in patients with USC.

The trial involved 35 patients, all of whom had previously been treated with platinum-based chemotherapy. They took adavosertib orally on a set schedule. At a median follow-up of 3.8 months, 10 of 34 patients who could be evaluated, had shrinkage of their tumors - a response rate of almost 30%.

In some cases, the responses were exceptionally durable, with some patients still responding more than a year after undergoing treatment, study leaders say.

The most common adverse side effects of the treatment were anemia, diarrhea, nausea, and fatigue.

"Adavosertib demonstrated remarkable activity as a single agent in this group of patients," says the study's lead author, Joyce Liu, MD, MPH, of Dana-Farber. "It's especially encouraging in a disease such as USC, for which current treatments are of limited effectiveness."

Credit: 
Dana-Farber Cancer Institute

UTEP researchers develop nanohybrid vehicle to optimally deliver drugs into the human body

image: Mahesh Narayan, Ph.D., along with and Sreeprasad Sreenivasan, Ph.D. (not pictured), has developed a nanohybrid vehicle that can be used to optimally deliver drugs into the human body. Both researchers are faculty members in in UTEP's Department of Chemistry and Biochemistry.

Image: 
Mahesh Narayan.

EL PASO, Texas - Researchers in The University of Texas at El Paso's Department of Chemistry and Biochemistry have developed a nanohybrid vehicle that can be used to optimally deliver drugs into the human body.

The research was published in April 2020 in ACS Applied Materials & Interfaces. Leading the study are Mahesh Narayan, Ph.D., professor, and Sreeprasad Sreenivasan, Ph.D., assistant professor, both from the Department of Chemistry and Biochemistry and the Border Biomedical Research Center (BBRC) in UTEP's College of Science.

Drug candidates that show promise against a particular disease often are toxic to other cell types. One such drug is the polyphenol ellagic acid (EA). This antioxidant, derived from nature, demonstrates the potential to mitigate pathologies including Parkinson's and Alzheimer's diseases. To selectively use EA in the brain against neurodegenerative disorders requires that its cytotoxic potential be reduced and only its anti-oxidant potential be exploited. Narayan, Sreenivasan and colleagues created a nanohybrid vehicle to circumvent this problem.

"We are very excited about the new drug delivery materials developed by Drs. Narayan and Sreenivasan," said Robert Kirken, Ph.D., dean of UTEP's College of Science. "This platform allows for molecules to be impregnated into the material so that the drug can more specifically target the tumor or other tissue site, thus increasing the beneficial effects of the drug while reducing its negative side effects."

The researchers discovered that encapsulating EA in chitosan, a sugar, reduces its inherent cytotoxicity while enhancing its anti-oxidant properties. The chitosan shell, which makes up the hard outer skeleton of shellfish, also permits EA delivery via a rapid burst phase and a relatively slow phase. This further enhances the drug delivery because the nanohybrid vehicle is uniquely suited for drug release over extended time periods.

"This work creates a new type of bio-friendly drug-delivery vehicle made of recyclable materials," Narayan said. "The other special feature of this vehicle is that it can deliver the drug via two mechanisms: one rapid and the other a slow-release."

Other project collaborators include UTEP doctoral student Jyoti Ahlawat, who led the research project under the supervision of her mentors; Eva Deemer, Ph.D., of UTEP's Department of Materials Science and Engineering; and Rabin Neupane, a graduate student in the department of industrial pharmacy at the University of Toledo.

Narayan's laboratory focuses on mitigating oxidative stress induced by neurotoxins as a means to prevent neurodegenerative disorders such as Parkinson's disease and Alzheimer's disease. Sreenivasan's lab works to bridge and interface chemistry, materials physics, and biological sciences to develop uniquely designed quantum structures and devices.

Credit: 
University of Texas at El Paso

Hummingbirds show up when tropical trees fall down

image: Hummingbirds like this one appeared in droves after a treefall gap opened up in the Panamanian forest.

Image: 
Photo by Henry Pollock, University of Illinois.

URBANA, Ill. - When the tree fell that October in 2015, the tropical giant didn't go down alone. Hundreds of neighboring trees went with it, opening a massive 2.5-acre gap in the Panamanian rainforest.

Treefalls happen all the time, but this one just happened to occur in the exact spot where a decades-long ecological study was in progress, giving University of Illinois researchers a rare look into tropical forest dynamics.

"I've been walking around that tree for 30 years now. It was just humongous," says Jeff Brawn, Professor and Stuart L. and Nancy J. Levenick Chair in Sustainability in the Department of Natural Resources and Environmental Sciences at Illinois. "Here we are, running around on this plot for years and all of a sudden I couldn't even find my way around. We just lucked into it."

What's lucky is that Brawn and his colleagues had amassed decades of data on the bird community in that exact spot, meaning they had a clear before-and-after view of what a treefall could mean for tropical birds.

This particular gap meant hummingbirds. Lots and lots of hummingbirds.

"After the treefall, we saw a very large spike in the total number of hummingbird species," says Henry Pollock, a postdoctoral scholar working with Brawn and lead author on a study published in the Journal of Field Ornithology. "Within the previous 25 years of the study, we had only documented three or four hummingbird species, and they were usually present in low numbers. There was one species, the snowy-bellied hummingbird, which we had never captured on either of our two plots in 25 years of sampling. The year after the treefall happened, we got 16 unique individuals of this one species, and total diversity of hummingbirds more than doubled."

The gap also attracted fruit-eating birds. The researchers documented a doubling of this group compared to pre-treefall numbers, with certain species being more than three times as abundant. Other species, including the thick-billed seed-finch, which typically inhabits grasslands, appeared as if out of thin air.

"They just swooped in," Brawn says. "It's analogous to a backyard bird feeder. As soon as you put one in, you'll see species you've never seen before."

And then, almost as quickly, the birds disappeared.

Within one to four years, depending on the species, the birds returned to pre-treefall numbers or were not detected again.

"What that suggests is these birds are incredibly mobile and opportunistic," Pollock says. "They are probably just cruising around the landscape prospecting for their preferred food sources and habitats. Given the sheer size of this gap, it acted as a sort of magnet, pulling in species from potentially kilometers away. I mean, 16 snowy-bellied hummingbirds and we've never caught one before? It's pretty astounding."

Treefalls are a common and necessary occurrence in forests all over the world. As sunshine streams in from above, trees hunkered down in the understory finally get their chance to rise. Basking in the suddenly resource-rich environment, tropical trees and other plants produce nectar-filled flowers and fruit, important food sources for birds and other animals.

Previous research has hinted at how important these food sources are for tropical birds, but no one had documented before-and-after differences until now. Instead, researchers typically compared treefall gaps with intact forest areas at a single time point. That approach has its uses, but it can't capture what Brawn and Pollock found: just how quickly the birds arrived on the scene, and how quickly they left.

"I was just really just astonished at how quickly and how efficiently these birds seem to be able to find and exploit a new source of food," Brawn says.

Gaps don't stay open long in the tropics. Understory trees shoot up, elbowing each other out of the way to take the top spot. Soon, there's no evidence a gap - or its riotous array of feathered occupants - was there at all.

As short-lived as they may be, treefall gaps represent critical opportunities for species turnover, especially in the tropics where forest fires are comparatively rare.

"This kind of periodic disturbance is probably necessary for these birds to persist in the landscape matrix," Pollock says. "That's true for many organisms and ecosystems; our study provides evidence to back that up in these birds."

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

How much does it cost california cannabis growers to safety test?

The high cost of testing cannabis in California leads to higher prices for the consumer, which could drive consumers to unlicensed markets.

A new study from researchers at the University of California, Davis, finds the safety tests cost growers about 10 percent of the average wholesale price of legal cannabis. The biggest share of this expense comes from failing the test.

"Testing itself is costly," said study author Dan Sumner, a professor of agricultural economics at UC Davis. "But growers have to destroy the product that doesn't pass the test and that is where the biggest losses occur."

In California, every batch of cannabis has to be tested for more than 100 contaminants before it can be sold to consumers by a licensed retailer. The safety testing laws -- the most stringent in the nation -- include testing for 66 pesticides with tolerance levels lower than allowable by any other agricultural product.

Sumner said an elaborate track-and-trace system for cannabis plants makes it difficult for a batch that failed testing to enter the legal market.

Zero tolerance

Most testing failures are the result of the state's low- or zero-tolerance levels for pesticide residues. Food that is compliant under regulations can have higher minimum detection levels of pesticides than what is required under cannabis laws and regulations.

While labs can and often re-test cannabis if it fails, some labs have reported up to a 10 percent variation in test results taken from the same sample. The cost of testing also varies by batch size, especially for batches under 10 pounds. The maximum batch size allowed in California is 50 pounds, but many are smaller than 15 pounds. Sumner said failure rates declined from 5.6 percent in 2018 to 4 percent in 2019.

High costs vs. unlicensed market

The study finds that higher testing costs translate into higher prices for the licensed cannabis market.

"No one wants a policy shift away from testing cannabis," said Sumner. "But for price-sensitive consumers, the alternative is an illegal market. That means they consume a product with no testing at all."

Further investigation is needed to determine the costs and benefits of current regulations in relationship to the unlicensed cannabis market, he said.

Credit: 
University of California - Davis

Insects: Largest study to date finds declines on land, but recoveries in freshwater

image: The global number of land-dwelling insects is in decline.

Image: 
Gabriele Rada

A worldwide compilation of long-term insect abundance studies shows that the number of land-dwelling insects is in decline. On average, there is a global decrease of 0.92% per year, which translates to approximately 24% over 30 years. At the same time, the number of insects living in freshwater, such as midges and mayflies, has increased on average by 1.08% each year. This is possibly due to effective water protection policies. Despite these overall averages, local trends are highly variable, and areas that have been less impacted by humans appear to have weaker trends. These are the results from the largest study of insect change to date, including 1676 sites across the world, now published in the journal Science. The study was led by researchers from the German Centre for Integrative Biodiversity Research (iDiv), Leipzig University (UL) and Martin Luther University Halle-Wittenberg (MLU). It fills key knowledge gaps in the context of the much-discussed issue of "insect declines".

Over the past few years, a number of studies have been published that show dramatic declines in insect numbers through time. The most prominent, from nature reserves in Western Germany, suggested remarkable declines of flying insect biomass (>75% decrease over 27 years). This was published in 2017 and sparked a media storm suggesting a widespread "insect apocalypse". Since then, there have been several follow-up publications from different places across the world, most showing strong declines, others less so, and some even showing increases. But so far, no one has combined the available data on insect abundance trends across the globe to investigate just how widespread and severe insect declines are. Until now.

Largest data compilation to date

An international team of scientists collaborated to compile data from 166 long-term surveys performed at 1676 sites worldwide, between 1925 and 2018, to investigate trends in insect abundances (number of individuals, not species). The complex analysis revealed a high variation in trends, even among nearby sites. For example, in countries where many insect surveys have taken place, such as Germany, the UK and the US, some places experienced declines while others quite close by indicated no changes, or even increases. However, when all of the trends across the world were combined, the researchers were able to estimate how total insect abundances were changing on average across time. They found that for terrestrial insects (insects that spend their whole lives on land, like butterflies, grasshoppers and ants), there was an average decrease of 0.92% per year.

Insects disappear quietly

First author Dr Roel van Klink, a scientist at iDiv and UL, said: "0.92% may not sound like much, but in fact it means 24% fewer insects in 30 years' time and 50% fewer over 75 years. Insect declines happen in a quiet way and we don't take notice from one year to the next. It's like going back to the place where you grew up. It's only because you haven't been there for years that you suddenly realise how much has changed, and all too often not for the better."

Insect declines were strongest in some parts of the US (West and Midwest) and in Europe, particularly in Germany. For Europe in general, trends became on average more negative over time, with the strongest declines since 2005.

Fewer insects in the air

When reporting about "insect decline", the mass media have often referred to the "windscreen phenomenon": people's perception that there are fewer insects being splattered on the windscreens of their cars now compared to some decades ago. The new study confirms this observation, at least on average. Last author Jonathan Chase, professor at iDiv and MLU, said: "Many insects can fly, and it's those that get smashed by car windshields. Our analysis shows that flying insects have indeed decreased on average. However, the majority of insects are less conspicuous and live out of sight - in the soil, in tree canopies or in the water."

For the new study, the researchers also analysed data from many of these hidden habitats. This showed that on average, there are fewer insects living in the grass and on the ground today than in the past - similar to the flying insects. By contrast, the number of insects living in tree canopies has, on average, remained largely unchanged.

Freshwater insects have recovered

At the same time, studies of insects that live (part of) their lives under water, like midges and mayflies, showed an average annual increase of 1.08%. This corresponds to a 38% increase over 30 years. This positive trend was particularly strong in Northern Europe, in the Western US, and since the early 1990s, in Russia. For Jonathan Chase this is a good sign. He said: "These numbers show that we can reverse these negative trends. Over the past 50 years, several measures have been taken to clean up our polluted rivers and lakes in many places across the world. This may have allowed the recovery of many freshwater insect populations. It makes us hopeful that we can reverse the trend for populations that are currently declining."

Roel van Klink added: "Insect populations are like logs of wood that are pushed under water. They want to come up, while we keep pushing them further down. But we can reduce the pressure so they can rise again. The freshwater insects have shown us this is possible. It's just not always easy to identify the causes of declines, and thus the most effective measures to reverse them. And these may also differ between locations."

No simple solutions

Ann Swengel, co-author of the study, has spent the last 34 years studying butterfly populations across hundreds of sites in Wisconsin and nearby states in the US. She stresses how complex the observed abundance trends are and what they mean for effective conservation management: "We've seen so much decline, including on many protected sites. But we've also observed some sites where butterflies are continuing to do well. It takes lots of years and lots of data to understand both the failures and the successes, species by species and site by site. A lot is beyond the control of any one person, but the choices we each make in each individual site really do matter."

Habitat destruction most likely causes insect declines

Although the scientists were unable to say for certain exactly why such trends - both negative and positive - emerged, they were able to point to a few possibilities. Most importantly, they found that destruction of natural habitats - particularly through urbanisation - is associated with the declines of terrestrial insects. Other reports, such as the IPBES Global Assessment, also noted that land-use change and habitat destruction are a main cause of global biodiversity change.

This new study was made possible by iDiv's synthesis centre sDiv. It is currently the most comprehensive analysis of its kind. It depicts the global status of insects and shows where insect protection is most urgently needed.

Credit: 
German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig