Earth

New fault zone measurements could help us to understand subduction earthquake

image: The outcrop of the pseudotachylyte-bearing fault zone in pelagic sedimentary rocks.

Image: 
University of Tsukuba

Tsukuba, Japan - A research team from the University of Tsukuba has conducted detailed structural analyses of a fault zone located in central Japan, with the aim to help identify the specific conditions that lead to earthquake faulting, a hazard that can cause enormous social damage. Subduction is a geological process that takes place in areas where two tectonic plates meet, such as the Japan Trench, in which one plate moves under another and is forced to sink. Regions in which this process occurs are known as subduction zones and the seismic activity that they produce causes devastating damage through ground shaking and tsunamis. However, understanding these seismic processes can be difficult because of the problems associated with taking measurements from their deepest sections, where much of the activity occurs. "To overcome this problem, we examined fault rocks exhumed from source depths of subduction earthquakes, which are now exposed at the land surface at the Jurassic accretionary complex in central Japan," explains study lead author Professor Kohtaro Ujiie. "At this complex, we were able to examine pseudotachylyte, a solidified frictional melt produced during subduction earthquakes, to help us to infer what may occur in the subduction zones deep beneath the oceans." The exposed fault zone was characterized through a range of measurements such as scanning electron microscope and Raman spectroscopy to provide a detailed picture of the pseudotachylytes and make some constraints about the heating conditions at the time of formation. "The pseudotachylyte at the site derived from the frictional melting of black carbonaceous mudstone together with chert, which accumulated under low-oxygen conditions," says Ujiie. "Thermal fracturing tends to occur along slip zones flanked by rocks with high thermal diffusivities such as chert, and may happen during seismic slip within the Jurassic accretionary complex. This thermal fracturing could lead to a fluid pressure drop in the slip zone and reduction in stiffness of surrounding rocks, potentially contributing to the generation of frictional melt and acceleration of seismic slip." The seismic slip processes recorded in the studied complex may be applicable to other fault zones with similar rock layers, such as the Japan Trench subduction zone. Therefore, the data gathered from this area could be useful in future attempts to describe or model the subduction earthquakes that lead to ground shaking and tsunami risk.

Credit: 
University of Tsukuba

Bacilli and their enzymes show prospects for several applications

The Laboratory of Enzyme Biosynthesis and Bioengineering of Kazan Federal University is engaged in several projects in fundamental and applied fields of biology and medicine. For several years, the main focus of research has been the production and characterization of enzymes. Enzymes such as subtilisin-like proteinase, glutamyl endopeptidase, and metalloproteinase have been described in detail in the dissertations of Ye. Mikhailova, T. Shamsutdinov, and N. Rudakova, respectively. In her dissertation, Iuliia Danilova, who is a co-author of this new paper, characterized the biological effects of all three enzymes. Enzymes, such as phytases, have been studied and described in the works of A. Suleimanova and A. Akhmetova. A vital contribution to the research on these proteins was made by N. Balaban. The study of enzymes is supervised by Professor Margarita Sharipova.

This publication is devoted to the des­cription of different microbial enzymes with prospects for practical application. The interest in microbial enzymes is due to the inability of animal and plant proteolytic enzymes to fully meet the needs of the global population. Microorganisms are an accessible source of enzymes owing to their wide variety, the safety of handling, ease of cultivation, and genetic transformability. Screening and characterizing these enzymes from different sources offer many benefits from both environmental and industrial standpoints. Microorganisms are considered an important source of proteases because they can be rapidly obtained in large quantities using established fermentation methods, and they produce a normal and abundant supply of the desired product. Microorganisms are producers of a wide range of proteins and enzymes with useful properties. In addition, microbial proteins have a longer shelf life and can be stored under sub-optimal conditions for several weeks without a significant loss in activity.

In recent years, several studies have been carried out on the characterization of bacillary proteases. Researchers from different countries have obtained enzymes with fibrinolytic and thrombolytic properties. There are known strains of the genus Bacillus which produce proteases exhibiting high activity in the degradation of the β-amyloid peptide, which causes Alzheimer's disease. It is recognized that most microbes live in complex communities called biofilms. One negative trait of the formation of biofilms is the detriment they can inflict in healthcare, agriculture, and industry. The anti-biofilm activity of bacillary proteases has been tested on a large number of bacterial biofilms. Phytases are considered as a means of increasing the availability of organic phosphorus in grain feed for farm animals, which could provide an alternative to mineral sources of phosphorus introduced into the feed.

The lack of therapeutic agents in areas of medicine, such as cardiovascular, antiviral therapy, and the therapy of neurodegenerative diseases, has spurred the search for new therapeutic enzyme preparations. Fibrinolytic proteases represent a promising alternative to existing drugs for thrombolytic therapy. The proteins described in the article are also proposed as potential agents for drug development against Alzheimer's disease. The creation of drugs based on bacilli or their metabolites that disrupt the integrity of the biofilm matrix is innovative and promising.

To increase the efficiency of animal feed, microbial enzymes that increase the digestibility of nutrients are used. The publication describes commercially available feed enzymes of the phytase and protease families, which are included mainly in the feed for pigs and poultry.

Further investigations of the study will focus on developing technologies that will reduce the cost of enzyme production, while scaling up its application in practice.

Credit: 
Kazan Federal University

Radical changes in ecosystems

Earth and all the living organisms on it are constantly changing. But is there any way we can detect if these changes are occurring at an abnormal rate? What are the consequences of these changes for the organisms affected? An international team of researchers including scientists from Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) have developed a method of detecting such developments and tracking how new ecosystems are formed. They have published their findings in the specialist journal Science.

Changes in the environment are becoming increasingly apparent due to climate change. Temperatures are rising, there is either no rain at all or it falls in severe storms. These changes have an effect on ecosystems and the living conditions of the organisms within them. If the conditions within these ecosystems become too unfavourable for individual organisms, they either migrate or even become extinct. At the same time, new ecological niches arise that other species can populate or new species are created. 'The problem is knowing at which point we can say it's a new ecosystem,' explains Prof. Wolfgang Kießling from the Chair of Palaeoenvironmental Research at FAU. 'We have now developed a method that allows us to distinguish such events from normal background noise.'

When is it a new ecosystem?

An ecosystem is considered new by scientists if extremely rapid changes in the range of organisms within it lead to a state that previously did not exist. The speed at which the changes occur is extremely important. Ecosystems are always changing to a certain extent. Significant shifts only occur above a certain limit, for example as caused by man-made climate change. Scientists are now able to precisely determine this limit using statistics.

Creation of new ecosystems is risky for the species involved

Times of increased change are extremely dynamic and pose special challenges for the species involved. Wolfgang Kießling and his colleagues successfully tested their method on fossil ecosystems from a period ranging over 66 million years and their results are alarming. The risk of extinction during such dynamic periods of change is twice to four times higher than under normal conditions. At the same time, however, there is a greater chance that new species migrate or are created. 'Changes have always occurred in ecosystems and they will continue to do so,' Kießling continues. 'In terms of protecting the environment, it's therefore important not to prevent changes in general, but to try and steer them in a direction that does not have an increased risk of extinction.'

Credit: 
Friedrich-Alexander-Universität Erlangen-Nürnberg

Well oriented

Polypropylene (PP) is one of the most widely used plastics in the world. By controlling the spatial orientation of the propylene building blocks and additional polar components, it should be possible to create a new generation of attractive, engineered, specialty plastics, with improved wettability or enhanced degradability, based on PP. In the journal Angewandte Chemie, Japanese scientists have introduced the basis for a new class of palladium catalysts for such polymerizations.

The properties of PP depend largely on the spatial orientation of the individual monomers as they are added to the growing chain (tacticity). In atactic PP (aPP) the orientation is random. In syndiotactic PP (sPP) the CH(3)-side groups on the monomer alternately point toward the two sides of the polymer backbone. The most sought-after version--isotactic PP (iPP) in which all of the side groups point the same way--has particularly advantageous mechanical properties. Incorporation of additional functional, polar monomers into iPP is an important step toward the development of novel plastics.

This type of copolymerization is heavily restricted with conventional Ziegler-Natta and metallocene catalysts because typical polar monomers first need to be "masked". This means they must be attached to special protective groups. With nickel and palladium catalysts, it is possible to achieve this unmasked but with significant losses in isotacticity. There has been some success with special nickel and palladium phosphine complexes (a type of phosphorus-containing organic compound), though synthesis of these catalysts is arduous and time-consuming.

Researchers working with Kyoko Nozaki at the University of Tokyo have now developed a new approach that allows more suitable catalysts to be produced much more easily. The spatial orientation of propylene monomers during polymerization is influenced by the special spatial structure (stereogenicity) at certain carbon atoms in the organic menthol substituents on the phosphine. The researchers wanted to develop phosphine compounds that have the required stereogenicity at the phosphorus atom.

To avoid the tedious synthetic challenges faced to date, they developed significantly faster synthetic protocols using storable, modular building blocks and phosphinites (a class of organic compounds containing phosphorus and oxygen). This allowed for the rapid and easy synthesis of many different phosphines and their corresponding palladium complexes. A rapid screening process successfully yielded suitable catalyst candidates.

In this way, the scientists found catalysts that polymerize propylene with polar monomers to form copolymers with particularly high isotacticity--a material they called isotactic polar polypropylene (iPPP).

Credit: 
Wiley

Children with asymptomatic brain bleeds as newborns show normal brain development at age 2

image: John H. Gilmore, MD, is senior author of the study.

Image: 
UNC School of Medicine

CHAPEL HILL, N.C. - Oct. 30, 2020 - In 2007, UNC researchers published unexpected and surprising results from a study based on magnetic resonance imaging (MRI) of newborn brains. Twenty-six percent of the newborns in the study were found to have asymptomatic subdural hemorrhages, or bleeding in and around the brain.

It was an unexpected finding because subdural hemorrhage had been considered unusual in full-term newborns. But the 2007 findings suggested that small, asymptomatic brain bleeds might be a fairly common consequence of a normal vaginal delivery.

Now 13 years later, John H. Gilmore, MD, professor and vice chair of research in the UNC Department of Psychiatry and senior author of the 2007 study, and J. Keith Smith, MD, PhD, vice chair of the UNC Department of Radiology, have published a follow-up study in the journal Radiology, which also published the 2007 study.

"We were one of the first groups to systematically scan the brains of newborns and were very surprised to discover that small subdural bleeds are very common," said Gilmore, senior author of the new study and director of the UNC Center of Excellence in Community Mental Health. "Since the bleeds were so common, we believed that they did not have a significant impact on brain development, but had no hard data to know for sure. This follow-up study is reassuring and demonstrates that children with these minor perinatal bleeds have normal cognitive development at two years of age."

The new article is based on data collected from 311 infants between 2003 and 2016 as part of the UNC Early Brain Development Study. Neurodevelopmental outcomes were evaluated at two years of age using the Mullen Scales of Early Learning (MSEL). All of the infants had MRI brain scans and were evaluated for subdural hemorrhage as neonates and at ages one and two years.

In comparing the children with a history of subdural hemorrhage to those without, study authors found no differences between the two groups in either MSEL scores or in total gray matter volumes. Also, at age two there was no evidence of rebleeding in the children who had subdural hemorrhages as neonates.

"There are two really important findings of this work," said Smith, who is the corresponding author of the 2020 study. "These small bleeds, which are very common, do not seem to harm brain development, and they also go away and don't predispose to later bleeding or other abnormalities."

Credit: 
University of North Carolina Health Care

A new spin on atoms gives scientists a closer look at quantum weirdness

image: Artist's rendering of a method of measuring and controlling quantum spins developed at Princeton University.

Image: 
Rachel Davidowitz

When atoms get extremely close, they develop intriguing interactions that could be harnessed to create new generations of computing and other technologies. These interactions in the realm of quantum physics have proven difficult to study experimentally due the basic limitations of optical microscopes.

Now a team of Princeton researchers, led by Jeff Thompson, an assistant professor of electrical engineering, has developed a new way to control and measure atoms that are so close together no optical lens can distinguish them.

Described in an article published Oct. 30 in the journal Science, their method excites closely-spaced erbium atoms in a crystal using a finely tuned laser in a nanometer-scale optical circuit. The researchers take advantage of the fact that each atom responds to slightly different frequencies, or colors, of laser light, allowing the researchers to resolve and control multiple atoms, without relying on their spatial information.

In a conventional microscope, the space between two atoms effectively disappears when their separation is below a key distance called the diffraction limit, which is roughly equal to the light's wavelength. This is analogous to two distant stars that appear as a single point of light in the night sky. However, this is also the scale at which atoms start to interact and give rise to rich and interesting quantum mechanical behavior.

"We always wonder, at the most fundamental level -- inside solids, inside crystals -- what do atoms actually do? How do they interact?" said physicist Andrei Faraon, a professor at the California Institute of Technology who was not involved in the research. "This [paper] opens the window to study atoms that are in very, very close proximity."

Studying atoms and their interactions at tiny distances allows scientists to explore and control a quantum property known as spin. As a form of momentum, spin is usually described as being either up or down (or both, but that's another story). When the distance between two atoms grows vanishingly small -- mere billionths of a meter -- the spin of one exerts influence over the spin of the other, and vice versa. As spins interact in this realm, they can become entangled, a term scientists use to describe two or more particles that are inextricably linked. Entangled particles behave as if they share one existence, no matter how far apart they later become. Entanglement is the essential phenomenon that separates quantum mechanics from the classical world, and it's at the center of the vision for quantum technologies. The new Princeton device is a stepping stone for scientists to study these spin interactions with unprecedented clarity.

One important feature of the new Princeton device is its potential to address hundreds of atoms at a time, providing a rich quantum laboratory in which to gather empirical data. It's a boon for physicists who hope to unlock reality's deepest mysteries, including the spooky nature of entanglement.

Such inquiry is not merely esoteric. Over the past three decades, engineers have sought to use quantum phenomena to create complex technologies for information processing and communication, from the logical building blocks of emerging quantum computers, capable of solving otherwise impossible problems, to ultrasecure communication methods that can link machines into an unhackable quantum Internet. To develop these systems further, scientists will need to entangle particles reliably and exploit their entanglement to encode and process information.

Thompson's team saw an opportunity in erbium. Traditionally used in lasers and magnets, erbium was not widely explored for use in quantum systems because it is difficult to observe, according to the researchers. The team made a breakthrough in 2018, developing a way to enhance the light emitted by these atoms, and to detect that signal extremely efficiently. Now they've shown they can do it all en masse.

When the laser illuminates the atoms, it excites them just enough for them to emit a faint light at a unique frequency, but delicately enough to preserve and read out the atoms' spins. These frequencies change ever so subtly according to the atoms' different states, so that "up" has one frequency and "down" has another, and each individual atom has its own pair of frequencies.

"If you have an ensemble of these qubits, they all emit light at very slightly different frequencies. And so by tuning the laser carefully to the frequency of one or the frequency of the other, we can address them, even though we have no ability to spatially resolve them," Thompson said. "Each atom sees all of the light, but they only listen to the frequency they're tuned to."

The light's frequency is then a perfect proxy for the spin. Switching the spins up and down gives researchers a way to make calculations. It's akin to transistors that are either on or off in a classical computer, giving rise to the zeroes and ones of our digital world.

To form the basis of a useful quantum processor, these qubits will need to go a step further.

"The strength of the interaction is related to the distance between the two spins," said Songtao Chen, a postdoctoral researcher in Thompson's lab and one of the paper's two lead authors. "We want to make them close so we can have this mutual interaction, and use this interaction to create a quantum logic gate."

A quantum logic gate requires two or more entangled qubits, making it capable of performing uniquely quantum operations, such as computing the folding patterns of proteins or routing information on the quantum internet.

Thompson, who holds a leadership position at the U.S. Department of Energy's new $115M quantum science initiative, is on a mission to bring these qubits to heel. Within the materials thrust of the Co-Design Center for Quantum Advantage, he leads the sub- qubits for computing and networking.

His erbium system, a new kind of qubit that is especially useful in networking applications, can operate using the existing telecommunications infrastructure, sending signals in the form of encoded light over silicon devices and optical fibers. These two properties give erbium an industrial edge over today's most advanced solid-state qubits, which transmit information through visible light wavelengths that don't work well with optical-fiber communication networks.

Still, to operate at scale, the erbium system will need to be further engineered.

While the team can control and measure the spin state of its qubits no matter how close they get, and use optical structures to produce high-fidelity measurement, they can't yet arrange the qubits as needed to form two-qubit gates. To do that, engineers will need to find a different material to host the erbium atoms. The study was designed with this future improvement in mind.

"One of the major advantages of the way we have done this experiment is that it has nothing to do with what host the erbium sits in," said Mouktik Raha, a sixth-year graduate student in electrical engineering and one of the paper's two lead authors. "As long as you can put erbium inside it and it doesn't jitter around, you're good to go."

Credit: 
Princeton University, Engineering School

Standardized measures needed to screen kinship foster placements

Kinship caregiving--placing a child in a relative's home if the child cannot safely stay in the family home--is becoming more common and is a preferred option for children, says UBC Okanagan Assistant Professor Sarah Dow-Fleisner.

The study, originally conducted at the Children and Family Research Center as part of the University of Illinois School of Social Work, was reanalyzed by UBCO student Kathrine Stene as part of an honours thesis in psychology and completed by researchers at UBCO's Centre for the Study of Services to Children and Families.

While it seems to make sense to keep a child with a relative, Dow-Fleisner says there are no clear screening tools for agencies to use that address the unique circumstance of kinship caregiving--which is in stark contrast to the tools available when screening voluntary non-relative foster caregivers.

"Kinship care can be an informal or formal placement arranged between individuals related to the youth, either biologically, culturally or legally through marriage," says Dow-Fleisner, who teaches in the School of Social Work. "And while this placement type is preferred as it maintains family connection and cultural ties for the child, there are no standardized and validated measures available to evaluate the quality of care available in those unique placements."

The problem, according to Susan Wells, professor emerita of psychology and social work and principal investigator of the original project, is that there is very little research examining the measurement of quality of care within kinship placement settings.

"We need the development of a scale for assessing the quality of care in a kinship setting and also explore to see if such a tool would work consistently," says Wells. "Considering the differences between kinship and traditional foster care placements it is necessary that a standardized measure of quality of care be available for use in a kinship care setting."

To address the problem, researchers conducted focus groups with caregivers, children and caseworkers and then extensively reviewed the literature to develop a tool to measure the quality of care unique to kinship settings. The final tool includes 36 items that fall into five key criteria for kinship settings, including the caregiver's capacity to meet the child's needs, their commitment and acceptance of foster caregiving, their social functioning, their ability to protect the child from maltreatment and neighbourhood support.

Each aspect has the potential to provide insight into interventions and supports to improve the quality of care.

"This measure has the potential to be utilized by child protection workers as part of the initial assessment for placement in kinship settings and for ongoing screening, and could be used in conjunction with other screening tools," Dow-Fleisner says. "By using these measures together, caseworkers may be able to determine which services, or lack thereof, impact the quality of care provided for children in kinship placements."

Credit: 
University of British Columbia Okanagan campus

Molecular compass for cell orientation

image: When the scientists removed one part of the receptor -- CANAR, the plant's veins were formed seemingly randomly (right side). The left side shows the normal vein formation.

Image: 
Jakub Hajný / IST Austria

Plants have veins that transport nutrients throughout their whole body. These veins are organized in a highly ordered manner. The plant hormone auxin travels directionally from cell-to-cell and provides cells with positional information, coordinating them during vein formation and regeneration. Until now, it remained a mystery how cells translate auxin signal into a formation of a complex system of veins. Scientists at the Institute of Science and Technology (IST) Austria discovered a molecular machinery that perceives a local auxin concentration and allows cells to synchronize their behavior to coordinate veins formation and regeneration. The scientists published their study in the journal Science. This phenomenon also applies to wound healing and might lead to more mechanically resistant plants and further agricultural implications.

The human body uses veins and blood to transport nutrients and oxygen throughout the body. Plants use a similar approach, the vascular systems. These veins transport nutrients for survival and define the size, structure, and position of new leaves and allow long-range communication between distant organs. Now, scientists from the group of Prof Jiri Friml at IST Austria, discovered how the plant hormone auxin dictates the newly formed veins' position. "Auxin decides which cells will differentiate into vascular tissue and orchestrates them to form intricate vein patterns," explains Jakub Hajný, who led the study. When cells have no ability to sense auxin signal, plant forms disorganized veins with disconnections that limit nutrients distribution. In case of mechanical damage, it also decreases regeneration after wounding.

Cell orientation in a tissue

Already decades ago, scientists suspected that auxin is the vein-inducing signal organizing tissue into the formation of conserved vein patterns. However, scientists could not understand how the cells decrypt this chemical signal into a cellular response so far. The Friml group managed to identify the responsible proteins, called CAMEL and CANAR which serves as auxin sensor. The CAMEL/CANAR complex most likely perceives the auxin concentration in the neighborhood and allows cells to synchronize their orientations to create continuous veins. "It is basically a molecular compass for cell orientation, only instead of a magnetic field, it detects auxin concentration," explains Jakub Hajný. Thus, the team discovered molecular machinery underlying auxin-mediated vein formation and regeneration.

Credit: 
Institute of Science and Technology Austria

Sensors driven by machine learning sniff-out gas leaks fast

image: ALFaLDS is deployed during blind tests at the model oil and gas test facility at Fort Collins, Colorado.

Image: 
Los Alamos National Laboratory

LOS ALAMOS, N.M., October 28, 2020--A new study confirms the success of a natural-gas leak-detection tool pioneered by Los Alamos National Laboratory scientists that uses sensors and machine learning to locate leak points at oil and gas fields, promising new automatic, affordable sampling across vast natural gas infrastructure.

"Our automated leak location system finds gas leaks fast, including small ones from failing infrastructure, and lowers cost as current methods to fix gas leaks are labor intensive, expensive and slow," said Manvendra Dubey, the lead Los Alamos National Laboratory scientist and coauthor of the new study. "Our sensors outperformed competing techniques in sensitivity to detecting methane and ethane. In addition, our neural network can be coupled to any sensor, which makes our tool very powerful and will enable market penetration."

The Autonomous, Low-cost, Fast Leak Detection System (ALFaLDS) was developed to discover accidental releases of methane, a potent greenhouse gas, and won a 2019 R&D 100 award. ALFaLDS detects, locates and quantifies a natural gas leak based on real-time methane and ethane (in natural gas) and atmospheric wind measurements that are analyzed by a machine-learning code trained to locate leaks. The code is trained using Los Alamos National Laboratory's high resolution plume dispersion models and the training is finessed on-site by controlled releases.

Test results using blind releases at an oil and gas well-pad facility at Colorado State University in Fort Collins, Colorado, demonstrated that the ALFaLDS locates the engineered methane leaks precisely and quantifies their size. This novel capability for locating leaks with high skill, speed and accuracy at lower cost promises new automatic, affordable sampling of fugitive gas leaks at well pads and oil and gas fields, the paper in the journal Atmospheric Environment: X concludes.

ALFaLDS's success in locating and quantifying fugitive methane leaks at natural gas facilities could lead to a 90 percent reduction in methane emissions if implemented by the industry.

ALFaLDS used a small sensor, which makes it ideal for deployment on cars and drones. The Los Alamos team is developing the sensors that were integrated with a mini 3D sonic anemometer and the powerful machine-learning code in these studies.

However, the code is autonomous and can read data from any gas and wind sensors to help find leaks fast and minimize fugitive emissions from the vast network of natural gas extraction, production and consumption.

With this integration, ALFaLDS offers a revolutionary approach for oil and gas service providers in leak detection, to non-profit organizations surveying the issue, and to national laboratories and academia researching natural gas production.

Credit: 
DOE/Los Alamos National Laboratory

HSE Faculty of Chemistry scientists discovered new anti-cancer molecule

image: Snapshot from molecular dynamics simulation of compound 2c bound to tubulin. A system of hydrogen bonds was formed that involves the ligand, protein residues Asn349 and Lys352 and a water molecule

Image: 
Victor S.Stroylov et.al.

A group of Moscow scientists has discovered and explained the activity mechanism of a new anti-cancer molecule -- diphenylisoxazole. This molecule has been shown to be effective against human cancer cells. The research, published in the journal Bioorganic & Medicinal Chemistry Letters, makes it possible to produce an affordable cancer treatment drug. https://www.sciencedirect.com/science/article/pii/S0960894X20307198?via%3Dihub

Every cell in our body has a cytoskeleton, a system of microtubules and filaments that support the cell's rigid shape. Microtubules are formed by the protein tubulin and play a key role in the division of both healthy and tumor cells. Therefore, microtubules are a target for antimitotics -- anti-cancer drugs that inhibit tumor growth by disrupting tubulin polymerization. Because the unlimited proliferation of cancer cells is what makes the disease so dangerous, many drugs aim at inhibiting this process.

The tubulin molecule has four binding sites (sites where it can interact with a drug), namely the colchicine, taxane/epothilone, laulimalide and vinca alkaloid binding sites. Several substances are known to bind with tubulin at the colchicine site and ultimately disrupt tubulin polymerization, and all of them contain a trimethoxyphenyl ring.

With the help of computer simulations, the Moscow researchers determined which compounds, including those without a trimethoxyphenyl ring, were able to bind to tubulin, and were able to predict the effectiveness of a new substance for such studies -- diphenylisoxazole. This molecule is unique in that it is easily synthesized using available compounds -- benzaldehydes, acetophenones, and aryl nitromethanes.

The simulation also showed for the first time that the molecule of a substance needn't have a trimethoxyphenyl ring in order to bind to tubulin at the colchicine site. All previously known tubulin polymerization inhibitors interacting with the colchicine site had a trimethoxyphenyl substituent in their structure, but this element is absent in diphenylisoxazole. This means that there is a yet unexplored structural class of compounds with antimitotic activity that can be used to create anti-cancer drugs with new properties.

It was later shown that diphenylisoxazole inhibits tubulin polymerization in sea urchin embryos, whose rapid cell division resembles that of cancer, making it a frequent subject of such studies. Adding diphenylisoxazole to a vessel containing fertilized sea urchin eggs inhibited cell-reproduction and caused the embryo to rotate instead of swimming forward. This observation indicates that the substance affected the cells' microtubules. Subsequent experiments proved the molecule's effectiveness not only on sea urchin embryos but also on human cancer cells.

The scientists pointed out that not only the results of the research but also its methodology hold value.

According to HSE University professor Igor Svitanko https://www.hse.ru/en/org/persons/219432788, one of the authors of the study, 'Previous work by these researchers on the synthesis of drugs against leukaemia and rheumatoid arthritis, as well as on other anti-cancer drugs, has shown the importance of this sequence in designing the scientific experiment -- first simulating the structure of the matter with the desired properties, and only then synthesising and testing its biological activity. Posing the question in this way gives only secondary importance to organic synthesis and requires that it take the simplest possible path to the predicted structure. This makes it possible to dramatically reduce the cost of finding and introducing new drugs,' he said.

Professor Svitanko also said that computer modelling makes it possible for young researchers without years of experience and intuition regarding synthetics to participate in such complex studies. HSE University has proposed creating a new computer-modelling laboratory that would synthesize new drugs and other substances using computer predicted structures.

Credit: 
National Research University Higher School of Economics

Brainstem neurons control both behaviour and misbehaviour

image: Gene expression determines the differentiation path of embryonic brainstem precursor cells into excitatory (glutamatergic) or inhibitory (GABAergic) neurons.

Image: 
Samir Sadik-Ogli

A recent study at the University of Helsinki reveals how gene control mechanisms define the identity of developing neurons in the brainstem. The researchers also showed that a failure in differentiation of the brainstem neurons leads to behavioural abnormalities, including hyperactivity and attention deficit.

The mammalian brain is big, but the state of its activity is controlled by a much smaller number of neurons. Many of these are located in the brainstem, an evolutionarily conserved part of the brain, which controls mood, motivation and motor activity. What are the brainstem neurons like? How do they develop in the embryonic brain? How are defects in their development reflected in brain activity and behaviour?

The research group, led by Professor Juha Partanen at the Faculty of Biological and Environmental Sciences, University of Helsinki, has addressed these questions by studying gene regulation in the embryonic brainstem.

The phenotype of a neuron, to a large extent, is determined already early in an embryo. We have shown how certain selector genes, which are expressed soon after the onset of neuronal differentiation, and control the activity of other neuron specific genes, determine the identity of the developing neuron.

The past few years have provided us with very powerful tools to study gene expression in individual cells. By analysing gene products in embryonic brain cells, we can now follow the differentiation paths of neurons and examine what exactly happens when the developing cells take different paths - for example in becoming a neuron either inhibiting or activating its target. Differentiation paths branch to produce the remarkable neuronal diversity that brain function is based on. According to the gene-expression-based identities, the immature neurons find their location in the brain and make contacts with other components of the neural circuitry.

What if the gene expression signposts point in wrong directions and the developing neurons are misrouted? In the brainstem, this has grave consequences on both brain function and behaviour.

In such a situation, "We have studied mice with an imbalance in differentiation of neurons either activating or inhibiting the dopaminergic and serotonergic neurotransmitter systems. These mice are hyperactive and impulsive, they have changes in their reward sensing and learning. Their hyperactivity can be alleviated with drugs used to treat human attention and hyperactivity deficits," as Partanen clarifies.

In sum, Partanen indicates that, "Despite active research, the developmental basis of many human behavioural disorders are still poorly understood. We do not know yet if the human counterparts of the neurons we studied are involved in these deficits. Nevertheless, from the perspective of behavioural regulation, this specific group of neurons is highly important and there is still lot to learn about them."

Credit: 
University of Helsinki

Copolymer helps remove pervasive PFAS toxins from environment

image: Illinois engineers Kwiyong Kim, left, Xiao Su, Johannes Elbert and Paola Baldaguez Medina are part of a team that developed a new polymer electrode device that can capture and destroy PFAS contaminants present in water.

Image: 
Photo by L. Brian Stauffer

CHAMPAIGN, Ill. -- Researchers have demonstrated that they can attract, capture and destroy PFAS - a group of federally regulated substances found in everything from nonstick coatings to shampoo and nicknamed "the forever chemicals" due to their persistence in the natural environment.

Using a tunable copolymer electrode, engineers from the University of Illinois at Urbana-Champaign captured and destroyed perfluoroalkyl and polyfluoroalkyl substances present in water using electrochemical reactions. The proof-of-concept study is the first to show that copolymers can drive electrochemical environmental applications, the researchers said.

The results of the study are published in the journal Advanced Functional Materials.

"Exposure to PFAS has gained intense attention recently due to their widespread occurrence in natural bodies of water, contaminated soil and drinking water," said Xiao Su, a professor of chemical and biomolecular engineering who led the study in collaboration with civil and environmental engineering professors Yujie Men and Roland Cusick.

PFAS are typically present in low concentrations, and devices or methods designed to remove them must be highly selective toward them over other compounds found in natural waters, the researchers said. PFAS are electrically charged, held together by highly stable bonds, and are water-resistant, making them difficult to destroy using traditional waste-disposal methods.

"We have found a way to tune a copolymer electrode to attract and adsorb - or capture - PFAS from water," Su said. "The process not only removes these dangerous contaminants, but also destroys them simultaneously using electrochemical reactions at the opposite electrode, making the overall system highly energy-efficient."

To evaluate the method, the team used various water samples that included municipal wastewater, all spiked with either a low or moderate concentration of PFAS.

"Within three hours of starting the electrochemical adsorption process in the lab, we saw a 93% reduction of PFAS concentration in the low concentration spiked samples and an 82.5% reduction with a moderate concentration spiked samples, which shows the system can be efficient for different contamination contexts - such as in drinking water or even chemical spills," Su said.

Based on concepts first proposed in Su's previous work with arsenic removal, the process combines the separation and reaction steps in one device. "This is an example of what we call processes intensification, which we believe is an important approach for addressing environmental concerns related to energy and water," Su said.

The team plans to continue to work with various emerging contaminants, including endocrine disruptors. "We are also very interested in seeing how these basic copolymer concepts might work outside of environmental systems and help perform challenging chemical separations, such as drug purification in the pharmaceutical industry," Su said.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Priming the immune system to attack cancer

Immunotherapies, such as checkpoint inhibitor drugs, have made worlds of difference for the treatment of cancer. Most clinicians and scientists understand these drugs to act on what's known as the adaptive immune system, the T cells and B cells that respond to specific threats to the body.

New research from an international team co-led by George Hajishengallis of the University of Pennsylvania School of Dental Medicine suggests that the innate immune system, which responds more generally to bodily invaders, may be an important yet overlooked component of immunotherapy's success.

Their work, published in the journal Cell, found that "training" the innate immune system with β-glucan, a compound derived from fungus, inspired the production of innate immune cells, specifically neutrophils, that were primed to prevent or attack tumors in an animal model.

"The focus in immunotherapy is placed on adaptive immunity, like checkpoint inhibitors inhibit the interaction between cancer cells and T cells," says Hajishengallis, a co-senior author on the work. "The innate immune cells, or myeloid cells, have not been considered so important. Yet our work suggests the myeloid cells can play a critical role in regulating tumor behavior."

The current study builds on earlier work published in Cell by Hajishengallis and a multi-institutional team of collaborators, which showed that trained immunity, elicited through exposure to exposure to the fungus-derived compound β-glucan, could improve immune recovery after chemotherapy in a mouse model.

In that previous study, the researchers also showed that the "memory" of the innate immune system was held within the bone marrow, in hematopoetic stem cells that serve as precursors of myeloid cells, such as neutrophils, monocytes, and macrophages.

The team next wanted to get at the details of the mechanism by which this memory was encoded. "The fact that β-glucan helps you fight tumors doesn't necessarily mean it was through trained immunity," says Hajishengallis.

To confirm that link, the researchers isolated neutrophils from mice that had received the innate immune training via exposure to β-glucan and transferred them, along with cells that grow into melanoma tumors, to mice that had not received β-glucan. Tumor growth was significantly dampened in animals that received cells from mice that had been trained.

To further support this link between myeloid precurors and the protective quality of trained immunity, the scientists performed bone marrow transplants, transferring bone marrow cells from "trained" mice to untrained mice that had been irradiated, effectively eliminating their own bone marrow.

When challenged later, the mice that were recipients of bone marrow from trained mice fought tumors much better than those that received bone marrow from untrained mice.
"This is innate immune memory at work," said Technical University Dresden's Triantafyllos Chavakis, a long-term collaborator of Hajishengallis and co-senior author of the study.

The experiment relied on the memory of bone marrow precursors of neutrophils of the trained donor mice, which were transferred by transplantation to the recipient mice and gave rise to neutrophils with tumor-killing ability.

The researchers found that the antitumor activity likely resulted from trained neutrophils producing higher levels of reactive oxygen species, or ROS, than did untrained neutrophils. ROS can cause harm in certain contexts but in cancer can be beneficial, as it acts to kill tumor cells.

Looking closely at the myeloid precursors in the bone marrow of trained animals, the team found significant changes in gene expression that biased the cells toward making neutrophils, specifically a type associated with anti-tumor activity, a classification known as tumor-associated neutrophils type I (TAN1).

Further investigation revealed that these changes elicited by innate immune training cause an epigenetic rewiring of bone marrow precursor cells, changes that acted to make certain genes more accessible to being transcribed and also pointed to the Type I interferon signaling pathway as a likely regulator of innate immune training. Indeed, mice lacking a receptor for Type I interferon couldn't generate trained neutrophils.

β-glucan is already in clinical trials for cancer immunotherapy, but the researchers say this finding suggests a novel mechanism of action with new treatment approaches.

"This is a breakthrough concept that can be therapeutically exploited for cancer immunotherapy in humans," Hajishengallis says, "specifically by transferring neutrophils from β-glucan-trained donors to cancer patients who would be recipients."

Credit: 
University of Pennsylvania

Losing ground in biodiversity hotspots worldwide

image: Biologists have identified 34 areas on the globe where biodiversity is both extremely high and at risk.

Image: 
Francesco Cherubini/NTNU

Between 1992 and 2015, the world's most biologically diverse places lost an area more than three times the size of Sweden when the land was converted to other uses, mainly agriculture, or gobbled up by urban sprawl.

These losses all occurred in what are called "biodiversity hotspots", or 34 areas scattered across the globe that contain "exceptional concentrations of endemic species that were undergoing exceptional loss of habitat," according to the originators of the idea. To be considered a hotspot, an area must have already lost as much as 70 per cent of its primary vegetation and yet still remain home to least 1500 species of plants found nowhere else on Earth.

When the concept of a biological hotspot was first introduced in 2000, the idea was that governments and land managers could focus their conservation efforts on the areas, because protecting them would protect the greatest number of species in the most at-risk places.

Now, researchers from the Norwegian University of Science and Technology (NTNU) have found that even these high profile, extremely important areas are losing ground to agriculture and urban development. Their analysis is the first ever to look at all hotspots worldwide, and with a long time frame of nearly a quarter of a century.

"We see that not even focusing protection on a small range of areas worked well," said Francesco Cherubini, the senior author on the paper. Cherubini is a professor at NTNU and director of the university's Industrial Ecology Programme. "There was major deforestation even in areas that were supposed to be protected."

The findings have just been published in Frontiers in Ecology and the Environment.

Cherubini and his colleagues from the Industrial Ecology Programme were able to document this trend by analysing high-resolution land-cover maps released by the European Space Agency. The maps present information on land cover worldwide from 1992-2015, or 24 years, at a resolution of 300 metres.

By integrating the maps with maps of the world's biodiversity hotspots, the researchers were able to see how land cover in the hotspots changed over the period.

The researchers also wanted to see if protected areas inside hotspots fared better than areas outside of the protected areas, but still within the hotspots. For this, they determined which protected areas were inside hotspots based on the World Database on Protected Areas.

In both cases, the trends were not encouraging, said Xiangping Hu, the first author of the paper and a researcher at the Industrial Ecology Programme.

At least 148 million hectares in the hotspots  -- that's  3.2 times the size of Sweden -- were converted from the vegetation that was there in 1992 to some other use, the researchers found.  Those losses over 24 years equated to a loss of 6 per cent of the total area of hotspots.

Most of these losses -- nearly 40 per cent, or 54 million hectares -- were in forests. Agriculture expansion gobbled up 38 million hectares of the areas that were once forests, Hu said.

The three top hotspots that lost the most forested area were in Sundaland (all of Indonesia), Indo-Burma and Mesoamerica.

The characteristic that links these three hotspots and makes them most susceptible to losing forests is that they are all in the tropics, Cherubini said.

"The soils in these areas are very fertile, and agricultural yields can be very high," he said. "So it's very productive land from an agricultural point of view, and attractive to farmers and local authorities who have to think about rising local incomes by feeding a growing population."

While planting trees on agricultural land will help over the long term, it can't make up for the losses of biodiversity over the short term.

Another issue is that the Earth's rapidly growing population is boosting demand for agriculture products -- and tropical areas are more at risk of being converted to fields and pastures because of socioeconomic and political factors, Cherubini said.

That, combined with weak environmental protection laws and regulations, explains why forests are being transformed into farms, he said.

"if you don't have strong measures that can prevent conversion of key habitat to agricultural production, that's where you have the expansion," he said. "But those are also areas which are exposed to food insecurity."

And although the planet's booming population does put pressure on regions to increase food production, the reality is that most of the land is used to produce palm oil or soybeans for cattle feed, not feeding people directly. And growing crops to feed beef cattle doesn't really benefit local populations in the long-term, Cherubini said.

"You have these big companies that are making these investments, with high risks of land overexploitation and environmental degradation" in producing cattle feed, he said. "The local  population might get some benefits from revenues, but not much."

The second question the researchers wanted to answer was if specific protected areas inside hotspots actually did what they were supposed to do.

Here again, the findings were somewhat discouraging. Even formally protected areas lost an equivalent of 5 per cent of their forest cover during the 24 years the researchers looked at.

The good news was that protected areas within hotspots generally lost less forest cover than the land outside protected areas, especially during the most recent period the researchers looked at, from 2010-2015.

The researchers also found that some hotspots gained forested areas, especially the mountains of Central Asia, the Irano-Anatolian area and the Atlantic forest in North America.

"Most of this increase in forests was due to reforestation of agricultural land. However, while planting trees on agricultural land will help over the long term, it can't make up for the losses of biodiversity over the short term," said Francesca Verones, another co-author who is also a professor at the Industrial Ecology Programme.

Cherubini and his colleagues say that the trends they have identified will only grow stronger as time goes on unless there is a concerted effort to reverse the losses.

There are things that people can do to reduce the pressure to convert lands, Cherubini said. Some of these actions were described in a special report from the Intergovernmental Panel on Climate Change, for which Cherubini was one of the lead authors.

"Increasing efficiency in agricultural production and the food value chain and distribution, cutting food waste and a change in diets to eat less meat can all help decrease pressure on land, which will make more space for conservation efforts and climate change mitigation," he said.

Nevertheless, climate change has the potential to put pressure on lands, by requiring land for bioenergy crops or for tree plantations to soak up CO2, the researchers wrote.

And because many of these highly diverse areas are in poorer countries, biodiversity conservation won't succeed unless the issues of poverty are addressed, Cherubini said

"We need to be able somehow to link protection to poverty alleviation, because most of the biodiversity hotspots are in underdeveloped countries and it's difficult to go there and say to a farmer, 'Well, you need to keep this forest -- don't have a rice paddy or a field to feed your family'," he said. "We need to also make it possible for the local communities to benefit from protection measures. They need income, too."

Credit: 
Norwegian University of Science and Technology

'Time machine' offers new pancreatic cancer drug testing approach

image: Bumsoo Han, a professor of mechanical engineering, developed a tool that is helping scientists to more efficiently discover and test new drugs for pancreatic cancer.

Image: 
Purdue University/Rebecca McElhoe

WEST LAFAYETTE, Ind. -- Many patients with pancreatic cancer have only about a 10% chance of survival within five years of their diagnosis because they tend to become resistant to chemotherapy, past studies have indicated.

A "time machine" that Purdue University engineers designed to observe pancreatic cancer behavior over time suggests a new drug testing approach that could help scientists better catch resistance.

The researchers found that testing potential drugs on multiple tumor cell subtypes - rather than on just one cell subtype - can reveal drug resistance that may occur due to how different cancer subtypes interact with each other.

The study was recently published in the Royal Society of Chemistry journal Lab on a Chip.

"The drug discovery and screening process has been using one cancer cell subtype and studying how it interacts with neighboring non-cancer cells, but this may overestimate the efficacy of the drug," said Bumsoo Han, a Purdue professor of mechanical engineering and program leader of the Purdue Center for Cancer Research. Han has a courtesy appointment in biomedical engineering.

"By condensing time to look at how cancer cells interact within a pancreatic tumor, we found that one cancer cell subtype can not only be more drug-resistant than the others, but drug-sensitive cells can also become resistant through interaction between the subtypes."

The "time machine" is a type of lab tool called a microfluidic device. These devices are gum strip-sized platforms, such as a chip or slide, where cancer cells can be cultured in channels smaller than a millimeter in diameter. The cells then grow in a lifelike environment on the platform, such as in a collagen tube that Han's lab created to mimic the pancreatic duct.

Microfluidic devices are starting to become more mainstream in the drug development process because they allow scientists to test drugs in realistic simulations of a biological system using real tissue samples, but on a faster time scale than in animal models.

Han's group found that about 25% of 2019 research publications indexed by PubMed, a biomedical literature database, had used microfluidic devices as models to study tumors from animals or patients.

But most microfluidic devices just show late-stage tumor growth. With Han's device, scientists can load in cell lines from an animal model or patient before gene mutation has happened, enabling them to see all stages of tumor progression.

While findings made using microfluidic devices need to be validated in humans before being put into clinical practice, they can still shorten the drug development process by offering new research approaches.

The findings from Han's device highlight the need for studying interactions between cancer cells.

"Not much research has been done on what kind of interaction happens within tumors, so those mechanisms of drug resistance have been overlooked," Han said.

These findings are already informing the development of new drug compounds.

Zhong-Yin Zhang, the director of the Purdue Institute for Drug Discovery, is using Han's microfluidic device to test a compound aimed at blocking an oncogenic process that Zhang's lab has previously identified as playing a role in cancer development.

The device allows Zhang's team to evaluate the compound not only for pancreatic cancer specifically, but also on multiple cancer cell subtypes.

"The nice thing about this device is that we don't have to use as much of a compound to see how well it works," said Zhang, who is also a distinguished professor of medicinal chemistry in Purdue's Department of Medicinal Chemistry and Molecular Pharmacology and Department of Chemistry.

Credit: 
Purdue University