Tech

New aluminum and samarium hexaboride-based composite material with near-zero expansion

image: Typical optic micrographs of the composite material

Image: 
Serebrennikov et al. / Results in Physics, 2021

Precision or invar alloys have been developed by scientists for many centuries. These iron and nickel-based alloys are capable of keeping their size unchanged within a given range of temperatures. Because of this, they are used in the manufacture of precision gages, standards of length, details for mechanical dial plates, and similar devices. However, invar alloys lack many other useful physical characteristics, and this limits their use in other areas, for example, those that require high thermal conductivity of materials. Therefore, scientists have long been trying to create a unique composite material based on other metals that would combine thermal expansion typical for invar alloys with additional physical properties.

A team of researchers from BFU suggested their approach to this issue. To develop a new composite material, they used a traditional method based on the reduction of heat expansion of functional materials. In the course of this technique, ceramic or other particles are added to the initial metal. Compared to the metal, the particles have considerably lower heat expansion. This time, the scientists added an intermediary valence compound to the mix. Unlike integral valence elements, such compounds can have anomalous properties: for example, some of them can shrink when heated. Moreover, the level of such shrinkage can be regulated. Composites based on a metal and an intermediate valence system allow one to manage their thermal expansion and to bring it down to almost zero. This considerably widens the range of their applications.

In their study, the team used aluminum and samarium hexaboride. Although these substances are widely known, it was the first time they were combined together. To obtain the composite, the components in powder form were hot-pressed. After that, the team studied the result with an optical microscope and used X-ray tomography to diagnose the internal structure of the sample without additional polishing and finishing. Using layer-by-layer scanning, the scientists developed a 3D model of the new substance and found out that samarium hexaboride particles were evenly distributed in aluminum. This confirmed that the composite was fit for further studies. To measure its heat expansion, the team used capacitive dilatometry within the temperature range of 10-210 ?. The sample had zero heat expansion at 45 ? and demonstrated invar behavior up to 60 K.

"Our work is the first in its field, and we are not ready to consider scaling to the industrial level yet. Currently, we are focused on specific problems that require unique solutions. The issue of reducing the heat expansion of functional materials by means of adding small particles of low or zero expansion substances has been relevant in the instrument-making industry, radio electronics, aviation, and space industry, as well as in laser and cryogen technologies for many years," said Dmitry Serebrennikov, a Candidate of Physical and Mathematical Sciences, and a research associate at the Laboratory for Strongly Correlated Electron Systems, Science and Research Center "Functional Nanomaterials" at BFU.

Credit: 
Immanuel Kant Baltic Federal University

Evolved to stop bacteria, designed for stability

Connections are crucial. Bacteria may be most dangerous when they connect - banding together to build fortress-like structures known as biofilms that afford them resistance to antibiotics. But a biomolecular scientist in Israel and a microbiologist in California have forged their own connections that could lead to new protocols for laying siege to biofilm-protected colonies. Their research was published in the Proceedings of the National Academy of Sciences (PNAS), USA.

This interdisciplinary collaboration began with a lecture given at the Weizmann Institute of Science in the Life Sciences Colloquium. Prof. Dianne Newman of the California Institute of Technology was the speaker, and the Institute's Prof. Sarel Fleishman, of the Biomolecular Sciences Department, decided to attend, even though the lecture had no immediately apparent bearing on his own research. Newman described an enzyme she had discovered that could interrupt the metabolism of the biofilm-building bacteria, Pseudomonas aeruginosa. The enzyme interferes with the functioning of a molecule (pyocyanin) that is generated by the bacteria as they reach a high cell density and start to run out of oxygen, and it is thus responsible for helping bacteria deep within the biofilm remain viable as well as better tolerate conventional antibiotics. This molecule, however, is a double-edged sword: It can also be toxic to P. aeruginosa in the outer layers of the biofilm, where oxygen is present. Since pyocyanin impacts both biofilm development and antibiotic tolerance, Newman's lab focused on identifying ways to disrupt its activities. Newman's only problem, she said, was that the newly discovered pyocyanin-blocking enzyme was unstable and produced in minute amounts, and thus far, standard lab methods for growing such proteins had not been successful.

Pseudomonas aeruginosa is an opportunistic bacterium that causes disease mainly in those with underlying conditions: in the lungs of cystic fibrosis patients, the peripheral wounds in diabetics and on various implanted medical devices of hospital patients. Hard-to-eradicate biofilms may help infections return even after treatment, contributing to the bacteria's growing antibiotic resistance, particularly in hospital-acquired strains.

After the lecture, Fleishman suggested to Newman that they try a new approach to producing larger quantities of the enzyme. His lab specializes in computational protein design, and some of their recent work had involved redesigning vaccine proteins to make them more stable.

Rosalie Lipsh-Sokolik, a research student in his lab, together with Dr. Olga Khersonsky, a research associate, took up the challenge of designing an improved, more stable biofilm-busting enzyme. But the enzyme was unlike any Fleishman's lab had worked with before, and it would require them to develop a new methodology: It was a trimer - three identical copies of a protein bundled "like barrels strapped together," says Fleishman, and that meant that, in addition to the structure of the individual protein, they would need to understand how the entire package fit together.

The group's first step was to map the enzyme down to its atomic structure. This gave them a detailed picture of the forces that hold the protein together. When they added the resulting models of the three copies together to understand the trimer formation, they noticed that the areas of contact between copies were poorly packed, atomically speaking, and they thought these particular weak points would be a good place to start in designing a more stable structure.

But even after narrowing down the potential sites for adjustment, the number of design possibilities for such a protein complex was huge. Lipsh-Sokolik ended up adopting a combined, two-pronged approach. The first was to look for proteins made by other bacteria that are similar but slightly different, to see what could be borrowed. The second was a sort of "subtle" atomistic design approach, identifying just a dozen or so points on the enzyme that might be tweaked and trying out different modeled combinations of amino acids at just those points.

The beauty of the computer design methods developed in Fleishman's lab is not only the fact that they can produce, in a very short time, hundreds of thousands of different possible protein designs, but they also rank them from most-likely to work to not likely to work at all. Still, the only way to know if your hypothesis is correct - about the areas that need reinforcement or the ability of an enzyme to still function despite changes to its protein sequence - is to make those proteins and test them in real biological systems. Enter Dr. Chelsey M. VanDrisse, a postdoctoral fellow in the Newman lab, who led all the experimental tests of the Fleishman lab designs.

Fleishman admits his team was nervous when VanDrisse and Newman told them that they could conduct experiments on only ten of the designed trimer enzymes due to the highly challenging nature of the experiments. Their challenges were not limited to creating these new proteins, but included figuring out how to purify them in sufficient quantities in the lab and then testing them on real biofilm-building Pseudomonas aeruginosa in combination with standard antibiotic treatment. The question was, could the team not only produce a more active protein, but determine whether its application could facilitate biofilm control and begin to understand the mechanisms underpinning the enzyme`s effects?

"Both teams were over the moon when the results came in," says Fleishman: Eight of the ten designed enzymes were produced in larger-than-normal quantities in Newman's lab, without, it seemed, compromising their biofilm-fighting abilities. VanDrisse jokingly raved that she could now produce so much protein she "could have put it in my cereal every morning!" "This showed that our hypothesis about the contact areas was correct," says Fleishman. One enzyme seemed especially robust and was produced in substantial amounts, so VanDrisse and Newman went all the way: They set out to check if this version of the enzyme could, at least in the lab, work together with a commonly used clinical antibiotic to eradicate the biofilm.

In fact, they found the enzyme, in combination with this antibiotic, worked much better than they had expected. Further analysis suggested the enzyme first helps the antibiotic kill the bacteria in the oxygenated outer regions of the biofilm in a way that had not before been seen, leading, in short time, to a significant reduction in the total number of viable biofilm cells.

Fleishman adds that, as the collaboration between two groups of scientists who normally read different journals, attend different conferences and experiment with very different methods on different scales deepened -- VanDrisse even making it to the Weizmann lab just before the first COVID-19 lockdowns - he realized that what had started for him as a test of his lab's computational protein design methods now had a very real chance of leading to a cure for some of the most aggressive bacterial infections. It all came down to making the right connections.

Credit: 
Weizmann Institute of Science

Solar cells: Losses made visible on the nanoscale

image: A conductive AFM tip is used to scan the sample surface of an a-Si:H/c-Si interface under ultra-high vacuum on the nm scale, revealing the transport channels of the charge carriers via defects in the a-Si:H (red states in the magnified section).

Image: 
Martin Künsting /HZB

Silicon solar cells are now so cheap and efficient that they can generate electricity at prices of less than 2 cent/kWh. The most efficient silicon solar cells today are made with less than 10 nanometres thin selective amorphous silicon (a-Si:H) contact layers, which are responsible for separating the light-generated charges . Efficiencies of over 24% are achieved at HZB with such silicon heterojunction solar cells and are also part of a tandem solar cell that lead to a recently reported efficiency record of 29.15 % (A. Al-Ashouri, et al. Science 370, (2020)). The current world record from Japan for a single junction silicon solar cell is also based on this heterocontact (26.6%: K. Yoshikawa, et al. Nature Energy 2, (2017)).

There is still considerable efficiency potential related to such heterocontact systems, however, it is not yet understood in detail how these layers enable charge carrier separation and what their nanoscopic loss mechanisms are. The a-Si:H contact layers are characterised by their intrinsic disorder, which on the one hand enables excellent coating of the silicon surface and thus minimises the number of interfacial defects, but on the other hand also has a small disadvantage: it can lead to local recombination currents and to the formation of transport barriers.

For the first time, a team at HZB and the University of Utah has experimentally measured on an atomic level how such leakage currents form between c-Si and a-Si:H, and how they influence the solar cell performance. In a joint effort, a team led by Prof. Christoph Boehme at the University of Utah, and by Prof. Dr. Klaus Lips at HZB, they were able to resolve the loss mechanism at the interface of the above mentioned silicon heterocontact on the nanometre scale using ultrahigh vacuum conductive atomic force microscopy (cAFM).

The physicists were able to determine with near atomic resolution where the leakage current penetrates the selective a-Si:H contact and creates a loss process in the solar cell. In cAFM these loss currents appear as nanometre-sized current channels and are the fingerprint of defects associated with the disorder of the amorphous silicon network. "These defects act as stepping stones for charges to penetrate the selective contact and induce recombination, we refer to this" as trap-assisted quantum mechanical tunnelling", explains Lips. "This is the first time that such states have been made visible in a-Si:H and that we were able to unravel the loss mechanism under working conditions of the a solar cell of highest quality," the physicist reports enthusiastically.

The Utah/Berlin team was also able to showed that the channelled dark current fluctuates stochastically over time. The results indicate that a short-term current blockade is present, which is caused by local charge that is trapped in neighbouring defects which changes the energetic positioning of the tunnelling states (stepping stones). This trapped charge can also cause the local photovoltage at a current channel to rise to above 1V, which is far above what one would be able to use with a macroscopic contact. "At this transition from the nano to the macro worldwe find the exciting physics of heterojunctions and the key on how to further improve the efficiency of silicon solar cells in an even more targeted way," says Dr. Bernd Stannowski, who is responsible for the development of industrial silicon heterojunction solar cells at HZB.

Credit: 
Helmholtz-Zentrum Berlin für Materialien und Energie

Looking for new explanations of TC genesis from the vertical coupling of Durian's embryo

Tropical Cyclone (TC) is an intense atmospheric vortex with a warm core and low pressure structure, and generates over the tropical or subtropical warm ocean. The problem of TC genesis has been paid great attention by scientists since the 1950s, but due to the lack of the observation data over sea, this problem has become the most difficult and challenging topic in the researches of TC.

Cumulus convections are considered to be the most basic element in the TC generation process. The formation of TC in the Northwest Pacific is often associated with the mesoscale convective system (MCS) or mesoscale convective complex (MCC). Meanwhile, in the stratiform cloud precipitation area of MCS or MCC, a mesoscale convective vortex (MCV) can be produced, and the MCV is considered to be the embryo of TC. However, how the cumulus convections and the embryo organize, develop and finally form a synoptic-scale vortex has always been difficult to answer.

For a long time, the effect of cumulus convection was explained by the CISK (conditional instability of the second kind) mechanism and WISHE (wind-induced surface heat exchange) mechanism. Since the beginning of the 21st century, researchers have gradually realized that these two mechanisms can explain the process of TC intensification, but neither of them is a reasonable explanation for the process of TC genesis. Because they all implicitly assume that a finite amplitude cyclonic circulation system with intense surface vorticity already exists. Therefore, the cumulus convection organization process before the establishment of the surface cyclonic circulation is still a difficult problem that needs to be investigated deeply.

TC Durian generated over the South China Sea on June 28, 2001. "Based on the multi-scale interaction mechanism, we described a series of stories about the genesis of TC Durian. In the first story, we mainly told about the activities of the monsoon surge associated with the cross-equatorial jet and the low level jet from Somalia, and the upscaling process of the vorticity zone accompanying the band-shaped convective clouds, and the role of large scale conditions on dominating the time and location of TC genesis. The occurrence of MCC is the result of the organization of large scale monsoon trough. Among the types of TC genesis in which the monsoon trough acts as a large-scale disturbance in the lower troposphere in the Western Pacific, it is inevitable that MCC is prone to occur as a precursor." Dr. Zhang Wenlong said.

"In the second story, we focused on the role of the mesoscale system, the middle-level MCV in the genesis of TC durian. The first is the mesoscale organization role of the MCV, which makes the vortical hot towers (VHTs) in the area tend to gradually concentrate in the central area and interact easily between each other, and promotes the intense VHTs to be axisymmetrically distributed in the embryonic area. The second is its storage role, which can reserve and maintain the heat, water vapor, and vorticity carried by the died VHTs, making the MCV area to be more favorable to the TC genesis, and ultimately become the embryo of Durian." Dr. Zhang Wenlong said.

In the tropical large scale disturbance systems, why do some convective cloud clusters eventually develop into TCs, while others fail to develop into TCs? Now Dr. Zhang Wenlong and his collaborators have written a new story about the genesis of TC Durian, looking for new explanations from the vertical coupling (VC) characteristics of TC and the role of the vertical connection of small-scale VHTs.

The new story is titled: "Vertical Coupling Characteristics and Mechanism of Tropical Cyclone Durian (2001)", published in the 3rd issue of "Science China: Earth Science" in 2021, by Dr. Zhang Wenlong from Beijing Institute of Urban Meteorology, professor Cui Xiaopeng from the Institute of Atmospheric Physics, Chinese Academy of Sciences, and professor Dong Jianxi from the National Marine Environmental Forecasting Center, China, and the corresponding author is Dr. Zhang Wenlong. The study adopts the numerical simulation method and compares the related results with that of the famous senior meteorologist Montgomery. It is revealed that even if TC genesis occurs in a barotropic environment, the VC process still occurs between the trough (vortex) at the middle level and that at the lower level in the TC embryo area, and the VHTs play vertical connecting roles and are the actual practitioners of the VC.

"The VC process and its mechanism in TC genesis is an important but unanswered basic scientific problem hidden under the condition of weak vertical shear of horizontal wind, the key factor for TC genesis." Professor Cui Xiaopeng said, "The VC problem has not been given a targeted study. One of the possible reasons is that most part of the previous numerical simulation experiments of TC genesis are the ideal experiments with an artificially constructed axisymmetric vortex in the initial field, and a few are real cases simulations starting when a tropical depression circulation already exists. These numerical simulations all skip the VC stage that may bring difficulties to the simulations, and therefore the researchers may miss the opportunity to observe the VC characteristics of TC. "

Based on the simulation of TC genesis starting from the monsoon trough with unclosed circulation, Zhang Wenlong research group successfully observed the VC process of TC embryo in a barotropic environment. The study pointed out that through the VHTs' vertical connections, the middle- and lower-troposphere trough axes move towards each other and realize the VC. The VHTs promote the VC of the wind field through the stretching term and tilting term of vorticity budget, promote the VC of the temperature field through the release of latent heat, and promote the VC of the humidity field through the deep cloud towers. Due to the collective contributions of the VHTs, the embryo area develops into a warm, nearly saturated core with strong cyclonic vorticity. The axisymmetric distribution of VHTs is an important sign of TC genesis. When a TC is about to form, there may be accompanying phenomena between the axisymmetric process of VHTs and vortex Rossby waves. "Only the TC embryo that has achieved VC may be the 'real' embryo that can further develop into a tropical storm with spiral cloud belts." Professor Dong Jianxi said.

This research has enriched and deepened the link of the small-scale VHTs in the multi-scale interaction mechanism of TC genesis, and is of great significance to the basic theoretical research of TC genesis. Meanwhile, more case studies are expected to verify and improve these understandings.

Credit: 
Science China Press

Cambodian study assesses 3D scanning technologies for prosthetic limb design

image: The international study gathered benchmarking data for new 3D scanning technologies

Image: 
University of Southampton

Cutting-edge 3D scanners have been put to the test by researchers from the University of Southampton and partners Exceed Worldwide to help increase the quality and quantity of prosthetics services around the world.

The study, carried out within the People Powered Prosthetics research group compared plaster-casts and 3D scans for prosthetic limb users in Cambodia to establish the suitability of different digital technologies.

The results, published in the Journal of Prosthetics and Orthotics, will help people to choose the right scanner for different uses - including new prosthesis design, replicating worn-out prosthesis, or limb shape monitoring - and assess whether affordable scanners in lower-income countries are fit for purpose.

A prosthetic limb is attached to the body using a bespoke socket that fits over the patient's residual limb (or stump). Well-fitting prosthetic sockets are crucial for the wearer's comfort and enable people to be independent and carry out functional activities like standing, walking, working and using transport.

Typically, these sockets are produced by experts using a hands-on plaster casting method. This gets excellent results but is often iterative, which comes at a cost and some inconvenience to patients.

Computer aided design and manufacturing (CAD/CAM) methods involve 3D scanning the person's residual limb, designing the socket in software and using robotic carvers in manufacturing. The scans give a true indication of limb shape but variation might arise because limbs change as people's muscles twitch and relax.

Dr Alex Dickinson, lead author from the Bioengineering Science Research Group, says: "Before this paper, evidence for the reliability of 3D scanners for prosthetic limb design was produced using plaster models and mannequins that don't twitch, or have trouble balancing. We are generating and sharing first-of-kind reliability data by scanning a group of prosthesis users' limbs directly, and comparing it to important reference data (the reliability of expert clinicians using plaster casting) to benchmark scanner accuracy and effectiveness."

For each participant, two plaster casts were made by a prosthetist and their residual limbs were scanned after each cast.

The research found that some low-cost scanners could capture limb shape with similar repeatability (the difference between their two repeated measurements) to the expert prosthetist using their hands and plaster, but that other devices gave considerably different results between their two measurements compared to the clinician.

The Southampton researchers collaborated on the study with expert prosthetists at Exceed Worldwide, a charity that provides prosthetic limbs to thousands of people in Cambodia and across South East Asia.

Dr Dickinson adds: "As with many emerging technologies, there's a risk that people could choose very high accuracy scanners that might limit the benefit of the approach to wealthy clinics; on the other hand, people could choose the new, very low-cost scanners which are developing alongside 3D printing, which might not be accurate enough at capturing anatomic shape for prosthetic limb design.

"This new data should help people choose the right scanner for the right job. We suggest that prosthetics design and measurement technologies should be benchmarked against the expert clinician and used to support them, never to replace their skill and training. By choosing appropriate technologies we can help make sure prosthetics services are accessible to as many people as possible, more sustainable, and with no compromise on quality."

Credit: 
University of Southampton

What brings olfactory receptors to the cell surface

image: Work under sterile conditions. Scientist fills a 96-well plate with cell culture medium.

Image: 
Photo: C. Schranner / Leibniz-LSB@TUM

A team of scientists led by Dietmar Krautwurst from the Leibniz Institute for Food Systems Biology at the Technical University of Munich has now identified address codes in odorant receptor proteins for the first time. Similar to zip codes, the codes ensure that the sensor proteins are targeted from inside the cell to the cell surface, where they begin their work as odorant detectors. The new findings could contribute to the development of novel test systems with which the odorant profiles of foods can be analyzed in a high-throughput process and thus could be better controlled.

The genes of the approximately 400 human odorant receptor types have been identified for about 20 years. Nevertheless, for about 80 percent of these sensor proteins, it is still not known to which odorants they respond. Knowing this, however, is an important prerequisite for developing bio-based "artificial noses" for food controls.

Cellular test systems

But how can this problem be solved? Normally, scientists use cellular test systems to find out to which substances a receptor protein reacts. However, a particular problem with odorant receptors is that they often are stuck inside the test cells, and hardly reach the cell surface. Even for a suitable odorant it is then difficult to dock onto enough receptors to activate a cellular function. Thus, odorant assignment to individual receptor types is hampered.

However, why do odorant receptors so often become stuck in test cells, and what molecular mechanisms are involved in the transport of odorant receptors to the cell surface? To help answer these fundamental questions, the team of scientists examined and compared the protein sequences of 4,808 odorant receptors from eight different species using statistical and phylogenetic analysis methods. This enabled the team to identify highly conserved amino acid motifs. These are localized in the respective C-terminal end of the receptor proteins, which protrudes into the cell interior (cytoplasm).

Address codes identified

"The structure-function analyses we performed indicate that certain amino acid motifs and their combinations in different receptor types individually promote their cell surface expression and signaling. They function like address codes, or 'zip codes'," reports Dietmar Krautwurst, who led the study. "Such amino acid motifs were previously unknown for olfactory receptors," the biologist continued. "We assume that the odorant receptor molecules interact with cellular proteins via these motifs, which guide the sensor proteins to their site of action on the cell surface via mechanisms that are still unknown."

The researchers led by Dietmar Krautwurst hope that their new findings will help to optimize cellular assay systems for odorant receptors in such a way that it will soon be possible to determine the corresponding odorant partners for each odorant receptor. The team of scientists agrees that only if the recognition spectra of as many odor receptors as possible are known, it will be possible to develop receptor-based test systems by which the odor quality of foods can be reliably and rapidly monitored online during production.

Credit: 
Leibniz-Institut für Lebensmittel-Systembiologie an der TU München

Crystal structure prediction of multi-elements random alloy

image: Schematic diagram of the crystal structure classification module of multi-elements alloys.

Image: 
POSTECH

Alchemy, which attempted to turn cheap metals such as lead and copper into gold, has not yet succeeded. However, with the development of alloys in which two or three auxiliary elements are mixed with the best elements of the times, modern alchemy can produce high-tech metal materials with high strength, such as high entropy alloys. Now, together with artificial intelligence, the era of predicting the crystal structure of high-tech materials has arrived without requiring repetitive experiments.

A joint research team of Professor Ji Hoon Shim and Dr. Taewon Jin (first author, currently at KAIST) of POSTECH's Department of Chemistry, and Professor Jaesik Park of POSTECH Graduate School of Artificial Intelligence have together developed a system that predicts the crystal structures of multi-element alloys with expandable features without needing massive training data. These research findings were recently published in Scientific Reports.

Properties of solid-state materials depend on their crystal structures. In solid solution high entropy alloy (HEA) - a material that has the same crystal structure but continuously changes its chemical composition within a certain range - mechanical properties such as strength and ductility vary depending on the structural phase. Therefore, predicting the crystal structure of a material plays a crucial role in finding new functional materials. Methods to predict the crystal structure through machine learning have been studied recently, but there is an enormous cost attached to prepare the data necessary for training.

To this, the research team designed an artificial intelligence model that predicts the crystal structure of HEAs through expandable features and binary alloy data instead of the conventional models that use more than 80% of the HEA data in the training process. This is the first study to predict the crystal structure of multi-element alloys, including HEAs, with an artificial intelligence model trained only with the compositions and structural phase data of binary alloys.

Through experiments, the researchers confirmed that the structural phase of the multi-element alloy was predicted with an accuracy of 80.56%, even though the multi-element alloy data were not involved in the training process. In the case of HEAs, it was predicted with an accuracy of 84.20%. According to the method developed by the research team, it is anticipated that the calculation cost can be saved by about 1,000 times compared to previous methods.

"An immense dataset is required to apply an artificial intelligence methodology to the development of new materials," explained Professor Ji Hoon Shim who led the research. "This study is significant in that it enables to effectively predict the crystal structure of advanced materials without securing a huge data set."

Credit: 
Pohang University of Science & Technology (POSTECH)

FAU researchers break bonds in molecular nitrogen with calcium

Chemists all over the world are constantly searching for simple ways to make elemental nitrogen or N2 in the air available for chemical reactions. This is no easy task, as nitrogen is a particularly non-reactive gas with a triple bond, which is one of the strongest known chemical bonds. A research team at Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) has now demonstrated that calcium, a metal commonly found in nature, is able to break the highly-stable nitrogen bond and can do so at minus 60°C. This is significant for two reasons. On the one hand, the researchers at FAU have made a new discovery in terms of the bond-breaking capabilities of calcium, which had been largely disregarded in the past. On the other hand, their findings could form the basis for developing industrial processes in the future.

Nitrogen is one of the main components of air and is in unlimited supply. It is also used as an inert gas for protecting food due to its particularly low chemical reactivity and it can keep products such as part-baked rolls fresh for months. Plants also require nitrogen for growth. However, they cannot use nitrogen directly from the air. The greatest challenge lies in converting the highly-stable diatomic molecule N2 into useful chemicals. Two German chemists succeeded in doing so in the early 1900s when they developed the Haber-Bosch process, which converts N2 into ammonia (NH3). Whilst ammonia was originally used to manufacture explosives, today it is mostly used as a fertiliser. In the Haber-Bosch process, a transition metal catalyst triggers the chemical reaction. Conversion of highly-stable nitrogen into ammonia requires high pressures and high temperatures, which means Haber's 'bread from the air' process requires large amounts of energy.

Chemists are looking for other methods of breaking the strong N?N triple bond to simplify this and other processes. The team of researchers led by Prof. Dr. Sjoerd Harder, Chair of Inorganic and Organometallic Chemistry at FAU, have now successfully demonstrated that the main group element calcium is capable of achieving this feat. Calcium is a metal commonly found in nature mainly in limestone, which had been regarded in the past as not being capable of breaking strong chemical bonds. Unlike transition metals, which are often toxic, calcium is generally not capable of utilising d orbitals - a wave function with a specific symmetry that facilitates bond breaking reactions.

While searching for calcium atoms in the unusual oxidation level +I, the FAU researchers accidentally discovered that the metal reacts with nitrogen, which was only supposed to be used as an inert gas during the experiment. Harder and his team isolated a molecule that was trapped in the nitrogen between two calcium atoms and were able to continue the conversion to hydrazine. In contrast to nitrogen, which is extremely stable, hydrazine is used as highly-reactive rocket fuel. Working with theoretical chemists at the universities of Marburg in Germany and Nanjing in China, the FAU research team discovered that d orbitals do actually play a significant role in nitrogen activation with calcium. This controversial but significant discovery dispels the dogma that d orbitals are irrelevant for metals assigned to the main group in the periodic system.

Despite the fact that the process is neither catalytic nor economical, it provides new fundamental and important insights into bond breaking reactions with calcium. These findings will not only rewrite students' textbooks, but could also contribute to the development of simplified industrial processes.

Credit: 
Friedrich-Alexander-Universität Erlangen-Nürnberg

Magnetism meets topology on a superconductor's surface

image: An illustration depicting a topological surface state with an energy band gap (an energy range where electrons are forbidden) between the apices of the top and corresponding bottom cones (allowed energy bands, or the range of energies electrons are allowed to have). A topological surface state is a unique electronic state, only existing at the surface of a material, that reflects strong interactions between an electron's spin (red arrow) and its orbital motion around an atom's nucleus. When the electron spins align parallel to each another, as they do here, the material has a type of magnetism called ferromagnetism.

Image: 
Dan Nevola, Brookhaven National Laboratory

UPTON, NY--Electrons in a solid occupy distinct energy bands separated by gaps. Energy band gaps are an electronic "no man's land," an energy range where no electrons are allowed. Now, scientists studying a compound containing iron, tellurium, and selenium have found that an energy band gap opens at a point where two allowed energy bands intersect on the material's surface. They observed this unexpected electronic behavior when they cooled the material and probed its electronic structure with laser light. Their findings, reported in the Proceedings of the National Academy of Sciences, could have implications for future quantum information science and electronics.

The particular compound belongs to the family of iron-based high-temperature superconductors, which were initially discovered in 2008. These materials not only conduct electricity without resistance at relatively higher temperatures (but still very cold ones) than other classes of superconductors but also show magnetic properties.

"For a while, people thought that superconductivity and magnetism would work against each other," said first author Nader Zaki, a scientific associate in the Electron Spectroscopy Group of the Condensed Matter Physics and Materials Science (CMPMS) Division at the U.S. Department of Energy's (DOE) Brookhaven National Laboratory. "We have explored a material where both develop at the same time."

Aside from superconductivity and magnetism, some iron-based superconductors have the right conditions to host "topological" surface states. The existence of these unique electronic states, localized at the surface (they do not exist in the bulk of the material), reflects strong interactions between an electron's spin and its orbital motion around the nucleus of an atom.

"When you have a superconductor with topological surface properties, you're excited by the possibility of topological superconductivity," said corresponding author Peter Johnson, leader of the Electron Spectroscopy Group. "Topological superconductivity is potentially capable of supporting Majorana fermions, which could serve as qubits, the information-storing building blocks of quantum computers."

Quantum computers promise tremendous speedups for calculations that would take an impractical amount of time or be impossible on traditional computers. One of the challenges to realizing practical quantum computing is that qubits are highly sensitive to their environment. Small interactions cause them to lose their quantum state and thus stored information becomes lost. Theory predicts that Majorana fermions (sought-after quasiparticles) existing in superconducting topological surface states are immune to environmental disturbances, making them an ideal platform for robust qubits.

Seeing the iron-based superconductors as a platform for a range of exotic and potentially important phenomena, Zaki, Johnson, and their colleagues set out to understand the roles of topology, superconductivity and magnetism.

CMPMS Division senior physicist Genda Gu first grew high-quality single crystals of the iron-based compound. Then, Zaki mapped the electronic band structure of the material via laser-based photoemission spectroscopy. When light from a laser is focused onto a small spot on the material, electrons from the surface are "kicked out" (i.e., photoemitted). The energy and momentum of these electrons can then be measured.

When they lowered the temperature, something surprising happened.

"The material went superconducting, as we expected, and we saw a superconducting gap associated with that," said Zaki. "But what we didn't expect was the topological surface state opening up a second gap at the Dirac point. You can picture the energy band structure of this surface state as an hourglass or two cones attached at their apex. Where these cones intersect is called the Dirac point."

As Johnson and Zaki explained, when a gap opens up at the Dirac point, it's evidence that time-reversal symmetry has been broken. Time-reversal symmetry means that the laws of physics are the same whether you look at a system going forward or backward in time--akin to rewinding a video and seeing the same sequence of events playing in reverse. But under time reversal, electron spins change their direction and break this symmetry. Thus, one of the ways to break time-reversal symmetry is by developing magnetism--specifically, ferromagnetism, a type of magnetism where all electron spins align in a parallel fashion.

"The system is going into the superconducting state and seemingly magnetism is developing," said Johnson. "We have to assume the magnetism is in the surface region because in this form it cannot coexist in the bulk. This discovery is exciting because the material has a lot of different physics in it: superconductivity, topology, and now magnetism. I like to say it's one-stop shopping. Understanding how these phenomena arise in the material could provide a basis for many new and exciting technological directions."

As previously noted, the material's superconductivity and strong spin-orbit effects could be harnessed for quantum information technologies. Alternatively, the material's magnetism and strong spin-orbit interactions could enable dissipationless (no energy loss) transport of electrical current in electronics. This capability could be leveraged to develop electronic devices that consume low amounts of power.

Coauthors Alexei Tsvelik, senior scientist and group leader of the CMPMS Division Condensed Matter Theory Group, and Congjun Wu, a professor of physics at the University of California, San Diego, provided theoretical insights on how time reversal symmetry is broken and magnetism originates in the surface region.

"This discovery not only reveals deep connections between topological superconducting states and spontaneous magnetization but also provides important insights into the nature of superconducting gap functions in iron-based superconductors--an outstanding problem in the investigation of strongly correlated unconventional superconductors," said Wu.

In a separate study with other collaborators in the CMPMS Division, the experimental team is examining how different concentrations of the three elements in the sample contribute to the observed phenomena. Seemingly, tellurium is needed for the topological effects, too much iron kills superconductivity, and selenium enhances superconductivity.

In follow-on experiments, the team hopes to verify the time-reversal symmetry breaking with other methods and explore how substituting elements in the compound modifies its electronic behavior.

"As materials scientists, we like to alter the ingredients in the mixture to see what happens," said Johnson. "The goal is to figure out how superconductivity, topology, and magnetism interact in these complex materials."

Credit: 
DOE/Brookhaven National Laboratory

Abundant and stable rocks are critical egg-laying habitat for insects in restored streams

image: A recent study highlights the importance of rock characteristics in stream habitat, which should inform stream restoration efforts. Here, first author Samantha Jordt records the presence of insect eggs in a stream.

Image: 
Michelle Jewell

The abundance and other characteristics of rocks partially extending above the water surface could be important for improving the recovery of aquatic insect populations in restored streams.

Nearly three quarters of stream insects reproduce on large rocks that sit above the water surface by crawling underneath to attach their eggs. Increasing the number of large and stable emergent rocks in streams could provide more egg-laying habitat and allow insects to quickly repopulate restored streams.

"We found that restored streams had fewer emergent rocks for egg-laying and fewer total eggs than naturally intact streams," says Samantha Jordt, first author of the paper and an M.Sc. student at NC State's Department of Applied Ecology.

The study also found that some of the large rocks in restored streams were unstable and rolled or were buried by sediment between Samantha's visits. According to the study, these variables combined-fewer large rocks available for egg laying and that some of those rocks were unstable-may delay insect recovery.

"When a rock rolls, any eggs on that rock will likely be destroyed either by being crushed or scraped off as the rock rolls, being buried by sediment, or by drying out if the rock settles into a new position that exposes the eggs to the air," says Jordt. "You end up with lots of insects laying eggs on the one good rock in the stream, truly putting all of their eggs in one, rolling, basket."

Less suitable egg-laying habitat means fewer larvae or adult insects - both important for the long term health and recovery of restored streams. Aquatic insects provide several ecosystem services, including breaking down leaf litter, consuming algae, cycling nutrients, and being food for fish, salamanders, and birds.

"Many people rely on streams for drinking water, which means they rely on all of the ecological processes that happen upstream before the water reaches them," says Jordt. "Aquatic insects maintain water quality for free. So we develop techniques so that restored streams have habitats that they can rebound and thrive in."

Most stream restoration projects focus on the recovery of physical and chemical aspects. This study highlights how incorporating the natural history of aquatic insects will be another critical tool for both the initial design and the long-term success of restoring streams.

"Unavailable or unstable egg-laying habitat may be a primary reason why biological recovery in restored streams lags decades behind geomorphological and hydrological recovery," says Brad Taylor, co-author of the paper and assistant professor of applied ecology at NC State. "Ensuring stable and suitable rocks for insect egg-laying could be a small design change to increase the return on our multi-million-dollar investment in stream restoration."

Credit: 
North Carolina State University

Icy ocean worlds seismometer passes further testing in Greenland

image: SIIOS demobilization team in Greenland. (left to right) Natalie Wagner, Juliette (Bella) Broadbeck, Dani DellaGiustina, Namrah Habib, Susan Detweiler, Angela Marusiak, and pilot Sebastian Holst.

Image: 
Tonny Olsen

The NASA-funded Seismometer to Investigate Ice and Ocean Structure (SIIOS) performed well in seismic experiments conducted in snowy summer Greenland, according to a new study by the SIIOS team led by the University of Arizona published this week in Seismological Research Letters.

SIIOS could be a part of proposed NASA spacecraft missions to the surface of Europa or Enceladus. These moons of Jupiter and Saturn are encrusted by an icy shell over subsurface liquid oceans, and seismic data could be used to better define the thickness and depth of these layers. Other seismic points of interest on these worlds could include ice volcanoes, drainage events below the ice shell and possibly even a timely glimpse of the reverberations from a meteorite impact.

To better mimic mission conditions, the SIIOS team attached flight candidate seismometers to the platform and legs of a buried and aluminum-shielded mock spacecraft lander on the Greenland Ice Sheet. Angela Marusiak of NASA's Jet Propulsion Laboratory and colleagues found that the lander's recordings of seismic waves from passive and active seismic sources were comparable to recordings made by other ground seismometers and geophones up to a kilometer away.

Although the attached seismometers did pick up some of the shaking of the lander itself, Marusiak said the lander and ground-based seismometers "performed very similar to each other, which is definitely promising," in detecting earthquakes and ice cracking.

The experimental array was placed over a subglacial lake (a new feature in Greenland that had not yet been studied with seismic approaches) and the lander-coupled seismometers were also able to detect the ice-water interface, which would be one of the instrument's primary tasks on the icy ocean worlds.

The scientists buried the lander and nearby seismometers a meter deep in granular snow, and covered the lander with an aluminum box, to reduce the effects of wind and temperature variation on the instruments. This brought the experiment closer to the atmospheric conditions that might be expected on an airless moon like Europa. During an icy ocean world mission, however, the seismometer would likely only be deployed to the surface and may not be buried.

"What we're hoping for is if we are able to go to Europa or Enceladus or one of these icy worlds that doesn't have huge temperature fluctuations or a very thick atmosphere and we're taking away that wind noise, essentially you're taking away what's going to cause a lot of shaking of the lander," explained Marusiak, who conducted the research while she was a Ph.D. student at the University of Maryland.

And unlike on Earth, researchers for these missions wouldn't be able to deploy a large array of seismometers and gather data for months at a time to build a picture of the moon's interior. The available solar energy to power the devices would be 25 times less than that on Earth, and devastating radiation would be likely to destroy the instruments within a couple weeks on a moon like Europa, she said.

After taking an Air Greenland helicopter ride to the site in the summer of 2018, the SIIOS deployment team set up the experimental lander and array on the ice sheet about 80 kilometers north of Qaanaaq. For the active source experiment, the instruments recorded seismic signals created by the team members striking aluminum plates with a sledgehammer at locations up to 100 meters from the array's center.

The array then made passive recordings of local and regional seismic events and the ice sheet's ambient creaking and cracking noises for about 12 days, until an unusual summer snow buried the solar panels powering the array.

Marusiak was proud to be a member of an all-female demobilization team, and by the warm reception that the scientists received at the Thule AFB. The work would not have been possible without the logistics support provided by the National Science Foundation, Polar Field Services, and local guides.

The team plans to return to Greenland this summer to test a prototype seismometer that has been designed to account for more mission-ready conditions of radiation, vacuum and launch vibration, she said.

Credit: 
Seismological Society of America

Immune receptor protein could hold key to treatment of autoimmune diseases

video: In a new study, scientists from Japan have explored the potential role of TARM1 in the pathogenesis of rheumatoid arthritis by analyzing mouse models. They found that TARM1 activated dendritic cells, and development of collagen-induced arthritis (CIA) was notably suppressed in TARM1-deficient mice and by treatment with TARM1-inhibitory soluble TARM1 proteins.

Image: 
Tokyo University of Science

Autoimmune diseases are typically caused when the immune system, whose purpose is to deal with foreign threats to the body, incorrectly recognizes the body's own proteins and cells as threats and activates immune cells to attack them. In the case of rheumatoid arthritis, a well-known autoimmune disease, immune cells erroneously attack the body's own joint components and proteins, causing painful inflammation and even the destruction of bone! Scientists from Japan have now taken a massive step toward understanding and, potentially, treating rheumatoid arthritis better, with their discovery in a brand-new study. Read on to understand how!

The development of autoimmune diseases is an incredibly complex process, involving several key players including genetic and environmental factors. Dendritic cells (DCs), which are responsible for kick-starting the immune response against infections, are one of the main immune cells involved in the pathogenesis of autoimmune diseases. All immune cells, including DCs, are equipped with a variety of receptors on their surfaces, which can either amplify or suppress the immune response. One such receptor is the T cell-interacting, activating receptor on myeloid cells-1 (TARM1). It is a member of the leukocyte immunoglobulin-like receptor family, and helps in the activation of other immune cells such as neutrophils and macrophages. TARM1's functions suggest that it may have an important role to play in the immune response, but the possibility of its role in the pathogenesis of rheumatoid arthritis remains largely unexplored.

The aforementioned team of scientists, led by Professor Yoichiro Iwakura from Tokyo University of Science, and Rikio Yabe and Shinobu Saijo from Chiba University, wanted to find out more about this association. In their study published in Nature Communications, they identified genes that were overexpressed in various mouse models of arthritis. Interestingly, they found that Tarm1 was one of many such genes. As Prof. Iwakura explains, "Tarm1 expression is elevated in the joints of rheumatoid arthritis mouse models, and the development of collagen-induced arthritis (CIA) is suppressed in TARM1-deficient mice."

The scientists observed that the immune system's response to type 2 collagen (IIC), a protein crucial for the development of CIA in mice, was suppressed in TARM1-deficient mice. They also found that the antigen-presenting ability of DCs in TARM1-deficient mice was impaired. With respect to the significance of these findings, Prof. Iwakura explains, "We have shown that TARM1 plays an important role for the maturation and activation of DCs through interaction with IIC". Finally, they injected TARM1-inhibitory soluble TARM1 proteins into the knee of a mouse with CIA. Notably, this suppressed the progression of CIA in the mouse, suggesting that TARM1 inhibition is effective in weakening autoimmune arthritis.

The team's findings about the TARM1 protein have wide implications with respect to the treatment of rheumatoid arthritis as well as other autoimmune and allergic diseases. Commenting on their important discoveries, Prof. Iwakura states, "Because excess DC activation is suggested in many autoimmune and allergic diseases, our observations suggest that TARM1 is a good target for the development of new drugs to treat such diseases."

The findings of this exciting new study surely indicate that there still remains much to be understood about autoimmune diseases like rheumatoid arthritis--and that the more we understand them, the better we can fight them!

Credit: 
Tokyo University of Science

Copy invitation

image: Researchers doped cobalt oxide with tin to create a more efficient electrode for use in supercapacitors. This microscopic image shows the new material on graphene film.

Image: 
JIA ZHU/PENN STATE

A sustainable, powerful micro-supercapacitor may be on the horizon, thanks to an international collaboration of researchers from Penn State and the University of Electronic Science and Technology of China. Until now, the high-capacity, fast-charging energy storage devices have been limited by the composition of their electrodes -- the connections responsible for managing the flow of electrons during charging and dispensing energy. Now, researchers have developed a better material to improve connectivity while maintaining recyclability and low cost.

They published their results on Feb. 8 in the Journal of Materials Chemistry A.

"The supercapacitor is a very powerful, energy-dense device with a fast-charging rate, in contrast to the typical battery -- but can we make it more powerful, faster and with a really high retention cycle?" asked Jia Zhu, corresponding author and doctoral student conducting research in the laboratory of Huanyu "Larry" Cheng, Dorothy Quiggle Career Development Professor in Penn State's Department of Engineering Science and Mechanics.

Zhu worked under Cheng's mentorship to explore the connections in a micro-supercapacitor, which they use in their research on small, wearable sensors to monitor vital signs and more. Cobalt oxide, an abundant, inexpensive material that has a theoretically high capacity to quickly transfer energy charges, typically makes up the electrodes. However, the materials that mix with cobalt oxide to make an electrode can react poorly, resulting in a much lower energy capacity than theoretically possible.

The researchers ran simulations of materials from an atomic library to see if adding another material -- also called doping -- could amplify the desired characteristics of cobalt oxide as an electrode by providing extra electrons while minimizing, or entirely removing, the negative effects. They modeled various material species and levels to see how they would interact with cobalt oxide.

"We screened possible materials but found many that might work were too expensive or toxic, so we selected tin," Zhu said. "Tin is widely available at a low cost, and it's not harmful to the environment."

In the simulations, the researchers found that by partially substituting some of the cobalt for tin and binding the material to a commercially available graphene film -- a single-atom thick material that supports electronic materials without changing their properties -- they could fabricate what they called a low-cost, easy-to-develop electrode.

Once the simulations were completed, the team in China conducted experiments to see if the simulation could be actualized.

"The experimental results verified a significantly increased conductivity of the cobalt oxide structure after partial substitution by tin," Zhu said. "The developed device is expected to have promising practical applications as the next-generation energy storage device."

Next, Zhu and Cheng plan to use their own version of graphene film -- a porous foam created by partially cutting and then breaking the material with lasers -- to fabricate a flexible capacitor to allow for easy and fast conductivity.

"The supercapacitor is one key component, but we're also interested in combining with other mechanisms to serve as both an energy harvester and a sensor," Cheng said. "Our goal is to put a lot of functions into a simple, self-powered device."

Credit: 
Penn State

Cellular benefits of gene therapy seen decades after treatment

An international collaboration between Great Ormond Street Hospital, the UCL GOS Institute for Child Health and Harvard Medical School has shown that the beneficial effects of gene therapy can be seen decades after the transplanted blood stem cells has been cleared by the body.

The research team monitored five patients who were successfully cured of SCID-X1 using gene therapy at GOSH. For 3-18 years patients' blood was regularly analysed to detect which cell types and biomarker chemicals were present in their blood. The results showed that even though the stem cells transplanted as part of gene therapy had been cleared by the patients, the all-important corrected immune cells, called T-cells, were still forming.

Gene therapy works by first removing some of the patients' blood-forming stem cells, which create all types of blood and immune cells. Next, a viral vector is used to deliver a new copy of the faulty gene into the DNA of the patients' cells in a laboratory. These corrected stem cells are then returned to patients in a so-called 'autologous transplant', where they go on to produce a continual supply of healthy immune cells capable of fighting infection.

In the gene therapy for SCID-X1 the corrected stem cells have been eventually cleared by the body but the patients remained cured of their condition. This team of researchers suggested that the 'cure' was down to the fact that the body was still able to continually produce newly-engineered T cells - an important part of the body's immune system.

They used state-of-the-art gene tracking technology and numerous tests to give unprecedented details of the T cells in SCID-X1 patients decades after gene therapy.

The team believe that this gene therapy has created the ideal conditions for the human thymus (the part of the body where T cells develop) to host a long-term store of the correct type of progenitor cells that can form new T cells. Further investigation of how this happens and how it can be exploited could be crucial for the development of next generation gene therapy and cancer immunotherapy approaches.

Credit: 
University College London

Predicting the likelihood of bone fractures in older men

Fractures in the vertebrae of the spine and calcification in a blood vessel called the abdominal aorta can both be visualized through the same spinal imaging test. A new study published in the Journal of Bone and Mineral Research that included 5,365 older men indicates that each of these measures are linked with a higher risk of developing hip and other fractures.

Investigators found that including both measures compared with including only abdominal aortic calcification or only vertebral fractures improved the ability to predict which men were most likely to experience a hip or other fracture in the future.

"Both abdominal aortic calcification and a prevalent vertebral fracture can be simultaneously and quickly detected on standard radiographs or lateral spine bone density images, and this may aid fracture risk assessment in older men who have either or both risk factors," said lead author John T. Schousboe, MD, PhD, of the University of Minnesota and Park Nicollet Clinic & HealthPartners Institute.

Credit: 
Wiley