Tech

A funnel of light

image: The figure shows how light is caught through the light funnel.

Image: 
University Rostock / Alexander Szameit

Professor Ronny Thomale holds a chair for theoretical condensed matter physics, the TP1, at the Julius-Maximilian University of Würzburg. The discovery and theoretical description of new quantum states of matter is a prime objective of his research. „Developing a theory for a new physical phenomenon which then inspires new experiments seeking after this effect is one of the biggest moments in a theoretical physicist's practice", so he says. In an ideal case, such an effect would even unlock unexpected technological potential.

All this has come together with a recent project which Thomale pursued together with the optical experimental group of Professor Alexander Szameit at the University of Rostock the results of which have now been published in the „Science" magazine.

Spot landing in an optical fibre 10 kilometres of length

„We have managed to realise an effect we call a ‚light funnel'", Thomale explains. Through this new effect, light in an optical fibre 10 kilometres of length can be accumulated at one specific point of choice in the wire. The mechanism underlying this phenomenon is the so-called „non-Hermitian skin effect" to which Thomale contributed relevant theoretical work in 2019. Specifically, Thomale's work has enabled the understanding of the skin effect in the framework set by topological states of matter.

Topological matter has evolved into one of the most vibrant areas of research of contemporary physics. In Würzburg, the field has been pioneered by semiconductor research by Gottfried Landwehr and Klaus von Klitzing (Nobel laureate 1985), which in the past decade was continued by Laurens W. Molenkamp.

Research on the topology of nature

The term topology originates from the old greek words for „study" and „place". Founded as a predominantly mathematical discipline, it has now broadly spread into physics, including optics. Together with other platforms of synthetic matter, they form the broader direction named topological metamaterials of which the researchers expect fundamental future technological innovation.

Here, physicists do not exclusively resort to materials and chemical compositions given by nature. Rather, they develop new synthetic crystals composed of tailored artificial degrees of freedom. With regard to the light funnel developed by Thomale and Szameit, the platform of choice is an optical fibre which conducts light along the fibre but at the same time allows for detailed spatially resolved manipulation.

Optical detectors with high sensitivity

„The light accumulation achieved by the light funnel could be the basis for improving the sensitivity of optical detectors and thus enabling unprecedented optical applications", Thomale explains. According to Thomale, however, the light funnel is only the beginning. „Already at this stage we are working on many new ideas in the realm of topological photonics and their potential technological application."

To Thomale's conviction, Würzburg provides an excellent environment for pursuing this direction of research. This has recently manifested itself in the excellence cluster „ct.qmat" which was jointly granted to the JMU Würzburg and TU Dresden. A major pillar of research of „ct.qmat" centres around synthetic topological matter, which is strongly supported by the research done at Thomale's chair TP1 in Würzburg.

The research team in Rostock around Alexander Szameit is constitutively integrated into „ct.qmat". For instance, Thomale and Szameit jointly supervise PhD students financially supported through „ct.qmat". „Already few months after its foundation, the synergies created by ct.qmat pay off, and demonstrate the stimulating impact of such excellence cluster on cutting edge research in Germany", Thomale concludes.

Credit: 
University of Würzburg

Researchers look for dark matter close to home

ANN ARBOR--Eighty-five percent of the universe is composed of dark matter, but we don't know what, exactly, it is.

A new study from the University of Michigan, Lawrence Berkeley National Laboratory (Berkeley Lab) and University of California, Berkeley has ruled out dark matter being responsible for mysterious electromagnetic signals previously observed from nearby galaxies. Prior to this work there were high hopes that these signals would give physicists hard evidence to help identify dark matter.

Dark matter can't be observed directly because it does not absorb, reflect or emit light, but researchers know it exists because of the effect it has on other matter. We need dark matter to explain gravitational forces that hold galaxies together, for example.

Physicists have suggested dark matter is a closely related cousin of the neutrino, called the sterile neutrino. Neutrinos--subatomic particles with no charge and which rarely interact with matter--are released during nuclear reactions taking place inside the sun. They have a tiny amount of mass, but this mass isn't explained by the Standard Model of Particle Physics. Physicists suggest that the sterile neutrino, a hypothetical particle, could account for this mass and also be dark matter.

Researchers should be able to detect the sterile neutrino because it is unstable, says Ben Safdi, co-author and an assistant professor of physics at U-M. It decays into ordinary neutrinos and electromagnetic radiation. To detect dark matter, then, physicists scan galaxies to hunt for this electromagnetic radiation in the form of X-ray emission.

In 2014, a seminal work discovered excess X-ray emission from nearby galaxies and galaxy clusters. The emission appeared to be consistent with that which would arise from decaying sterile neutrino dark matter, Safdi said.

Now, a meta analysis of raw data taken by the XMM-Newton space X-ray telescope of objects in the Milky Way over a period of 20 years has found no evidence that the sterile neutrino is what comprises dark matter. The research team includes U-M doctoral student Christopher Dessert and Nicholas Rodd, a physicist with the Berkley Lab theory group and the Berkley Center for Theoretical Physics. Their results are published in the journal Science.

"This 2014 paper and follow-up works confirmed the signal generated a significant amount of interest in the astrophysics and particle physics communities because of the possibility of knowing, for the first time, precisely what dark matter is at a microscopic level," Safdi said. "Our finding does not mean that the dark matter is not a sterile neutrino, but it means that--contrary to what was claimed in 2014--there is no experimental evidence to-date that points towards its existence."

Space-based X-ray telescopes such as the XMM-Newton telescope, point at dark-matter-rich environments to search for this faint electromagnetic radiation in the form of X-ray signals. The 2014 discovery named the X-ray emission the "3.5 keV line"--keV stands for kilo-electronvolts--because of where the signal appeared on X-ray detectors.

The research team searched for this line in our own Milky Way using 20 years of archival data taken by the XMM-Newton space X-ray telescope. Physicists know dark matter collects around galaxies, so when previous analyses looked at nearby galaxies and galaxy clusters, each of those images would have captured some column of the Milky Way dark matter halo.

The team used those images to look at the "darkest" part of the Milky way. This significantly improved the sensitivity of previous analyses looking for sterile neutrino dark matter, Safdi said.

"Everywhere we look, there should be some flux of dark matter from the Milky Way halo," said the Berkeley Lab's Rodd, because of our solar system's location in the galaxy. "We exploited the fact that we live in a halo of dark matter" in the study.

Christopher Dessert, a study co-author who is a physics researcher and Ph.D. student at U-M, said galaxy clusters where the 3.5 keV line have been observed also have large background signals, which serve as noise in observations and can make it difficult to pinpoint specific signals that may be associated with dark matter.

"The reason why we're looking through the galactic dark matter halo of our Milky Way galaxy is that the background is much lower," Dessert said.

For example, XMM-Newton has taken images of isolated objects like individual stars in the Milky Way. The researchers took these images and masked the objects of original interest, leaving pristine and dark environments in which to search for the glow of dark matter decay. Combining 20 years of such observations allowed for a probe of sterile neutrino dark matter to unprecedented levels.

If sterile neutrinos were dark matter, and if their decay led to an emission of the 3.5 keV line, Safdi and his fellow researchers should have observed that line in their analysis. But they found no evidence for sterile neutrino dark matter.

"While this work does, unfortunately, throw cold water on what looked like what might have been the first evidence for the microscopic nature of dark matter, it does open up a whole new approach to looking for dark matter which could lead to a discovery in the near future," Safdi said.

Credit: 
University of Michigan

New type of immunotherapy hinders the spread of ovarian cancer

Malignant ovarian cancer is insidious: Known and feared for vague and uncharacteristic symptoms that often mean the disease is discovered so late that on average only four out of six patients are still alive after five years. Researchers from Aarhus University, Denmark, are hoping to change this situation in the future.

Together with research colleagues from France and the UK, they have published a study conducted in mice showing that it appears to be possible to hinder the spread of ovarian cancer and reduce the tumour by removing some specific immune cells, known as macrophages, from the fat that is stored in the abdominal cavity and hanging in front of the intestines. The omental fat, as it is known, became well-known a few years ago under the name 'skinny fat'. However, it should not be confused with the layer of fat seen in overweight people as visible rolls of fat under the skin.

Anders Etzerodt is PhD and assistant professor of cancer immunology at the Department of Biomedicine at Aarhus University. He is the lead author of the study, which has been published in the Journal of Experimental Medicine. He explains that ovarian cancer most often occurs in the fallopian tubes and that the starting point for the research project was familiar knowledge about cancer cells from this type of cancer being able to detach and shed into the abdominal cavity. Because this occurs very early in the course of the disease, the 'homeless' cancer cells need to fasten onto something to survive.

"This is where the omental fat becomes a kind of host for cells which would otherwise perish, and our research now shows that when tumour cells move into the omental fat, two specific types of immune cell known as macrophages alter character. They develop into the disease's small supporters," says Anders Etzerodt.

"One of the macrophage types which is already present in the tissue simply begins to help the tumour spread further to the other organs in the abdominal cavity. At the same time, the second type of macrophage, which comes from the bloodstream and is recruited as a reaction to the infiltration of tumour cells into the omental fat, begins to counteract the immune system's attempt to fight the invasive cancer cells. In this way, they help the tumour to grow larger," says Anders Etzerodt about the new findings.

In the study, the researchers initially experimented with removing the macrophages already found in the tissue, which led them to establish that this inhibited the spread of cancer in the abdominal cavity - though without the tumour in the omental fat becoming smaller. When the researchers simultaneously removed the above-mentioned macrophages from the bloodstream, the result was both less spreading and a shrinking tumour.

"We describe a type of immunotherapy which differs from the immunotherapy that is characterised by supporting the T-cells that kill a tumour, and which has become an established part of modern immunological treatment," says Anders Etzerodt.

"What we're doing is also immunotherapy, but it focuses on another part of the immune system. This project is only the third scientific article to describe how macrophages with different origins affect tumour development, and precisely how the macrophages that are found to inhibit the immune system's ability to hamper the cancer can be removed. They 'put the brakes on the brake', if you will," he explains.

Anders Etzerodt also explains that he and his colleagues found the new types of macrophages using a new technique called single-cell sequencing, a method which gives the researchers very detailed information about the processes that take place in each individual cell.

According to Anders Etzerodt, the research result has obvious potential for improved treatment in the future, though with the important proviso that the testing has only been conducted on mice so far. The next step is to develop a medicine which can be tested on people. According to Anders Etzerodt, this is particularly interesting because the research group has previously shown that similar macrophages from the bloodstream are also present in models for skin cancer.

"So far, we've gained a new and deeper understanding of what is helping and what is hindering the body in the development of ovarian cancer, and I'm looking forward to testing this in clinical trials on patients who currently have a really poor prognosis," he says.

Credit: 
Aarhus University

Local community involvement crucial to restoring tropical peatlands

image: Indonesia's peatlands are a crucial habitat for many birds and animals, including endangered species such as orang-utans and tigers.

Image: 
Caroline Ward, University of York.

Local community involvement is vital in efforts to raise water levels to help restore Indonesia's tropical peatlands, a new study has found.

Unspoilt peatlands act as a carbon sink and play an important role in reducing global carbon emissions. They are also a crucial habitat for many birds and animals, including endangered species such as orang-utans and tigers.

Draining peatland for farming destroys habitats and causes the peat to emit the carbon it once stored. The dry land also becomes prone to fire - leading to increased carbon emissions and a threat to the lives of many species including humans.

The study, led by researchers at the University of York, interviewed people involved in work to conserve and restore Indonesia's 15 million hectares of peatland - more than half of which has been drained and converted to cropland.

The international team, which included researchers from Jambi University and the Indonesia Soil Research Institute, gained the views of scientists, charities, and government officials. All participants agreed that getting support from local communities and raising water levels were key to successful restoration.

Lead author of the study, Dr Caroline Ward from the Leverhulme Centre for Anthropocene Biodiversity at the University of York, said: "While many peatlands in Indonesia have been drained for large palm oil plantations, many hectares have also been drained for local smallholders and farmers to grow crops on.

"Many of these local people have no other option but to use the land in this way. Efforts need to be made across all stakeholders involved in peatland restoration to provide people with an alternative source of income or a crop which can be farmed in a more sustainable way."

Restoring drained peatland involves a process of "re-wetting" where canals draining water away are blocked or filled in.

Professor Lindsay Stringer from the University of Leeds commented that "rewetting brings the water table closer to the surface, so this makes it less likely that the peatlands will catch fire. Rewetting is often used together with revegetation and revitalisation to improve the overall peatland condition".

Professor Jane Hill, project leader at the University of York, commented that "restoration of peatlands to prevent fires has huge benefits for local communities and preventing forest fires is vital for conserving wildlife".

Dr Ward added that "Tropical peatland restoration is really important to reducing global carbon emissions.

"Our study highlights that while everyone agrees that restoration of peatland is crucial, there is no consensus on how this should be achieved and how much progress has been made to date.

"We need to gather an evidence base urgently, and establish better ways of collaborating and sharing information between different groups."

Wading through the swamp: what does tropical peatland restoration mean to national level stakeholders in Indonesia? Is published in Restoration Ecology.

Credit: 
University of York

Quantum phenomenon governs organic solar cells

image: Olle Inganäs, professor emeritus, Linkoping University

Image: 
Thor Balkhed

Researchers at Linköping University have discovered a quantum phenomenon that influences the formation of free charges in organic solar cells. "If we can properly understand what's going on, we can increase the efficiency", says Olle Inganäs, professor emeritus.

Doctoral student Qingzhen Bian obtained unexpected results when he set up an experiment to optimise a solar cell material consisting of two light-absorbing polymers and an acceptor material. Olle Inganäs, professor emeritus in the Division of Biomolecular and Organic Electronics asked him to repeat the experiment to eliminate the possibility of measurement errors. Time after time, and in experiments carried out both at LiU and by colleagues in Lund, the same thing happened: a tiny periodic waveform lasting a few hundred femtoseconds appeared in the signature from the optical absorption as a photocurrent formed in the solar cell material. What was going on?

The explanation has been published in Nature Communications.

Some background: When light in the form of photons is absorbed in a semi-conducting polymer, an exciton forms. Excitons are bound electrone-hole pairs in the polymer. The electrons are not released, and the transport of charges, the photocurrent, does not arise. When the electron-donating polymer is mixed with a molecule that accepts electrons, the electrons can be released. The electrons then only need to take a small jump to become free, and the loss of energy is kept to a minimum. The holes and the electrons transport the photocurrent and the solar cell starts to produce electricity.

This has been well-known for a long time. However, the remarkable waveform then appeared in Qingzhen Bian's experiment.

"The only conceivable explanation is that coherence arises between the excited system and the separated charges. We asked the quantum chemists to look into this and the results we obtain in repeated experiments agree well with their calculations", says Olle Inganäs.

In the quantum scale, atoms vibrate, and they vibrate faster when they are heated. It is these vibrations that interact with each other in some way and with the excited system of electrons: the phases of the waves follow each other and a state of coherence arises.

"The coherence helps to create the charges that give the photocurrent, which takes place at room temperature. But we don't know why or how yet", says Olle Inganäs.

The same quantum coherence is found in the biological world.

"An intense debate is ongoing among biophysics researchers whether systems that use photosynthesis have learnt to exploit coherence or not. I find it unlikely that millions of years of evolution have not resulted in the natural world exploiting the phenomenon", says Olle Inganäs.

"If we understood better how the charge carriers are formed and how the process is controlled, we should be able to use it to increase the efficiency of organic solar cells. The vibrations depend on the structure of the molecule, and if we can design molecules that contribute to increasing the photocurrent, we can also use the phenomenon to our advantage", he says.

Credit: 
Linköping University

Artificial intelligence identifies optimal material formula

image: A look into the sputtering system where nanostructured layers are generated.

Image: 
© Lars Banko

Porous or dense, columns or fibres

During the manufacture of thin films, numerous control variables determine the condition of the surface and, consequently, its properties. Relevant factors include the composition of the layer as well as process conditions during its formation, such as temperature. All these elements put together result in the creation of either a porous or a dense layer during the coating process, with atoms combining to form columns or fibres. "In order to find the optimal parameters for an application, it used to be necessary to conduct countless experiments under different conditions and with different compositions; this is an incredibly complex process," explains Professor Alfred Ludwig, Head of the Materials Discovery and Interfaces Team.

Findings yielded by such experiments are so-called structure zone diagrams, from which the surface of a certain composition resulting from certain process parameters can be read. "Experienced researchers can subsequently use such a diagram to identify the most suitable location for an application and derive the parameters necessary for producing the suitable layer," points out Ludwig. "The entire process requires an enormous effort and is highly time consuming."

Algorithm predicts surface

Striving to find a shortcut towards the optimal material, the team took advantage of artificial intelligence, more precisely machine learning. To this end, PhD researcher Lars Banko, together with colleagues from the Interdisciplinary Centre for Advanced Materials Simulation at RUB, Icams for short, modified a so-called generative model. He then trained this algorithm to generate images of the surface of a thoroughly researched model layer of aluminium, chromium and nitrogen using specific process parameters, in order to predict what the layer would look like under the respective conditions.

"We fed the algorithm with a sufficient amount of experimental data in order to train it, but not with all known data," stresses Lars Banko. Thus, the researchers were able to compare the results of the calculations with those of the experiments and analyse how reliable its prediction was. The results were conclusive: "We combined five parameters and were able to look in five directions simultaneously using the algorithm - without having to conduct any experiments at all," outlines Alfred Ludwig. "We have thus shown that machine learning methods can be transferred to materials research and can help to develop new materials for specific purposes."

Credit: 
Ruhr-University Bochum

Interactive product labels require new regulations, study warns

Artificial intelligence will be increasingly used on labels on food and other products in the future to make them interactive, and regulations should be reformed now so they take account of new innovations, a study warns.

Thanks to the increased use of smartphones, smart-watches and other interconnected products, labelling on foods and other goods may become more personalised and thus more helpful, addressing consumer concerns, such as nut allergies.

Facial recognition technology can be used by shops and manufacturers to collect data on the specific needs of consumers, as well as to prompt shop staff to offer assistance or enable features such as large print on labels, if necessary.

The study says AI technology could play a significant role in making labelling more comprehensive and personalised, but regulators across Europe must ensure the technology is also used for public good and to help consumers. Changes are especially needed because AI is currently mainly being used to collect data about customers, or to help manufacturing or distribution.

The changes should include the introduction of more specific rules about the design and content of consumer product labels in order to prevent producers from manipulating consumers' product and safety expectations by using AI.

The EU Product Liability Directive is being reviewed and it is hoped the research, published in the European Journal of Risk Regulation, can contribute to this work.

Dr Joasia Luzak, from the University of Exeter Law School, who carried out the research, said: "Modern technologies mean consumers can have more personal and comprehensive product labelling. The pace at which AI is being used means it would be wise to rethink the whole framework of the Product Liability Directive or to design a separate set of rules for products using modern technology. There is a danger this technology will only be used for the benefit of companies, not consumers."

The technology could benefit consumers as, for example, they could store the information on their allergies on a smartwatch, which information would then be picked up by interconnected, store devices. As a result, ingredients to which consumers are allergic could be highlighted on the labels of products when consumers near them. Companies often say it is not their responsibility when a fault occurs to the product after it leaves the manufacturing process and is put into circulation. However, if they continue to monitor the product through AI after the product is put on the market, or if they retain the right to adjust information on the labels, the study says this justification should no longer apply.

The research says more extensive AI product labelling, providing consumers with a greater list of warnings about product risks, should not be an excuse for manufacturers to avoid taking action or pass responsibility when a product becomes unsafe or malfunctions.

Dr Luzak said: "Consumers will likely pay more attention to personalised labelling. The use of modern technologies to personalise product labelling could be in the interests of both producers and consumers.

"Producers could gain more insights into their supply chain and more control over their products, as well as reaching more consumers with their product information. Consumers should be able to rely on better product information and to form more realistic expectations regarding consumer products.

"The increased tracking and monitoring of products should raise the level of product safety, which always reduces the instances of product liability."

The study says the definition of a "defective product" in the Product Liability Directive should be based on an objective assessment of product safety. This could be the product's adherence to safety standards and experts', rather than the public at large, safety expectations.

Credit: 
University of Exeter

Wildfire perceptions largely positive after hiking in a burned landscape

image: Students in a UC Davis fire ecology class walk along a burned ridge top of Stebbins Cold Canyon Natural Reserve in 2016.

Image: 
Alexandra Weill, UC Davis

When hikers returned to UC Davis Stebbins Cold Canyon Reserve in 2016, a year after a wildfire swept through its expanse of oak trees and chaparral in Northern California, half of them expected to see a devastated landscape. But pre- and post-hike surveys conducted by the University of California, Davis, reveal that roughly a third returned energized, awed and excited about the changes they saw.

Among the survey responses: "This area is restoring itself." "Awe-inspiring." "Nature is always changing, sometimes sad. Today I felt hopeful."

Results of the survey, published in the International Journal of Wildland Fire, indicate that people understand and appreciate the role of fire in natural landscapes more than is perceived.

"People can have really largely positive experiences hiking in a place that has burned," said lead author Alexandra Weill, who conducted the survey while a graduate student researcher in Professor Andrew Latimer's lab in the UC Davis Department of Plant Sciences. "They engage in it and find it very interesting and surprisingly beautiful. That can be used as a tool in education and outreach as places around us recover from wildfire."

GETTING THE PRESCRIBED BURN MESSAGE

Survey responses were gathered from about 600 people between May 2016 and June 2017. Responses indicate that most participants -- about 70 percent -- were getting the message that prescribed burns can benefit ecosystems and reduce the threat of catastrophic fire.

Survey participants were highly familiar with the narrative of the West's history of fire suppression and fairly familiar with fire topics related to conifer forests. But they were less knowledgeable about fire's history and role in the shrublands and woodlands that dominate much of Northern California.

Several of the state's most devastating recent fires -- the Camp Fire in Paradise, Tubbs and Kinkade fires in Santa Rosa, the Mendocino Complex fire -- were in environments including oak, woodland and chaparral, such as at Stebbins Cold Canyon. Fires in these areas burn differently than those in conifer forests.

This disconnect could indicate a gap in fire outreach and education. Weill suggested that educators and agencies adjust the narrative to reflect people's local landscape.

NUANCED VIEWS

While positive responses were far more common than expected, most people held mixed views regarding effects of the fire. For example: "I know it's good, but it's sad when it's out of control and people lose homes." "I understand [it] needs to happen -- but devastating!"

Such wariness is not surprising but it is illuminating, Weill said.

"People have more nuanced opinions than we give them credit for in understanding positive and negative effects of fire, but also difficulty in reconciling what they know about good fire versus what they see in the news or personal experiences," said Weill.

Credit: 
University of California - Davis

Upgrading biomass with selective surface-modified catalysts

image: (Top row) Jiayi Fu, Jonathan Lym, (middle row) Weiqing Zheng, Konstantinos Alexopoulos, Alexander Mironenko, and (bottom row) Dionisios Vlachos of the Catalysis Center for Energy Innovation--a U.S. Department of Energy (DOE) Energy Frontier Research Center at the University of Delaware--and Anibal Boscoboinik of Brookhaven Lab's Center for Functional Nanomaterials. The scientists are part of a team that designed and characterized a catalyst to promote the selective breaking of a carbon-oxygen bond in a plant derivative (chemical structure with the OH side group shown in the image background) to produce a potential biofuel.

Image: 
University of Delaware

UPTON, NY--Scientists have designed a catalyst composed of very low concentrations of platinum (single atoms and clusters smaller than billionths of a meter) on the surface of titanium dioxide. They demonstrated how this catalyst significantly enhances the rate of breaking a particular carbon-oxygen bond for the conversion of a plant derivative (furfuryl alcohol) into a potential biofuel (2-methylfuran). Their strategy--described in a paper published in Nature Catalysis on Mar. 23--could be applied to design stable, active, and selective catalysts based on a wide range of metals supported on metal oxides to produce industrially useful chemicals and fuels from biomass-derived molecules.

"For a molecule to generate a particular product, the reaction has to be directed along a certain pathway because many side reactions that are not selective for the desired product are possible," explained co-author Anibal Boscoboinik, a staff scientist in the Center for Functional Nanomaterials (CFN) Interface Science and Catalysis Group at the U.S. Department of Energy's (DOE) Brookhaven National Laboratory. "To convert furfuryl alcohol into biofuel, the bond between carbon and oxygen atoms on the side group attached to the ring-shaped part of the molecule must be broken, without producing any reactions in the ring. Typically, the metal catalyst that breaks this bond also activates ring-related reactions. However, the catalyst designed in this study only breaks the side group carbon-oxygen bond."

Aromatic rings are structures with atoms connected through single or double bonds. In molecules derived from plant waste, aromatic rings often have oxygen-containing side groups. Transforming plant waste derivatives into useful products requires the removal of oxygen from these side groups by breaking specific carbon-oxygen bonds.

"Biomass contains a lot of oxygen, which needs to be partially removed to leave behind more useful molecules for the production of renewable fuels, plastics, and high-performance lubricants," said co-first author Jiayi Fu, a graduate student at the Catalysis Center for Energy Innovation (CCEI) at the University of Delaware (UD). "Hydrodeoxygenation, a reaction in which hydrogen is used as a reactant to remove oxygen from a molecule, is useful for converting biomass into value-added products."

In this study, the scientists hypothesized that adding noble metals to the surfaces of moderately reducible metal oxides--those that can lose and gain oxygen atoms--would promote hydrodeoxygenation.

"Removing oxygen from the oxide surface forms an anchoring site where molecules can be held in place so the necessary bonds can be broken and formed," said co-first author and UD CCEI graduate student Jonathan Lym. "Previous studies in the catalysis and semiconductor communities have shown how much impurities can influence the surface."

To test their hypothesis, the team selected platinum as the noble metal and titanium dioxide (titania) as the metal oxide. Theoretical calculations and modeling indicated that the formation of oxygen vacancies is more energetically favorable when single atoms of platinum are introduced onto the surface of titania.

After synthesizing the platinum-titania catalyst at UD, they performed various structural and chemical characterization studies using facilities at Brookhaven and Argonne National Labs. At the CFN Electron Microscopy Facility, they imaged the catalyst at high resolution with a scanning transmission electron microscope. At Brookhaven's National Synchrotron Light Source II (NSLS-II), they used the In situ and Operando Soft X-ray Spectroscopy (IOS) beamline and the Quick X-ray Absorption and Scattering (QAS) beamline to track the chemical (oxidation) state of platinum. Through complementary x-ray spectroscopy studies at Argonne's Advanced Photon Source (APS), they determined the distance between atoms in the catalyst.

"This work is a great example of how scientific user facilities provide researchers with the complementary information needed to understand complex materials," said CFN Director Chuck Black. "The CFN is committed to our partnership with NSLS-II to enable these sorts of studies by scientists from around the world."

Back at Delaware, the team performed reactivity studies in which they put the catalyst and furfuryl alcohol in a reactor and detected the products through gas chromatography, an analytical chemistry separation technique. In addition to these experiments, they theoretically calculated the amount of energy required for different steps of the reaction to proceed. On the basis of these calculations, they ran computer simulations to determine the preferable reaction pathways. The simulated and experimental product distributions both indicated that negligible ring-reaction products are generated when a low concentration of platinum is present. As this concentration is increased, the platinum atoms begin to aggregate into larger clusters that incite ring reactions.

"The complementary experimental and computational framework allows for a detailed understanding of what is happening on the surface of a very complex material in a way that we can generalize concepts for the rational design of catalysts," said Boscoboinik. "These concepts can help in predicting suitable combinations of metals and metal oxides to carry out desired reactions for converting other molecules into valuable products."

"This multimember teamwork can only be enabled by center-like activities," added corresponding author Dionisios Vlachos, the UD Allan & Myra Ferguson Chair of Chemical Engineering.

Credit: 
DOE/Brookhaven National Laboratory

Designing lightweight glass for efficient cars, wind turbines

A new machine-learning algorithm for exploring lightweight, very stiff glass compositions can help design next-gen materials for more efficient vehicles and wind turbines. Glasses can reinforce polymers to generate composite materials that provide similar strengths as metals but with less weight.

Liang Qi, a professor of materials science and engineering at the University of Michigan, answered questions about his group's new paper in npj Computational Materials.

What is elastic stiffness? Elastic and glass don't seem to be two words that go together.

All solid materials, including glass, have a property called elastic stiffness--also known as elastic modulus. It's a measure of how much force per unit area is needed to make the material bend or stretch. If that change is elastic, the material can totally recover its original shape and size once you stop the force.

Why do we want light and very stiff glasses?

Elastic stiffness is critical for any materials in structural applications. Higher stiffness means that you can sustain the same force loading with a thinner material. For example, the structural glass in car windshields, and in touch screens on smartphones and other screens, can be made thinner and lighter if the glasses are stiffer. Glass fiber composites are widely used lightweight materials for cars, trucks and wind turbines, and we can make these parts even lighter.

Lighter vehicles can go further on a gallon of gas--6-8% further for a 10% reduction in weight, according to the U.S. Office of Energy Efficiency and Renewable Energy. Weight reduction can also significantly increase the range of electric vehicles.

Lighter, stiffer glass can enable wind turbine blades to transfer wind power into electricity more efficiently because less wind power is "wasted" to make the blades rotate. It can also enable longer wind turbine blades, which can generate more electricity under the same wind speed.

What are the challenges in trying to design light but resilient glasses?

Because glasses are amorphous--or disordered--materials, it's hard to predict their atomistic structures and the corresponding physical/chemical properties. We use computer simulations to speed up the study of glasses, but they require so much computing time that it is impossible to investigate each possible glass composition.

The other problem is that we don't have enough data about glass compositions for machine learning to be effective at predicting glass properties for new glass compositions. Machine learning algorithms are fed data, and they find patterns in the data that enable them to make predictions. But without enough of the right training data, their predictions aren't reliable--just like a political poll conducted in Ohio cannot predict the election in Michigan.

How did you overcome these barriers?

First, we used existing high-throughput computer simulations to generate data on the densities and elastic stiffnesses of various glasses. Second, we developed the machine learning model that is more suitable for a small amount of data--because we still didn't have a lot of data by machine learning standards. We designed it so that the key thing it pays attention to is the strength of the interaction between atoms. In essence, we used physics to give it hints about what was important in the data, and that improves the quality of its predictions for new compositions.

What can your model do?

While we trained our machine learning model with glasses made of silicon dioxide and one or two other additives, we found that it could accurately predict the lightness and elastic stiffness of more complex glasses, with more than 10 different components. It can screen as many as 100,000 different compositions at once.

What are the next steps?

Lightness and elastic stiffness are only two properties that are important in designing glasses. We also need to know their strength, toughness, and their melting temperatures. By openly sharing our data and methods, we hope to inspire the development of new models in the glass research community.

Credit: 
University of Michigan

Coral tells own tale about El Niño's past

image: Georgia Tech climate scientist Kim Cobb samples an ancient coral for radiometric dating. She is part of a team of Rice University and Georgia Tech scientists using data from coral fossils to build a record of temperatures in the tropical Pacific Ocean over the last millennium.

Image: 
Cobb Lab

HOUSTON - (March 26, 2020) - There is no longer a need to guess what ocean temperatures were like in the remote tropical Pacific hundreds of years ago. The ancient coral that lived there know all.

A study in Science led by Rice University and Georgia Tech researchers parses the record archived by ancient tropical Pacific coral over the past millennium. That record could help scientists refine their models of how changing conditions in the Pacific, particularly from volcanic eruptions, influence the occurrence of El Niño events, which are major drivers of global climate.

They found the ratio of oxygen isotopes sequestered in coral, an accurate measure of historic ocean temperatures, shows no correlation between estimates of sulfate particles ejected into the atmosphere by tropical volcanic eruptions and El Niño events.

That result could be of particular interest to scientists who suggest seeding the atmosphere with sun-blocking particles may help reverse global warming.

According to Rice climate scientist and primary author Sylvia Dee, previous climate model studies often tie volcanic eruptions, which increase sulfate aerosols in the atmosphere, to increased chances for an El Niño event. But the ability to analyze climate conditions based on oxygen isotopes trapped in fossil corals extends the climatological record in this key region across more than 20 ancient eruptions. Dee said this allows for a more rigorous test of the connection.

"A lot of climate modeling studies show a dynamical connection where volcanic eruptions can initiate El Niño events," Dee said. "We can run climate models many centuries into the past, simulating volcanic eruptions for the last millennium.

"But the models are just that -- models -- and the coral record captures reality."

Coral data that Georgia Tech climate scientist Kim Cobb and her team arduously collected on trips to the Pacific show little connection between known volcanoes and El Niño events over that time. Like tree rings, these paleoclimate archives hold chemical indicators, the oxygen isotopes, of oceanic conditions at the time they formed.

The coral data yields a high-fidelity record with a resolution of less than a month, tracking the El Niño-Southern Oscillation (ENSO) in the heart of the central tropical Pacific.

The eight time-overlapped corals Cobb and her colleagues recently studied held an unambiguous record of conditions over 319 years, from 1146-1465. This and data from other corals spans more than 500 years of the last millennium and, they wrote, "presents a window into the effects of large volcanic eruptions on tropical Pacific climate."

That span of time includes the 1257 eruption of Mt. Samalas, the largest and most sulfurous of the last millennium.

Cobb said her lab has been developing techniques and expanding the coral record for years. "My first expedition to the islands was in 1997, and it has been my sole focus pretty much since then to extract the best records that we can from these regions," she said, noting the lab has issued many papers on the topic, including a groundbreaking 2003 study on ENSO in Nature.

Cobb said dating the ancient coral samples depends on precise uranium-thorium dating, followed by thousands of mass spectrometric analyses of coral oxygen isotopes from powders drilled every 1 millimeter across the coral's growth axis. "That speaks to the temperature reconstruction," she said. "We're borrowing on 70 years of work with this particular chemistry to establish a robust temperature proxy in corals."

The oxygen-16 to oxygen-18 isotopes revealed by spectrometry show the temperature of the water at the time the coral formed, Cobb said. "The ratio of those two isotopes in carbonates is a function of the temperature," she said. "That's the magic: It's based on pure thermodynamics."

"This beautiful coral record is highly sensitive to El Niño and La Niña events based on its location," Dee added. "My collaborators worked to extend this coral record to span a period where we know there were a lot of explosive volcanic eruptions, especially in the first half of the millennium.

"Scientists have reconstructed the timing of those volcanic eruptions from ice-core records," she said. "We compared the timing of the largest eruptions to the coral record to see if volcanic cooling events had any impact on tropical Pacific climate."

Only some volcanoes launch particulate matter -- particularly sulfate particles, leading to a phenomenon called sulfate aerosol forcing -- into the stratosphere, where the particles reflect incoming sunlight and cool the planet over the short term, Dee said. "But that cooling's impact on the tropical Pacific is uncertain, and might be regionally heterogeneous," she said.

"Our study suggests that linkage (between volcanoes and ENSO) doesn't exist or, if it does, it is obscured by the large natural variability in the climate system," Dee said. "In general, El Niño is a natural oscillator in the climate system. It's a product of chaos, like a Slinky going back and forth. It is so strong that the system might be immune to big climate perturbations like short-term volcanic cooling.

"Incidentally, our scientific community uses the same climate models that we evaluated to estimate the climate's response to geoengineering and solar radiation management schemes that employ sulfate aerosols," Dee said.
Cobb and Dee characterized the study as a cautionary tale for those who study geoengineering. "There is no doubt whatsoever that if we inject stratospheric aerosols, we will cool the planet," Cobb said. "That's been shown and modelled. What we're trying to ask is, what else happens? And how well can we predict that? Our work really motivates further study to flesh out the full scope of climate impacts from sulfate aerosols."

Credit: 
Rice University

Moffitt researchers discover novel role of specific histone deacetylase in lung cancer

TAMPA, Fla. - The survival rates for patients with non-small cell lung cancer (NSCLC) have improved greatly over the past decade thanks to several new targeted treatment options for patients. However, lung cancer still remains the number one cause of cancer-related morality, leading to approximately 154,000 deaths each year in the United States. Many patients do not respond to these new targeted therapies or they may develop drug resistance. Researchers at Moffitt Cancer Center are trying to identify alternative strategies to treat this disease. In a new article published online in Scientific Reports, they highlight how targeting the histone deacetylase HDAC11 may be a novel therapeutic strategy for NSCLC.

Histone deacetylases (HDACs) are proteins that regulate the expression and activity of genes by altering DNA compaction and modifying proteins. HDACs are often deregulated in different types of cancer and several drugs that inhibit HDACs have been approved to treat these diseases. HDAC11 is one of the newest HDACs to be identified, but its role in cancer is not yet known.

Moffitt researchers conducted a series of preclinical studies to investigate the role of HDAC11 in NSCLC. They discovered that high levels of HDAC11 are found in samples from patients with NSCLC and that these high levels are associated with poor survival. The research team wanted to further delineate the potential role of HDAC11 in NSCLC development. They focused their studies on cancer stem cells (CSCs), which are slowly dividing cells that can undergo self-renewal. CSCs are known to contribute to tumor development and progression and are also highly resistant to chemotherapy and targeted drug treatments.

"It has been suggested that drugs that can eliminate CSCs would be effective as anti-cancer agents and could potentially overcome drug resistance," said Srikumar Chellappan, Ph.D., chair of the Department of Tumor Biology at Moffitt.

Chellappan and his team discovered that HDAC11 is found at high levels in CSCs and is associated with expression of the protein Sox2 - a gene that is highly important for the self-renewal of CSCs. When the researchers targeted HDAC11 in NSCLC with specific inhibitors, they observed that the ability of CSCs to undergo self-renewal and expression of Sox2 were greatly reduced. Given the importance of CSCs to lung cancer growth and development, these observations suggest that targeting HDAC11 may be a potential strategy to block the self-renewal process of CSCs and inhibit NSCLC progression.

In additional studies, the research team found that these specific HDAC11 inhibitors impacted several processes associated with cancer development, including the formation of vascular networks, anchorage independent growth and cell motility. The HDAC11 inhibitors also reduced the growth of lung cancer cells that were resistant to other targeted therapies and inhibited the growth of lung cancer cells grown in the presence of other cells that contribute to drug resistance.

"This study presents a mechanistic basis for the role of HDAC11 in lung adenocarcinoma and suggests that once the parameters for in vivo studies and efficacy are met, they would be of immense potential in combating NSCLC," said Chellappan.

Credit: 
H. Lee Moffitt Cancer Center & Research Institute

Quantum effect triggers unusual material expansion

image: Daniel Mazzone led the project to explore the mechanism that causes samarium sulfide to expand dramatically when cooled.

Image: 
Brookhaven National Laboratory

UPTON, NY--You know how you leave space in a water bottle before you pop it in the freezer--to accommodate the fact that water expands as it freezes? Most metal parts in airplanes face the more common opposite problem. At high altitudes (low temperatures) they shrink. To keep such shrinkage from causing major disasters, engineers make airplanes out of composites or alloys, mixing materials that have opposite expansion properties to balance one another out.

New research conducted in part at the U.S. Department of Energy's Brookhaven National Laboratory may bring a whole new class of chemical elements into this materials science balancing act.

As described in a paper just published in the journal Physical Review Letters, scientists used x-rays at Brookhaven's National Synchrotron Light Source II (NSLS-II)--a U.S. Department of Energy Office of Science user facility--and two other synchrotron light sources to explore an unusual metal that expands dramatically at low temperature. The experiments on samarium sulfide doped with some impurities revealed details about the material's atomic-level structure and the electron-based origins of its "negative thermal expansion."

This work opens avenues for designing new materials where the degree of expansion can be precisely tuned by tweaking the chemical recipe. It also suggests a few related materials that could be explored for metal-mixing applications.

"In practical applications, whether an airplane or an electronic device, you want to make alloys of materials with these opposite properties--things that expand on one side and shrink on the other when they cool down, so in total it stays the same," explained Daniel Mazzone, the paper's lead author and a postdoctoral fellow at NSLS-II and Brookhaven Lab's Condensed Matter Physics and Materials Science Department.

But materials that mimic water's expansion when chilled are few and far between. And while the expansion of freezing water is well understood, the dramatic expansion of samarium sulfide had never been explained.

Like other materials Mazzone has studied, this samarium-based compound (specifically samarium sulfide with some yttrium atoms taking the place of a few samarium atoms) is characterized by competing electronic phases (somewhat analogous to the solid, liquid, and gaseous phases of water). Depending on external conditions such as temperature and pressure, electrons in the material can do different things. In some cases, the material is a gold-colored metal through which electrons can move freely--a conductor. In other conditions, it's a black-colored semiconductor, allowing only some electrons to flow.

The golden metallic state is the one that expands dramatically when chilled, making it an extremely unusual metal. Mazzone and his colleagues turned to x-rays and theoretical descriptions of the electrons' behavior to figure out why.

At NSLS-II's Pair Distribution Function (PDF) beamline, the scientists conducted diffraction experiments. The PDF beamline is optimized for studies of strongly correlated materials under a variety of external conditions such as low temperatures and magnetic fields. For this experiment, the team placed samples of their samarium metal inside a liquid-helium-cooled cryostat in the beam of NSLS-II's x-rays and measured how the x-rays bounced off atoms making up the material's crystal structure at different temperatures.

"We track how the x-rays bounce off the sample to identify the locations of atoms and the distances between them," said Milinda Abeykoon, the lead scientist of the PDF beamline. "Our results show that, as the temperature drops, the atoms of this material move farther apart, causing the entire material to expand up to three percent in volume."

The team also used x-rays at the SOLEIL synchrotron in France and SPring-8 synchrotron in Japan to take a detailed look at what electrons were doing in the material at different stages of the temperature-induced transition.

"These 'x-ray absorption spectroscopy' experiments can track whether electrons are moving into or out of the outermost 'shell' of electrons around the samarium atoms," explained co-corresponding author Ignace Jarrige, a physicist at NSLS-II.

If you think back to one of the basics of chemistry, you might remember that atoms with unfilled outer shells tend to be the most reactive. Samarium's outer shell is just under half full.

"All the physics is essentially contained in this last shell, which is not full or not empty," Mazzone said.

The electron-tracking x-ray experiments revealed that electrons flowing through the samarium-sulfide metal were moving into that outer shell around each samarium atom. As each atom's electron cloud grew to accommodate the extra electrons, the entire material expanded.

But the scientists still had to explain the behavior based on physics theories. With the help of calculations performed by Maxim Dzero, a theoretical physicist from Kent State University, they were able to explain this phenomenon with the so-called Kondo effect, named after physicist Jun Kondo.

The basic idea behind the Kondo effect is that electrons will interact with magnetic impurities in a material, aligning their own spins in the opposite direction of the larger magnetic particle to "screen out," or cancel, its magnetism.

In the samarium-sulfide material, Dzero explained, the almost-half-full outer shell of each samarium atom acts as a tiny magnetic impurity pointing in a certain direction. "And because you have a metal, you also find free electrons that can approach and cancel out these little magnetic moments," Dzero said.

Not all elements subject to the Kondo effect have electrons fill the outermost shell, as it can also go the other way--causing electrons to leave the shell. The direction is determined by a delicate energy balance dictated by the rules of quantum mechanics.

"For some elements, because of the way the outer shell fills up, it is more energetically favorable for electrons to move out of the shell. But for a couple of these materials, the electrons can move in, which leads to expansion," Jarrige said. In addition to samarium, the other two elements are thulium and ytterbium.

It would be worth exploring compounds containing these other elements as additional possible ingredients for creating materials that expand upon cooling, Jarrige said.

Finally, the scientists noted that the extent of the negative thermal expansion in samarium sulfide can be tuned by varying the concentration of impurities.

"This tunability makes this material very valuable for engineering expansion-balanced alloys," Mazzone said.

"The application of highly developed many-body theory modeling was an important part of the work to identify the connection between the magnetic state of this material and its volume expansion," said Jason Hancock, a collaborator at the University of Connecticut (UConn). "This collaboration between Kent State, UConn, Brookhaven Lab, partner synchrotrons, and synthesis groups in Japan could potentially guide new materials discovery efforts that make use of the unusual properties of these rare-earth materials."

Credit: 
DOE/Brookhaven National Laboratory

Is the coronavirus outbreak of unnatural origins?

Did coronavirus mutate from a virus already prevalent in humans or animals or did it originate in a laboratory? As scientists grapple with understanding the source of this rapidly spreading virus, the Grunow-Finke assessment tool (GFT) may assist them with determining whether the coronavirus outbreak is of natural or unnatural origins.

Unless the question of origin is asked, unnatural outbreaks cannot be identified. Public health training, practice and culture defaults to the assumption that every outbreak is natural in origin and does not routinely include risk assessments for unnatural origins.

A study, "Application of a risk analysis tool to Middle East respiratory syndrome (MERS-CoV) outbreak in Saudi Arabia," recently published in Risk Analysis, developed a modified GFT (mGFT) to improve the sensitivity of the tool, which has been validated against previous outbreaks.

The mGFT contains 11 criteria for determining if an outbreak is of unnatural origin. The criteria are as follows:

1. Existence of a biological risk: The presence of a political or terrorist environment from which a biological attack could originate.

2. Unusual strain: In unnatural outbreaks, the strains may be atypical, rare, antiquated, new emerging, with mutations or different origins, genetically edited created by synthetic biotechnology. It may demonstrate increased virulence, unusual environmental sustainability, resistance to prophylactic and therapeutic measures, or difficulty in detection and identification.

3. Special aspects of the biological agent: It cannot be ruled out that a biological agent has been genetically manipulated.

4. Peculiarities of the geographic distribution of disease: It is unusual from an epidemiological perspective, if the disease, is identified in a region concerned for the first time ever or again after a long period of time.

5. High concentration of the biological agent in the environment: If a biological agent is released artificially, we can expect to find it in unusually high concentrations in the air, soil and drinking or surface water over a large area.

6. Peculiarities of the intensity and dynamics of the epidemic: Characterized by the percentage of cases of a disease per unit of time or the total number of cases.

7. Peculiarities of the transmission mode of the biological agent: In general, natural epidemics will feature paths of transmission which are typical for the pathogen and its natural hosts, deviations from the natural paths of infection could indicate that biological agents have been deliberately disseminated.

8. Peculiarities of the time of the epidemic: Epidemics of certain infectious diseases occur in increased numbers during certain seasons, either because they are dependent on the weather, or they occur after certain intervals in time.

9. Unusually rapid spread of the epidemic: The speed at which some epidemic spreads is determined by the virulence, resistance and concentration of the pathogen, the contagiousness of the disease and the intensity of the transmission process, on the one hand, and on the susceptibility and disposition of the exposed population, on the other.

10. Limitation of the epidemic to a specific population: Biological attacks can be directed against large heterogeneous population groups and military contingents or against selected target groups.

11. Special insights: Any suspicious circumstances identified prior to the outbreak, during the period of outbreak or post-outbreak, which would point to an unnatural outbreak.

If the tool reveals a score of less than 30 points, out of 60 possible points, then the outbreak is of natural cause. Each criterion is given a value between 0-3, based on available data. The value is then multiplied by a set weighting factor between 1-3 points. The sum of points is divided by the maximum number of points, for a probability which indicate the likelihood of bioterrorism. This tool can be applied to the coronavirus outbreak to flag unusual patterns.

Credit: 
Society for Risk Analysis

Neural networks facilitate optimization in the search for new materials

When searching through theoretical lists of possible new materials for particular applications, such as batteries or other energy-related devices, there are often millions of potential materials that could be considered, and multiple criteria that need to be met and optimized at once. Now, researchers at MIT have found a way to dramatically streamline the discovery process, using a machine learning system.

As a demonstration, the team arrived at a set of the eight most promising materials, out of nearly 3 million candidates, for an energy storage system called a flow battery. This culling process would have taken 50 years by conventional analytical methods, they say, but they accomplished it in five weeks.

The findings are reported in the journal ACS Central Science, in a paper by MIT professor of chemical engineering Heather Kulik, Jon Paul Janet PhD '19, Sahasrajit Ramesh, and graduate student Chenru Duan.

The study looked at a set of materials called transition metal complexes. These can exist in a vast number of different forms, and Kulik says they "are really fascinating, functional materials that are unlike a lot of other material phases. The only way to understand why they work the way they do is to study them using quantum mechanics."

To predict the properties of any one of millions of these materials would require either time-consuming and resource-intensive spectroscopy and other lab work, or time-consuming, highly complex physics-based computer modeling for each possible candidate material or combination of materials. Each such study could consume hours to days of work.

Instead, Kulik and her team took a small number of different possible materials and used them to teach an advanced machine-learning neural network about the relationship between the materials' chemical compositions and their physical properties. That knowledge was then applied to generate suggestions for the next generation of possible materials to be used for the next round of training of the neural network. Through four successive iterations of this process, the neural network improved significantly each time, until reaching a point where it was clear that further iterations would not yield any further improvements.

This iterative optimization system greatly streamlined the process of arriving at potential solutions that satisfied the two conflicting criteria being sought. This kind of process of finding the best solutions in situations, where improving one factor tends to worsen the other, is known as a Pareto front, representing a graph of the points such that any further improvement of one factor would make the other worse. In other words, the graph represents the best possible compromise points, depending on the relative importance assigned to each factor.

Training typical neural networks requires very large data sets, ranging from thousands to millions of examples, but Kulik and her team were able to use this iterative process, based on the Pareto front model, to streamline the process and provide reliable results using only the few hundred samples.

In the case of screening for the flow battery materials, the desired characteristics were in conflict, as is often the case: The optimum material would have high solubility and a high energy density (the ability to store energy for a given weight). But increasing solubility tends to decrease the energy density, and vice versa.

Not only was the neural network able to rapidly come up with promising candidates, it also was able to assign levels of confidence to its different predictions through each iteration, which helped to allow the refinement of the sample selection at each step. "We developed a better than best-in-class uncertainty quantification technique for really knowing when these models were going to fail," Kulik says.

The challenge they chose for the proof-of-concept trial was materials for use in redox flow batteries, a type of battery that holds promise for large, grid-scale batteries that could play a significant role in enabling clean, renewable energy. Transition metal complexes are the preferred category of materials for such batteries, Kulik says, but there are too many possibilities to evaluate by conventional means. They started out with a list of 3 million such complexes before ultimately whittling that down to the eight good candidates, along with a set of design rules that should enable experimentalists to explore the potential of these candidates and their variations.

"Through that process, the neural net both gets increasingly smarter about the [design] space, but also increasingly pessimistic that anything beyond what we've already characterized can further improve on what we already know," she says.

Apart from the specific transition metal complexes suggested for further investigation using this system, she says, the method itself could have much broader applications. "We do view it as the framework that can be applied to any materials design challenge where you're really trying to address multiple objectives at once. You know, all of the most interesting materials design challenges are ones where you have one thing you're trying to improve, but improving that worsens another. And for us, the redox flow battery redox couple was just a good demonstration of where we think we can go with this machine learning and accelerated materials discovery."

For example, optimizing catalysts for various chemical and industrial processes is another kind of such complex materials search, Kulik says. Presently used catalysts often involve rare and expensive elements, so finding similarly effective compounds based on abundant and inexpensive materials could be a significant advantage.

"This paper represents, I believe, the first application of multidimensional directed improvement in the chemical sciences," she says. But the long-term significance of the work is in the methodology itself, because of things that might not be possible at all otherwise. "You start to realize that even with parallel computations, these are cases where we wouldn't have come up with a design principle in any other way. And these leads that are coming out of our work, these are not necessarily at all ideas that were already known from the literature or that an expert would have been able to point you to."

Credit: 
Massachusetts Institute of Technology