Tech

Shining a light on nanoscale dynamics

image: Ultrafast electron diffraction of optically-excited metamaterials

Image: 
Kathrin Mohler, Ludwig-Maximilians-Universität München

Physicists from the University of Konstanz, Ludwig-Maximilians-Universität München (LMU Munich) and the University of Regensburg have successfully demonstrated that ultrashort electron pulses experience a quantum mechanical phase shift through their interaction with light waves in nanophotonic materials, which can uncover the nanomaterials' functionality. The corresponding experiments and results are reported in the latest issue of Science Advances.

Nanophotonic materials and metamaterials

Many materials found in nature can influence electromagnetic waves such as light in all different kinds of ways. However, generating novel optical effects for the purpose of developing particularly efficient solar cells, cloaking devices or catalysts often requires artificial structures, so-called metamaterials. These materials achieve their extraordinary properties through sophisticated structuring at the nanoscale, i.e. through a grid-like arrangement of smallest building blocks on length scales well below the wavelength of the excitation.

The characterization and development of such metamaterials requires a deep understanding of how the incident light waves behave when they hit these tiny structures and how they interact with them. Consequently, the optically-excited nanostructures and their electromagnetic near fields must be measured at spatial resolutions in the range of nanometres (~10-9 m) and, at the same time, at temporal resolutions below the duration of the excitation cycle (~10-15 s). However, this cannot be achieved with conventional light microscopy alone.

Ultrafast electron diffraction of optically-excited nanostructures

In contrast to light, electrons have a rest mass and therefore offer 100,000 times better spatial resolution than photons. In addition, electrons can be used to probe electromagnetic fields and potentials due to their charges. A team led by Professor Peter Baum (University of Konstanz) has now succeeded in applying extremely short electron pulses to achieve such a measurement. To that end, the duration of the electron pulses was compressed in time by means of terahertz radiation to such an extent that the researchers were able to resolve the optical oscillations of the electromagnetic near fields at the nanostructures in detail.

High spatial and temporal resolutions

"The challenge involved with this experiment lies in making sure that the resolution is sufficiently high both in space and in time. To avoid space charge effects, we only use single electrons per pulse and accelerate these electrons to energies of 75 kiloelectron volts", explains Professor Peter Baum, last author on the study and head of the working group for light and matter at the University of Konstanz's Department of Physics. When being scattered by the nanostructures, these extremely short electron pulses interfere with themselves due to their quantum mechanical properties and generate a diffraction image of the sample.

Interaction with the electromagnetic fields and potentials

The investigation of the optical-excited nanostructures is based on the known principle of pump-probe experiments. After the optical excitation of the near fields, the ultrashort electron pulse arrives at a defined point in time and measures the time-frozen fields in space and time. "According to the predictions of Aharonov and Bohm, the electrons experience a quantum mechanical phase shift of their wave function when travelling through electromagnetic potentials", explains Kathrin Mohler, a doctoral researcher at LMU Munich and first author on the study. These optically-induced phase shifts provide information about the ultrafast dynamics of light at the nanostructures, ultimately delivering a movie-like sequence of images that reveals the interaction of light with the nanostructures.

A new application regime for electron holography and diffraction

These experiments illustrate how electron holography and diffraction can be harnessed in the future to improve our understanding of fundamental light-matter interactions underlying nanophotonic materials and metamaterials. In the long term, this may even lead to the development and optimization of compact optics, novel solar cells or efficient catalysts.

Credit: 
University of Konstanz

New light on polar explorer's last hours

image: Sledgeteam 1 from The Denmark Expedition 1906-08. From left expedition commander Niels Peter Høeg Hagen, Mylius-Erichsen, and Jørgen Brønlund. All three died on the expedition. Credit wikipedia.

Image: 
wikipedia

Jørgen Brønlund was one of the participants in the legendary Mylius Erichsen's Denmark Expedition to Greenland 1906-08. In 1907, he died in a small cave of hunger and frostbite, but before that, he made one last note in his diary:

"Perished 79 Fjord after trying to return home over the ice sheet, in November Month I come here in waning moonlight and could not continue from Frost in the Feet and the Dark".

The Danish expedition had traveled to Northeast Greenland the year before to explore and map the most northerly Greenland and also to determine whether the 50,000 square kilometer Peary Land was a peninsula or an island. If an island, it would accrue to the Americans. If a peninsula, it would be part of Danish territory.

It was after a failed attempt to get into the Independence Fjord that Jørgen Brønlund and two other participants on the expedition's sled team 1 eventually had to give up.

A few days before Brønlund died, the two others from sled team 1 died: Expedition commander Mylius Erichsen and Niels Peter Høegh Hagen. Neither their corpses nor diaries have since been found.

Jørgen Brønlund's body and diary were found, and almost ever since, the diary has been kept at the Royal Library in Copenhagen.

Now chemists from the University of Southern Denmark have had the opportunity to analyze a very specific part of the diary's last page; more specifically, a black spot below Jørgens Brønlund's last entry and signature.

The results are published in the journal Archaeometry.

The analyzes reveal that the spot consists of the following components: burnt rubber, various oils, petroleum and feces.

- This new knowledge gives a unique insight into Brønlund's last hours, says professor of chemistry, Kaare Lund Rasmussen, Department of Physics, Chemistry and Pharmacy at the University of Southern Denmark.

- I see for me, how he, weakened and with dirty, shaking hands, fumbled in an attempt to light the burner, but failed, he says.

As a last survivor of sled team 1, Brønlund had reached a depot on Lambert's Land and had at his disposal a LUX petroleum burner, matches and petroleum. But there was no metabolised alcohol to preheat the burner.

- He had to find something else to get the burner going. You can use paper or oiled fabric, but it is difficult. We think he tried with the oils available, because the black spot contains traces of vegetable oil and oils that may come from fish, animals or wax candles, says Kaare Lund Rasmussen.

The spot's content of burnt rubber probably comes from a gasket in the Lux burner. The gasket may have been burned long before Brønlund's crisis in the cave, but it may also have happened during his last vain attempt to light a fire.

Brønlund's corpse and diary were found four months later, when spring came, by Johan Peter Koch and Tobias Gabrielsen, who had left Danmarkshavn to find the missing members of sled team 1.

The diary was found at Brønlund's feet and was taken back to Denmark and is now kept at the Royal Library in Copenhagen.

Brønlund's Lux burner was found in 1973 by the Danish Defense Sirius Patrol. After the re-burial of Brønlund in 1978, it was donated to the Arctic Institute in Copenhagen.

Peary Land:

Peninsula in northeast Greenland, named after the American polar explorer R.E. Peary, who believed that the area was an island and thus not part of Denmark. This was disproved by the Denmark Expedition, and Peary Land remained Danish. Peary Land is uninhabited.

Credit: 
University of Southern Denmark

Extraction method affects the properties of a sustainable stabiliser, spruce gum

image: From the right: unpurified spruce gum obtained from hot water extraction, spruce gum obtained from industrial streams, and purified spruce gum.

Image: 
Mamata Bhattarai

Biomass obtained from wood and the fractions extracted from it can serve as precursors for future sustainable and cost-efficient raw materials in various industrial sectors. A good example is spruce gum, a common, renewable and sustainable raw material found in nature.

"These hemicelluloses in wood have promising properties as stabilisers of emulsions, such as salad dressings and yoghurts. Stabilisers help achieve the desired texture and mouthfeel in food products. So far, oil-in-water emulsions have been primarily stabilised using more expensive polysaccharides, which are imports. Furthermore, the whole processes of from their extraction to global supply makes it less sustainable," says Mamata Bhattarai, who is defending her doctoral thesis in a public examination at the Faculty of Agriculture and Forestry, University of Helsinki.

In her doctoral thesis, Bhattarai investigated spruce gum extracted through three different approaches, observing that individual processes had markedly different effects on the solubility of spruce gum and its functioning in emulsions.

"Particles of varying sizes and structures, brought about by partial solubility, affected the functioning of spruce gum as an emulsion stabiliser. For example, spruce gum recovered through a modified hot water extraction had the best solubility, but its ability to stabilise emulsions was poorer than that of the spruce gum samples obtained by the two other processes. Recovery processes are also expected to modify the chemical composition of spruce gum which also affect their functioning."

Spruce gum offers a plant-based alternative for the food, cosmetics and pharmaceutical industries, which manufacture a broad range of emulsion-based products. The new information gained on spruce gum also promotes their use in biobased films, fillers and biofuels.

Credit: 
University of Helsinki

Hormone found to switch off hunger could help tackle obesity

A hormone that can suppress food intake and increase the feeling of fullness in mice has shown similar results in humans and non-human primates, says a new study published today in eLife.

The hormone, called Lipocalin-2 (LCN2), could be used as a potential treatment in people with obesity whose natural signals for feeling full no longer work.

LCN2 is mainly produced by bone cells and is found naturally in mice and humans. Studies in mice have shown that giving LCN2 to the animals long term reduces their food intake and prevents weight gain, without leading to a slow-down in their metabolism.

"LCN2 acts as a signal for satiety after a meal, leading mice to limit their food intake, and it does this by acting on the hypothalamus within the brain," explains lead author Peristera-Ioanna Petropoulou, who was a Postdoctoral Research Scientist at Columbia University Irving Medical Center, New York, US, at the time the study was carried out, and is now at the Helmholtz Diabetes Center, Helmholtz Zentrum München, Munich, Germany. "We wanted to see whether LCN2 has similar effects in humans, and whether a dose of it would be able to cross the blood-brain barrier."

The team first analysed data from four different studies of people in the US and Europe who were either normal weight, overweight or living with obesity. The people in each study were given a meal after an overnight fast, and the amount of LCN2 in their blood before and after the meal was studied. The researchers found that in those who were of normal weight, there was an increase in LCN2 levels after the meal, which coincided with how satisfied they felt after eating.

By contrast, in people who were overweight or had obesity, LCN2 levels decreased after a meal. Based on this post-meal response, the researchers grouped people as non-responders or responders. Non-responders, who showed no increase in LCN2 after a meal, tended to have a larger waist circumference and higher markers of metabolic disease - including BMI, body fat, increased blood pressure and increased blood glucose. Remarkably, however, people who had lost weight after gastric bypass surgery were found to have a restored sensitivity to LCN2 - changing their status from non-responders before their surgery, to responders afterwards.

Taken together, these results mirror those seen in mice, and suggest that this loss of post-meal LCN2 regulation is a new mechanism contributing to obesity and could be a potential target for weight-loss treatments.

After verifying that LCN2 can cross into the brain, the team explored whether treatment with the hormone might reduce food intake and prevent weight gain. To do this, they treated monkeys with LCN2 for a week. They saw a 28% decrease in food intake compared with that before treatment within a week, and the monkeys also ate 21% less than their counterparts who were treated only with saline. Moreover, after only one week of treatment, measurements of body weight, body fat and blood fat levels showed a declining trend in treated animals.

"We have shown that LCN2 crosses to the brain, makes its way to the hypothalamus and suppresses food intake in non-human primates," concludes senior author Stavroula Kousteni, Professor of Physiology and Cellular Biophysics at Columbia University Irving Medical Center. "Our results show that the hormone can curb appetite with negligible toxicity and lay the groundwork for the next level of LCN2 testing for clinical use."

Credit: 
eLife

Stable catalysts for new energy

image: Carina Brunnhofer (left), Dominik Dworschak (right)

Image: 
TU Wien

On the way to a CO2-neutral economy, we need to perfect a whole range of technologies - including the electrochemical extraction of hydrogen from water, fuel cells, or carbon capture. All these technologies have one thing in common: they only work if suitable catalysts are used. For many years, researchers have therefore been investigating which materials are best suited for this purpose.

At TU Wien and the Comet Center for Electrochemistry and Surface Technology CEST in Wiener Neustadt, a unique combination of research methods is available for this kind of research. Together scientists could now show: Looking for the perfect catalyst is not only about finding the right material, but also about its orientation. Depending on the direction in which a crystal is cut and which of its atoms it thus presents to the outside world on its surface, its behavior can change dramatically.

Efficiency or stability

"For many important processes in electrochemistry, precious metals are often used as catalysts, such as iridium oxide or platinum particles," says Prof. Markus Valtiner from the Institute of Applied Physics at TU Wien (IAP). In many cases these are catalysts with particularly high efficiency. However, there are also other important points to consider: The stability of a catalyst and the availability and recyclability of the materials. The most efficient catalyst material is of little use if it is a rare metal, dissolves after a short time, undergoes chemical changes or becomes unusable for other reasons.

For this reason, other, more sustainable catalysts are of interest, such as zinc oxide, even though they are even less effective. By combining different measuring methods, it is now possible to show that the effectiveness and the stability of such catalysts can be significantly improved by studying how the surface of the catalyst crystals is structured on an atomic scale.

It all depends on the direction

Crystals can have different surfaces: "Let's imagine a cube-shaped crystal that we cut in two," says Markus Valtiner. "We can cut the cube straight through the middle to create two cuboids. Or we can cut it exactly diagonally, at a 45-degree angle. The cut surfaces that we obtain in these two cases are different: Different atoms are located at different distances from each other on the cut surface. Therefore, these surfaces can also behave very differently in chemical processes".

Zinc oxide crystals are not cube-shaped, but form honeycomb-like hexagons - but the same principle applies here, too: Its properties depend on the arrangement of the atoms on the surface. "If you choose exactly the right surface angle, microscopically small triangular holes form there, with a diameter of only a few atoms," says Markus Valtiner. "Hydrogen atoms can attach there, chemical processes take place that support the splitting of water, but at the same time stabilize the material itself".

The research team has now been able to prove this stabilization for the first time: "At the catalyst surface, water is split into hydrogen and oxygen. While this process is in progress, we can take liquid samples and examine whether they contain traces of the catalyst," explains Markus Valtiner. "To do this, the liquid must first be strongly heated in a plasma and broken down into individual atoms. Then we separate these atoms in a mass spectrometer and sort them, element by element. If the catalyst is stable, we should hardly find any atoms from the catalyst material. Indeed, we could not detect any decomposition of the material at the atomic triangle structures when hydrogen was produced". This stabilizing effect is surprisingly strong - now the team is working on making zinc oxide even more efficient and transferring the physical principle of this stabilization to other materials.

Unique research opportunities for energy system transformation

Atomic surface structures have been studied at TU Wien for many years. "At our institute, these triangular structures have first been demonstrated and theoretically explained years ago, and now we are the first to demonstrate their importance for electrochemistry," says Markus Valtiner. "This is because we are in the unique situation here of being able to combine all the necessary research steps under one roof - from sample preparation to simulation on supercomputers, from microscopy in ultra-high vacuum to practical tests in realistic environments."

"This collaboration of different specialties under one roof is unique, and our great advantage to be able to be a global leader in research and teaching in this field," says Carina Brunnhofer, student at the IAP.

"Over the next ten years, we will develop stable and commercially viable systems for water splitting and CO2 reduction based on methodological developments and a fundamental understanding of surface chemistry and physics," says Dominik Dworschak, the first author of the recently published study. "However, at least a sustainable doubling of the current power output must be achieved in parallel," Markus Valtiner notes. "We are therefore on an exciting path, on which we will only achieve our climate targets through consistent, cross-sector research and development.

Credit: 
Vienna University of Technology

Secrets of the 'lost crops' revealed where bison roam

image: American bison at the Joseph H. Williams Tallgrass Prairie Preserve in Oklahoma.

Image: 
Natalie Mueller

Blame it on the bison.

If not for the wooly, boulder-sized beasts that once roamed North America in vast herds, ancient people might have looked past the little barley that grew under those thundering hooves. But the people soon came to rely on little barley and other small-seeded native plants as staple food.

New research from Washington University in St. Louis helps flesh out the origin story for the so-called "lost crops." These plants may have fed as many Indigenous people as maize, but until the 1930s had been lost to history.

As early as 6,000 years ago, people in the American Northeast and Midwest were using fire to maintain the prairies where bison thrived. When Europeans slaughtered the bison to near-extinction, the plants that relied on these animals to disperse their seeds began to diminish as well.

"Prairies have been ignored as possible sites for plant domestication, largely because the disturbed, biodiverse tallgrass prairies created by bison have only been recreated in the past three decades after a century of extinction," said Natalie Mueller, assistant professor of archaeology in Arts & Sciences.

Following the bison

In a new publication in The Anthropocene Review, Mueller reports on four field visits during 2019 to the Joseph H. Williams Tallgrass Prairie Preserve in eastern Oklahoma, the largest protected remnant of tallgrass prairie left on Earth. The roughly 40,000-acre preserve is home to about 2,500 bison today.

Mueller waded into the bison wallows after years of attempting to grow the lost crops from wild-collected seed in her own experimental gardens.

"One of the great unsolved mysteries about the origins of agriculture is why people chose to spend so much time and energy cultivating plants with tiny, unappetizing seeds in a world full of juicy fruits, savory nuts and plump roots," Mueller said.

They may have gotten their ideas from following bison.

Anthropologists have struggled to understand why ancient foragers chose to harvest plants that seemingly offer such a low return on labor.

"Before any mutualistic relationship could begin, people had to encounter stands of seed-bearing annual plants dense and homogeneous enough to spark the idea of harvesting seed for food," Mueller said.

Recent reintroductions of bison to tallgrass prairies offer some clues.

For the first time, scientists like Mueller are able to study the effects of grazing on prairie ecosystems. Turns out that munching bison create the kind of disturbance that opens up ideal habitats for annual forbs and grasses -- including the crop progenitors that Mueller studies.

These plants include: goosefoot (Chenopodium berlandieri); little barley (Hordeum pusillum,); sumpweed (Iva annua,); maygrass (Phalaris caroliniana); and erect knotweed (Polygonum erectum).

Harvesting at the wallow's edge

At the Tallgrass Prairie Preserve, Mueller and her team members got some tips from local expert Mike Palmer.

"Mike let us know roughly where on the prairie to look," Mueller said. "His occurrence data were at the resolution of roughly a square mile, but that helps when you're on a 60-square-mile grassland.

"I thought it would be hard to find trails to follow before I went out there, but it's not," she said. "They are super easy to find and easy to follow, so much so that I can't imagine humans moving through a prairie any other way!"

So-called 'little barley' is one of the small-seeded crop progenitors that Mueller has identified in stands around a bison paths

Telltale signs of grazing and trampling marked the "traces" that bison make through shoulder-high grasses. By following recently trodden paths through the prairie, the scientists were able to harvest seeds from continuous stands of little barley and maygrass during their June visit, and sumpweed in October.

"While much more limited in distribution, we also observed a species of Polygonum closely related to the crop progenitor and wild sunflowers in bison wallows and did not encounter either of these species in the ungrazed areas," Mueller said.

It was easier to move through the prairie on the bison paths than to venture off them.

"The ungrazed prairie felt treacherous because of the risk of stepping into burrows or onto snakes," she said.

With few landscape features for miles in any direction, the parts of the prairie that were not touched by bison could seem disorienting.

"These observations support a scenario in which ancient people would have moved through the prairie along traces, where they existed," Mueller said. "If they did so, they certainly would have encountered dense stands of the same plant species they eventually domesticated."

Diverse landscapes

Mueller encourages others to consider the role of bison as 'co-creators' -- along with Indigenous peoples -- of landscapes of disturbance that gave rise to greater diversity and more agricultural opportunities.

"Indigenous people in the Midcontinent created resilient and biodiverse landscapes rich in foods for people," she said. "They managed floodplain ecosystems rather than using levees and dams to convert them to monocultures. They used fire and multispecies interactions to create mosaic prairie-savanna-woodland landscapes that provided a variety of resources on a local scale."

Mueller is now growing seeds that she harvested from plants at the Tallgrass Prairie Preserve and also seeds that she separated from bison dung from the preserve. In future years, Mueller plans to return to the preserve and also to visit other prairies in order to quantify the distribution and abundance of crop progenitors under different management regimes.

"These huge prairies would not have existed if the native Americans were not maintaining them," using fire and other means, Mueller said. But to what end? Archaeologists have not found caches of bones or other evidence to indicate that Indigenous people were eating lots of prairie animals. Perhaps the ecosystems created by bison and anthropogenic fire benefited the lost crops.

"We don't think of the plants they were eating as prairie plants," she said. "However, this research suggests that they actually are prairie plants -- but they only occur on prairies if there are bison.

"I think we're just beginning to understand what the botanical record was telling us," Mueller said. "People were getting a lot more food from the prairie than we thought."

Credit: 
Washington University in St. Louis

Taking a shine to polymers: Fluorescent molecule betrays the breakdown of polymer materials

image: H-DAAN could work as a radical scavenger for polymeric mechanoradicals in the bulk and generate DAAN* , which could potentially be evaluated by EPR spectroscopy and fluorescence measurements owing to their high stability toward oxygen.

Image: 
Tokyo Tech

Nylon, rubber, silicone, Teflon, PVC - these are all examples of man-made polymers - long chains of repeated molecular units that we call monomers. While polymers also exist in nature (think wool, silk, or even hair), the invention of synthetic polymers, the most famous of which is plastic, revolutionized the industry. Light, stretchy, flexible, yet strong and resistant, synthetic polymers are one of the most versatile materials on the planet, used in everything from clothing to building, packaging and energy production. Since the very beginning of this new era in material engineering, understanding the influence of external forces on polymers' strength and stability has been crucial to evaluate their performance.

When subjected to mechanical stress, the weak bonds that keep some polymer chains together are overcome, and one inevitably breaks. When this happens, a free radical (a molecule with an unpaired electron, which is naturally unstable and very reactive, called a "mechanoradical" in this case) is generated. By estimating the amount of free mechanoradicals produced, we can infer the resistance of a material to the amount of stress. While this phenomenon is well documented, scientists struggled to observe it under ambient temperature in bulk state, because mechanoradicals produced for polymers in bulk are not stable due to their high reactivity toward oxygen and other agents.

Researchers from Tokyo Institute of Technology led by Professor Hideyuki Otsuka decided to take up the challenge. In their study published in Angewandte Chemie International Edition, they used a small molecule called diarylacetonitrile (H-DAAN) to capture the rogue free radicals. "Our theory was that H-DAAN would emit a distinctive fluorescent light when it reacts with the free radicals, which we could then measure to estimate the extent of polymer breakdown," explains Prof Otsuka. "The theory is simple; the higher the force exerted on the polymer, the more mechanoradicals are produced, and the more they react with H-DAAN. This higher reaction rate results in more intense fluorescent light, changes in which can easily be measured."

The researchers now wanted to see how this would work in practice. When polystyrene (in the presence of H-DAAN) was subjected to mechanical stress via grinding, the H-DAAN acted as a radical scavenger for polymeric mechanoradicals, and bound with them to produce "DAAN* ," which has fluorescent properties. This caused a visible yellow fluorescence to appear.

"More important, probably, is the clear correlation that we found between fluorescence intensity and the amount of DAAN radicals generated by the ground-up polystyrene, as we had predicted," reports Prof Otsuka. "This means that it is possible to estimate the amount of DAAN radicals generated in the bulk system just by measuring the fluorescence intensity."

The implications of their findings are wide-ranging: by being able to visually quantify how materials respond to different external stimuli, they can test how suitable polymers are for various uses, depending on the mechanical stress they will be expected to undergo. This method could prove to be an invaluable tool for scientists and engineers as they strive to improve material performance and specificity.

This exciting research this shine light on the responses of polymers to mechanical stress and illuminate the way forward in the research of polymer mechanoradicals!

Credit: 
Tokyo Institute of Technology

Exploring blended materials along compositional gradients

image: Yale University PhD student Kristof Toth (pictured above) with the electrospray deposition tool he designed, built, and validated in collaboration with staff scientist Gregory Doerk of Brookhaven Lab's Center for Functional Nanomaterials (CFN). This CFN tool allows users to blend multiple components--such as polymers, nanoparticles, and small molecules--over a range of compositions in a single sample. Next door to the CFN, at the National Synchrotron Light Source II, users can probe how the structure of the blended material changes across this entire composition space.

Image: 
Brookhaven National Laboratory

UPTON, NY--Blending is a powerful strategy for improving the performance of electronics, coatings, separation membranes, and other functional materials. For example, high-efficiency solar cells and light-emitting diodes have been produced by optimizing mixtures of organic and inorganic components.

However, finding the optimal blend composition to produce desired properties has traditionally been a time-consuming and inconsistent process. Scientists synthesize and characterize a large number of individual samples with different compositions one at a time, eventually compiling enough data to create a compositional "library." An alternative approach is to synthesize a single sample with a compositional gradient so that all possible compositions can be explored at once. Existing combinatorial methods for rapidly exploring compositions have been limited in terms of the types of compatible materials, the size of compositional increments, or number of blendable components (often only two).

To overcome these limitations, a team from the U.S. Department of Energy's (DOE) Brookhaven National Laboratory, Yale University, and University of Pennsylvania recently built a first-of-its-kind automated tool for depositing films with finely controlled blend compositions made of up to three components onto single samples. Solutions of each component are loaded into syringe pumps, mixed according to a programmable "recipe," and sprayed as tiny electrically charged droplets onto the surface of a heated base material called a substrate. By programming the flow rates of the pumps as a stage underneath the substrate changes position, users can obtain continuous gradients in composition.

Now, the team has combined this electrospray deposition tool with the structural characterization technique of x-ray scattering. Together, these capabilities form a platform to probe how material structure changes across an entire composition space. The scientists demonstrated this platform for a thin-film blend of three polymers--chains made of molecular building blocks linked together by chemical bonds--designed to spontaneously arrange, or "self-assemble," into nanometer-scale (billionths of a meter) patterns. Their platform and demonstration are described in a paper published today in RSC Advances, a journal of the Royal Society of Chemistry (RSC).

"Our platform reduces the time to explore complex compositional dependencies of blended material systems from months or weeks to a few days," said corresponding author Gregory Doerk, a staff scientist in the Electronic Nanomaterials Group at Brookhaven Lab's Center for Functional Nanomaterials (CFN).

"We constructed a morphology diagram with more than 200 measurements on a single sample, which is like making 200 samples the conventional way," said first author Kristof Toth, a PhD student in the Department of Chemical and Environmental Engineering at Yale University. "Our approach not only reduces sample preparation time but also sample-to-sample error."

This diagram mapped how the morphologies, or shapes, of the blended polymer system changed along a compositional gradient of 0 to 100 percent. In this case, the system contained a widely studied self-assembling polymer made of two distinct blocks (PS-b-PMMA) and this block copolymer's individual block constituents, or homopolymers (PS and PMMA). The scientists programmed the electrospray deposition tool to consecutively create one-dimensional gradient "strips" with all block copolymer at one end and all homopolymer blend at the other end.

To characterize the structure, the team performed grazing-incidence small-angle x-ray scattering experiments at the Complex Materials Scattering (CMS) beamline, which is operated at Brookhaven's National Synchrotron Light Source II (NSLS-II) in partnership with the CFN. In this technique, a high-intensity x-ray beam is directed toward the surface of a sample at a very low angle. The beam reflects off the sample in a characteristic pattern, providing snapshots of nanoscale structures at different compositions along each five-millimeter-long strip. From these images, the shape, size, and ordering of these structures can be determined.

"The synchrotron's high intensity x-rays allow us to take snapshots at each composition in a matter of seconds, reducing the overall time to map the morphology diagram," said co-author Kevin Yager, leader of the CFN Electronic Nanomaterials Group.

The x-ray scattering data revealed the emergence of highly ordered morphologies of different kinds as the blend composition changed. Normally, the block copolymers self-assemble into cylinders. However, blending in very short homopolymers resulted in well-ordered spheres (increasing amount of PS) and vertical sheets (more PMMA). The addition of these homopolymers also tripled or quadrupled the speed of the self-assembly process, depending on the ratio of PS to PMMA homopolymer. To further support their results, the scientists performed imaging studies with a scanning electron microscope at the CFN Materials Synthesis and Characterization Facility.

Though the team focused on a self-assembling polymer system for their demonstration, the platform can be used to explore blends of a variety of materials such as polymers, nanoparticles, and small molecules. Users can also study the effects of different substrate materials, film thicknesses, x-ray beam focal spot sizes, and other processing and characterization conditions.

"This capability to survey a broad range of compositional and processing parameters will inform the creation of complex nanostructured systems with enhanced or entirely new properties and functionalities," said co-author Chinedum Osuji, the Eduardo D. Glandt Presidential Professor of Chemical and Biomolecular Engineering at the University of Pennsylvania.

In the future, the scientists hope to create a second generation of the instrument that can create samples with mixtures of more than three components and which is compatible with a range of characterization methods--including in situ methods to capture morphology changes during the electrospray deposition process.

"Our platform represents a huge advance in the amount of information you can get across a composition space," said Doerk. "In a few days, users can work with me at the CFN and the beamline staff next door at NSLS-II to create and characterize their blended systems."

"In many ways, this platform complements autonomous methods developed by CFN and NSLS-II scientists to identify trends in experimental data," added Yager. "Pairing them together has the potential to dramatically accelerate soft matter research."

Credit: 
DOE/Brookhaven National Laboratory

Study: Clean Air Act saved 1.5 billion birds

image: Great Blue Heron in front of an oil refinery.

Image: 
Gerrit Vyn

Ithaca, NY--U.S. pollution regulations meant to protect humans from dirty air are also saving birds. So concludes a new continentwide study published today in The Proceedings of the National Academy of Sciences. Study authors found that improved air quality under a federal program to reduce ozone pollution may have averted the loss of 1.5 billion birds during the past 40 years. That's nearly 20 percent of birdlife in the United States today. The study was conducted by scientists at Cornell University and the University of Oregon.

"Our research shows that the benefits of environmental regulation have likely been underestimated," says Ivan Rudik, a lead author and Ruth and William Morgan Assistant Professor at Cornell's Dyson School of Applied Economics and Management. "Reducing pollution has positive impacts in unexpected places and provides an additional policy lever for conservation efforts."

Ozone is a gas that occurs in nature and is also produced by human activities, including by power plants and cars. It can be good or bad. A layer of ozone in the upper atmosphere protects the Earth from the harmful ultraviolet rays of the sun. But ground-level ozone is hazardous and is the main pollutant in smog.

To examine the relationship between bird abundance and air pollution, the researchers used models that combined bird observations from the Cornell Lab of Ornithology's eBird program with ground-level pollution data and existing regulations. They tracked monthly changes in bird abundance, air quality, and regulation status for 3,214 U.S. counties over a span of 15 years. The team focused on the NOx (nitrogen oxide) Budget Trading Program, which was implemented by the U.S. Environmental Protection Agency to protect human health by limiting summertime emissions of ozone precursors from large industrial sources.

Study results suggest that ozone pollution is most detrimental to the small migratory birds (such as sparrows, warblers, and finches) that make up 86 percent of all North American landbird species. Ozone pollution directly harms birds by damaging their respiratory system, and indirectly affects birds by harming their food sources.

"Not only can ozone cause direct physical damage to birds, but it also can compromise plant health and reduce numbers of the insects that birds consume," explains study author Amanda Rodewald, Garvin Professor at the Cornell Department of Natural Resources and the Environment and Director of the Center for Avian Population Studies at the Cornell Lab of Ornithology. "Not surprisingly, birds that cannot access high-quality habitat or food resources are less likely to survive or reproduce successfully. The good news here is that environmental policies intended to protect human health return important benefits for birds too."

Last year, a separate study by the Cornell Lab of Ornithology showed that North American bird populations have declined by nearly 3 billion birds since 1970 (Rosenberg et. al. Science, 2019). This new study shows that without the regulations and ozone-reduction efforts of the Clean Air Act, the loss of birdlife may have been 1.5 billion birds more.

"This is the first large-scale evidence that ozone is associated with declines in bird abundance in the United States and that regulations intended to save human lives also bring significant conservation benefits to birds," says Catherine Kling, Tisch University Professor at the Cornell Dyson School of Applied Economics and Management and Faculty Director at Cornell's Atkinson Center for Sustainability. "This work contributes to our ever increasing understanding of the connectedness of environmental health and human health."

Credit: 
Cornell University

Team uses copper to image Alzheimer's aggregates in the brain

image: Research scientist Hong-Jun Cho is the first author of the study.

Image: 
Photo courtesy Hong-Jun Cho

CHAMPAIGN, Ill. -- A proof-of-concept study conducted in a mouse model of Alzheimer's disease offers new evidence that copper isotopes can be used to detect the amyloid-beta protein deposits that form in the brains of people living with - or at risk of developing - Alzheimer's.

Several types of isotopes give off positively charged particles called positrons that are detectable by positron emission tomography scanners. The copper isotope used in the study, Cu-64, lasts much longer than the carbon or fluorine isotopes currently approved for use in human subjects, researchers report. Having access to longer-lasting diagnostic agents would make the process of diagnosing Alzheimer's more accessible to people who live far from major medical centers. Any clinic with a PET scanner could have the agents shipped to it in time to use the compounds in brain scans of patients living nearby.

University of Illinois at Urbana-Champaign researchers report their findings in the Proceedings of the National Academy of Sciences.

The effort to develop copper-based compounds to detect Alzheimer's disease in living patients is a complicated affair, said Liviu Mirica, a chemistry professor who led the new study with research scientist Hong-Jun Cho. Any diagnostic agent created in the lab must meet several criteria.

"There is a part that binds copper and another part that binds to these amyloid peptides," Mirica said.

In tests with compounds created in Mirica's lab, the team discovered that the copper-binding region of the molecule interfered with the amyloid-binding fragment. To overcome this problem, the researchers introduced a tiny chemical spacer between the two components. This improved their molecule's affinity for the amyloid peptides.

To be effective, however, such diagnostic agents also must be able to cross the blood-brain barrier.

"They have to be small enough and greasy enough to make it into the brain," Mirica said. "But they can't be too greasy, because then they might not be bioavailable."

The imaging agent must last long enough for imaging but ultimately decay, leaving no potentially problematic radioactive metals in the body or the brain.

When they first tested their compounds in mouse brain tissue, the researchers saw that their agents' affinity for amyloid deposits was limited. Adding a second amyloid-binding component to the molecule enhanced its binding and improved its ability to pass through the blood-brain barrier.

"If we do live PET imaging of mice with and without Alzheimer's disease pathologies, we see a statistically significant difference in signal intensity," Mirica said.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Novel chemical process a first step to making nuclear fuel with fire

image: Combustion synthesis of LnBTA compound.

Image: 
Los Alamos National Laboratory.

LOS ALAMOS, N.M., November 24, 2020 -- Developing safe and sustainable fuels for nuclear energy is an integral part of Los Alamos National Laboratory's energy security mission. Uranium dioxide, a radioactive actinide oxide, is the most widely used nuclear fuel in today's nuclear power plants. A new "combustion synthesis" process recently established for lanthanide metals -- non-radioactive and positioned one row above actinides on the periodic table -- could be a guide for the production of safe, sustainable nuclear fuels.

"Actinide nitride fuels are potentially a safer and more economical option in current power-generating systems," said Bi Nguyen, Los Alamos National Laboratory Agnew postdoc and lead author of research recently published in the journal Inorganic Chemistry, which was selected as an American Chemical Society Editors' Choice Featured Article.

"Nitride fuels are also well suited to future Generation IV nuclear power systems, which focus on safety, and feature a sustainable closed reactor fuel cycle. Actinide nitrides have superior thermal conductivity compared to the oxides and are significantly more energy dense," said Nguyen. Nitrides are a class of chemical compounds that contain nitrogen, versus oxides, which contain oxygen.

Actinide nitride fuels would provide more safety and sustainability because of their energy density, offering up more energy from less material, as well as better thermal conductivity -- allowing for lower temperature operations, giving them a larger margin to meltdown under abnormal conditions.

Actinide nitrides, however, are very challenging to make and the production of large amounts of high purity actinide nitrides continues to be a major impediment to their application. Both actinides and lanthanides are at the bottom of the periodic table and potential methods to make actinide materials are typically first tested with the lanthanides because they behave similarly, but are not radioactive.

Los Alamos National Laboratory and Naval Research Laboratory scientists discovered that LnBTA [lanthanide bis(tetrazolato)amine] compounds can be burned to produce high-purity lanthanide nitride foams in a unique technique called combustion synthesis. This method uses a laser pulse to initiate dehydrated LnBTA complexes, which then undergo a self-sustained combustion reaction in an inert atmosphere to yield nanostructured lanthanide nitride foams. This work was funded by the Laboratory Directed Research and Development (LDRD) program.

LnBTA compounds are easily prepared in bulk and their combustion is readily scalable. There is an ongoing collaboration between the Laboratory's Weapons Modernization and Chemistry divisions to examine actinide analogues for combustion synthesis of actinide nitride fuels.

Credit: 
DOE/Los Alamos National Laboratory

Tel Aviv University researchers go underwater to study how sponge species vanished

Researchers from Tel Aviv University (TAU) embarked on an underwater journey to solve a mystery: Why did sponges of the Agelas oroides species, which used to be common in the shallow waters along the Mediterranean coast of Israel, disappear? Today, the species can be found in Israel mainly in deep habitats that exist at a depth of 100 meters (330 feet).

The researchers believe that the main reason for the disappearance of the sponges was the rise in seawater temperatures during the summer months, which in the past 60 years have risen by about 3°C (37°F).

The study was led by Professor Micha Ilan and PhD student Tal Idan of TAU's School of Zoology at the George S. Wise Faculty of Life Sciences and Steinhardt Museum of Natural History. The article was published in the journal Frontiers in Marine Science in November 2020.

"Sponges are marine animals of great importance to the ecosystem, and also to humans," Professor Idan explains. "They feed by filtering particles or obtain substances dissolved in the seawater and making them available to other animals. Sponges are also used as a habitat for many other organisms and contain a wide variety of natural materials used as a basis for the development of medicines.

"In our study, we focused on the Agelas oroides species, a common Mediterranean sponge that grew throughout the Mediterranean Sea from a depth of less than a meter to 150 meters deep. But the sponge has not been observed in Israel's shallow waters for over 50 years."

During the study, the researchers used a research vessel and an underwater robot belonging to the nongovernmental organization EcoOcean. With that help, they located particularly rich rocky habitats on the seabed at a depth of about 100 meters (330 feet), approximately 16 kilometers (10 miles) west of Israeli shores. The most dominant animals in these habitats are sponges, which is why the habitats are called "sponge gardens."

The researchers collected 20 specimens of the Agelas oroides sponge, 14 of which were transferred to shallow waters at a depth of 10 meters (about 30 feet), at a site where the sponge was commonly found in the 1960s. The remaining six specimens were returned to the sponge gardens from which they were taken and used as a control group.

The findings showed that when the water temperature ranged from 18°-26°C (64°-79°F), usually in the months of March to May, the sponges grew and flourished: they pumped and filtered water, the action by which they feed, and their volume increased. But as the water temperature continued to rise, the sponges' condition deteriorated. At a temperature of 28°C (82°F), most of them stopped pumping water, and during the month of July, when the water temperature exceeded 29°C (84°F), all of the sponges that had been transferred to the shallow water died within a short period of time.

At the same time, the sponges in the control group continued to enjoy a relatively stable and low temperature between 17°-20°C (63°-68°F), which allowed them to continue to grow and thrive.

The researchers believe that the decisive factor that led to the disappearance of the sponges from the shallow area was prolonged exposure to high seawater temperature. According to them, "In the past, the temperature would also reach 28.5°C (83°F) in the summer, but only for a short period of about two weeks. So the sponges, even if damaged, managed to recover. Today, seawater temperatures rise above 29°C (84°F) degrees for three months, which likely causes multi-system damage in sponges and leaves them no chance of recovering and surviving."

"From 1960 until today, the water temperature on the Israeli Mediterranean coast has risen by 3°C (37°F), which may greatly affect marine organisms, including sponges," Professor Ilan concludes. "Our great concern is that the changes taking place on our shores are a harbinger of what may take place in the future throughout the Mediterranean. Our findings suggest that continued climate change and the warming of seawater could fatally harm sponges and marine life in general."

Credit: 
American Friends of Tel Aviv University

Pesticide deadly to bees now easily detected in honey

image: Professor Janusz Pawliszyn transformed his lawn into a wildflower meadow to attract bees.

Image: 
Janusz Pawliszyn

A common insecticide that is a major hazard for honeybees is now effectively detected in honey thanks to a simple new method.

Researchers at the University of Waterloo developed an environmentally friendly, fully automated technique that extracts pyrethroids from the honey. Pyrethroids are one of two main groups of pesticides that contribute to colony collapse disorder in bees, a phenomenon where worker honeybees disappear, leaving the queen and other members of the hive to die. Agricultural producers worldwide rely on honeybees to pollinate hundreds of billions of dollars worth of crops.

Extracting the pyrethroids with the solid phase microextraction (SPME) method makes it easier to measure whether their levels in the honey are above those considered safe for human consumption. It can also help identify locations where farmers use the pesticide and in what amounts. The substance has traditionally been difficult to extract because of its chemical properties.

"Pyrethroids are poorly soluble in water and are actually suspended in honey," said Janusz Pawliszyn, a professor of chemistry at Waterloo. "We add a small amount of alcohol to dissolve them prior to extraction by the automated SPME system."

Farmers spray the pesticides on crops. They are neurotoxins, which affect the way the brain and nerves work, causing paralysis and death in insects.

"It is our hope that this very simple method will help authorities determine where these pesticides are in use at unsafe levels to ultimately help protect the honeybee population," said Pawliszyn.

The Canadian Food Inspection Agency tests for chemical residues in food in Canada. Maximum residue limits are regulated under the Pest Control Products Act. The research team found that of the honey products they tested that contained the pesticide, all were at allowable levels.

Credit: 
University of Waterloo

Microbes help unlock phosphorus for plant growth

image: Poplar trees such as these along the Snoqualmie River able to thrive on rocky riverbanks, despite low availability of nutrients like phosphorus in their natural habitat. Microbes help these trees capture and use the nutrients they need for growth.

Image: 
Sharon Doty/University of Washington

Phosphorus is a necessary nutrient for plants to grow. But when it's applied to plants as part of a chemical fertilizer, phosphorus can react strongly with minerals in the soil, forming complexes with iron, aluminum and calcium. This locks up the phosphorus, preventing plants from being able to access this crucial nutrient.

To overcome this, farmers often apply an excess of chemical fertilizers to agricultural crops, leading to phosphorus buildup in soils. The application of these fertilizers, which contain chemicals other than just phosphorus, also leads to harmful agricultural runoff that can pollute nearby aquatic ecosystems.

Now a research team led by the University of Washington and Pacific Northwest National Laboratory has shown that microbes taken from trees growing beside pristine mountain-fed streams in Western Washington could make phosphorus trapped in soils more accessible to agricultural crops. The findings were published in October in the journal Frontiers in Plant Science.

Endophytes, which are bacteria or fungi that live inside a plant for at least some of their lifecycle, can be thought of as "probiotics" for plants, said senior author Sharon Doty, a professor in the UW School of Environmental and Forest Sciences. Doty's lab has shown in previous studies that microbes can help plants survive and even thrive in nutrient-poor environments -- and help clean up pollutants.

In this new study, Doty and collaborators found that endophytic microbes isolated from wild-growing plants helped unlock valuable phosphorus from the environment, breaking apart the chemical complexes that had rendered the phosphorus unavailable to plants.

"We're harnessing a natural plant-microbe partnership," Doty said. "This can be a tool to advance agriculture because it's providing this essential nutrient without damaging the environment."

Doty's research scientist, Andrew Sher, and UW undergraduate researcher Jackson Hall demonstrated in lab experiments that the microbes could dissolve the phosphate complexes. Poplar plants inoculated with the bacteria in Doty's lab were sent to collaborator Tamas Varga, a materials scientist at the Environmental Molecular Sciences Laboratory at Pacific Northwest National Laboratory in Richland, Washington. There researchers used advanced imaging technologies at their lab and at other U.S. Department of Energy national laboratories to provide clear evidence that the phosphorus made available by the microbes did make it up into the plant's roots.

The imaging also revealed that the phosphorus gets bound up in mineral complexes within the plant. Endophytes, living inside plants, are uniquely positioned to re-dissolve those complexes, potentially maintaining the supply of this essential nutrient.

While previous work in Doty's lab demonstrated that endophytes can supply nitrogen, obtained from the air, to plants, such direct evidence of plants using phosphorus dissolved by endophytes was previously unavailable.

The bacteria used in these experiments came from wild poplar trees growing along the Snoqualmie River in Western Washington. In this natural environment, poplars are able to thrive on rocky riverbanks, despite low availability of nutrients like phosphorus in their natural habitat. Microbes help these trees capture and use the nutrients they need for growth.

These findings can be applied to agriculture crops, which often sit, unused, on an abundance of "legacy" phosphorus that has accumulated in the soil, unused, from years of fertilizer applications. Microbes could be applied in the soil among young crop plants, or as a coating on seeds, helping to unlock phosphorus held captive and making it available for use by plants to grow. Reducing the use of fertilizers and employing endophytes -- such as those studied by Doty and Varga -- opens the door for more sustainable food production.

"This is something that can easily be scaled up and used in agriculture," Doty said.

UW has already licensed the endophyte strains used in this study to Intrinsyx Bio, a California-based company working to commercialize a collection of endophyte microbes. The direct evidence provided by Doty and Varga's research of endophyte-promoted phosphorus uptake is "game-changing for our research on crops," said John Freeman, chief science officer of Intrinsyx Bio.

Credit: 
University of Washington

For people with diabetes, medicaid expansion helps, but can't do it all: BU study

Medicaid expansion through the Affordable Care Act has insured millions of low-income people in the United States, improving outcomes for patients with many different diseases. But expansion alone has not been enough to improve outcomes for patients with diabetes, according to a new Boston University School of Public Health (BUSPH) study.

Published in the American Journal of Preventive Medicine, the study finds an increase in insurance coverage, ability to see a physician, and foot examinations among patients with diabetes in states that expanded Medicaid. However, the study did not find significant changes in follow-up examinations, care, or treatment for diabetes, pointing to the need for other structural changes.

"There are likely many steps between having health insurance and successfully getting treatment for diabetes--including providers needing to recognize the importance of screening and patients needing to implement rigorous lifestyle changes," says study lead author Dr. Lily Yan, who was a master of science in population health student at BUSPH while working on the study and is now a global health research fellow at Weill Cornell Medicine.

"While having health insurance through a program like Medicaid expansion may be necessary for better health, it may not be sufficient alone."

The researchers used data from the Behavioral Risk Factor Surveillance System from 2008 through 2018 to compare 24 states that expanded Medicaid as of 2018 and 16 states that had not. The study included all non-pregnant, Medicaid-eligible residents of these states with self-reported diabetes.

The study examined diabetes outcomes using the "continuum of care" model of successful diabetes management: Screening would ideally lead to diagnosis, then linkage to care for the disease, then treatment, and ultimately control of the disease.

The researchers found improvements the beginning of the continuum in the first years following a state's Medicaid expansion: Health insurance coverage rates for people with diabetes increased by 7.2 percentage points, and as a result the ability to afford a physician increased by 5.5 percentage points. This in turn led to a 5.3-percentage point increase in diabetic foot examinations by healthcare providers. A few years after expansion, the researchers also found a 7.2-percentage point increase in self-administered foot examinations.

The researchers also found an increase in linkage to care among Hispanic patients. However, the researchers found no statistically significant improvement overall in linkage to care, lifestyle changes and self-monitoring of conditions, or treatment.

"Medicaid coverage on its own is not enough to manage diabetes. Our policymakers should think about insurance coverage and beyond: supporting behavioral interventions, bolstering healthcare workforces, and addressing the underlying socioeconomic determinants of health," says study senior author Dr. Kiersten Strombotne, assistant professor of health law, policy & management at BUSPH.

Credit: 
Boston University School of Medicine