Tech

Verifying 'organic' foods

Organic foods are increasingly popular -- and pricey. Organic fruits and vegetables are grown without synthetic pesticides, and because of that, they are often perceived to be more healthful than those grown with these substances. But not all foods with this label are fully pesticide free, and it can be challenging to detect low amounts of the substances. Now, scientists report in ACS' Journal of Agricultural and Food Chemistry a new strategy to determine organic authenticity.

The high cost and popularity of organic foods can be an incentive to try to pass off pesticide-treated foods as organic. Pesticide detection can be challenging, or even impossible, especially because some of these substances break down rapidly after being applied, leading to a false impression that a food has not been treated. However, a bit of pesticide on the surface of a fruit doesn't necessarily signal intentional fraud. The compound might have just blown over from a neighboring field. To help improve the practice of verifying organic foods, Jana Hajslova and colleagues developed a method to analyze the metabolites generated within plants when pesticides break down, using an experimental vineyard as their testing ground.

The researchers used a combination of ultra-high-performance liquid chromatography and high-resolution mass spectrometry to identify and screen the metabolites of seven common pesticides. The team then used the method on the leaves and fruits of treated grapevines at different intervals between planting and harvest, as well as the wine made from the treated fruits. With the technique, the team observed decreasing levels of the initial pesticides as degradation occurred. The group also detected the metabolites of these substances as their levels varied over time. Many metabolites were still detectable at higher levels than the applied pesticide compound in wine made from the treated fruits, meaning that organic wines, not just fruits and leaves, could potentially be verified using the strategy. The researchers say that their methodology, with some refinement, should aid in food regulators' efforts to crack down on illegal practices in organic farming.

Credit: 
American Chemical Society

New polymer tackles PFAS pollution

image: Flinders University PhD candidate Nicholas Lundquist (left) and Dr Justin Chalker (right) with a field sample of PFAS contaminated water.

Image: 
Flinders University

The problem of cleaning up toxic polyfluorinated alkyl substances (PFAS) pollution - commonly used in non-stick and protective coatings, lubricants and aviation fire-fighting foams - has been solved through the discovery of a new low-cost, safe and environmentally friendly method that removes PFAS from water.

In The US, contamination by PFAS and other so-called "forever chemicals" has been detected in foods including grocery store meats and seafoods by FDA tests, prompting calls for regulations to be applied to manmade compounds. Consistent associations between very high levels of the industrial compounds in peoples' blood and health risks have been reported but insufficient evidence has been presented to prove the compounds as the cause.

In Australia, PFAS pollution - which does not break down readily in the environment - has been a hot news item due to the extensive historical use of fire-fighting foams containing PFAS at airports and defence sites, resulting in contaminated ground water and surface water being reported in these areas.

Researchers from the Flinders University Institute for NanoScale Science and Technology have - on World Environment Day - revealed a new type of absorbent polymer, made from waste cooking oil and sulfur combined with powdered activated carbon (PAC).

While there have been few economic solutions for removing PFAS from contaminated water, the new polymer adheres to carbon in a way that prevents caking during water filtration. It works faster at PFAS uptake than the commonly used and more expensive granular activated carbon method, and it dramatically lowers the amount of dust generated during handling PAC that lowers respiratory risks faced by clean-up workers.

"We need safe, low-cost and versatile methods for removing PFAS from water, and our polymer-carbon blend is a promising step in this direction," says Flinders University's Dr Justin Chalker, co-director of the study. "The next stage for us is to test this sorbent on a commercial scale and demonstrate its ability to purify thousands of litres of water. We are also investigating methods to recycle the sorbent and destroy the PFAS."

During the testing phase, the research team was able to directly observe the self-assembly of PFOA hemi-micelles on the surface of the polymer. "This is an important fundamental discovery about how PFOA interacts with surfaces," explains Dr Chalker.

The team demonstrated the effectiveness of the polymer-carbon blend by purifying a sample of surface water obtained near a RAAF airbase. The new filter material reduced the PFAS content of this water from 150 parts per trillion (ppt) to less than 23 parts per trillion (ppt), which is well below the 70 ppt guidance values for PFAS limits in drinking water issues by the Australian Government Department of Health.

The core technology for this PFAS sorbent is protected by a provisional patent.

"Our canola oil polysulfide was found to be highly effective as a support material for powdered activated carbon, enhancing its efficiency and prospects for implementation," says Nicholas Lundquist, PhD candidate at Flinders University and first author in the ground-breaking study.

The research paper, "Polymer supported carbon for safe and effective remediation of PFOA- and PFOS-contaminated water", by Nicholas Lundquist, Martin Sweetman, Kymberley Scroggie, Max Worthington, Louisa Esdaile, Salah Alboaiji, Sally Plush, John Hayball and Justin Chalker, has been published in the published in ACS Sustainable Chemistry & Engineering (DOI:10.1021/acssuschemeng.9b01793).

This project was a collaboration funded by the South Australian Defence Innovation Partnership, with further support from industry partners Puratap and the Salisbury Council. Co-directors of the study were A/Prof Sally Plush and Prof John Hayball at UniSA and Dr Justin Chalker at Flinders University.

Flinders PhD student Nicholas Lundquist was the lead author of the study in collaboration with Research Fellow Dr Martin Sweetman of UniSA.

"This successful project has laid the groundwork for significant ongoing, collaborative research between Flinders and UniSA," says Dr Sweetman, "as well as with our two industry partners Membrane Systems Australia and Puratap."

Credit: 
Flinders University

Children's brains reorganize after epilepsy surgery to retain visual perception

image: fMRI scans show that patient TC's word-specific region, which is normally found on the left, has remapped to the right hemisphere.

Image: 
Erez Freud, Ph.D., York University.

Children can keep full visual perception - the ability to process and understand visual information - after brain surgery for severe epilepsy, according to a study funded by the National Eye Institute (NEI), part of the National Institutes of Health. While brain surgery can halt seizures, it carries significant risks, including an impairment in visual perception. However, a new report by Carnegie Mellon University, Pittsburgh, researchers from a study of children who had undergone epilepsy surgery suggests that the lasting effects on visual perception can be minimal, even among children who lost tissue in the brain's visual centers.

Normal visual function requires not just information sent from the eye (sight), but also processing in the brain that allows us to understand and act on that information (perception). Signals from the eye are first processed in the early visual cortex, a region at the back of the brain that is necessary for sight. They then travel through other parts of the cerebral cortex, enabling recognition of patterns, faces, objects, scenes, and written words. In adults, even if their sight is still present, injury or removal of even a small area of the brain's vision processing centers can lead to dramatic, permanent loss of perception, making them unable to recognize faces, locations, or to read, for example. But in children, who are still developing, this part of the brain appears able to rewire itself, a process known as plasticity.

"Although there are studies of the memory and language function of children who have parts of the brain removed surgically for the treatment of epilepsy, there have been rather few studies that examine the impact of the surgery on the visual system of the brain and the resulting perceptual behavior," said Marlene Behrmann, Ph.D., senior author of the study. "We aimed to close this gap."

Behrmann and colleagues recruited 10 children who had undergone surgery for severe epilepsy - caused in most cases by an injury such as stroke in infancy, or by a tumor - and 10 matched healthy children as a control group. Of the children with surgery, three had lost parts of the visual cortex on the right side, three on the left side, and the remaining four had lost other parts of the brain not involved in perception, serving as a second kind of control group. Of the six children who had areas of the visual cortex removed, four had permanent reductions in peripheral vision on one side due to loss of the early visual cortex. The epilepsy was resolved or significantly improved in all children after surgery. The children ranged in age from 6 to 17 years at the time of surgery, and most joined the study a few years later.

The researchers tested the children's perception abilities, including facial recognition, the ability to classify objects, reading, and pattern recognition. Despite in some cases completely lacking one side of the visual cortex, nearly all the children were able to successfully complete these behavioral tasks, falling within the normal range even for complex perception and memory activities.

To better understand how the children were able to compensate after surgery, the team imaged the children's brains with functional magnetic resonance imaging (fMRI) while the children engaged in perceptual tasks. fMRI allows researchers to visualize which regions of the brain are activated during specific activities. The team was able to map specific locations in the brain required for individual perception tasks both in the control children and in the children who had undergone surgery. These regions included the early visual cortex, the fusiform face area (required for facial recognition), the parahippocampal place area (required for processing scenes and locations), the lateral occipital complex (required for object recognition), and the visual word form area (necessary for reading).

Most of the regions for visual perception exist bilaterally - that is, both sides of the brain are involved in these tasks. The exceptions, however, are for facial recognition (fusiform face area), which tends to be more dominant in the right hemisphere, and for the visual word form area.

"We think there's some competition between face representation and word representation," explained Erez Freud, Ph.D., a lead author of the study, now an assistant professor at York University, Toronto. "When we learn to read, a reading-specific area arises on the left, and that pushes face recognition to the right hemisphere."

Curiously, for one participant whose surgery had removed most of the visual cortex in the left hemisphere, this reading-specific visual word form area region remapped to the right hemisphere, sharing space next to the facial recognition region on that side. But even for those participants who did not show such clear remapping, the remaining hemisphere was still able to compensate for missing regions in a way not usually seen in adults.

It isn't clear exactly when this compensation took place, but the researchers believe that it may begin well before surgery, in response to the damage that caused the epilepsy in the first place.

"It's possible that early surgical treatment for children with epilepsy might be what allows this remapping," although more research is needed to understand what drives this type of brain plasticity," said Freud.

"It turns out that the residual cortex actually can support most of the visual functions we were looking at," said Tina Liu, Ph.D., a lead author on the study. "Those visual functions - recognizing patterns, facial recognition, and object recognition - are really important to support daily interactions."

Credit: 
NIH/National Eye Institute

Ant reactions to habitat disruptions inform a result of evolution, according to Conco

A Concordia biology professor is calling on ant experts to develop a set of common principles that influence the way the insects respond when their habitat undergoes severe disruption.

Writing in the Journal of Animal Ecology, Jean-Philippe Lessard synthesizes the work of Alan Andersen, a leading researcher in the field of ant ecology based at Charles Darwin University in Australia.

Lessard writes that Andersen's system of grouping ant communities along certain criteria is a helpful start, particularly when it comes to how different species respond to disturbances to their environment. But much more work is needed before ant ecologists -- known formally as myrmecologists -- have an agreed-upon standard framework.

Andersen's groupings provide a base from which researchers can compare changes in the makeup of ant communities around the world. Ants are a highly diverse group of organisms: there are more than 12,000 separate species, found on all continents except Antarctica and in almost all ecosystems, from arctic taiga to arid desert. This makes them easy to sample and identify, says Lessard, and easy to monitor when measuring recovery efforts and response to disturbance.

Biogeographic and evolutionary history

He writes that comparing those responses offers several important insights. For instance, all ant communities around the globe react strongly to habitat openness, or how much vegetation covers the ground, regardless of how that openness comes about.

"Ant communities will not respond so differently to a fire versus the cutting down of a forest versus an outbreak of herbivores eating up biomass," explains Lessard, Concordia University Research Chair in Biodiversity and Ecosystem Functioning.

"They will respond to the openness that these create. It doesn't matter what the actual source of the disturbance is, what matters is whether the canopy is open or closed."

He notes that ant communities' response to disturbance can also be quite heterogeneous. An ant community in the Brazilian savanna, for instance, will react differently to a change in its ecosystem than a colony in the Australian savanna.

This is thanks to millions of years of biogeographic and evolutionary history. Most ant communities in Brazil are evolutionarily adapted to forest habitats, and so the loss of canopy to events like forest fires will have a greater effect on them than a similar event would have on a species adapted to the hot, dry Australian ecosystem.

As Lessard writes, these findings show that their presence over the eons of so-called deep time "has left a signature on contemporary structure of ant communities."

Toward a common framework

As useful and interesting as he finds them, however, Lessard believes Andersen's functional groupings are at least somewhat arbitrary.

"If someone else decided which ants would belong to which groups, how meaningful would that be?" he asks.

Without an existing common framework, ant ecologists are "out of sync" in what functional traits they measure to assess the consequences of man-made disturbances, he argues.

"If someone is trying to measure one trait and someone else is measuring a different trait, we'll never be able to compare how they might facilitate or prevent extinction in the face of a disturbance," Lessard says.

"In the ant world, we really don't have much widespread agreement on which traits would be most useful when it comes to measuring how communities respond to disturbance and understanding the fundamental process of how species come together in one place."

Credit: 
Concordia University

New way to estimate current induced magnetization switching efficacy in ultrathin systems

image: Maxim Stebliy, Laboratory for Film Technologies, Far Eastern Federal University, School of Natural Sciences.

Image: 
FEFU press-office

Scientists of Far Eastern Federal University (FEFU) in collaboration with colleagues from South Ural State University and the Chinese Academy of Sciences have developed an alternative method for numerical evaluation of the current induced local magnetization effect in ultrathin ruthenium-cobalt-ruthenium films with a wolfram layer added. That is another throw to fathom how to control spin orientation needed for the correct operation of spintronic electronics devices. A related article was published in Physical Review Applied.

Development of a reliable method for control local magnetization (orientation of spins) remains among the key unsolved problem in spintronic and an important direction for its progress, along with the issues of high current density and the need to affect the system with an external magnetic field.

The study revealed a possibility to change the magnetic parameters of the system in a wide scope by adjusting the materials' layers thickness. In the ruthenium - cobalt - ruthenium system scrutinized, the current affected the magnetization poorly. However, when a wolfram layer was added to the system, an additional source of spin-polarized electrons appeared. As a result, the efficacy of spin switching is increased.

Current scientific studies are aimed at the future possibility to create elements operating under new principles comes true. Among them are non-volatile magnetic memory and logic, high precision sensors, ultrafast information processing systems, and artificial intelligence systems.

'For the time being, such devices are based on semiconductors, and all processes in them are caused by the movement of electrons. Accumulation of electrons in the memory cells makes it possible to store information. If electrons are lost for some reason, the information will be lost too' Maxim Stebliy explains, Ph.D. in Physical and Mathematical Sciences, a researcher of the Laboratory for Film Technologies, FEFU School of Natural Sciences. 'In magnetic materials, processing and storing information is qualitatively different. Imagine that each and every element of the magnetic material capacity is associated with a compass needle. This is a magnetic moment. The operation in the magnetic material calls to change the orientation of this needle. The electrons causing the magnetization remain static, and to change the orientation of their compass needles (spins) a significant external effect should be provided. That makes their needles static state stable and in that sense non-volatile. In computer, bit switches that way from zero to one. This very process runs in hard disks. The big technological drawback of that process is that a small reel is wanted through which current is passed and a magnetic field is generated. To switch one bit, there is a need to very quickly change the orientation of the current in the reel. This is a relatively long and energy-consuming process, but most importantly this operation cannot be distributed. There is only one reel while there can be terabytes of information, i.e. 10 in the 13 bits. Hence, a hard disk platter has to rotate at very high speed, since the reel must be brought to its every site. The problem with magnetic materials is that it's impossible to control the changes in the orientation of their electrons' needles in a convenient and fast way. A reel is wanted.'

The scientist went on that in recent years technology has been developed that allows one to switch the 'compass needle' by applying the current not to the reel, but to the whole structure. In this case, the spins of electrons localized in the crystal lattice of a structure and having stable needle orientation begin to interact with the spins of the current conducted electrons.

'If somehow we will manage to 'comb out', that is, to polarize the randomly oriented spins of current electrons, adjusting them in one direction, then the spins of localized electrons will 'feel' that and will switch under certain conditions'. Says Maxim Stebliy. 'Our article is devoted to a structure containing layers of ruthenium and wolfram. When a current flows through these layers, the spins of the electrons are self-polarized by the right-handed system. This is the spin Hall Effect.'

Researchers put a layer of cobalt on a layer of wolfram and passed a current through the system. Under certain conditions, this can lead to the cobalt layer's magnetization switches over. This is the so-called spin-torque effect, the performance of which FEFU scientists endeavor to study on the example of different materials, selecting the system where it will be the most profound.

Credit: 
Far Eastern Federal University

Rare fossils provide more detailed picture of biodiversity during Middle Ordovician

image: Maine fossils from Portugal are shedding light on Middle Ordovician, where there had been a gap in the fossil record.

Image: 
Julien Kimmig / KU News Service

LAWRENCE -- A clutch of marine fossil specimens unearthed in northern Portugal that lived between 470 and 459 million years ago is filling a gap in understanding evolution during the Middle Ordovician period.

The discovery, explained in a new paper just published in The Science of Nature, details three fossils found in a new "Burgess Shale-type deposit." (The Burgess Shale is a deposit in Canada renowned among evolutionary biologists for excellent preservation of soft-bodied organisms that don't have a biomineralized exoskeleton.)

"The paper describes the first soft-body fossils preserved as carbonaceous films from Portugal," said lead author Julien Kimmig, collections manager at the University of Kansas Biodiversity Institute and Natural History Museum. "But what makes this even more important is that it's one of the few deposits that are actually from the Ordovician period -- and even more importantly, they're from the Middle Ordovician, a time were very few soft-bodied fossils are known."

Kimmig and his KU Biodiversity Institute colleagues, undergraduate researcher Wade Leibach and senior curator Bruce Lieberman, along with Helena Couto of the University of Porto in Portugal (who discovered the fossils), describe three marine fossil specimens: a medusoid (jellyfish), possible wiwaxiid sclerites and an arthropod carapace.

"Before this, there had been nothing found on the Iberian Peninsula in the Ordovician that even resembled these," Kimmig said. "They close a gap in time and space. And what's very interesting is the kind of fossils. We find Medusozoa -- a jellyfish -- as well as animals which appear to be wiwaxiids, which are sluglike armored mollusks that have big spines. We found these lateral sclerites of animals which were actually thought to have gone extinct in the late Cambrian. There might have been some that survived into the Ordovician in a Morocco deposit, but nothing concrete has been ever published on those. And here we have evidence for the first ones actually in the middle of the Ordovician, so it extends the range of these animals incredibly."

Kimmig said the discovery of uncommon wiwaxiids fossils in this time frame suggests the animals lived on Earth for a far greater span of time than previously understood.

"Especially with animals that are fairly rare that we don't have nowadays like wiwaxiids, it's quite nice to see they lived longer than we ever thought," he said. "Closely after this deposit, in the Upper Ordovician, we actually get a big extinction event. So, it's likely the wiwaxiids survived up to that big extinction event and didn't go extinct earlier due to other circumstances. But it might have been whatever caused the big Ordovician extinction event killed them off, too."

According to the researchers, the soft-bodied specimens fill a gap in the fossil record for the Middle Ordovician and suggest "many soft-bodied fossils in the Ordovician remain to be discovered, and a new look at deep-water shales and slates of this time period is warranted."

"It's a very interesting thing with these discoveries -- we're actually getting a lot of information about the distribution of animals chronologically and geographically," Kimmig said. "Also, this gives us a lot of information on how animals adapted to different environments and where they actually managed to live. With these soft-body deposits, we get a much better idea of how many animals there were and how their environment changed over time. It's something that applies to modern days, with changing climate and changing water temperatures, because we can see how animals over longer periods of time in the geologic record have actually adapted to these things."

Co-author Couto discovered the fossils in the Valongo Formation in northern Portugal, an area famed for containing trilobites. When the animals were alive, the Valongo Formation was part of a shallow sea on the margin of northern Gondwana, the primeval supercontinent.

"Based on the shelly fossils, the deposit looks like it was a fairly common Ordovician community," Kimmig said. "And now we know that in addition to those common fossils jellyfish were floating around, we had sluglike mollusks roaming on the ground, too, and we had bigger arthropods, which might have been predatory animals. So, in that regard, we're getting a far better image with these soft-bodied fossils of what these communities actually looked like."

According to the KU researcher, scientists didn't grasp until recently that deposits from this period could preserve soft-bodied specimens.

"For a long time, it was just not known that these kinds of deposits survived in to the Ordovician," Kimmig said. "So, it is likely these deposits are more common in the Ordovician than we know of, it's just that people were never looking for them."

Kimmig led analysis of the fossils at KU's Microscopy and Analytical Imaging Laboratory to ensure the fossils were made of organic material. Leibach, the KU undergraduate researcher, conducted much of the lab work.

"We analyzed the material and looked at the composition because sometimes you can get pseudo fossils -- minerals that create something that looks like a fossil," Kimmig said. "We had to make sure that these fossils actually had an organic origin. And what we found is that they contain carbon, which was the big indication they would actually be organic."

Credit: 
University of Kansas

Zebrafish capture a 'window' on the cancer process

image: Micrograph of a region of zebrafish skin where a track of cancer cells has disrupted the epithelium much like a mole burrowing beneath a lawn of grass.

Image: 
University of Bristol

Cancer-related inflammation impacts significantly on cancer development and progression. New research has observed in zebrafish, for the first time, that inflammatory cells use weak spots or micro-perforations in the extracellular matrix barrier layer to access skin cancer cells.

The research, led by the University of Bristol and published in Cell Reports today [Tuesday 4 June], used translucent zebrafish to model several sorts of skin cancer and live image how inflammatory cells find the growing cancer cells in the skin. To access the cancer cells, the immune cells need to first breach an extracellular matrix barrier layer called the basement membrane zone.

The researchers observed weak spots in the basement membrane zone, which the inflammatory cells use as easy routes to access the cancer. Those clones of cancer cells nearest to the weak spots tend to receive more inflammatory cell visits and as a consequence they grow faster.

Paul Martin, Professor of Cell Biology in the Schools of Biochemistry and Physiology, Pharmacology & Neuroscience at the University of Bristol, said: "As the zebrafish are translucent, we can watch inflammatory cells interacting with cancer cells in ways not possible in our own tissues.

"This 'window' on the cancer process has revealed 'weak spots' in the barrier layer that inflammatory cells must breach in order to access and feed the cancer cells within the skin. Now we know these micro-perforations exist we can target them with cancer therapeutics."

Inflammatory cells (white blood cells) are known to be key players encouraging cancer malignancy from the earliest stages of cancer initiation, but it has been unclear how they gain access to the cancer cells which generally develop within epithelial tissues, which have a barrier layer of extracellular matrix between them and the tissues from where the inflammatory cells arise. Inflammatory cells could kill the growing cancer but instead tend to nourish it and encourage cancer progression.

The study's findings have clear clinical relevance to cancer patients because microperforations in the basement membrane zone have been shown to occur in human airways and guts and so could act as similar portals to let inflammatory cells gain access to the cancer.

Credit: 
University of Bristol

Separation anxiety no more: A faster technique to purify elements

image: (From left) Rebecca Abergel, Abel Ricano, and Gauthier Deblonde of Berkeley Lab's Chemical Sciences Division have pioneered a faster method of purifying elements.

Image: 
Marilyn Chung/Berkeley Lab

The actinides - those chemical elements on the bottom row of the periodic table - are used in applications ranging from medical treatments to space exploration to nuclear energy production. But purifying the target element so it can be used, by separating out contaminants and other elements, can be difficult and time-consuming.

Now researchers at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) have developed a new separation method that is vastly more efficient than conventional processes, opening the door to faster discovery of new elements, easier nuclear fuel reprocessing, and, most tantalizing, a better way to attain actinium-225, a promising therapeutic isotope for cancer treatment.

The research, "Ultra-Selective Ligand-Driven Separation of Strategic Actinides," has been published in the journal Nature Communications. The authors are Gauthier Deblonde, Abel Ricano, and Rebecca Abergel of Berkeley Lab's Chemical Sciences Division. "The proposed approach offers a paradigm change for the production of strategic elements," the authors wrote.

"Our proposed process appears to be much more efficient than existing processes, involves fewer steps, and can be done in aqueous environments, and therefore does not require harsh chemicals," said Abergel, lead of Berkeley Lab's Heavy Element Chemistry group. "I think this is really important and will be useful for many applications."

Berkeley Lab is one of a handful of institutions around the world studying the nuclear and chemical properties of the heaviest elements. Most of them were, in fact, discovered at Berkeley Lab in the last century. Abergel's group has previously published discoveries on berkelium and plutonium and treatments for radioactive contamination.

Abergel noted that the new separation method achieves separation factors that are many orders of magnitude higher than current state-of-the-art methods. The separation factor is a measure of how well an element can be separated from a mixture. "The higher the separation factor, the fewer contaminants there are," she said. "Usually when you purify an element you'll go through the cycle many times to reduce contaminants."

With a higher separation factor, fewer steps and less solvents are needed, making the process faster and more cost-effective. For example, the scientists demonstrated for one of the three systems they purified that they could reduce the process from 25 steps to just two steps.

The Berkeley Lab researchers demonstrated their method first on actinium-225, an isotope of actinium that has shown very promising radio-therapeutic applications. It works by killing cancer cells but not healthy cells, through targeted delivery.

DOE's Isotope Program is actively working on ramping up production of actinium-225 throughout the complex of national laboratory-based accelerators. This new separation method could be an alternative to chemical processes currently under development. "With any production process, you need to purify the final isotope," Abergel said. "Our method could be used right after production, before distribution."

The two other actinides purified in this study were plutonium and berkelium. An isotope of plutonium, plutonium-238, is used for power generation in robots being sent to explore Mars. Plutonium isotopes are also present in waste generated at nuclear power plants, where they must be separated out from the uranium in order to recycle the uranium.

Lastly, berkelium is important for fundamental science research. One of its uses is as a target for discovery of new elements.

The process relies on the unprecedented ability of synthetic ligands - small molecules that bind metal atoms - to be highly selective in binding to metallic cations (positive ions) based on the size and charge of the metal.

The next step, said Abergel, is to explore using the process on other medical isotopes. "Based on what we've seen, this new method can really be generalized, as long as we have different charges on the metals we want to separate," she said. "Having a good purification process available could make everything easier in terms of post-production processing and availability."

Credit: 
DOE/Lawrence Berkeley National Laboratory

New process to rinse heavy metals from soils

When poisonous heavy metals like lead and cadmium escape from factories or mines, they can pollute the nearby soil. With no easy ways to remove these contaminants, fields must be cordoned off to prevent these toxins from entering the food chain where they threaten human and animal health.

According to the Environmental Protection Agency, heavy metals have been found at thousands of locations nationwide. While some have been cleaned up through a combination of federal, state and private efforts, the need remains for new technologies to address heavy metal contamination.

Now a research team led by Stanford materials scientist Yi Cui has invented a way to wash heavy metals from contaminated soils using a chemical process that's a bit like brewing coffee.

As they describe June 4 in Nature Communications, the researchers started by rinsing contaminated soil with a mixture of water and a chemical that attracts heavy metals. When that mixture percolates through the soil, the chemical pulls heavy metals loose. The team members then collected this toxic brew and ran it through an electrochemical filter that captured the heavy metals out of the water. In this way they cleansed the soil of heavy metals and recycled the water and chemical mixture to percolate through more contaminated ground.

"This is a new approach to soil cleanup," said Cui, who is a professor of materials science and engineering and photon science. "Our next step is a pilot test to make sure that what works in the lab is practical in the field, and to figure out how much this process will cost."

So far, his team has cleansed soils contaminated with lead and cadmium, two prevalent and dangerous toxins, as well as with copper, which is only dangerous in high concentrations. Cui believes this process of chemical cleansing and electrochemical filtering will work with other dangerous heavy metals like mercury and chromium, but further lab experiments are needed to demonstrate that.

No more sacrificial plants

Cui said the project began two years ago when he and graduate student Jinwei Xu brainstormed about how to solve the basic problem: Heavy metals bind to the soil and become virtually inextricable. Today, Cui said, cleanup may involve digging up contaminated soils and sequestering them somewhere. Agricultural researchers have also developed phytoremediation techniques - growing sacrificial plants in contaminated soil to absorb heavy metals, then harvesting these crops and taking them to an extraction and disposal facility. But phytoremediation can take many years of repeated harvests.

Seeking a quick, cost-effective way to extract heavy metals from contaminated fields, the researchers tried washing toxic soil samples with plain water. They soon realized that plain water couldn't break the chemical bond between the heavy metals and the soil. They needed some additive to pry the contaminants loose. They found the answer in a common chemical known by its initials: EDTA.

In retrospect, EDTA was the obvious choice because this same chemical is used to treat patients poisoned with lead or mercury. Negatively charged EDTA bonds so strongly to positively charged heavy metal particles that it pulls the lead or mercury from the patient's tissues. The researchers reasoned that, when dissolved in water, EDTA's negative hooks would rip heavy metals loose from soils. Experiments bore this out. When EDTA-treated water percolated through contaminated soil, it carried the heavy metals away.

But the team's job was only half done. The soil was clean, but the treated water was still toxic. They needed a way to separate the EDTA from the heavy metals in the rinse water and capture those toxins once and for all.

Isolating heavy metals

The scientists knew that EDTA remained strongly negative even after it captured a positively charged metal particle. So, the researchers built a sieve with the electrical and chemical properties to pull the negatively charged EDTA and positively charged heavy metals apart. The result was isolated heavy metals and a mixture of water and EDTA ready to purify more soil.

In addition to lead and cadmium, the researchers tested the process on copper, which is only dangerous in high concentrations. Next Cui would like to run the experiment on other heavy metals like mercury, which are so toxic they require special handling to protect the researchers. But he thinks the chemistry is so sound that he is confident of success in the lab. The bigger question is whether the process can be scaled up to treat tons of contaminated soil. The researchers have sought to patent the process through the Stanford Office of Technology Licensing and would like to find an opportunity to run a pilot project in a contaminated field.

"We really have no good remediation technology for heavy metals," Cui said. "If this proves practical on a large scale it will be a significant advance."

Credit: 
Stanford University

Choosing the right drug to fight cancer

Canadian researchers have discovered a molecular indicator of a mechanism that drives cancer progression, giving doctors the possibility of using precision medicine, that is, choosing which patients will respond to a particular anticancer drug.

In a study published in Cancer Research, a team of biochemists at Université de Montreal found that a group of enzymes called SRC kinases chemically modify a tumour-suppressing protein called SOCS1.

"SOCS1 is part of a gene-regulation circuit centered around the master cell proliferation regulator p53, often called the guardian of the genome," said senior author Gerardo Ferbeyre, an UdeM biochemistry professor and researcher at its hospital research centre, the CRCHUM.

"If p53 or another protein in its network is mutated or becomes chemically modified in some abnormal way, a pattern of gene activation occurs that programs cells to proliferate without control, as occurs in cancers."

In their research - led by UdeM PhD student Emmanuelle Saint-Germain, with UdeM biochemist Frédéric Lessard and Université de Sherbrooke biochemist Subburaj Ilangumaran - Ferbeyre's team uncovered a new mechanism by which the p53 circuit becomes unbalanced.

Normally, the SRC kinases add phosphates to proteins in a cell in a highly regulated manner. But in cancer cells the regulation of these enzymes can break down. As a consequence, SOCS1 is abnormally targeted by these enzymes, leading to an effective inhibition of its ability to regulate p53 and stop the proliferation of cancer cells.

The therapeutic implications of UdeM's cientists discovery could be multiple, they believe.

Since effective anticancer drugs that target SRC kinases already exist, detection of modified SOCS1 in a tumour could be used to predict whether these drugs would be an effective treatment for the tumour.

"We were able to detect phosphorylated SOCS1 in patients' samples with an antibody that we developed," said Saint-Germain. "The same antibody could be used to detect phosphorylated SOCS1 in a clinical setting, providing a way to decide whether SRC kinase inhibitors would be an effective treatment."

Added Ilangumaran, who has been studying SOCS1 in immune cells and cancers for many years: "This new mechanism for SOCS1 inactivation may actually represent a regulatory control that is hijacked by cancer cells. On a more fundamental level, our group's discovery - that phosphorylation of SOCS1 acquires a new physical form - opens the door to hitherto unknown ways of regulating SOCS1 functions.

"And this has implications for the treatment of autoimmune diseases and for anticancer immunity."

Credit: 
University of Montreal

SwRI's ActiveVision enables transportation agencies to automate traffic monitoring

image: ActiveVision applies a combination of advanced computer vision and machine learning capabilities to detect and report actionable traffic condition changes.

Image: 
Southwest Research Institute

SAN ANTONIO - June 4, 2019 - Southwest Research Institute (SwRI) has announced the release of ActiveVision, a machine vision tool that transportation agencies can use to autonomously detect and report traffic condition changes. ActiveVision's algorithms process camera data to provide real-time information on weather conditions and other anomalies affecting congestion. Designed for integration with intelligent transportation systems (ITS), ActiveVision can be configured with existing traffic cameras to analyze roadway conditions with no human monitoring required.

"The goal is to help transportation officials enhance their ITS capabilities with advanced algorithms that autonomously scan vast amounts of visual data, extracting and reporting actionable data," said Dan Rossiter, an SwRI research analyst leading ActiveVision development.

ActiveVision integrates with the SwRI-developed ActiveITS software in addition to other ITS and advanced traffic management systems (ATMS) used by state and local agencies across the country. A leader in transportation and traffic management software, SwRI has over 20 years of experience developing and deploying ITS software for state and local agencies. SwRI-developed intelligent transportation systems have been applied to more than 13,000 miles of urban and rural managed roadways in 10 states and Puerto Rico.

"Work in the ITS arena inspired our team to find a solution that could be integrated agnostically into just about any advanced traffic management system, using existing cameras and infrastructure," added Steve Dellenback, vice president of SwRI's Intelligent Systems Division. The division specializes in traffic management systems and connected and automated vehicles in addition to machine learning solutions used in autonomous robotics, health diagnostics, markerless motion capture, and methane leak detection.

Credit: 
Southwest Research Institute

NASA-NOAA satellite sees system 91L's reach into the western gulf of Mexico

image: NASA-NOAA's Suomi NPP satellite passed over the Gulf of Mexico and captured a visible image of developing low pressure System 91L on June 3, 2019. Clouds associated with the system filed the Bay of Campeche and stretched north into the western Gulf of Mexico.

Image: 
NASA Worldview, Earth Observing System Data and Information System (EOSDIS)

System 91L is an area of tropical low pressure located in the Bay of Campeche. On June 3, when NASA-NOAA's Suomi NPP satellite passed the western Gulf of Mexico, it captured an image of the storm that showed its extensive reach.

The Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard Suomi NPP provided a visible image of the storm. The VIIRS image showed fragmented bands of thunderstorms around System 91L's circulation center which filled the Bay of Campeche and stretched north into the western Gulf of Mexico. System 91L's clouds extend from Mexico's Yucatan state to the west bordering and including the states of Campeche, Tabasco, Veracruz and as far north as Tamaulipas.

At 8 a.m. EDT on June 4, NOAA's National Hurricane Center or NHC noted that System 91L now has a 40 percent chance to form into a depression over the next two days. "Shower and thunderstorm activity associated with a broad area of low pressure located over the southwestern Gulf of Mexico has decreased since yesterday and remains disorganized. This system could briefly become a tropical depression before moving inland over northeastern Mexico later today or tonight. "

Even if System 91L doesn't develop into a depression, it's still packing a punch with rainfall.

The NHC noted "the disturbance will likely produce heavy rainfall over portions of eastern Mexico, southeastern Texas and the Lower Mississippi Valley during the next few days.   Interests along the Gulf coast of Mexico should monitor the progress of this system."

Credit: 
NASA/Goddard Space Flight Center

Researchers find seaweed helps trap carbon dioxide in sediment

TALLAHASSEE, Fla. -- Every beachgoer can spot seaweed in the ocean or piling up on the beach, but Florida State University researchers working with colleagues in the United Kingdom have found that these slimy macroalgae play an important role in permanently removing carbon dioxide from the atmosphere.

Their work is published in the journal Ecological Monographs by the Ecological Society of America.

The researchers, who partnered with ecologists from Plymouth Marine Laboratory in the United Kingdom, investigated how seaweed absorbed carbon and processed it, trapping it in the seafloor.

"Seaweeds have been ignored in the 'blue carbon' storage literature in favor of seagrasses and mangroves, which physically trap carbon from sediments and their own biomass in root structures," said Assistant Professor of Biological Science Sophie McCoy. "Macroalgae are also often overlooked by oceanographers who study the carbon cycle, as their high productivity occurs close to shore and has been thought to stay and cycle locally."

In designing the study, researchers suspected that the high productivity and huge amount of seasonal biomass of annual algae would provide carbon subsidies farther offshore than typically considered, and that these subsidies would be important to benthic food webs there.

That was exactly what they found. They also discovered that this was the process that leads to the burial of seaweed carbon in ocean sediments.

Blue carbon is the carbon captured in marine systems both through photosynthesis and then by trapping it in the seafloor. Researchers sequenced environmental DNA and modeled stable isotope data for over a year off the coast of Plymouth, England. Through this, they found that seaweed debris was an important part of the food web for marine organisms and that much of that debris was ultimately stored in sediments or entered the food web on the seafloor.

Jeroen Ingels, a researcher at the FSU Coastal and Marine Laboratory who conducted the meiofauna work for the study, said the research not only explains seaweed's role in the food web, but it also shows that human activities that affect seaweed and the sea floor are important to monitor.

"The human activities that are impacting macroalgae and sediment habitats and their interstitial animals are undermining the potential for these systems to mitigate climate change by affecting their potential to take up and cycle carbon," he said. "The study really illustrates in a new way how seaweed and subsequently benthic animals can contribute in a significant way to blue carbon."

The team found that about 8.75 grams of macroalgae carbon are trapped per square meter of sediment each year.

Ana M. Queiros, a scientist at Plymouth Marine Laboratory and the paper's lead author, said these first measurements of seaweed carbon trapped in the sediment gives scientists more information to help them develop sustainable environmental practices.

"They tell us that the global extent of blue carbon-meaningful marine habitats could be much wider than we previously thought," she said. "Identifying these areas and promoting their management will let us capitalize on the full potential of the ocean's blue carbon towards the stabilization of the global climate system."

Credit: 
Florida State University

New interaction between thin film magnets discovered

We ubiquitously stream videos, we download audiobooks to mobile devices, and we store huge numbers of photos on our devices. In short, the storage capacity we need is growing rapidly. Researchers are working to develop new data storage options. One possibility is the racetrack memory device where the data is stored in nanowires in the form of oppositely magnetized areas, so-called domains. The results of this research have recently been published in the scientific journal Nature Materials.

A research team from Johannes Gutenberg University Mainz (JGU) in Germany, together with colleagues from Eindhoven University of Technology in the Netherlands as well as Daegu Gyeongbuk Institute of Science and Technology and Sogang University in South Korea, has now made a discovery that could significantly improve these racetrack memory devices. Instead of using individual domains, in the future one could store the information in three-dimensional spin structures, making the memories faster and more robust and providing a larger data capacity.

"We were able to demonstrate a hitherto undiscovered interaction," explained Dr. Kyujoon Lee of Mainz University. "It occurs between two thin magnetic layers separated by a non-magnetic layer." Usually, spins align either parallel or antiparallel to each other. This would also be expected for such two separate magnetic layers. However, the situation is different in this work as the researchers have been able to show that in particular systems the spins in the two layers are twisted against each other. More precisely, they couple to be aligned perpendicular with one another at an angle of 90 degrees. This new interlayer coupling interaction was theoretically explained through theoretical calculations performed by the project partners at the Peter Grünberg Institute (PGI) and the Institute for Advanced Simulation (IAS) at Forschungszentrum Jülich.

The Mainz-based researchers examined a number of different combinations of materials grown in multi-layers. They were able to show that this previously unknown interaction exists in different systems and can be engineered by the design of the layers. Theoretical calculations allowed them to understand the underlying mechanisms of this novel effect.

With their results, the researchers reveal a missing component in the interaction between such layers. "These results are very interesting to the scientific community in that they show that the missing antisymmetric element of interlayer interaction exists," commented Dr. Dong-Soo Han from JGU. This opens up the possibility of designing various new three-dimensional spin structures, which could lead to new magnetic storage units in the long term.

Professor Mathias Kläui, senior author of the publication, added: "I am very happy that this collaborative work in an international team has opened a new path to three-dimensional structures that could become a key enabler for new 3D devices. Through the financial support of the German Research Foundation and the German Academic Exchange Service, the DAAD, we were able to exchange students, staff, and professors with our foreign partners in order to realize this exciting work."

Credit: 
Johannes Gutenberg Universitaet Mainz

Downpours of torrential rain more frequent with global warming

image: Dr. Simon Papalexiou of the Global Institute for Water Security, at the University of Saskatchewan, Canada, has found 'extreme' downpours of rain have increased in the past 50 years.

Image: 
University of Saskatchewan, Canada

The frequency of downpours of heavy rain--which can lead to flash floods, devastation, and outbreaks of waterborne disease--has increased across the globe in the past 50 years, research led by the Global Institute for Water Security at the University of Saskatchewan (USask) has found.

The number of extreme downpours increased steadily between 1964 and 2013--a period when global warming also intensified, according to research published in the journal Water Resources Research.

The frequency of 'extreme precipitation events' increased in parts of Canada, most of Europe, the Midwest and northeast region of the U.S., northern Australia, western Russia and parts of China, (see maps and graphics).

"By introducing a new approach to analyzing extremes, using thousands of rain records, we reveal a clear increase in the frequency extreme rain events over the recent 50 years when global warming accelerated," said Simon Papalexiou, a hydro-climatologist in USask's College of Engineering, and an expert in hydroclimatic extremes and random processes.

Papalexiou, who led the research, added: "This upward trend is highly unlikely to be explained by natural climatic variability. The probability of this happening is less than 0.3 per cent under the model assumptions used."

The USask study of over 8,700 daily rain records from 100,000 stations monitoring rain worldwide found the frequency of torrential rain between 1964 and 2013 increased as the decades progressed.

Between 2004 and 2013, there were seven per cent more extreme bouts of heavy rain overall than expected globally. In Europe and Asia, there were 8.6 per cent more 'extreme rain events' overall, during this decade.

Global warming can lead to increased precipitation because more heat in the atmosphere leads to more atmospheric water which, in turn, leads to rain.

Torrents of rain not only lead to flooding, but can threaten public health, overwhelming sewage treatment plants and increasing microbial contaminants of water. More than half a million deaths were caused by rain-induced floods between 1980 and 2009.

Heavy rain can also cause landslides, damage crops, collapse buildings and bridges, wreck homes, and lead to chaos on roads and to transport, with huge financial losses.

Co-author Alberto Montanari, professor of hydraulic works and hydrology at the University of Bologna and president of the European Geoscience Union, said:

"Our results are in line with the assumption that the atmosphere retains more water under global warming. The fact that the frequency, rather the magnitude, of extreme precipitation is significantly increasing has relevant implications for climate adaptation. Human systems need to increase their capability to react to frequent shocks."

The researchers screened data for quality and consistency, selecting the most robust and complete records from the 100,000 stations worldwide monitoring precipitation. Regions in South America and Africa were excluded from the study, as records for the study period were not complete or robust.

Papalexiou said planning for more frequent 'extreme' rain should be a priority for governments, local authorities and emergency services.

"If global warming progresses as climate model projections predict, we had better plan strategies for dealing with frequent heavy rain right now," said Papalexiou. "Our study of records from around the globe shows that potentially devastating bouts of extreme rain are increasing decade by decade.

"We know that rainfall-induced floods can devastate communities, and that there are implications of increasing bouts of heavy rain for public health, agriculture, farmers' livelihoods, the fishing industry and insurance, to name but a few."

Credit: 
University of Saskatchewan