Tech

Streamlining quantum information transmission

image: A tele-meeting with the results: from left, William J. Munro, Nicolò Lo Piparo, Kae Nemoto, Michael Hanks, and Claude Gravel.

Image: 
Kae Nemoto, Global Research Center for Quantum Information Science, the National Institute of Informatics in Japan

The quantum realm holds the key to the next revolution in communication technology as we know it. With the promise of unprecedented performance and impenetrable security, quantum technology is taking its first steps towards the ultimate goal of applications such as highly encrypted yet nearly fast-as-light financial transactions. However, the ability for quantum computers to communicate with one another has been limited by the resources required for such exchanges, constraining the amount of information that can be traded, as well as the amount of time it can be stored.

Researchers based in Japan have taken a major step toward addressing these resource limitations. They published their findings on May 27 in Physical Review Letters.

"To connect remote quantum computers together, we need the capacity to perform quantum mechanical operations between them over very long distances, all while maintaining their important quantum coherence," said Professor Kae Nemoto, paper author and director of the Global Research Center for Quantum Information Science at the National Institute of Informatics (NII) in Japan.

"However, interestingly, while quantum computers have emerged at the small scale, quantum communication technology is still at the device level and has not been integrated together to realize communication systems. In this work, we show a route forward."

Quantum information requires protection from the significant amount of noise surrounding it, as well as the tendency of information to be lost from the initial message. This protection process is called quantum error correction, which entangles one piece of information across many qubits, the most basic unit of quantum information. Imagine a letter torn into nine pieces, each placed in an envelope, with each envelope sent to the same destination to be re-assembled and read. In the quantum world, the envelopes are mailed via photons and there is enough information in each envelope to recreate the entire letter if any of the envelopes are lost or destroyed.

"The overhead to protect quantum information from noise and loss will be large, and the size of the required devices to realize this will cause serious problems, as we have started to see in today's quantum computer development," Nemoto said. "As the efforts to realize the quantum internet are occurring worldwide, it is important to think of it as a system, and not simple devices."

Nemoto and her team addressed this issue using a process called quantum multiplexing, in which they reduced not only noise, but also the number of resources needed to transmit information. In multiplexing, the information contained within two separate photons is combined into one photon, like two envelopes being sent in a portfolio, so the information is still individually protected but only one stamp is needed for transport.

"In this system, quantum error correction will play an essential role, not only of protecting the quantum information transmitted, but also for significantly reducing the necessary resources to achieve whatever tasks one needs," said paper co-author William J. Munro, a researcher at NTT's Basic Research Laboratories. "Quantum multiplexing enables significant resource reduction without requiring new technology to be developed for such quantum communication devices."

The researchers are currently extending their work to large-scale quantum complex network scenarios.

"The quantum revolution has allowed us to design and create new technologies previously thought impossible in our classical world," Nemoto said. "Small-scale quantum computers have already shown computing performance better than today's largest supercomputers. However, many other forms of quantum technology are emerging and one of the most profound could be the quantum internet - a quantum-enabled version of today's internet - which will allow us to network devices together, including quantum computers."

Next, the researchers will build upon the first steps they have already taken to increase both the amount of information and the storage time.

Credit: 
Research Organization of Information and Systems

Recognising fake images using frequency analysis

image: Images of people transformed into the frequency domain: the upper left corner represents low-frequency image areas, the lower right corner represents high-frequency areas. On the left, you can see the transformation of a photo of a real person: the frequency range is evenly distributed. The transformation of the computer-generated photo (right) contains a characteristic grid structure in the high-frequency range - a typical artefact.

Image: 
© RUB, Lehrstuhl für Systemsicherheit

They look deceptively real, but they are made by computers: so-called deep-fake images are generated by machine learning algorithms, and humans are pretty much unable to distinguish them from real photos. Researchers at the Horst Görtz Institute for IT Security at Ruhr-Universität Bochum and the Cluster of Excellence "Cyber Security in the Age of Large-Scale Adversaries" (Casa) have developed a new method for efficiently identifying deep-fake images. To this end, they analyse the objects in the frequency domain, an established signal processing technique.

The team presented their work at the International Conference on Machine Learning (ICML) on 15 July 2020, one of the leading conferences in the field of machine learning. Additionally, the researchers make their code freely available online at https://github.com/RUB-SysSec/GANDCTAnalysis, so that other groups can reproduce their results.

Interaction of two algorithms results in new images

Deep-fake images - a portmanteau word from "deep learning" for machine learning and "fake" - are generated with the help of computer models, so-called Generative Adversarial Networks, GANs for short. Two algorithms work together in these networks: the first algorithm creates random images based on certain input data. The second algorithm needs to decide whether the image is a fake or not. If the image is found to be a fake, the second algorithm gives the first algorithm the command to revise the image - until it no longer recognises it as a fake.

In recent years, this technique has helped make deep-fake images more and more authentic. On the website http://www.whichfaceisreal.com, users can check if they're able to distinguish fakes from original photos. "In the era of fake news, it can be a problem if users don't have the ability to distinguish computer-generated images from originals," says Professor Thorsten Holz from the Chair for Systems Security.

For their analysis, the Bochum-based researchers used the data sets that also form the basis of the above-mentioned page "Which face is real". In this interdisciplinary project, Joel Frank, Thorsten Eisenhofer and Professor Thorsten Holz from the Chair for Systems Security cooperated with Professor Asja Fischer from the Chair of Machine Learning as well as Lea Schönherr and Professor Dorothea Kolossa from the Chair of Digital Signal Processing.

Frequency analysis reveals typical artefacts

To date, deep-fake images have been analysed using complex statistical methods. The Bochum group chose a different approach by converting the images into the frequency domain using the discrete cosine transform. The generated image is thus expressed as the sum of many different cosine functions. Natural images consist mainly of low-frequency functions.

The analysis has shown that images generated by GANs exhibit artefacts in the high-frequency range. For example, a typical grid structure emerges in the frequency representation of fake images. "Our experiments showed that these artefacts do not only occur in GAN generated images. They are a structural problem of all deep learning algorithms," explains Joel Frank from the Chair for Systems Security. "We assume that the artefacts described in our study will always tell us whether the image is a deep-fake image created by machine learning," adds Frank. "Frequency analysis is therefore an effective way to automatically recognise computer-generated images."

Credit: 
Ruhr-University Bochum

Polymers self-assembling like links of a chain for innovative materials

image: Nano-[2]catenane, nano-[5]catenane and nano-poly[22]catenane

Image: 
Politecnico di Torino

The "nanolimpiad", as the complex structure of self-assembled links has been named, looks like the five Olympic Rings and may become the source of new polymeric materials with innovative properties.

A collaboration of research groups from Japan (Chiba University), Italy (Politecnico di Torino), Switzerland (SUPSI) and the UK (Keele University, the Diamond Light Source &the ISIS Pulsed Neutron and Muon Source) has succeeded in developing and studying supramolecular poly-catenanes: hierarchical structures composed of mechanically interlocked self-assembled rings, made solely from one elementary molecular ingredient.

In 2016, the Nobel Prize in Chemistry (for the contribution to the synthesis of molecular machines) was awarded to Ben Feringa, Fraser Stoddart and Jean-Pierre Sauvage, the latter having managed to connect two ring-shaped molecules into what is called a "catenane". Unlike ordinary polymers, consisting of monomers united via covalent chemical bonds, catenanes are composed by interconnected units mechanically interlocked like rings in a chain. This allows the links to move relatively to each other, imparting to these materials unique properties in terms of absorption, conversion and dissipation of energy, super-elasticity, etc.

The synthesis and characterisation of such structures are notoriously difficult, particularly when the fundamental rings themselves are not held together by strong covalent bonds.

This research work, led by Shiki Yagai (Chiba University), has been just published on the prestigious journal Nature.

It is the first report on the creation of nano-poly[n]catenanes via molecular self-assembly, without the use of models or other supporting materials. By altering the self-assembly conditions, this group of scientists has been able to create intricate structures, including a nano-[5]catenane with interlocked rings in a linear arrangement, which has been named "nanolympiadane" in homage to the [5]catenane system "olympiadane" first reported by Fraser Stoddart and colleagues in 1994, so called for the similarity with the well-known symbol of the Olympic games.

The scientists were able to probe these impressive structures composed of nano-rings using atomic force microscopy, X-ray and neutron scattering.

Each component ring (nano-toroid, ~30 nm of diameter) comprises around 600 identical small molecules (monomers). These monomers first spontaneously assemble into 6-membered flat "rosettes", which then collectively stack on each other to form a ring. The consortium designed methods to purify the rings, removing any material that hadn't assembled as desired, and found that the addition of such rings to the hot monomer solution facilitates the formation of new assemblies on the surface of the rings, a process known as secondary nucleation. Based on this finding, they applied the sequential addition of monomers, and were able to create poly[n]catenanes with up to 22 rings.

Multiscale molecular simulations carried out by the research group of Giovanni M. Pavan, full professor at the Politecnico, were instrumental for understanding the formation of the poly-catenanes. This computational work was supported by the European Research Council (ERC) under the European Union's Horizon 2020 research and innovation program, by the Swiss National Science Foundation (SNSF) and by the CSCS Swiss National Supercomputing Centre (CSCS) and the CINECA, which provided the needed computational resources.

The simulations allowed to model the secondary nucleation taking place on the rings surface, and, together with small-angle scattering experiments, to characterize this process.

The molecular simulations showed that the prime cause triggering the nucleation and growth of new rings onto pre-existing ones is the limited solubility in the solvent, which causes the monomers and the rosettes to stick onto the surface of pre-formed toroids.

A stepwise addition of monomers allowed the authors to greatly improving the extent of interlocking of the rings, generating unprecedentedly long poly-catenanes.

The size of these interlocked structures will allow for in-depth study of the unique physical properties that a structure made up of miniscule interlocked chain links may have, and to explore their potential for creating new types of molecular machines and active materials.

Credit: 
Politecnico di Torino

ENSO influences trans-Pacific ozone transport from vegetation fires in Southeast Asia

image: Conceptual schematic of the trans-Pacific O3 enhancement induced by ENSO and ENSO-mediated vegetation fires in Southeast Asia in spring. The vertical cross-section shows the O3 plumes and wind flows (grey lines for climatological average and red dashed ones for the anomaly in El Niño spring). Red and brown bold lines show the main transport pathways of biomass burning plumes from the Indochinese Peninsula, and Indonesia respectively. The red dashed lines present the ENSO-induced wind anomalies along the cross-section. Blue and pink areas over the Pacific indicate the SST anomaly between the El Niño and La Niña years

Image: 
©Science China Press

Long-range transboundary transport of air pollutants (e.g. ozone) is one of the important environmental concerns globally. Previous studies on trans-Pacific transport of air pollutants have been mainly focused on the influence of anthropogenic fossil fuel combustion sources in Asia, especially from China. This study reveals that the El Niño-Southern Oscillation (ENSO)-modulated vegetation fires in Southeast Asia, rather than fossil fuel plumes from China, dominate the springtime trans-Pacific transport of ozone across the entire North Pacific Ocean.

Ozone, produced mainly from photochemical reaction, is one of main secondary pollutants in many countries. It also plays a key role in atmospheric chemistry and climate change because of its dominant roles on the atmospheric oxidizing capacity and cycling of reactive trace gases in the troposphere. Ozone can be transported on regional to intercontinental scales due to its lifetime of a few weeks in the free troposphere. Previous studies attributed the long-range transport of ozone and its precursors from Asia to anthropogenic fossil fuel combustion sources, particularly those from China.

This study detected strong anomaly of ozone and its precursors in El Niño and La Niña springs based on satellite retrievals and ground based measurements. During El Niño springs, the intensified fires from Southern Asia result in enhanced ozone plumes that stretch over 15,000 km, from the South China Sea northeastward to Southwestern North America and the Gulf of Mexico, in both the lower-middle and upper troposphere. The enhancement is also observed in the in-situ measurements at Mauna Loa Observatory (MLO) in Hawaii, the most famous Global Atmosphere Watch station with long continuous records of ambient baseline carbon dioxide and ozone.

The study further applied a global-scale numerical model to explore the mechanism and to quantify the impact based on a series of experiments. It shows that both the ENSO-induced anomalies in circulation and enhanced vegetation fire emissions in Southeast Asia make stronger contributions to the ozone and carbon monoxide anomalies stretching from subtropical Asia to North America in El Niño spring. However, the year-to-year difference in vegetation fire emissions modulated by ENSO makes a stronger contribution than the anomalies due to meteorological variability alone. The study also reveal completely different transport pathways of the enhanced fire emissions from Indochina Peninsula and Indonesia over the Pacific Ocean: via lower-mid troposphere and upper troposphere, respectively.

The study proves once more how complex the interactions in the climate system are. It highlights the importance of continuous measurements in the remote North Pacific Ocean for characterizing the impacts from both natural climate variability and human activity, and also suggests that better ENSO forecasting could improve modelling of continental-scale long-range transport of air pollutants.

Credit: 
Science China Press

Breakthrough in studying ancient DNA from Doggerland that separates the UK from Europe

image: The sediment of which the sedaDNA was studied

Image: 
Dr Martin Bates, UWTSD

Thousands of years ago the UK was physically joined to the rest of Europe through an area known as Doggerland. However, a marine inundation took place during the mid-holocene, separating the British landmass from the rest of Europe, which is now covered by the North Sea.

Scientists from the School of Life Sciences at the University of Warwick have studied sedimentary ancient DNA (sedaDNA) from sediment deposits in the southern North Sea, an area which has not previously been linked to a tsunami that occurred 8150 years ago.

The paper, led by the University of Bradford and involving Universities of Warwick, Wales St. Trinity David, St. Andrews, Cork, Aberystwyth, Tartu as well as the Smithsonian and Natural History Museum, 'Multi-Proxy Characterisation of the Storegga Tsunami and Its Impact on the Early Holocene Landscapes of the Southern North Sea', published in the Journal Geosciences, sees Life Scientists from the University of Warwick work specifically on the sedimentary ancient DNA from Doggerland.

A number of innovative breakthroughs were achieved by the University of Warwick scientists in terms of analysing the sedaDNA. One of these was the concept of biogenomic mass, where for the first time they were able to see the how the biomass changes with events, evidence of this presented in the paper refers to the large woody mass of trees from the tsunami found in the DNA of the ancient sediment.

New ways of authenticating the sedaDNA were also developed, as current methods of authentication do not apply to sedaDNA which has been damaged whilst under the sea for thousands of years because there is too little information for each individual species. Researchers therefore came up with a new way, metagenomic assessment methodology, whereby the characteristic damage found at the ends of ancient DNA molecules is collectively analysed across all species rather than one.

Alongside this a key part of analysing the sedaDNA is to determine whether or not it was deposited in situ or has moved over time. This led researchers to develop statistical methods to establish which scenario was appropriate, using stratigraphic integrity they were able to determine that the sedaDNA in the sediment deposits had not moved a massive amount since deposition by assessing the biomolecules vertical movement in the core column of the sedaDNA.

Identifying which organisms the ancient fragmented molecules of DNA came from is also challenging because often there is nothing to directly compare. In a fourth innovation the researchers refined algorithms to define these regions of "dark phylogenetic space" from where organisms must have originated overcome this issue.

Professor Robin Allaby from the School of Life Sciences at the University of Warwick comments: "This study represents an exciting milestone for sedimentary ancient DNA studies establishing a number of breakthrough methods to reconstruct an 8,150 year old environmental catastrophe in the lands that existed before the North Sea flooded them away into history."

Professor Vince Gaffney from the School of Archaeological and Forensic Sciences at the University of Bradford said: "Exploring Doggerland, the lost landscape underneath the North Sea, is one of the last great archaeological challenges in Europe. This work demonstrates that an interdisciplinary team of archaeologists and scientists can bring this landscape back to life and even throw new light on one of prehistory's great natural disasters, the Storegga Tsunami.

"The events leading up to the Storegga tsunami have many similarities to those of today. Climate is changing and this impacts on many aspects of society, especially in coastal locations."

Credit: 
University of Warwick

COVID-19 lockdown reduced dangerous air pollutants in five Indian cities by up to 54 percent

A team of 10 interdisciplinary researchers from the University of Surrey's renowned Global Centre for Clean Air Research (GCARE), including PhD students and post-doctoral researchers, have united to develop a rapid assessment of the impact COVID-19 has had on air quality.

Figures from the World Health Organisation show the ongoing pandemic has caused more than 477,000 deaths worldwide as of June 2020, 14,000 of which occurred in India. On 25 March 2020, a complete lockdown of internal and external boarders together with social isolation measures came into effect in India, affecting the lives and mobility of its 1.3 billion population.

In this recent study, published by Sustainable Cities and Society, researchers from Surrey's GCARE studied the levels of harmful fine particulate matter (PM2.5) originating from vehicles and other non-vehicular sources in five Indian cities - Chennai, Delhi, Hyderabad, Kolkata, and Mumbai - from the start of the lockdown period until 11 May 2020. The team analysed PM2.5 distribution and contextualised their findings against those from other cities from across the world. They also explored potential factors influencing differences between divergent concentration changes in different cities, as well as aerosol loadings at regional scale.

In their work, titled "Temporary reduction in fine particulate matter due to 'anthropogenic emissions switch-off' during COVID-19 lockdown in Indian cities", the GCARE team compared these lockdown air pollution figures with those from similar periods of the preceding five years.

The results showed that the lockdown reduced concentrations of harmful particles across all five cities, from a 10% reduction in Mumbai up to a 54% reduction in Delhi. These reductions in PM2.5 were found to be comparable to reductions in other cities across the world, such as in Vienna (60%) and Shanghai (42%).

The team from GCARE also investigated the monetary value of avoided premature mortality due to reduced PM2.5 concentrations, and calculated that the reduction may have saved 630 people from premature death and $690 million in health costs in India.

The team point out that the present lockdown situation offers observational opportunities regarding potential control systems and regulations for improved urban air quality. An integrated approach might help in understanding overall impacts of COVID-19 lockdown-style interventions and support the implementation of relevant policy frameworks.

Professor Prashant Kumar, Director of GCARE at the University of Surrey, said: "COVID-19 has had a devastating effect on the lives and livelihoods of billions across the world. This tragic global event has allowed us to quantify the impact that human activity has had over our environment and, in particular, our air quality.

"While the reduction in PM2.5 pollution may not be surprising, the size of the reduction should make us all take notice of the impact we have been having on the planet. This is an opportunity for us all to discuss and debate what the 'new normal' should look like - particularly when it comes to the quality of the air we breathe."

This work is supported by the NERC-funded ASAP-Delhi project (NE/P016510/1) and the EPSRC-funded INHALE project (EP/T003189/1). It builds upon GCARE's previous work into the efficacy of odd-even trials in Delhi, key pollution challenges of land-locked Delhi and other Indian megacities and proposing mitigation strategies to tackle the issue, a spatial distribution mapping method of pollutants across Delhi, and on their most recent work on a long-term assessment of ambient particulate matter and trace gases in the Delhi-NCR region.

Credit: 
University of Surrey

Why governments have the right to require masks in public

Requirements for consumers to wear masks at public places like retail stores and restaurants are very similar to smoking bans, according to three university experts.

In a paper published today (July 16, 2020) in the American Journal of Preventive Medicine, the professors say mask requirements to stop the spread of COVID-19 should be considered "fundamental occupational health protections" for workers at stores, restaurants and other public places.

"Both tobacco smoke and COVID-19 are air-based health hazards to workers who may be exposed to them for hours on end," said Michael Vuolo, co-author of the paper and associate professor of sociology at The Ohio State University.

"Requiring that members of the public wear masks is a form of workplace protection."

Vuolo, who researches the effectiveness of smoking bans, wrote the article with Brian Kelly, a professor of sociology at Purdue University who is an expert on health policy, and Vincent Roscigno, a professor of sociology at Ohio State who is an expert on labor and worker rights.

The main argument typically made against a mask requirement, as was the case with smoking bans, is that it violates the individual liberties of Americans.

"But even the strictest individual liberty philosophies still recognize that those liberties only go to the point of harm against others," Vuolo said.

"It is clear that COVID-19 is a threat to workers who may be exposed to it and mask wearing can help minimize that threat."

The issue is also one of inequality, because many of the workers in service and retail industries are people who earn lower wages and are racial and ethnic minorities.

Mask requirements may be a key means to reduce the already evident inequalities in who gets COVID-19, the researchers said.

The risks of contracting COVID-19 for workers are, in some ways, even more insidious than those related to smoking, Vuolo noted.

"The risk from smokers is clear. But workers don't know who may have COVID-19 and who doesn't. That makes mask requirements for everyone even more important," he said.

Many business owners enforce smoking bans even when not required by law for a very good reason, according to Vuolo.

"Research has shown that workplace productivity is higher in workplaces that are seen as healthy and safe."

Vuolo said it is important to remember how controversial smoking bans were when they were first implemented. Now, they are hardly mentioned.

"No one is out there policing smoking for the most part. Health authorities could if they had to, but it is usually not necessary," he said.

"The way we got people to stop smoking in public was simply to make it abnormal. We could do a similar thing by making it abnormal not to wear a mask," he said.

If mask-wearing is required, it could become as normalized here in the United States as it is in east Asia. At some point, people may even consider wearing masks during normal flu seasons, Vuolo said.

But until that time, we need legal requirements to protect workers, according to the authors.

"Wearing a mask may seem like a nuisance, just like having to step outside to smoke may seem like a nuisance," Vuolo said.

"But both are a small inconvenience when compared to workers' rights to a safe work environment."

Credit: 
Ohio State University

Air pollution from wildfires linked to higher death rates in patients with kidney failure

Highlight

Exposure to higher amounts of fine particulate air pollution was associated with higher death rates among patients with kidney failure.

Washington, DC (July 16, 2020) -- New research suggests that individuals with kidney failure may face a higher risk of dying prematurely if they're exposed to air pollution from wildfires. The findings appear in an upcoming issue of JASN.

Wildfires generate high levels of tiny particles of air pollution--called fine particulate matter--that can have a range of effects on health. When inhaled, fine particulate matter can travel into the respiratory tract and bloodstream and trigger oxidative stress and inflammation. Because of their frailty, patients with kidney failure might be especially susceptible to this environmental stressor, but little is known about the effects of air pollution exposures in these individuals.

To investigate, a team led by Ana Rappold, PhD (the US Environmental Protection Agency), along with Yuzhi Xi, MSPH and Abhijit V. Kshirsagar, MD, MPH (University of North Carolina at Chapel Hill), analyzed information from 253 US counties near a major wildfire between 2008 and 2012.

"This study was possible because the US Renal Disease System, a registry of patients with kidney failure, included vital records on almost all US patients receiving in-center hemodialysis, as well as the counties of the dialysis clinics. Secondly, we utilized an air quality model to estimate daily exposure to wildfire fine particulate matter across the country at the counties of the dialysis units," explained Ms. Xi.

The researchers found 48,454 deaths among patients with kidney failure who were receiving dialysis in the 253 counties. Each 10 μg/m3 increase in the concentration of wildfire fine particulate matter in the air was associated with a 4% higher death rate on the same day and a 7% higher rate over the next month. On days with wildfire fine particulate matter greater than 10 μg/m3, exposure to the pollution accounted for 8.4% of daily mortality.

"The findings highlight the impact of air pollution exposure in individuals receiving hemodialysis, and they support the need for more research to develop and implement interventions to manage exposure during wildfire smoke episodes in this population," said Dr. Rappold.

Credit: 
American Society of Nephrology

Membrane technology could cut emissions and energy use in oil refining

image: New membrane technology could reduce carbon emissions and energy intensity associated with refining crude oil. Georgia Tech Associate Professor Ryan Lively shows a module containing the new membrane material, while Professor M.G. Finn holds vials containing some of the other polymers used in this study.

Image: 
Christopher Moore, Georgia Tech

New membrane technology developed by a team of researchers from the Georgia Institute of Technology, Imperial College London, and ExxonMobil could help reduce carbon emissions and energy intensity associated with refining crude oil. Laboratory testing suggests that this polymer membrane technology could replace some conventional heat-based distillation processes in the future.

Fractionation of crude oil mixtures using heat-based distillation is a large-scale, energy-intensive process that accounts for nearly 1% of the world's energy use: 1,100 terawatt-hours per year (TWh/yr), which is equivalent to the total energy consumed by the state of New York in a year. By substituting the low-energy membranes for certain steps in the distillation process, the new technology might one day allow implementation of a hybrid refining system that could help reduce carbon emissions and energy consumption significantly compared to traditional refining processes.

"Much in our modern lives comes from oil, so the separation of these molecules makes our modern civilization possible," said M.G. Finn, professor and chair of Georgia Tech's School of Chemistry and Biochemistry. Finn also holds the James A. Carlos Family Chair for Pediatric Technology. "The scale of the separation required to provide the products we use is incredibly large. This membrane technology could make a significant impact on global energy consumption and the resulting emissions of petroleum processing."

To be reported in the July 17 issue of the journal Science, the paper is believed to be the first report of a synthetic membrane specifically designed for the separation of crude oil and crude-oil fractions. Additional research and development will be needed to advance this technology to industrial scale.

Membrane technology is already widely used in such applications as seawater desalination, but the complexity of petroleum refining has until now limited the use of membranes. To overcome that challenge, the research team developed a novel spirocyclic polymer that was applied to a robust substrate to create membranes able to separate complex hydrocarbon mixtures through the application of pressure rather than heat.

Membranes separate molecules from mixtures according to differences such as size and shape. When molecules are very close in size, that separation becomes more challenging. Using a well-known process for making bonds between nitrogen and carbon atoms, the polymers were constructed by connecting building blocks having a kinked structure to create disordered materials with built-in void spaces.

The team was able to balance a variety of factors to create the right combination of solubility - to enable membranes to be formed by simple and scalable processing - and structural rigidity - to allow some small molecules to pass through more easily than others. Unexpectedly, the researchers found that the materials needed a small amount of structural flexibility to improve size discrimination, as well as the ability to be slightly "sticky" toward certain types of molecules that are found abundantly in crude oil.

After designing the novel polymers and achieving some success with a synthetic gasoline, jet fuel, and diesel fuel mixture, the team decided to try to separate a crude oil sample and discovered that the new membrane was quite effective at recovering gasoline and jet fuel from the complex mixture.

"We were initially trying to fractionate a mixture of molecules that were too similar," said Ben McCool, a senior research associate at ExxonMobil and one of the paper's coauthors. "When we took on a more complex feed, crude oil, we got fractionalization that looked like it could have come from a distillation column, indicating the concept's great potential."

The researchers worked collaboratively, with polymers designed and tested at Georgia Tech, then converted to 200-nanometer-thick films, and incorporated into membrane modules at Imperial using a roll-to-roll process. Samples were then tested at all three organizations, providing multi-lab confirmation of the membrane capabilities.

"We have the foundational experience of bringing organic solvent nanofiltration, a membrane technology becoming widely used in pharmaceuticals and chemicals industries, to market," said Andrew Livingston, professor of chemical engineering at Imperial. "We worked extensively with ExxonMobil and Georgia Tech to demonstrate the scalability potential of this technology to the levels required by the petroleum industry."

The research team created an innovation pipeline that extends from basic research all the way to technology that can be tested in real-world conditions.

"We brought together basic science and chemistry, applied membrane fabrication fundamentals, and engineering analysis of how membranes work," said Ryan Lively, associate professor and John H. Woody faculty fellow in Georgia Tech's School of Chemical and Biomolecular Engineering. "We were able to go from milligram-scale powders all the way to prototype membrane modules in commercial form factors that were challenged with real crude oil - it was fantastic to see this innovation pipeline in action."

ExxonMobil's relationship with Georgia Tech goes back nearly 15 years and has produced innovations in other separation technologies, including a new carbon-based molecular sieve membrane that could dramatically reduce the energy required to separate a class of hydrocarbon molecules known as alkyl aromatics.

"Through collaboration with strong academic institutions like Georgia Tech and Imperial, we are constantly working to develop the lower-emissions energy solutions of the future," said Vijay Swarup, vice president of research and development at ExxonMobil Research and Engineering Company.

Credit: 
Georgia Institute of Technology

Genetics could help protect coral reefs from global warming

video: Researchers dive to collect coral samples on the Great Barrier Reef during the 2017 coral bleaching event.

Image: 
Australian Institute of Marine Science

Coral reefs are dying at an alarming rate as water temperatures rise worldwide as a result of global warming, pollution and human activities. In the last three decades, half of Australia's Great Barrier Reef has lost its coral cover.

A new study from Columbia University provides more evidence that genetic-sequencing can reveal evolutionary differences in reef-building corals that one day could help scientists identify which strains could adapt to warmer seas.

The findings, published in Science July 17, provide a window into the genetic processes that allow some corals to resist dramatic climate shifts that could complement or improve current preservation efforts.

"We need to use as many tools as possible to intervene or we will continue to see coral reefs vanish," said Zachary Fuller a postdoctoral researcher in biology at Columbia and first author on the study. "Using genomics can help identify which corals have the capacity to live at higher temperatures and reveal genetic variants associated with climate resilience."

Coral reefs, found throughout the world in tropical oceans, are one of the most diverse and valuable ecosystems on Earth. They are actually living animal colonies and important for many reasons. Reefs provide a habitat for a large variety of marine species; protect coastlines from storms, floods and erosion; and help sustain the fishing and tourism industries.

In the late 1990s, reefs worldwide experienced their first wave of mass bleaching, which occurs when high water temperatures destroy the symbiotic relationship with a colony's colorful algae, causing the corals to turn white. The loss effectively starves them, as corals are dependent on the photosynthetic algae that lives within their tissues for nutrients. Reefs can recover from bleaching, but prolonged periods of environmental stress can eventually kill them.

The Columbia research predicts, to some degree, which corals are likely to withstand unusually high temperatures and resist bleaching events.

"Genomics allows us to examine the genetic differences that could influence survival and bleaching tolerance, helping us work out how we might support coral health," said Molly Przeworski, professor in the Departments of Biological Sciences and Systems Biology at Columbia and senior study author.

To collect their genetic data, Fuller, Przeworski and their collaborators from the Australian Institute of Marine Science analyzed 237 samples collected at 12 locations along the Great Barrier Reef, generating the highest quality sequences of any corals to date. The sequencing allowed the researchers to look across the genome for signatures where adaptation occurred and to find genetically distinct variations associated with bleaching tolerance.

"What we discovered is that no single gene was responsible for differences in a coral's response to bleaching, but instead many genetic variants influence the trait," Fuller said. "On their own, each has a very small effect, but when taken together we can use all these variants to predict which corals may be able to survive in the face of hotter seas."

Fuller and Przeworski said the findings offer a pathway for coral biologists to further search for strains that can better cope with ocean warming and enables similar approaches that can be used in other species most at risk from climate change.

"The best chance we have to save what's left of the Earth's coral reefs is to mitigate the effects of climate change by rapidly reducing greenhouse gas emissions," Fuller said. "In the meantime, genetic approaches may be able to buy us time."

Credit: 
Columbia University

Range of commercial infant foods has grown markedly in past seven years

The range of commercial foods on sale for babies has grown rapidly in recent years but their sugar levels are still too high, suggests research published in Archives of Disease in Childhood.

Although fewer foods are now being marketed to infants aged four months, there are far more snack foods for babies being sold and the sweetness of savoury foods designed for babies is a concern, researchers found.

Although parents are encouraged to offer home-made baby foods, 58% of UK babies are estimated to receive commercial baby foods between the ages of 6 and 12 months.

In their previous research in 2013, a team of researchers from the University of Glasgow's College of Medical Veterinary and Life Sciences, reported concerns about the nutrition quality and recommendations in food labels of these foods.

Experts recommend that the transition from an exclusively milk-based diet to solid foods for an infant should be a gradual process, starting at around 6 months. But the Glasgow team found that many commercial baby foods were marketed to infants from age 4 months, and nearly half the products were sweet.

For their latest study, the same researchers set out to assess how the baby food market in the UK has changed between 2013 and 2019, by carrying out a cross-sectional survey of all infant food products available to buy in the UK online and in-store in 2019.

Nutritional content and product descriptions were recorded and compared with an existing 2013 database.

They set out to measure changes in the proportion of products marketed to infants aged 4 months, proportion classified as sweet versus savoury, and spoonable versus dry (snacks) average sugar content.

Overall, there was an increase in commercial baby food products with 84% more brands and double the products compared with 2013.

Results showed there were 32 brands selling baby foods including 27 brands that were not included in 2013.

In 2019, a total of 898 commercial baby foods were identified. Of these, 611 (68%) were spoonable products, mostly packed in pouches (54%), while 253 (28%) were dry products.

The researchers focused on 865 products overall for their analysis to be in line with their 2013 survey.

Analysis of the results showed there were fewer products described as suitable for infants aged 4 months in 2019 (201 or 23%) compared with 2013 (178 or 43%), while the proportion for children in the 6-7-month age range increased from 135 (or 33%) in 2013 to 369 (or 43%) in 2019.

The proportion of sweet and savoury products was unchanged while sweet spoonable products showed a small but significant decrease in sugar content (6%) between 2013 and 2019.

However, savoury spoonable products showed a 16% increase in sugar content.

Sweet snacks remained very sweet and in the 2019 data, concentrated juice was added to 29% of products and 18% of "savoury" products were comprised of more than 50% sweet vegetables or fruit.

The number and proportion of snacks increased markedly in 2019 to 185 compared with just 42 in 2013 while the proportion of wet spoonable foods decreased from 79% in 2013 to 71% in 2019.

The researchers said that although clinical evidence was currently unavailable, the health consequences of snacking for baby feeding skills, liquid/milk intake and continued exposure to sugars in the oral cavity were likely to have implications for healthy eating guidelines.

Further research on the prevalence and extent of these marketing strategies was required, they say, and there may be a need for tighter regulations on packaging to discourage the use of baby snacks.

They conclude: "The product range of commercial infant foods has expanded dramatically in the last 7 years, both in the number of brands and the types of products. Fewer foods are now marketed to infants aged 4 months, but the increase in snack foods and the sweetness of savoury foods is a concern."

Credit: 
BMJ Group

Avoiding food contamination with a durable coating for hard surfaces

image: A new study from the University of Missouri demonstrates that a durable coating -- made from titanium dioxide -- is capable of eliminating foodborne germs, such as salmonella and E. coli, and provides a preventative layer of protection against future cross-contamination on stainless steel food-contact surfaces.

Image: 
University of Missouri

In the future, a durable coating could help keep food-contact surfaces clean in the food processing industry, including in meat processing plants. A new study from a team of University of Missouri engineers and food scientists demonstrates that the coating -- made from titanium dioxide -- is capable of eliminating foodborne germs, such as salmonella and E. coli, and provides a preventative layer of protection against future cross-contamination on stainless steel food-contact surfaces.

The study was conducted by Eduardo Torres Dominguez, who is pursuing a doctorate in chemical engineering in the MU College of Engineering, and includes a team of researchers from the College of Engineering and the MU College of Agriculture, Food and Natural Resources. Dominguez is also a Fulbright scholar.

"I knew that other researchers had developed antimicrobial coatings this way, but they hadn't focused on the coatings' mechanical resistance or durability," Dominguez said. "In the presence of ultraviolet light, oxygen and water, the titanium dioxide will activate to kill bacteria from the food contact surfaces on which it is applied. Although the coating is applied as a liquid at the beginning of the process, once it is ready to use it becomes a hard material, like a thin layer of ceramic."

Heather K. Hunt, an associate professor in the College of Engineering and one of Dominguez's advisors, guided Dominguez through the process of finding, selecting, synthesizing and characterizing the titanium dioxide material -- a known disinfecting agent that is also food safe.

"We picked this material knowing it would have good antimicrobial behavior, and we strengthened its mechanical stability to withstand normal wear and tear in a typical food processing environment," said Hunt, whose appointment is in the Department of Biomedical, Biological and Chemical Engineering. "In addition to normal cleaning procedures, our coating can add an additional layer of prevention to help stop the spread of foodborne contamination."

Once Dominguez developed the coating, Azlin Mustapha, a professor in the College of Agriculture, Food and Natural Resources' Food Science program and Dominguez's other advisor, helped him optimize its antimicrobial, or disinfecting, properties. Matt Maschmann, an assistant professor in the Department of Mechanical and Aerospace Engineering in the College of Engineering, helped Dominguez optimize the material's durability through hardness testing.

Mustapha is encouraged by the group's progress as this could be a way to deter the spread of foodborne germs in a food processing environment.

"This will not only be helpful in the raw food processing lines of a processing plant but also ready-to-eat food lines, like deli counters, as well," Mustapha said. "All surfaces in a food processing plant that come into contact with food are prone to be contaminated by foodborne germs spread by the handling of a contaminated food product."

The researchers said this is the first step needed toward future testing of the coating's properties in a real-world environment. Although the team has not tested it for use against the novel coronavirus, Hunt and Mustapha believe their coating has the potential to aid in helping stop the spread of the COVID-19 pandemic in a food processing environment because of its durability and disinfecting qualities. So far, it has shown to be effective against a strain of E. coli that can be deadly in people, and more work is being done to test the coating against other disease-causing bacteria.

The study, "Design and characterization of mechanically stable, nanoporous TiO2 thin film antimicrobial coatings for food contact surfaces," was published in Materials Chemistry and Physics. Co-authors include Phong Nguyen at MU and Annika Hylen at St. Louis University. Funding was provided by the graduate fellowship program of the Fulbright Program and the Comision Mexico-Estados Unidos para el Intercambio Educativo y Cultural (COMEXUS). The content is solely the responsibility of the authors and does not necessarily represent the official views of the funding agencies.

Credit: 
University of Missouri-Columbia

Gel that breaks down, puts itself back together could improve delivery of oral drugs

image: Microrheological characterization of covalent adaptable hydrogel degradation in response to temporal pH changes that mimic the gastrointestinal tract, Nan Wu et al.

An artistic rendering shows covalent adaptable hydrogel degradation in response to pH changes over time that mimic the gastrointestinal tract. The yellow dots represent the particles in the gel used to measure this process in microrheological experiments.

Image: 
Courtesy of Soft Matter/Illustration by Sayo Studio LLC

An emerging hydrogel material with the capacity to degrade and spontaneously reform in the gastrointestinal tract could help researchers develop more effective methods for oral drug delivery.

“The majority of drugs and nutrients are absorbed into the body in the intestines, but to get there, they have to traverse the stomach—a very acidic, harsh environment that can interfere with the active molecules in pharmaceuticals,” says Kelly Schultz, an associate professor of chemical and biomolecular engineering in Lehigh University’s P.C. Rossin College of Engineering and Applied Science.

Schultz and fourth-year chemical engineering PhD student Nan Wu are studying covalent adaptable hydrogels (CAHs), which are being designed to release molecules as they lose polymer in the stomach but then re-gel on their own, which protects the molecules and allows them to stay active for targeted delivery in the intestines. The team’s microrheology research is featured in an article and inside cover illustration in the current issue of Soft Matter.

To characterize the material and provide insight into its pharmaceutical potential, Wu has repurposed a microfluidic device originally developed in Schultz’s lab for research into fabric and home care products to create a “GI tract-on-a-chip.” The experimental setup allows her to exchange the fluid environment around the gel to mimic the pH environment of all the organs in the GI tract, simulating how the material would react over time if ingested.   

Using microrheology, Wu collects microscopy data and measures how much particles within the gel wiggle, with some experiments taking hours and others spanning days, depending on the digestive organ she is replicating. Wu tracks the particles using an algorithm that yields scientifically meaningful information on the properties of the material, which was originally developed by University of Colorado at Boulder professor Kristi S. Anseth.

“CAHs exhibit unusual spontaneous re-gelation that is really surprising,” Schultz says. “Typically, gels won’t degrade and then reform without any added stimuli as these do. We’ve demonstrated viability of CAHs as means of oral drug and nutrient delivery, and now we’re starting to work on molecular release studies and adding in other components to make the experiments more complex.”

Wu has been investigating these materials over the course of her entire PhD studies, says Schultz. “She’s doing amazing work and is committed to understanding every aspect of the research.” 

Schultz’s research lab focuses on the characterization of colloidal and polymeric gel scaffolds and the development of new techniques to characterize these complex systems, which play important roles in fields such as health care and consumer products.  

“What we do in biomaterials is somewhat unique: There's a lot of work on the cross-linking chemistry and actually developing these materials, and there's a lot of animal research that implants and tests them, but there's not that much work in the middle. A great deal of mystery lies between designing a material and understanding what’s going on when it's working. We're trying to find new ways that we can replicate what’s going on inside of an animal or a person and collect important measurements to connect the dots and inform further studies.”

Research supported in part by National Institute of General Medical Sciences of the National Institutes of Health under award number R15GM119065. Content is solely responsibility of authors and does not necessarily represent official views of NIH.

About Kelly M. Schultz

Kelly M. Schultz is an associate professor in the Department of Chemical and Biomolecular Engineering at Lehigh University. She earned her BS in chemical engineering from Northeastern University (2006) and PhD in chemical engineering with Professor Eric Furst from the University of Delaware (2011) as a National Science Foundation graduate research fellow. While at Delaware, she was invited to speak in the American Chemical Society Excellence in Graduate Polymers Research Symposium and was selected as the Fraser and Shirley Russell Teaching Fellow. Following her PhD, she was a Howard Hughes Medical Institute postdoctoral research associate at the University of Colorado at Boulder working in the laboratory of Professor Kristi Anseth. As a postdoc, she was invited to participate in the Distinguished Young Scholars Summer Seminar Series at the University of Washington. She joined Lehigh as an assistant professor in 2013 and was a P.C. Rossin Assistant Professor from 2016 to 2018. Schultz was named one of TA Instruments Distinguished Young Rheologists in 2014 and was recognized by the NSF Faculty Early Career Development (CAREER) Program in 2018. She received Lehigh’s Libsch Early Career Research Award in 2019 and the Excellence in Research Scholarship & Leadership Award from the P.C. Rossin College of Engineering and Applied Sciences in 2020. Schultz and her research group study emerging hydrogel materials developed for biological applications, such as wound healing and tissue regeneration.

Related Links:

Faculty Profile: Kelly Schultz
Soft Matter: Microrheological characterization of covalent adaptable hydrogel degradation in response to temporal pH changes that mimic the gastrointestinal tract
Rossin College: Schultz Lab

Credit: 
Lehigh University

Heat stress: The climate is putting European forests under sustained pressure

image: In a forest near Basel researchers study the effects of climate change on the most important and sensitive part of the trees - the canopy. A total of 450 trees between 50 and 120 years old grow on the 1.6 hectare research area.

Image: 
University of Basel

No year since weather records began was as hot and dry as 2018. A first comprehensive analysis of the consequences of this drought and heat event shows that central European forests sustained long-term damage. Even tree species considered drought-resistant, such as beech, pine and silver fir, suffered. The international study was directed by the University of Basel, which is conducting a forest experiment unique in Europe.

Until now, 2003 has been the driest and hottest year since regular weather records began. That record has now been broken. A comparison of climate data from Germany, Austria and Switzerland shows that 2018 was significantly warmer. The average temperature during the vegetation period was 1.2°C above the 2003 value and as high as 3.3°C above the average of the years from 1961 to 1990.

Part of the analysis, which has now been published, includes measurements taken at the Swiss Canopy Crane II research site in Basel, where extensive physiological investigations were carried out in tree canopies. The goal of these investigations is to better understand how and when trees are affected by a lack of water in order to counter the consequences of climate change through targeted management measures.

When trees die of thirst

Trees lose a lot of water through their surfaces. If the soil also dries out, the tree cannot replace this water, which is shown by the negative suction tension in the wood's vascular tissue. It's true that trees can reduce their water consumption, but if the soil water reservoir is used up, it's ultimately only a matter of time until cell dehydration causes the death of a tree.

Physiological measurements at the Basel research site have shown the researchers that the negative suction tension and water shortage in trees occurred earlier than usual. In particular, this shortage was more severe throughout all of Germany, Austria and Switzerland than ever measured before. Over the course of the summer, severe drought-related stress symptoms therefore appeared in many tree species important to forestry. Leaves wilted, aged and were shed prematurely.

Spruce, pine and beech most heavily affected

The true extent of the summer heatwave became evident in 2019: many trees no longer formed new shoots - they were partially or wholly dead. Others had survived the stress of the drought and heat of the previous year, but were increasingly vulnerable to bark beetle infestation or fungus. Trees with partially dead canopies, which reduced the ability to recover from the damage, were particularly affected.

"Spruce was most heavily affected. But it was a surprise for us that beech, silver fir and pine were also damaged to this extent," says lead researcher Professor Ansgar Kahmen. Beech in particular had until then been classified as the "tree of the future", although its supposed drought resistance has been subject to contentious discussion since the 2003 heatwave.

Future scenarios to combat heat and drought

According to the latest projections, precipitation in Europe will decline by up to a fifth by 2085, and drought and heat events will become more frequent. Redesigning forests is therefore essential. "Mixed woodland is often propagated," explains plant ecologist Kahmen, "and it certainly has many ecological and economic advantages. But whether mixed woodland is also more drought-resistant has not yet been clearly proven. We still need to study which tree species are good in which combinations, including from a forestry perspective. That will take a long time."

Another finding of the study is that it is only possible to record the impacts of extreme climate events on European forests to a limited extent using conventional methods, and thus new analytical approaches are needed. "The damage is obvious. More difficult is precisely quantifying it and drawing the right conclusions for the future," says Kahmen. Earth observation data from satellites could help track tree mortality on a smaller scale. Spatial patterns that contain important ecological and forestry-related information can be derived from such data: which tree species were heavily impacted, when and at which locations, and which survived without damage? "A system like this already exists in some regions in the US, but central Europe still lacks one."

Credit: 
University of Basel

Tuning frontal polymerization for diverse material properties

video: Rapid polymerization using the monomers 1,5-cyclooctadiene and dicyclopentadiene is shown.

Image: 
Courtesy the Autonomous Materials Systems Group.

Researchers from the University of Illinois at Urbana-Champaign have improved the technique of frontal polymerization, where a small amount of heat triggers a moving reaction wave that produces a polymeric material. The new method enables a wider range of materials with better control over their thermal and mechanical properties.

The paper, “Rapid Synthesis of Elastomers and Thermosets with Tunable Thermomechanical Properties,” was published in ACS Macro Letters and selected as ACS editors’ choice.

“Most of the previous research looked at stiffer materials. This paper is the first time frontal polymerization has been used to synthesize a rubbery material,” said Nancy Sottos, a Maybelle Leland Swanlund Chair and head of the Department of Materials Science and Engineering, who also leads the Autonomous Materials Systems Group at the Beckman Institute for Advanced Science and Technology. “The new technique allows us to have more control and makes materials that have good engineering properties in terms of strength and stiffness.”

The researchers used a mixture of two monomers, 1,5-cyclooctadiene and dicyclopentadiene, to create materials tailored for a wide range of applications.

“These materials are chemically similar to what is used in tires,” said Leon Dean, a graduate student in the Sottos Group, which is part of AMS. “Conventionally, the synthesis of rubbers requires an organic solvent, multiple steps, and a lot of energy, which is not environmentally friendly. Our solvent-free manufacturing method speeds up the process and reduces energy consumption.”

Using this technique, the researchers were able to make materials that demonstrate a shape memory polymer hand. The shape-memory effect occurs when a pre-deformed polymer is heated beyond its glass transition temperature, which is the point the polymer changes from a hard, glassy material to a soft, rubbery material. The sequential change in shape was enabled by the differences in glass transition temperature between each layer.

“We made a layered material in the shape of a hand, where each layer had different amounts of the two monomers and therefore different glass transition temperatures,” said Qiong Wu, a postdoctoral fellow in the Moore Group, which also is part of AMS. “When you heat the polymer above the highest glass transition temperature and then cool it, it forms a fist. As you raise the temperature again, the digits of the fist open sequentially.”

The researchers hope to further develop this technique by improving their control over the polymer properties. “Although we have demonstrated the tunability of several properties over a wide range, it remains a challenge to adjust each property individually,” Wu said.

“Scaling up the technique will also be a challenge,” Dean said. “Most of our work has been done on a lab scale. However, in larger scale manufacturing, there is a competition between bulk polymerization and frontal polymerization.”

“This study demonstrates the Beckman Institute at its best,” said Jeff Moore, an Ikenberry Endowed Chair, a professor of chemistry, and the director of the Beckman Institute. “It brought together two groups that have different perspectives on a problem, but share a common goal.”

Omar Alshangiti, an undergraduate in the Moore Group, also made significant contribution to the study by investigating suitable monomer combinations, preparing most of the samples and measuring all the parameters of frontal polymerization process.

This research was supported by the AFOSR Center for Excellence in Self-Healing, Regeneration and Structural Remodeling Award; the U.S. Department of Energy, Office of Basic Energy Sciences, Division of Materials Sciences and Engineering Award; and a National Science Foundation Graduate Research Fellowship.

 

The study “Rapid Synthesis of Elastomers and Thermosets with Tunable Thermomechanical Properties” can be found at https://doi.org/10.1021/acsmacrolett.0c00233.

Journal

ACS Macro Letters

DOI

10.1021/acsmacrolett.0c00233.

Credit: 
Beckman Institute for Advanced Science and Technology