Tech

Temporal aiming with temporal metamaterials

image: a, an incoming electromagnetic wave travels in an unbounded medium with an isotropic permittivity. b, by rapidly changing the permittivity of the medium in time from isotropic to an anisotropic tensor (with the x component of permittivity smaller than its z component) a temporal boundary is introduced producing forward and backward waves (equivalent to transmission and reflection at the interface between two media in the spatial domain). In this situation the direction of the wave vector remains the same while the energy propagation changes its direction following the direction of the Poynting vector S, eventually reaching receiver 2 (Rx2). c, a similar approach can be applied by properly engineering the temporal function of permittivity to allow the transmitted electromagnetic wave to be re-directed towards Rx1. d-e, Snapshots of the simulation results for the power distributions at times before and after the permittivity is changed from isotropic to anisotropic in real time, respectively, demonstrating how the energy changes its direction: temporal aiming.

Image: 
by Victor Pacheco-Peña and Nader Engheta

Tailoring and manipulating electromagnetic wave propagation has been of great interest within the scientific community for many decades. In this context, wave propagation have been engineered by properly introducing spatial inhomogeneities along the path where the wave is traveling. Antennas and communications systems in general have greatly benefited from this wave-matter control. For instance, if one needs to re-direct the radiated field (information) from an antenna (transmitter) to a desired direction and reach a receiving antenna placed at a different location, one can simply place the former in a translation stage and mechanically steer the propagation of the emitted electromagnetic wave. Such beam steering techniques have greatly contributed to the spatial aiming of targets in applications such as radars and point-to-point communication systems. The beam steering can also be achieved using metamaterials and metasurfaces by means of spatially controlling the effective electromagnetic parameters of a designed meta-lens antenna system and/or using reconfigurable meta-surfaces. The next question to ask: could we push the limits of current beam steering applications by controlling electromagnetic properties of media not only in space but also in time (i.e., 4D metamaterials x,y,z,t)? In order words, would it be possible to achieve temporal aiming of electromagnetic waves?

In a new paper published in Light Science & Application, Victor Pacheco-Peña from the School of Mathematics, Statistics and Physics of Newcastle University in UK and Nader Engheta from and Department of Electrical and Systems Engineering of the University of Pennsylvania, USA have answered this question by proposing the idea of temporal metamaterials that change from an isotropic to an anisotropic permittivity tensor. In this concept, the authors consider a rapid change of the permittivity of the whole medium where the wave is traveling and demonstrated both numerically and analytically the effects of such a temporal boundary caused by the rapid temporal change of permittivity. In so doing, forward and backward waves are produced with wave vector k preserved through the whole process while frequency is changed, depending on the values of the permittivity tensor before and after the temporal change of permittivity.

Interestingly, the authors' theoretical results also show how the direction of the energy propagation (defined by the Poynting vector S) is different from that of the wave number, leading to real-time beam steering of electromagnetic energy, a phenomenon the authors named Temporal aiming as the temporal analogue of the spatial beaming. All the reported numerical calculations are in excellent agreement with analytical calculations. As the authors comment:

"In this study we provide an in-depth analysis of the underlying physics behind such temporal aiming approach achieved by rapidly changing the permittivity of the medium containing the wave, from isotropic to anisotropic values. As an exciting result, we were able to extract a closed analytical and simple formula for the new direction of energy propagation of the already present electromagnetic wave."

"We present a detailed study considering monochromatic waves under different oblique incident angles along with more complex incident electromagnetic waves such as Gaussian beams."

"Since this temporal aiming allows us to arbitrarily change the direction of energy propagation in real time, it could open new possibilities for real time beam steering. We provide a numerical example of a single transmitter antenna and three receivers placed at different spatial locations. Our example demonstrates how the transmitted electromagnetic wave can reach any of the three receivers by simply engineering a time-dependent permittivity of the medium following a square function: isotropic - anisotropic - isotropic".

"The presented technique has the potential to open new possibilities for the routing of information in integrated photonic systems by implementing temporal metamaterials that can deflect the guided electromagnetic waves to a desired target/direction on a chip" the scientists forecast.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

New detection method turns silicon cameras into mid-infrared detectors

image: Artistic rendering of the principle of non-degenerate two-photon absorption (NTA) for the detection of mid-infrared (MIR) by a silicon-based camera. In this detection technique, the sensor is illuminated directly by the MIR light beam, while a second, near-infrared (NIR) beam is also incident on the sensor. The energies of the MIR and NIR photons combine to excite charge carriers in the silicon material, inducing a response in the camera. This method enables fast MIR imaging with regular Si-based cameras.

Image: 
by David Knez, Adam Hanninen, Richard Prince, Eric Potma and Dmitry Fishman

The MIR range of the electromagnetic spectrum, which roughly covers light in the wavelength regime between 3 to 10 micrometers, coincides with the energies of fundamental molecular vibrations. Utilizing this light for the purpose of imaging can produce stills with chemical specificity, i.e. images with contrast derived from the chemical composition of the sample. Unfortunately, detecting MIR light is not as simple as detecting light in the visible regime. Current MIR cameras exhibit excellent sensitivity but are very sensitive to thermal noise. In addition, the fastest MIR cameras suitable for chemical mapping have sensors with low pixel numbers, thus limiting imaging at high definition. To overcome this problem, several strategies have been developed for shifting the information carried by MIR light into the visible range, followed by efficient detection with a modern Si-based camera. Unlike MIR cameras, Si-based cameras exhibit low noise characteristics and have high pixel densities, making them more attractive candidates for high performance imaging applications. The required MIR-to-visible conversion scheme, however, can be rather complicated. Presently, the most direct way for achieving the desired color conversion is through the use of a nonlinear optical crystal. When the MIR light and an additional near-infrared (NIR) light beam are coincident in the crystal, a visible light beam is generated through the process of sum-frequency generation, or SFG for short. Although the SFG up-conversion trick works well, it is sensitive to alignment and it requires numerous orientations of the crystal to produce a single MIR-derived image on the Si camera.

In a new paper published in Light Science & Applications, a team of scientists from the University of California, Irvine, describes a simple method for detecting MIR images with a Si camera. Instead of using the optical nonlinearity of a crystal, they used the nonlinear optical properties of the Si chip itself to enable a MIR specific response in the camera. In particular, they used the process of non-degenerate two-photon absorption (NTA), which, with the help of an additional NIR 'pump' beam, triggers the generation of photo-induced charge carriers in Si when the MIR light illuminates the sensor. Compared to SFG up-conversion, the NTA method avoids the use of nonlinear up-conversion crystals altogether and it is virtually free of alignment artifacts, making MIR imaging with Si-based cameras significantly simpler.

The team, led by Dr. Dmitry Fishman and Dr. Eric Potma, first established that Si is a material suitable for MIR detection through NTA. Using MIR light with pulse energies in the femtojoule (fJ, 10-12 J) range, they found that NTA in silicon is sufficiently efficient for detecting MIR. This principle enabled them to perform vibrational spectroscopy measurements of organic liquids employing just a simple Si photodiode as the detector.

The team then moved to replace the photodiode with a charge-coupled device (CDD) camera, which also uses silicon as the photosensitive material. Through NTA, they were able to capture MIR-derived images on a 1392x1040 pixel sensor at 100 ms exposure times, yielding chemically selective images of several polymer and biological materials as well as living nematodes. Despite using technology not specifically optimized for NTA, the team observed the ability to detect small (10-2) changes in optical density (OD) in the image.

"We are excited to offer this new detection strategy to those who use MIR light for imaging," says David Knez, one of the team members. "We have high hopes that the simplicity and versatility of this approach allows for the broad adoption and development of the technology." Adding that NTA may speed up analysis in a wide variety of fields, such as pharmaceutical quality assurance, geologic mineral sampling, or microscopic inspection of biological samples.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

The immune system facilitates alcohol addiction

The activation of the immune system could eventually perpetuate some of the deleterious effects of alcohol, like addiction. It is the conclusion of a research carried out by an international team led by Dr. Santiago Canals, from the Institute of Neurosciences in Alicante (Spain), a joint center of the Spanish National Research Council and the University Miguel Hernández in Elche, and Dr. Wolfgang Sommer, from the Central Institute of Mental Health of the University of Heidelberg (Germany).

The researchers have observed that alcohol may increase its addictive capacity by changing the geometry of the brain, specifically the grey matter, as shown by this translational study carried out in rats and humans, published today in the journal Science Advances. "This is a totally new mechanism of addiction", highlights Dr Santiago Canals, at the Institute of Neuroscience in Alicante (Spain).

According to their observations, the cells of the immune system that reside in the brain, called microglia, are responsible for the change in geometry that grey substance undergoes in the presence of alcohol, clarifies Dr. Silvia de Santis, researcher at the Institute of Neurosciences in Alicante and first signatory of the article.

Alcohol, as a harmful substance, causes the activation of these defense cells, which leads to a change in their biochemical characteristics and in their shape, which changes from being branched to a more rounded or amoeboid shape. This change in shape alters the geometry of the extracellular space, allowing a greater diffusion of substances that would be limited in the absence of alcohol. One of these substances is dopamine, a neurotransmitter that is particularly important in the processes of reward and addiction.

Increasing the concentration and diffusion range of neurotransmitters such as dopamine, glutamate or neuropeptides can turn the weak rewarding properties of alcohol into powerful effectors in the formation of drinking habits that eventually lead to addiction in some people, the researchers say. Understanding and ultimately reversing these changes can help in the development of more effective treatments.

This translational study shows that there is a higher average diffusivity in the cerebral gray matter of humans and rats that drink regularly. These alterations appear shortly after the onset of alcohol consumption in rats, persist in early abstinence in both rodents and humans, and are associated with a strong decrease in extracellular space barriers explained by a reaction of the microglia to an aggressor such as alcohol

In a previous paper, published in April last year in JAMA Psychiatry, this same group showed that alcohol continues to damage the brain even after stopping drinking. That work already reflected an increase in diffusivity in the brain due to alcohol, but the researchers did not yet know why. Now the new study, published in Science Advances, solves the mystery by showing that the increase in diffusivity is due to the activation of the brain's immune cells

This study has been developed in the context of a broad European collaboration by researchers from the UMH-CSIC Institute of Neurosciences in Alicante, the Polytechnic University of Valencia, the German Central Institute of Mental Health, the University of Camerino (Italy) and the Charles University of Prague (Czech Republic).

Credit: 
Spanish National Research Council (CSIC)

Researchers identify subject-specific component to perceptual learning ability

image: A. Illustration of seven training tasks used in the experiment; B. average learning curves for each task; C. weights of contributed factors from the multivariate regression model; D. LASSO regression model

Image: 
Prof. HUANG's Group

Developing expertise usually requires a variety of skills, and some people can become experts while others can't. Common practice in perceptual learning research has treated ubiquitous individual learning differences as random fluctuations or noise and made inferences based on aggregated data from multiple subjects.

Recent research, however, is paying more attention to individual learning differences.

For example, accumulating evidence suggests that individual differences reflect genetic and/or environmental influences on human behavior and can even be predicted from brain structure and neural activity. Nevertheless, researchers have been uncertain whether or not individual differences in perceptual learning reflect a consistent individual variability in learning ability across multiple perceptual learning tasks.

A new Chinese study has begun to answer this question.

Ph.D. candidate YANG Jia and her colleagues, under the guidance of Prof. HUANG Changbing from the Institute of Psychology of the Chinese Academy of Sciences (CAS), and LU Zhonglin from New York University (NYU) and NYU Shanghai recently collected and analyzed data from a large sample of subjects in seven visual, auditory and working memory training tasks.

They established a multivariate regression model and showed that initial performance, task, and individual differences all contributed significantly to the learning rates across the tasks.

Most importantly, they were able to identify both a task-specific but subject-invariant component of learning and a subject-specific but task-invariant perceptual learning ability. An additional least absolute shrinkage and selection operator (LASSO) regression analysis revealed that a number of personality traits, including intelligence quotient, extraversion, and neuroticism, made significant contributions to individual differences.

These findings reveal the multifaceted nature of perceptual learning and provide strong evidence for the existence of a consistent pattern of individual differences across multiple training tasks, suggesting that individual differences in perceptual learning are not "noise"; rather, they reflect variability in learning ability across individuals.

"These results could have important implications for selecting potential trainees in occupations that require perceptual expertise and designing better training protocols to improve the efficiency of clinical rehabilitation," said YANG.

Credit: 
Chinese Academy of Sciences Headquarters

World's smallest imaging device has heart disease in focus

image: Ultrathin 3D printed endoscope imaging an artery.

Image: 
Image by Simon Thiele and Jiawen Li.

A team of researchers led by the University of Adelaide and University of Stuttgart has used 3D micro-printing to develop the world's smallest, flexible scope for looking inside blood vessels.

The camera-like imaging device can be inserted into blood vessels to provide high quality 3D images to help scientists better understand the causes of heart attack and heart disease progression, and could lead to improved treatment and prevention.

In a study published in the journal Light: Science & Applications, a multidisciplinary team of researchers and clinicians was able to 3D print a tiny lens on to the end of an optical fibre, the thickness of a human hair.

The imaging device is so small that researchers were able to scan inside the blood vessels of mice.

Dr Jiawen Li, co-author and Heart Foundation Postdoctoral Fellow at the Institute for Photonics and Advanced Sensing, University of Adelaide, says in Australia cardiovascular disease kills one person every 19 minutes.

"A major factor in heart disease is the plaques, made up of fats, cholesterol and other substances that build up in the vessel walls," Dr Li said.

"Preclinical and clinical diagnostics increasingly rely on visualising the structure of the blood vessels to better understand the disease.

"Miniaturised endoscopes, which act like tiny cameras, allow doctors to see how these plaques form and explore new ways to treat them," she said.

Dr Simon Thiele, Group Leader, Optical Design and Simulation at the University of Stuttgart, was responsible for fabricating the tiny lens.

"Until now, we couldn't make high quality endoscopes this small," Dr Thiele said.

"Using 3D micro-printing, we are able to print complicated lenses that are too small to see with the naked eye.

"The entire endoscope, with a protective plastic casing, is less than half a millimetre across," he said.

Dr Li explains: "It's exciting to work on a project where we take these innovations and build them into something so useful.

"It's amazing what we can do when we put engineers and medical clinicians together," said Dr Li.

Credit: 
University of Adelaide

Valley-Hall nanoscale lasers

image: a Scanning electron microscope image of the fabricated sample. The false-colour triangle marks the interior of the topological cavity. b Emission energy vs. the pump energy dependence showing a threshold transition to lasing. The sample is optically pumped at a 980 nm wavelength with 8 ns pulses at a 10 kHz repetition rate. c, d Spatial distribution of emission for the pump intensity (c) below and (d) above the lasing threshold.

Image: 
by Daria Smirnova, Aditya Tripathi, Sergey Kruk, Min-Soo Hwang, Ha-Reem Kim, Hong-Gyu Park, and Yuri Kivshar

Topological photonics underpins a promising paradigm for robust light manipulation and smart design of optical devices with improved reliability and advanced functionalities governed by the nontrivial band topology. Nanostructures made of high-index dielectric materials with judiciously designed resonant elements and lattice arrangements show special promise for implementation of topological order for light at the nanoscale and optical on-chip applications. High-index dielectrics such as III-V semiconductors that can contain strong optical gain further enhanced by topological field localization form a promising platform for active topological nanophotonics.

In a new paper published in Light Science & Application, a team of scientists, led by Yuri Kivshar from the Australian National University and Hong-Gyu Park from the Korea University, and co-workers have implemented nanophotonic cavities in a nanopatterned InGaAsP membrane incorporating III-V semiconductor quantum wells. The nanocavities exhibit a photonic analogue of valley-Hall effect. Researchers demonstrated room-temperature low-threshold lasing from a cavity mode hosted within the topological bandgap of the structure.

The SEM image of the fabricated structure and experimental results are shown in Figure. The cavity is based on the closed valley-Hall domain wall created by inversion of staggering nanoholes sizes in a bipartite honeycomb lattice. In the topological bandgap frequency range, the cavity supports a quantized spectrum of modes confined to the domain wall. The images show real-space emission profiles below and above the threshold.

The scientists explain:

"In experiment, we first observe spontaneous emission from the cavity. The emission profile shows the enhancement along the entire perimeter of the triangular cavity associated with edge states. When increasing a pump power, we observe a threshold transition to lasing with a narrow linewidth where the emission gets confined at the three corners."

When two spots are isolated, coherence of the emission is confirmed by interference fringes observed in the measured far-field radiation patterns. An isolated corner emits a donut-shaped beam carrying a singularity. These findings make a step towards topologically controlled ultrathin light sources with nontrivial radiation characteristics.
The researchers forecast:

"The proposed all-dielectric platform holds promise for the versatile design of active topological metasurfaces with integrated light sources. Such topological nanocavities have vast potential for advances in nonlinear nanophotonics, low-power nanolasing and cavity quantum electrodynamics".

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

Skin stem cells shuffle sugars as they age

Tsukuba, Japan - Age shows nowhere better than on the skin. The ravages of time on skin and the epidermal stem cells that differentiate to replenish its outer layer have been hypothesized, but there has been no method to evaluate their aging at the molecular level. Now, researchers at the University of Tsukuba and the National Institute of Advanced Industrial Science and Technology (AIST) have revealed that changes in the complex sugars called glycans that coat the surface of epidermal stem cells can serve as a potential biological marker of aging.

Skin is the largest human organ and a vital barrier against infection and fluid loss. Aging impairs environmental defenses and wound healing, while increasing hair loss and cancer risk. A key process underlying epidermal function in health and disease is cellular glycosylation that mediates cell-cell interactions and cell-matrix adhesions. Glycosylation involves attaching glycans to proteins; the profile of all glycans on and in a cell--collectively 'the cell glycome'--could reflect its functional scope and serve as an index of its age.

The researchers first isolated epidermal stem cells from the skin of young and old laboratory mice, including both hair follicle cells and interfollicular epidermal cells. These cells underwent glycan profiling using the lectin microarray platform; this technique uses lectins--proteins that bind specific glycans--and enables glycome analysis even for cells sparsely dispersed in tissues.

"Our results clearly showed that high mannose-type N-glycans are replaced by ?2-3/6 sialylated complex type N-glycans in older epidermal stem cells," senior author, Professor Hiromi Yanagisawa, explains. "We followed this with gene expression analysis; this revealed up-regulation of a glycosylation-related mannosidase and two sialyltransferase genes, suggesting that this 'glycome shift' may be mediated by age-modulated glycosyltransferase and glycosidase expression."

Finally, to check whether the glycan changes were the cause or merely the result of aging, the research team overexpressed the up-regulated glycogenes in primary epidermal mouse keratinocytes in vitro. The keratinocytes showed decreased mannose and increased Sia modifications, replicating the in vivo glycosylation pattern of aging epidermal stem cells. In addition, their decreased ability to proliferate suggested that these alterations may reflect the waning ability of aging epidermal stem cells to proliferate.

Professor Aiko Sada, currently Principal Investigator at Kumamoto University, and Professor Hiroaki Tateno at AIST, co-corresponding authors, explain the implications of their results. "Our work is broadly targeted at investigating stem cell dysfunction specifically in aging skin. Future advances may help manage skin disorders at the stem cell level, including age-related degenerative changes, impaired wound healing and cancer."

Credit: 
University of Tsukuba

Artificial cells produce parts of viruses for safe studies

Scientists searching for better diagnostic tests, drugs or vaccines against a virus must all begin by deciphering the structure of that virus. And when the virus in question is highly pathogenic, investigating, testing or developing these can be quite dangerous. Prof. Roy Bar-Ziv, Staff Scientist Dr. Shirley Shulman Daube, Dr. Ohad Vonshak, a former research student in Bar-Ziv's lab, and current research student Yiftach Divon have an original solution to this obstacle. They demonstrated the production of viral parts within artificial cells.

The cells are micrometer-sized compartments etched into a silicon chip. At the bottom of each compartment, the scientists affixed DNA strands, packing them densely. The edges of the artificial cells were carpeted with receptors that can capture the proteins produced within the cells. To begin with, the scientists flooded their cells with everything needed to make proteins - molecules and enzymes needed to read the DNA information and translate it into proteins. Then, with no further human intervention, the receptor carpet trapped one of the proteins produced in the bottoms of the cells, with the rest of the viral proteins binding to one another, producing structures that the scientists had earlier "programmed" into the system. In this case, they created assorted small parts of a virus that infects bacteria (a bacteriophage).

"We discovered," says Bar-Ziv, "that we can control the assembly process - both the efficiency and the final products - through the design of the artificial cells. This included the cells' geometric structure, and the placement and organization of the genes. These all determine which proteins will be produced and, down the line, what will be made from these proteins once they are assembled."

Vonshak adds: "Since these are miniaturized artificial cells, we can place a great many of them on a single chip. We can alter the design of various cells, so that diverse tasks are performed at different locations on the same chip."

The features of the system developed at the Weizmann Institute - including the ability to produce different small parts of a single virus at once, could give scientists around the globe a new tool for evaluating tests, drugs and vaccines against that virus. Adds Divon: "And because the artificial parts - even if they faithfully reproduced parts of the virus - do not include the use of actual viruses, they would be especially safe from beginning to end". "Another possible application", says Shulman Daube, "might be the development of a chip that could rapidly and efficiently conduct thousands of medical tests all at once".

Credit: 
Weizmann Institute of Science

New material can generate hydrogen from salt and polluted water

Scientists of Tomsk Polytechnic University jointly with teams from the University of Chemistry and Technology, Prague and Jan Evangelista Purkyn? University in Ústí nad Labem have developed a new 2D material to produce hydrogen, which is the basis of alternative energy. The material efficiently generates hydrogen molecules from fresh, salt, and polluted water by exposure to sunlight. The results are published in ACS Applied Materials & Interfaces (IF: 8,758; Q1).

"Hydrogen is an alternative source of energy. Thus, the development of hydrogen technologies can become a solution to the global energy challenge. However, there are a number of issues to solve. In particular, scientists are still searching for efficient and green methods to produce hydrogen. One of the main methods is to decompose water by exposure to sunlight. There is a lot of water on our planet, but only a few methods suitable for salt or polluted water. In addition, few use the infrared spectrum, which is 43% of all sunlight," Olga Guselnikova, one of the authors and a researcher of the TPU Research School of Chemistry & Applied Biomedical Sciences, notes.

The developed material is a three-layer structure with a 1-micrometer thickness. The lower layer is a thin film of gold, the second one is made of 10-nanometer platinum, and the third is a film of metal-organic frameworks of chromium compounds and organic molecules.

"During the experiments, we watered material and sealed the container to take periodic gas samples to determine the amount of hydrogen. Infrared light caused the excitation of plasmon resonance on the sample surface. Hot electrons generated on the gold film were transferred to the platinum layer. These electrons initiated the reduction of protons at the interface with the organic layer. If electrons reach the catalytic centers of metal-organic frameworks, the latter were also used to reduce protons and obtain hydrogen," Olga explains.

Experiments have demonstrated that 100 square centimeters of the material can generate 0.5 liters of hydrogen in an hour. It is one of the highest rates recorded for 2D materials.

"I this case, the metal-organic frame also acted as a filter. It filtered impurities and passed already purified water without impurities to the metal layer. It is very important, because, although there is a lot of water on Earth, its main volume is either salt or polluted water. Thereby, we should be ready to work with this kind of water," she notes.

In the future, scientists improve material to make it efficient for both infrared and visible spectra.

"The material already demonstrates a certain absorption in the visible light spectrum, but its efficiency is slightly lower than in the infrared spectrum. After improvement, it will be possible to say that the material works with 93% of the spectral volume of sunlight," Olga adds.

Credit: 
Tomsk Polytechnic University

Skin cancer treatments could be used to treat other forms of the disease

The creation of a silica nanocapsule could allow treatments that use light to destroy cancerous or precancerous cells in the skin to also be used to treat other types of cancer. Such are the findings of a study by INRS (Institut national de la recherche scientifique) professors Fiorenzo Vetrone and Federico Rosei, in collaboration with an international research team. Their results have been published in an article featured on the cover of the 26th edition of the Royal Society of Chemistry journal Chemical Science.

Photodynamic therapies, referred to as PDT, are treatments that use visible light to destroy cancerous or precancerous cells. Abnormal tissue is brought into contact with a light-activated and light-sensitive medication before it is exposed to light, triggering a chemical reaction that releases reactive oxygen species (ROS) to attack the diseased cells. In current treatments, the medication is injected into unhealthy tissue. But injection only works if tumours are on or under the skin, such as skin cancers.

The research team found a way around the limitation, so phototherapies can be used to treat other types of cancers as well. Thanks to silica nanoparticles doctors can use near-infrared (NIR) light, which penetrates further down into the tissue. The nanoparticles convert NIR light into visible light, triggering a chemical reaction and releasing ROS. "It's like we've reinforced the capsule that transports the treatment for diseased cells and increased the versatility of PDT. This nano superhero is stronger and more effective, even inside the body," explained Professor Vetrone, referring to the cover illustration depicting the team's discovery.

Developed by a team of chemists, magnetic resonance imaging (MRI) experts, and an oncologist, the new way of selectively enveloping light-sensitive medication in a nanocapsule could be beneficial for diagnosis and treatment. "Eventually, nanocapsules will help us expand the scope of application," added the professor.

Next, the team plans to test the nanoparticles in an in-vivo setting.

Credit: 
Institut national de la recherche scientifique - INRS

UK's Modern Slavery Act challenging for universities -- new study

video: Michael Rogerson, researcher at the University of Bath, discusses how the UK's Modern Slavery Act is proving challenging for Universities

Image: 
University of Bath

The UK's universities are struggling to live up to the spirit and ambition of the Modern Slavery Act, hampered by poor oversight of their supply chains, a lack of skills and resource in supply chain management, a focus on reducing costs, and lacklustre engagement from many in senior management, a new study from the University of Bath shows.

The UK Modern Slavery Act 2015 (MSA) obliges organisations to report on the actions they are taking to protect individuals in their organisations and supply chains from slavery - widely defined as the exploitation of a person who is deprived of individual liberty anywhere along the supply chain, from raw material extraction to the final customer, for the purpose of service provision or production.

The study interviewed key personnel at 33 UK universities who were responsible for managing and reporting on modern slavery in supply chains. It showed that universities were struggling to engage with the Act, resorting to a rudimentary level of compliance, and failing to seize an opportunity for leadership in an area which is increasingly identified as a risk to organisational reputation and sustainability.

"The MSA had aimed to create a culture of continuous improvement, where organisations developed new and better ways of addressing modern slavery and shared those innovations with others for the greater good. What we found, however, was that organisations have been reduced to a box-ticking approach to basic compliance, which in many ways reflects the scale of the challenge," Michael Rogerson, researcher at the University's School of Management, said.

Rogerson said around a quarter of UK universities were fully compliant with the Act. The study revealed several major obstacles: pressure to keep costs down and make best use of taxpayer funds meant universities purchased through consortia, which meant they do not have clear view of the supply chains to conduct effective due diligence. Essentially, the university end of the chain was simple procurement, meaning they did not have effective in-house supply chain management skills and were reliant on the guarantees of third-party suppliers.

"The historical focus of procurement was cost - the higher education sector has used its collective consumption, through regional purchasing consortia, to negotiate lower prices for high-volume, repeat goods from IT equipment, stationery and furniture to laboratory chemicals. MSA however requires a depth of knowledge about, and active management of, supply chains which our study suggests is incompatible with this structure," Rogerson said.

In turn, the study found procurement teams were hampered by a lack of focus, and engagement, from senior university management. Rogerson found that procurement was not treated as a strategic function at board level, and that resources and time allocated to complying with MSA were not sufficient to be effective.

Rogerson said the resourcing issue had actually led universities to collaborate but not in the spirit of the original Act by sharing innovation and competition. Rather, they were helping each other to achieve a minimum level of compliance, often by sharing templated mission statements, or knowledge to help struggling organisations achieve a basic level of pro-forma compliance. Several universities have near-identical statements, he said.

"There is a desire to do the right thing, help others in the sector, and demonstrate organisational responsibility - the failure to engage is not necessarily intentional. But the lack of supply chain management skills has led to the adoption of a rudimentary level of assurance - one example would be updating the terms of contracts to reflect the need for modern slavery statements. But more has to be done," Rogerson said.

Rogerson said the desire to live up to the MSA was not universal across university departments and there was a perception that tackling modern slavery was too big a challenge in a sector fighting on many fronts simultaneously.

"Executive focus is so poor that some procurement teams have had trouble complying even when they've done the work. Cases include a marketing department refusing to put the statement on the website's homepage (one of three actions required for full compliance) because 'there's too much stuff on there already'). At several universities, no executive wanted to sign the document (another of the three requirements)," he said.

Rogerson said universities should work more closely with their private sector suppliers to learn about supply chain processes and share supply chain data with other educational institutions. He said universities should look to the management accounting profession to find a way forward and ensure they live up to the ambitions of the Modern Slavery Act.

"Modern slavery is a serious reputational challenge for universities with international profiles. With proliferating demands for reporting and disclosure on social management practices in the higher education sector, it urgently needs to address its reporting and disclosure issues in such a way as to inspire confidence in their practices. Management accountants are well placed to help in this regard," he said.

Credit: 
University of Bath

Droplet biosensing method opens the door for faster identification of COVID-19

image: From left: Jiangtao Cheng and Wei Zhou.

Image: 
Virginia Tech

Mechanical engineering associate professor Jiangtao Cheng and electrical and computer engineering assistant professor Wei Zhou have developed an ultrasensitive biosensing method that could dramatically shorten the amount of time required to verify the presence of the COVID-19 virus in a sample. Their peer-reviewed research was published in ACS Nano on June 29.

There’s significant room to improve the pace of coronavirus testing, Cheng and Zhou have found. Current COVID-19 verification tests require a few hours to complete, as verification of the presence of the virus requires the extraction and comparison of viral genetic material, a time-intensive process requiring a series of steps. The amount of virus in a sampling is also subject to error, and patients who have had the virus for a shorter period of time may test negative because there is not enough of the virus present to trigger a positive result.

In Cheng and Zhou’s method, all of the contents of a sampling droplet can be detected, and there is no extraction or other tedious procedures. The contents of a microdroplet are condensed and characterized in minutes, drastically reducing the error margin and giving a clear picture of the materials present.

The key to this method is in creating a surface over which water containing the sample travels in different ways. On surfaces where drops of water may “stick” or “glide,” the determining factor is friction. Surfaces that introduce more friction cause water droplets to stop, whereas less friction causes water droplets to glide over the surface uninhibited.

The method starts by placing a collected sample into liquid. The liquid is then introduced into an engineered substrate surface with both high and low friction regions. Droplets containing sample will move more quickly in some areas but anchor in other locations thanks to a nanoantenna coating that introduces more friction. These stop-and-go waterslides allow water droplets to be directed and transported in a programmable and reconfigurable fashion. The “stopped” locations are very small because of an intricately placed coating of carbon nanotubes on etched micropillars.

These prescribed spots with nanoantennae are established as active sensors. Cheng and Zhou’s group heats the substrate surface so that the anchored water droplet starts to evaporate. In comparison with natural evaporation, this so-called partial Leidenfrost-assisted evaporation provides a levitating force which causes the contents of the droplet to float toward the nanoantenna as the liquid evaporates. The bundle of sample particles shrinks toward the constrained center of the droplet base, resulting in a rapidly-produced package of analyte molecules.

For fast sensing and analysis of these molecules, a laser beam is directed onto the spot with the packed-in molecules to generate their vibrational fingerprint light signals, a description of the molecules expressed in waveforms. This method of laser-enabled feedback is called surface-enhanced Raman spectroscopy.

All of this happens in just a few minutes, and the fingerprint spectrum and frequency of the coronavirus can be quickly picked out of a lineup of the returned data.

Professor Cheng and Zhou’s team is pursuing a patent on the method, and are also pursuing funding from the National Institutes for Health to deliver the method for widespread use.

A full summary and description of this research is available in the June 26, 2020, publication of ACS Nano.

In immediate response to the COVID-19 pandemic, Virginia Tech faculty, staff, and students have initiated numerous research projects with local and global salience. Learn more from the Office of the Vice President for Research and Innovation.

Journal

ACS Nano

Credit: 
Virginia Tech

New drug discoveries are closely linked to the quality of lab procedures

In their quest to find new drugs to treat deadly diseases, scientists study millions of molecules at high speed at the same time. Often it is enzymes that are investigated as targets in these 'high-throughput' screenings.

New research from the University of Bath in the UK suggests the quality of the lab procedure (or assay) used for these screenings (measured by the "Z' value") has a much bigger impact on the ability to identify effective new molecules than was previously thought. The Z'-factor - which can never be greater than 1.0 - is a statistical measure of the researchers' ability to see the required signal. It is used to judge whether the response in a particular assay is large enough to warrant further attention.

As a result of the new study, pharmaceutical companies and other labs around the world will be under pressure to refine their techniques for investigating new drug candidates.

In recent years, there has been an explosion of studies involving enzymes. These studies aim to identify molecules that can be developed into new drugs for treating cancers, infectious diseases and neurodegenerative diseases, amongst other conditions.

"There are a lot of diseases out there for which there is no treatment or the treatments aren't very good," said Dr Matthew Lloyd, who led the study from the University's Department of Pharmacy and Pharmacology. "This explains why there is such a big drive to develop new treatments using high-throughput screening."

In a paper published this month in the Journal of Medicinal Chemistry, Dr Lloyd identifies 75 examples of 'hit' molecules that went on to the next stage of early drug discovery. This is the first time high-throughput screening involving enzymes has been subject to such a focused review and analysis. Dr Lloyd examined scientific papers published between 2002 and 2020 and found that hit frequency was closely linked to assay quality, as measured by the Z'-factor.

Dr Lloyd found that a Z'-factor of 0.65 had an average hit rate of 0.22% whereas a Z'-factor of 0.8 had an average hit rate of 0.83%, clearly demonstrating the significance of an optimised assay.

"These findings underline how important it is to make sure your assay is the best possible quality it can be," said Dr Lloyd. "A high Z' factor, indicative of high-quality lab procedures, enables more hits to be found and ultimately should increase the chances of new treatments being developed.

"Some studies are currently using assays that are not very good in terms of the Z'-factor. It was thought that 0.5 was acceptable but this review shows a level between 0.75 and 0.8 is the minimum that should be aimed for.

He added: "I suspect some researchers don't realise there is such a pronounced effect, which is why they settle for assay with a Z' of 0.7. But in the future, people in industry will need to be mindful of the results of my analysis when they are

Credit: 
University of Bath

Study strengthens calls to clear cancer diagnosis backlog

Delays to cancer referral through reduced use of the urgent GP referral pathway during the coronavirus pandemic could result in more than a thousand additional deaths in England, a new study reports.

New modelling suggests that delays in patients presenting and being referred with suspected cancer by their GP, and resulting bottlenecks in diagnostic services, are likely to have had a significant adverse effect on cancer survival.

During lockdown, urgent so-called two-week wait GP referrals in England for suspected cancer have dropped by up to 84% - raising fears that undiagnosed cancers could be progressing from early-stage tumours to advanced, incurable disease.

The researchers suggest the NHS would need to ramp up diagnostic capacity rapidly to avoid further unnecessary deaths - and might prioritise certain tumour types in which avoidance of delay is particularly impactful, such as bladder, kidney and lung cancer.

Scientists at The Institute of Cancer Research, London, used 10-year cancer survival estimates for England for 20 common tumour types to create models estimating the impact of reduced patient referrals through urgent GP pathways linked to the Covid-19 pandemic.

The researchers modelled the impact of three different scenarios of lockdown-accumulated backlog - reflecting a 25, 50 or 75 per cent reduction in people across England coming forward with symptoms and receiving urgent GP referrals over the three-month lockdown period.

Their modelling indicated if all these patients presented and were referred for diagnostic investigation promptly post end of lockdown in mid June, the presentational delay would result in an estimated 181, 361 or 542 excess deaths respectively.

Since extra diagnostic services like scans and biopsies are unlikely to be immediately available to address fully the backlog, the researchers also estimated the additional lives that might be lost due to consequent diagnostic delays.

The researchers estimated that in a good-case scenario, which assumes all patients present in the month post lockdown and the necessary additional diagnostic capacity is made fully available spread across the three months post lockdown, delays in diagnosis would result in up to another 276 additional deaths. If the additional capacity were delayed and only provided spread across months three to eight post lockdown, there could be up to 1,231 additional deaths, their modelling suggested.

The study was published in The Lancet Oncology and was largely funded by the ICR itself, with support from Breast Cancer Now and Cancer Research UK.

The research team found that the impact of a delayed referral and diagnosis depended on factors such as cancer type and stage, aggressiveness of the disease and patient age.

The number of cases that progress from urgent GP referral to diagnosis of an aggressive but treatable cancer varies widely by tumour type. Reflecting this, their modelling suggests that avoiding delays for suspected bladder, kidney and lung cancers, especially in younger patients for whom there is less risk from hospital-acquired Covid-19 infection, would have most impact on lives and life-years lost.

These findings suggest that strategies that prioritise referral of particular groups of patients would achieve the best outcomes and limit the number of additional deaths from cancer linked to the pandemic. However, the estimates are based on limited evidence and results need to be interpreted with caution when considering strategies for specific groups.

Researchers estimated that across all 20 cancer types, a uniform per-patient delay of one month in diagnosis just via the urgent referral pathway would result in 1,412 lives lost and 25,812 life-years lost if these disruptions lasted a full year - while a six-month delay would result in 9,280 lives and 173,540 life-years lost.

In addition to delays in diagnosing and treating cancer, crucial cancer research aiming to find new treatments for patients has also been disrupted. The ICR, a charity and research institute, has launched a major fundraising appeal to kick-start its research and make up for the time lost to the coronavirus crisis.

Study leader Professor Clare Turnbull, Professor of Cancer Genomics at The Institute of Cancer Research, London, said:

"We have shown that delays in presenting to GPs with symptoms and subsequently to accessing diagnostic tests could cause more than a thousand additional deaths if sufficient extra capacity isn't provided promptly to deal with the backlog.

"It's vital that we do everything we can to ensure cancer patients are not left further behind by the disruptions to care caused by the Covid-19 pandemic. That means ramping up capacity as quickly as possible to allow cancer diagnostic services to clear the backlog. Our data indicate prioritisation of particular patient groups may be effective in mitigating the extent of excess deaths and lost life years."

Professor Paul Workman, Chief Executive of The Institute of Cancer Research, London, said:

"It has become clear that the Covid-19 pandemic is taking a heavy toll on people with cancer - by delaying their diagnosis, disrupting access to surgery and other aspects of care, and pausing vital research into new treatments.

"We know that cases of cancer have remained hidden during the pandemic, because patients have missed out on GP referral and access to diagnostics, and this study reveals the likely impact on survival rates.

"It adds to evidence of the vital need to get cancer diagnostic and treatment services fully back up and running, and research programmes moving forward once again, to minimise the impact on patients today and in the future."

Credit: 
Institute of Cancer Research

New cosmic magnetic field structures discovered in galaxy NGC 4217

Spiral galaxies such as our Milky Way can have sprawling magnetic fields. There are various theories about their formation, but so far the process is not well understood. An international research team has now analysed the magnetic field of the Milky Way-like galaxy NGC 4217 in detail on the basis of radio astronomical observations and has discovered as yet unknown magnetic field structures. The data suggest that star formation and star explosions, so-called supernovae, are responsible for the visible structures.

The team led by Dr. Yelena Stein from Ruhr-Universität Bochum, the Centre de Données astronomiques de Strasbourg and the Max Planck Institute for Radio Astronomy in Bonn together with US-American and Canadian colleagues, published their report in the journal Astronomy and Astrophysics, released online on 21 July 2020.

The analysed data had been compiled in the project "Continuum Halos in Nearby Galaxies", where radio waves were utilised to measure 35 galaxies. "Galaxy NGC 4217 is of particular interest to us," explains Yelena Stein, who began the study at the Chair of Astronomy at Ruhr-Universität Bochum under Professor Ralf-Jürgen Dettmar and who currently works at the Centre de Données astronomiques de Strasbourg. NGC 4217 is similar to the Milky Way and is only about 67 million light years away, which means relatively close to it, in the Ursa Major constellation. The researchers therefore hope to successfully transfer some of their findings to our home galaxy.

Magnetic fields and origins of star formation

When evaluating the data from NGC 4217, the researchers found several remarkable structures. The galaxy has an X-shaped magnetic field structure, which has also been observed in other galaxies, extending far outwards from the galaxy disk, namely over 20,000 light years.

In addition to the X-shape, the team found a helix structure and two large bubble structures, also called superbubbles. The latter originate from places where many massive stars explode as supernovae, but also where stars are formed that emit stellar winds in the process. Researchers therefore suspect a connection between these phenomena.

"It is fascinating that we discover unexpected phenomena in every galaxy whenever we use radio polarisation measurements," points out Dr. Rainer Beck from the MPI for Radio Astronomy in Bonn, one of the authors of the study. "Here in NGC 4217, it is huge magnetic gas bubbles and a helix magnetic field that spirals upwards into the galaxy's halo."

The analysis moreover revealed large loop structures in the magnetic fields along the entire galaxy. "This has never been observed before," explains Yelena Stein. "We suspect that the structures are caused by star formation, because at these points matter is ejected outward."

Image shows magnetic field structures

For their analysis, the researchers combined different methods that enabled them to visualise the ordered and chaotic magnetic fields of the galaxy both along the line of sight and perpendicular to it. The result was a comprehensive image of the structures.

To optimise the results, Yelena Stein combined the data evaluated by means of radio astronomy with an image of NGC 4217 that was taken in the visible light range. The image is available for download on the website. "Visualising the data was important to me," stresses Stein. "Because when you think about galaxies, magnetic fields is not the first thing that comes to mind, although they can be gigantic and display unique structures. The image is supposed to shift the magnetic fields more into focus."

Credit: 
Ruhr-University Bochum