Tech

Breaking records like baking bread

image: A new technique called thermal interdiffusion alloying can help to improve electronics and photonics devices while reducing carbon emissions and improving health and safety.

Image: 
© 2021 KAUST; Heno Hwang

Alloying, the process of mixing metals in different ratios, has been a known method for creating materials with enhanced properties for thousands of years, ever since copper and tin were combined to form the much harder bronze. Despite its age, this technology remains at the heart of modern electronics and optics industries. Semiconducting alloys, for instance, can be engineered to optimize a device's electrical, mechanical and optical properties.

Alloys of oxygen with group III elements, such as aluminum, gallium, and indium, are important semiconductor materials with vast applications in high-power electronics, solar-blind photodetectors and transparent devices. The defining property of a semiconductor is its bandgap, a barrier over which only electrons with the required energy can pass. Beta-phase aluminum gallium oxides are notable because of their relatively large bandgap, but most III-O alloys are expensive to make and of unsatisfactory quality.

Che-Hao Liao and co-workers in Xiaohang Li's group have invented a technique similar to bread baking to create high-quality aluminum gallium oxides in a common furnace. "We have demonstrated a simple and efficient method called thermal interdiffusion alloying (TIA) to achieve high-quality thin films while also being able to control the composition with the temperature and time," says Liao.

Liao and the team started by making the "dough" of common gallium oxide templates on sapphire substrates. They then heated the samples to a high temperature between 1,000 to 1,500 degrees Celsius in the furnace. In bread baking, the heating process hardens the gluten and solidifies the dough. In their study, the heating causes the aluminum atoms to slowly diffuse from sapphire into gallium oxide and the gallium atoms to move in the opposite direction to mix and create aluminum gallium oxide alloy. Higher temperatures and a longer process result in more interdiffusion, producing alloys with higher aluminum composition.

The choice of annealing temperature and time enabled better control over the aluminum composition, varying between 0 and 81 percent, which is a record high for an alloy corresponding to a very wide bandgap range. "We have demonstrated that TIA is an excellent new technique to control the essential thin film properties, including the composition, bandgap and thickness," explains Liao.

The KAUST researchers see TIA as a pathway to improved electronics and photonics devices to overcome challenges such as reducing carbon emissions and improving health and safety. "We are now applying TIA for next-generation device research," says Liao.

Credit: 
King Abdullah University of Science & Technology (KAUST)

New early warning system for self-driving cars

A team of researchers at the Technical University of Munich (TUM) has developed a new early warning system for vehicles that uses artificial intelligence to learn from thousands of real traffic situations. A study of the system was carried out in cooperation with the BMW Group. The results show that, if used in today's self-driving vehicles, it can warn seven seconds in advance against potentially critical situations that the cars cannot handle alone - with over 85% accuracy.

To make self-driving cars safe in the future, development efforts often rely on sophisticated models aimed at giving cars the ability to analyze the behavior of all traffic participants. But what happens if the models are not yet capable of handling some complex or unforeseen situations?

A team working with Prof. Eckehard Steinbach, who holds the Chair of Media Technology and is a member of the Board of Directors of the Munich School of Robotics and Machine Intelligence (MSRM) at TUM, is taking a new approach. Thanks to artificial intelligence (AI), their system can learn from past situations where self-driving test vehicles were pushed to their limits in real-world road traffic. Those are situations where a human driver takes over - either because the car signals the need for intervention or because the driver decides to intervene for safety reasons.

Pattern recognition through RNN

The technology uses sensors and cameras to capture surrounding conditions and records status data for the vehicle such as the steering wheel angle, road conditions, weather, visibility and speed. The AI system, based on a recurrent neural network (RNN), learns to recognize patterns with the data. If the system spots a pattern in a new driving situation that the control system was unable to handle in the past, the driver will be warned in advance of a possible critical situation.

"To make vehicles more autonomous, many existing methods study what the cars now understand about traffic and then try to improve the models used by them. The big advantage of our technology: we completely ignore what the car thinks. Instead we limit ourselves to the data based on what actually happens and look for patterns," says Steinbach. "In this way, the AI discovers potentially critical situations that models may not be capable of recognizing, or have yet to discover. Our system therefore offers a safety function that knows when and where the cars have weaknesses."

Warnings up to seven seconds in advance

The team of researchers tested the technology with the BMW Group and its autonomous development vehicles on public roads and analyzed around 2500 situations where the driver had to intervene. The study showed that the AI is already capable of predicting potentially critical situations with better than 85 percent accuracy - up to seven seconds before they occur.

Collecting data with no extra effort

For the technology to function, large quantities of data are needed. After all, the AI can only recognize and predict experiences at the limits of the system if the situations were seen before. With the large number of development vehicles on the road, the data was practically generated by itself, says Christopher Kuhn, one of the authors of the study: "Every time a potentially critical situation comes up on a test drive, we end up with a new training example." The central storage of the data makes it possible for every vehicle to learn from all of the data recorded across the entire fleet.

Credit: 
Technical University of Munich (TUM)

Groundwater discharge affects water quality in coastal waters

image: Groundwater discharge can drive nitrogen in coastal waters. In the picture, researchers are walking on groundwater discharging to the ocean at low tide.

Image: 
Isaac Santos

Water quality management in the ocean often targets visible pollution sources such as sewage, rivers or ships. A new global study, led by researchers at the University of Gothenburg, reveals that invisible groundwater discharges may be just as important driving nitrogen into coastal waters.

As we enter the United Nations' Decade of the Oceans, a new research study shed light on an often overlooked source of impact on the coastal ecosystems.

The study, which examined groundwater discharges at more than 200 locations worldwide, showed that groundwater is the major source of nitrogen and phosphorus to the ocean at many locations, including some areas in the Baltic Sea.

"Groundwater is essentially invisible and difficult to investigate. That is why coastal water quality managers often overlook groundwater discharges to the oceans," says Isaac Santos, professor in marine chemistry at the University of Gothenburg, who led the study in collaboration with thirteen worldwide universities.

"Nitrogen pollution is a major threat to marine biodiversity and a worldwide concern. Surprisingly, our global analysis revealed that groundwater nitrogen discharge exceeds river nitrogen discharge at 60 percent of the sites where both sources have been quantified."

Groundwater accumulates nitrogen from fertilisers used on crops, and may take decades to release this nitrogen to the ocean. When the nitrogen reaches the ocean, it increases algal biomass and decreases marine biodiversity and eventually fisheries.

Many lakes and rivers are connected to groundwater aquifers, geological formations that store groundwater. This high connectivity has prompted legislation to protect those groundwater-dependent ecosystems at the national and European level.

"However, this study shows that the coastal ocean is also highly connected to aquifers, so we need to consider groundwater aquifers as well when managing coastal water quality. For example, the Baltic Sea and many other coastal areas have suffered from nitrogen pollution for decades," says Stefano Bonaglia, a marine chemist at the University of Gothenburg who also participated in the study

They both emphasise that the management of groundwater discharges to the coastal ocean is challenging and may require decades of work. At the University of Gothenburg marine researchers will continue to investigate submarine groundwater discharge with a number of international research projects.

"Climate change, sea level rise and land use change will modify the chemistry of coastal aquifers, and we are now trying to understand how this will have long term impacts on submarine groundwater discharge", says Isaac Santos.

Credit: 
University of Gothenburg

Water splitting for solar energy conversion

image: Pt-modified BaTaO2N photocatalysts

Image: 
Cited from Wang, Z., Luo, Y., Hisatomi, T. et al. Sequential cocatalyst decoration on BaTaO2N towards highly-active Z-scheme water splitting. Nat Commun 12, 1005 (2021). Copyright © 2021, The Authors.

In order to enable large-scale hydrogen production using solar energy, particulate photocatalysts are being researched as a simple and cost-effective solution to splitting water into hydrogen and oxygen. It is necessary to develop a photocatalyst that can efficiently use visible light, which accounts for a large part of solar energy, in the water decomposition reaction. Barium tantalum oxynitride (BaTaO2N) is an oxynitride semiconductor material that absorbs visible light up to 650 nm and has a band structure capable of decomposing water into hydrogen and oxygen. Until very recently, it had not been possible to load BaTaO2N granules with co-catalyst fine particles, which are reaction active sites, with good adhesion and high dispersion.

In this study led by the Research Initiative for Supra-Materials of Shinshu University, the co-catalyst fine particles were found to be highly dispersed on the surface of the single crystal fine particles of BaTaO2N synthesized by the flux method when the impregnation-reduction method and the photodeposition method were sequentially applied (Fig. 1). As a result, the efficiency of the hydrogenation reaction using the BaTaO2N photocatalyst has been improved to nearly 100 times that of the conventional one, and the efficiency of the two-step excitation type (Z scheme type) water decomposition reaction in combination with the oxygen generation photocatalyst has also been improved. Transient absorption spectroscopy reveals that the Pt-assisted catalyst microparticles supported by the new method are less likely to induce recombination of electrons and holes because they efficiently extract electrons from the BaTaO2N photocatalyst (Fig. 2).

By supporting a small amount of Pt co-catalyst by the impregnation-reduction method in advance, the reduction reaction on the photocatalyst is promoted without agglutination of Pt fine particles. As a result, Pt cocatalyst fine particles are evenly supported by photodeposition on BaTaO2N particles. As a result, it is considered that the extraction of electricity by Pt co-catalyst fine granules proceeded efficiently.

It was also confirmed that the use of BaTaO2N, which is synthesized using an appropriate flux and has a low density of defects, is also important for supporting a highly dispersed Pt co-catalyst. This study dramatically improved the activity of the BaTaO2N photocatalyst and clarified its mechanism. The results of this research are expected to lead to the development of long-wavelength-responsive photocatalysts that drive the water decomposition reaction with high efficiency.

Credit: 
Shinshu University

Sussex scientists develop ultra-thin terahertz source

image: Ultrafast lasers at the University of Sussex EPic Lab are an essential ingredient to realise ultra-thin THz sources

Image: 
EPic Lab, University of Sussex

Physicists from the University of Sussex have developed an extremely thin, large-area semiconductor surface source of terahertz, composed of just a few atomic layers and compatible with existing electronic platforms.

Terahertz sources emit brief light pulses oscillating at 'trillion of times per second'. At this scale, they are too fast to be handled by standard electronics, and, until recently, too slow to be handled by optical technologies. This has great significance for the evolution of ultra-fast communication devices above the 300GHz limit - such as that required for 6G mobile phone technology - something that is still fundamentally beyond the limit of current electronics.

Researchers in the Emergent Photonics (EPic) Lab at Sussex, led by the Director of the Emergent Photonics (EPic) Lab Professor Marco Peccianti, are leaders in surface terahertz emission technology having achieved the brightest and thinnest surface semiconductor sources demonstrated so far. The emission region of their new development, a semiconductor source of terahertz, is 10 times thinner than previously achieved, with comparable or even better performances.

The thin layers can be placed on top of existing objects and devices, meaning they are able to place a terahertz source in places that would have been inconceivable otherwise, including everyday object such as a teapot or even a work of art - opening up huge potential for anti-counterfeiting and 'the internet of things' - as well as previously incompatible electronics, such as a next generation mobile phone.

Dr Juan S. Totero Gongora, Leverhulme Early Career Fellow at the University of Sussex, said:

"From a physics perspective, our results provide a long-sought answer that dates back to the first demonstration of terahertz sources based on two-colour lasers.

"Semiconductors are widely used in electronic technologies but have remained mostly out of reach for this type of terahertz generation mechanism. Our findings therefore open up a wide range of exciting opportunities for terahertz technologies."

Dr Luke Peters, Research Fellow of the European Research Council project TIMING at the University of Sussex, said:

"The idea of placing terahertz sources in inaccessible places has great scientific appeal but in practice is very challenging. Terahertz radiation can have a superlative role in material science, life science and security. Nevertheless, it is still alien to most of the existing technology, including devices that talk to everyday objects as part of the rapidly expanding 'internet of things'.

" This result is a milestone in our route to bring terahertz functions closer to our everyday lives."

Lying between microwaves and infrared in the electromagnetic spectrum, terahertz waves are a form of radiation highly sought in research and industry. They have a natural ability to reveal the material composition of an object by easily penetrating common materials like paper, clothes and plastic in the same way X-rays do, but without being harmful.

Terahertz imaging makes it possible to 'see' the molecular composition of objects and distinguish between different materials. Previous developments from Prof Peccianti's team showcased the potential applications of terahertz cameras, which could be transformative in airport security, and medical scanners - such as those used to detect skin cancers.

One of the biggest challenges faced by scientists working in terahertz technology is that what is commonly accepted as an 'intense terahertz source' is faint and bulky when compared with, for example, a light bulb. In many cases, the need for very exotic materials, such as nonlinear crystals, makes them unwieldy and expensive. This requirement poses logistical challenges for integration with other technologies, such as sensors and ultrafast communications.

The Sussex team have overcome these limitations by developing terahertz sources from extremely thin materials (about 25 atomic layers). By illuminating an electronic-grade semiconductor with two different types of lasers light, each oscillating at different frequency or colour, they were able to elicit the emission of short bursts of Terahertz radiation.

This scientific breakthrough has been long-sought by scientists working in the field since the first demonstration of terahertz sources based on two-colour lasers in the early 2000s. Two-colour terahertz sources based on special mixtures of gas, such as nitrogen, argon or krypton, are among the best performing sources available today. Semiconductors, widely used in electronic technologies, have remained mostly out of reach for this type of terahertz generation mechanism.

Credit: 
University of Sussex

Screams of 'joy' sound like 'fear' when heard out of context

People are adept at discerning most of the different emotions that underlie screams, such as anger, frustration, pain, surprise or fear, finds a new study by psychologists at Emory University. Screams of happiness, however, are more often interpreted as fear when heard without any additional context, the results show.

PeerJ published the research, the first in-depth look at the human ability to decode the range of emotions tied to the acoustic cues of screams.

"To a large extent, the study participants were quite good at judging the original context of a scream, simply by listening to it through headphones without any visual cues," says Harold Gouzoules, Emory professor of psychology and senior author of the study. "But when participants listened to screams of excited happiness they tended to judge the emotion as fear. That's an interesting, surprising finding."

First author of the study is Jonathan Engelberg, an Emory Ph.D. student of psychology. Emory alum Jay Schwartz, who is now on the faculty of Western Oregon University, is co-author.

The acoustic features that seem to communicate fear are also present in excited, happy screams, the researchers note. "In fact, people pay good money to ride roller coasters, where their screams no doubt reflect a blend of those two emotions," Gouzoules says.

He adds that the bias towards interpreting both of these categories as fear likely has deep, evolutionary roots.

"The first animal screams were probably in response to an attack by a predator," he says. "In some cases, a sudden, loud high-pitched sound might startle a predator and allow the prey to escape. It's an essential, core response. So mistaking a happy scream for a fearful one could be an ancestral carryover bias. If it's a close call, you're going to err on the side of fear."

The findings may even provide a clue to the age-old question of why young children often scream while playing.

"Nobody has really studied why young children tend to scream frequently, even when they are happily playing, but every parent knows that they do," Gouzoules says. "It's a fascinating phenomenon."

While screams can convey strong emotions, they are not ideal as individual identifiers, since they lack the more distinctive and consistent acoustic parameters of an individual's speaking voice.

"It's just speculative, but it may be that when children scream with excitement as they play, it serves the evolutionary role of familiarizing a parent to the unique sound of their screams," Gouzoules says. "The more you hear your child scream in a safe, happy context, the better able you are to identify a scream as belonging to your child, so you will know to respond when you hear it."

Gouzoules first began researching the screams of non-human primates, decades ago. Most animals scream only in response to a predator, although some monkeys and apes also use screams to recruit support when they are in a fight with other group members. "Their kin and friends will come to help, even if some distance away, when they can recognize the vocalizer," he says.

In more recent years, Gouzoules has turned to researching human screams, which occur in a much broader context than those of animals. His lab has collected screams from Hollywood movies, TV shows and YouTube videos. They include classic performances by "scream queens" like Jaime Lee Curtis, along with the screams of non-actors reacting to actual events, such as a woman shrieking in fear as aftershocks from a meteor that exploded over Russia shake a building, or a little girl's squeal of delight as she opens a Christmas present.

In previous work, the lab has quantified tone, pitch and frequency for screams from a range of emotions: Anger, frustration, pain, surprise, fear and happiness.

For the current paper, the researchers wanted to test the ability of listeners to decode the emotion underlying a scream, based solely on its sound. A total of 182 participants listened through headphones to 30 screams from movies that were associated with one of the six emotions. All of the screams were presented six times, although never in sequence. After hearing a scream, the listeners rated how likely it was associated with each of six of the emotions, on a scale of one to five.

The results showed that the participants most often matched a scream to its correct emotional context, except in the case of screams of happiness, which participants more often rated highly for fear.

"Our work intertwines language and non-verbal communication in a way that hasn't been done in the past," Gouzoules says.

Some aspects of non-verbal vocal communication are thought to be precursors for language. The researchers hypothesize that it may be that the cognitive underpinnings for language also built human capacity in the non-verbal domain. "It's probably language that gives us this ability to take a non-verbal vocalization and discern a wide range of meanings, depending on the acoustic cues," Gouzoules says.

Credit: 
Emory Health Sciences

Environmental antimicrobial resistance driven by poorly managed urban wastewater

Researchers from Newcastle University, UK, working with colleagues at King Mongkut's University of Technology Thonburi (KMUTT) in Thailand and the Institute of Urban Environment of the Chinese Academy of Sciences, analysed samples of water and sediment taken from aquaculture ponds and nearby canals at five locations in central Thailand's coastal region.

The research, which was part-funded by an institutional links grant awarded by the Newton Fund via the British Council, and which has been published in the Journal of Hazardous Materials, found that the highest prevalence of antimicrobial resistance (AMR) genes was in water from the Hua Krabue canal, originating in Bangkok. Faecal pollution markers were also high in these samples.

In comparison, they found a low number of AMR genes in all of the water and sediment samples collected from the aquaculture ponds.

Aquaculture is the fastest growing animal food production sector globally, and over 91% of global aquaculture is now produced in Asia. The worldwide increase in demand for farmed fish, shrimp and other shellfish has led to the widespread use of antibiotics in aquaculture, and there have been concerns that this is driving environmental AMR, threatening global food production systems.

In recent years, the Thai government has introduced measures aimed at tackling AMR in aquaculture including reducing the amount of antibiotics used in the industry and routinely monitoring antibiotic residues in aquaculture produce.

Dr David Werner, from Newcastle University, said: "We found no evidence that aquaculture is driving environmental AMR. In fact, the data suggests that small-scale aquaculture farmers are complying with Thai government One Health policies to reduce antimicrobial use in aquaculture.

"Wide and regular monitoring of environmental antibiotic resistance with high-throughput diagnostic tools can identify pollution hot-spots and sources to pinpoint the most effective countermeasures. This study provides a further line of evidence for the importance of safely managed sanitation for combatting antibiotic resistance. Currently only around half of total domestic wastewater in Thailand is treated, and our findings have identified an urgent need to improve urban sanitation in the country's coastal aquaculture region, for the protection of global food production systems."

The global spread of AMR is one of the greatest health threats to human, animal and environmental health. Without effective sanitation and adequate treatment of wastewater, bacteria can evolve quickly, increasing resistance to antibiotic medicines.

This has led to fears that so-called superbugs - bacteria that are resistant to all antibiotics - will compromise our ability to combat many new biological infections.

Reducing the spread of AMR is a World Health Organization (WHO) top five priority, and guidance published by the WHO in 2020 provides a framework for countries to create their own national action plans that suit their own particular regional setting. The guidance included contributions from Professor David Graham, also from Newcastle University, and reflects growing evidence, including research by Professor Graham, which suggests that the spread of AMR will not be solved by prudent antibiotic use alone and that environmental factors may be of equal or greater importance.

Professor Graham, who was also part of the team involved with this aquaculture study, said: "The only way we are going to win the fight against antibiotic resistance is to understand and act on all of the pathways that accelerate its spread. Although the types and drivers of resistance are diverse and vary by region and country, there are common roots to its spread - excess antibiotic use, pollution, poor water quality, and poor sanitation.

"This new work is crucial because it exemplifies how inadequate sanitation can affect the food supply, and may be among the strongest drivers of AMR spread."

The work in Thailand is just one example of how experts from Newcastle University are working with scientists from countries including China, Malaysia, India, Ethiopia, Tanzania, and Nepal to track down the sources of waterborne hazards in rivers and their associated food production systems. By working together to carry out comprehensive water quality assessments, they are helping to address the global health challenges of safe water, safe food, and controlling AMR and infectious disease.

Credit: 
Newcastle University

Materials scientists use frontal polymerization to mimic biology, reimagine manufacturing

image: An optical image of the surface shows ridges generated spontaneously during free-surface frontal polymerization of dicyclopentadiene. The samples are imaged under UV light to enhance visualization.

Image: 
Image courtesy the Autonomous Materials Systems Group, Beckman Institute

A simple plastic water bottle isn't so simple when it comes to the traditional manufacturing process. To appear in its final form, it has to go through a multi-step journey of synthetic procedure, casting, and molding. But what if materials scientists could tap into the same biological mechanisms that create the ridges on our fingertips or the spots on a cheetah in order to manufacture something like a water bottle?

A research paper titled "Spontaneous Patterning during Frontal Polymerization" was published in the American Chemical Society's Central Science journal, a research tipping point into how material scientists can use biological truths to manufacture materials in a more sustainable way.

Evan M. Lloyd, a Ph.D. graduate of the Department of Chemical and Biomolecular Engineering and Elizabeth Feinberg, a former postdoctoral research associate with the Department of Chemistry, led the project seeking answers on how to fabricate functionally useful patterns in ways inspired by developmental biology. While at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign, they were both members of the Autonomous Materials Systems Group.

"When you build a house, you have to build every room in the house. But when you're making a body, nobody's putting the arms and the legs in the right place --it just happens," Feinberg said. "We wanted to know if we could do things more like nature, rather than how we typically do it ourselves."

In general, complex patterns integral to the structure and function of biological materials arise spontaneously during morphogenesis, or a biological process that causes a cell, tissue, or organism to develop its shape. In sharp contrast, functional patterns in synthetic materials are typically created through multistep manufacturing processes, making it difficult to change how materials are patterned.

"It's very hard to get patterns into materials but throughout biology, we see patterns with a large number of uses, from mechanical performance to camouflage," Lloyd said. "We wanted to see if we could look at ways that patterns could emerge spontaneously."

Step one for the researchers was to use a relatively new manufacturing technique known as frontal polymerization, a reaction-thermal diffusion system, which utilizes the diffusion of heat to promote chemical reactions. Under certain conditions, the chemical reaction produces regions with varying degrees of heat. The team took advantage of these properties to change polymer microstructure and mechanical properties, and subsequently fine-tuned the reactions based on the applications of heat.

"We can incorporate alternative chemistry that is thermally sensitive within that temperature difference regime to generate changes in color, morphology, mechanical property," Lloyd said. "It's really all about how we can translate changes in reaction temperature and see lasting material properties."

The result achieved in this paper is essentially the varying stiffness of materials, explained faculty author and Beckman Institute Director Jeff Moore, a Swanlund Endowed Chair of chemistry.

"The research was made possible by interdisciplinary collaboration. The team brought together chemists, materials scientists, and computational modelers," Moore said. "The diversity of thinking helped us realize a vision for polymer fabrication that may someday be important in manufacturing."

Nancy Sottos, Swanlund Endowed Chair and head of the Department of Materials Science and Engineering, and Philippe Geubelle, Bliss Professor of aerospace engineering, also collaborated on the paper.

Manufacturing materials created with this method could be implemented on a large scale within a few decades, Feinberg predicts, imagining these materials could be used in mundane, everyday items like desks, or in massive products like windmills or airplanes. But due to the nature of these reactions, the result is monumental in creating sustainable manufacturing production.

"When you cure a polymer, you have to thermally cure it at 100 degrees or more, and it has to bake for hours. That takes a lot of energy," Feinberg said.

When it comes to wind turbine blades or an airplane fuselage, massive amounts of energy must be applied to reach a suitable end result.

"But here, because the way frontal polymerization works, you apply energy to just one small spot and it propagates or releases the latent energy in the chemical precursor," she said. "It requires a lot less energy to get it going."

Additionally, by looking toward a future of multi-functional materials, it's also possible to reduce the number of single-use materials needed, Lloyd explained.

"The research is just the start of a more sustainable approach to manufacturing," Lloyd said. "As we become better as materials scientists, we want to start pushing towards more sustainable manufacturing processes. And so by eliminating the need for these multi step processes to get to this final form, we can improve the efficiency of manufacturing itself."

Credit: 
Beckman Institute for Advanced Science and Technology

Why are optical refractive indices so small?

image: Schematic illustration of the optical response of a dense atomic medium seen by traditional theories vs. the RG theory

Image: 
ICFO

Pink Floyd's Dark Side of the Moon cover, voted the greatest classical rock album of all time, intended to portray the prism and dispersion of light into a rainbow as a certain metaphorical symbolism and a light show that was never celebrated. However, they really were not aware of the fact that this image would be used by many to help illustrate the concept of refractive index and how light changes speed and direction when it encounters a different medium.

Although conceptually the drawing was not accurate, it conveyed the message that light changes its speed when it moves into another medium, and that the different speeds of different colors causes white light to disperse into its different components. This change in speed is related to the refractive index, a unitless number that represents the ratio of the speed of light in vacuum and the speed of light in a medium.

In general, all materials with positive refractive indices have values close to 1 for visible light. Whether this is just a coincidence or reflects some deeper physics has never been explained.

Now, in a recent study published in Physical Review X and highlighted by the editors, ICFO researchers Francesco Andreoli and ICREA Prof. at ICFO Darrick Chang, in collaboration with researchers from Princeton University, University of Chicago and Institut d'Optique, have investigated and explained why the refractive index of a dilute atomic gas can only reach a maximum value of 1.7, regardless of how high the density of atoms becomes.

This result is in contrast with conventional textbook theories, which predict that the more material there is, the larger the optical response and refractive index can be. The challenge in properly understanding the problem has to deal with the multiple scattering of light - all the complex paths that light can traverse inside a medium - and the resulting interference. This can cause each individual atom to see a local intensity of light that is very different than the intensity sent in, and which varies depending on the geometry of the atoms surrounding it. Instead of dealing with the complex microscopic details of this granularity, textbooks often assume in some way that this granularity and its effects on light can be smoothed out.

In contrast, the teams makes use of a theory, called strong-disorder renormalization group (RG), which enables them to capture granularity and multiple scattering effects in a simple way. This theory shows that the optical response of any given atom is disproportionately affected by its single nearest neighbor because of near-field interactions, which is why typical smoothing theories fail. The physical effect of the near-field interactions is to produce an inhomogeneous broadening of atomic resonance frequencies, where the amount of broadening grows with density. Thus, no matter how high the physical density of atoms is, incoming light of any frequency will only see about 1 near-resonant atom per cubic wavelength to efficiently scatter off, which limits the refractive index to its maximum value of 1.7.

More broadly, this study suggests that the RG theory could constitute a new versatile tool for understanding the challenging problem of multiple scattering of light in near-resonant disordered media, including in the nonlinear and quantum regimes. It also shows the promise of trying to understand the limits of refractive index of real materials, starting bottom-up from the individual atoms of which they are composed.

Credit: 
ICFO-The Institute of Photonic Sciences

Herpesvirus triggers cervical cancer affecting nearly 1 in 4 adult sea lions

image: California sea lion Moonunit was one of 258 sea lions examined in necropsy, or animal autopsy, for cancer at The Marine Mammal Center in Sausalito, CA, in a newly released study published in Animals as part of the Special Issue Oncogenic Viruses in Animals. The findings support that genital herpesvirus, specifically otarine herpesvirus-1 (OtHV1), plays an integral role in sea lion urogenital carcinoma and suggests there is an underlying trigger or event that causes the virus to induce cancer in some infected sea lions and not others.

Image: 
Credit © The Marine Mammal Center

Sausalito, Calif. (March 30, 2021) - After more than three decades of research, scientists have proven that the cancer affecting up to one in four adult California sea lions necrospied at The Marine Mammal Center in Sausalito, CA, is caused by a sexually transmitted herpesvirus. The cancer, known as sea lion urogenital carcinoma, has clear parallels to cervical cancer in humans and provides a helpful model for human cancer study.

Scientists have long suspected this cancer was associated with a virus, but this is the first study to prove this theory. The study, which was published in Animals, an open-access, peer-reviewed journal, concluded that genital herpesvirus is a driving factor in the development of sea lion urogenital carcinoma. The research also suggests there is an underlying trigger or event that causes the virus to induce cancer in some infected sea lions and not others. Wild California sea lions have among the highest prevalence of a single type of cancer in any mammal, including humans.

A second recently published paper (Sea lions are dying from a mysterious cancer. - Los Angeles Times, latimes.com) led by the same team showed that pollutants such as PCBs and DDT play a significant role as co-factors in the development of this cancer. This is particularly relevant to Southern California where there is a large DDT dumpsite in the Southern California bight which is also where the majority of the sea lion population gather each year to give birth and raise their pups (How the waters off Catalina became a DDT dumping ground - Los Angeles Times, latimes.com).

"The confirmation that this is a virally induced cancer combined with the knowledge that contaminants play a significant role in the cancer's development means that we can use these sea lions as a naturally occurring disease model to better understand how cancer develops and spreads in all species, including humans," says Dr. Alissa Deming, the lead author of the study who completed this work during her Ph.D. studies at University of Florida in Gainesville, FL., while she was a Research Fellow at The Marine Mammal Center in Sausalito, CA. (Dr. Deming is now Director of Clinical Medicine at the Pacific Marine Mammal Center in Laguna Beach, CA.)

The Marine Mammal Center is the world's largest marine mammal hospital and has been on the forefront of researching and understanding cancer in California sea lions and its connection to both ocean and human health. Since cancer in sea lions was first discovered in 1979, researchers have found that between 18-23 percent of adult sea lions admitted to the Center's hospital have died of the fatal disease. In 2010, the Center brought together an array of international researchers to form the Sea Lion Cancer Consortium to further investigate this disease, many of whom helped co-author the paper.

"This research is critical as these sea lions may hold the key to understanding virally induced cancers as well as how cancer metastasizes, or spreads through the body," says Dr. Pádraig Duignan, Director of Pathology at The Marine Mammal Center and a co-author on the study. "This knowledge is an important link that could help scientists better understand various cancers in people."
Most cancers are caused by an accumulation of several factors, making it challenging to study cancer in traditional laboratory models. However, wild sea lions experience multiple layers of stressors including infectious agents, exposure to pollutants, nutrition, and environmental influences, all of which are much more representative of how cancer develops in the "real world."

According to Duignan, "the cancer begins in the sea lion's genital tract and aggressively spreads throughout the sea lion's body, resulting in death, often from kidney failure." Because of the advanced state of cancer by the time these patients strand on beaches and are rescued by rehabilitation centers, euthanasia is the only humane option. "This cancer is devastating to see in California sea lions. They come to the hospital in end-stage disease," says Dr. Deming.

The paper was the result of an international, cross-discipline effort, combining multiple techniques from a variety of specialists to unlock the mysteries of this disease. The research relied on novel techniques using RNAscope® Technology and Base Scope™, tools that allow researchers to pinpoint high viral gene expression within tumor tissue but not in surrounding healthy tissue.

"Our study was the first time that this revolutionary technique has been used on a marine mammal species," says Dr. Kathleen Colegrove, Clinical Professor of the Zoological Pathology Program at the University of Illinois Urbana-Champaign, and a key researcher on the study. "This proved that the virus was integral to cancer development and was not just being detected in the reproductive tracts or tissue as a bystander."

Credit: 
The Marine Mammal Center

Forests on caffeine: coffee waste can boost forest recovery

image: Coffee pulp delivery (Day 1)

Image: 
Rebecca Cole

A new study finds that coffee pulp, a waste product of coffee production, can be used to speed up tropical forest recovery on post agricultural land. The findings are published in the British Ecological Society journal Ecological Solutions and Evidence.

In the study, researchers from ETH-Zurich and the University of Hawai`i spread 30 dump truck loads of coffee pulp on a 35 × 40m area of degraded land in Costa Rica and marked out a similar sized area without coffee pulp as a control.

"The results were dramatic" said Dr Rebecca Cole, lead author of the study. "The area treated with a thick layer of coffee pulp turned into a small forest in only two years while the control plot remained dominated by non-native pasture grasses."

After only two years the coffee pulp treated area had 80% canopy cover compared to 20% in the control area. The canopy in the coffee pulp area was also four times taller than that of the control area.

The addition of the half metre thick layer of coffee pulp eliminated the invasive pasture grasses which dominated the land. These grasses are often a barrier to forest succession and their removal allowed native, pioneer tree species, that arrived as seeds through wind and animal dispersal, to recolonise the area quickly.

The researchers also found that after two years, nutrients including carbon, nitrogen and phosphorous were significantly elevated in the coffee pulp treated area compared to the control. This is a promising finding given former tropical agricultural land is often highly degraded and poor soil quality can delay forest succession for decades.

Dr Cole said: "This case study suggests that agricultural by-products can be used to speed up forest recovery on degraded tropical lands. In situations where processing these by-products incurs a cost to agricultural industries, using them for restoration to meet global reforestation objectives can represent a 'win-win' scenario."

As a widely available waste product that's high in nutrients, coffee pulp can be a cost-effective forest restoration strategy. Such strategies will be important if we are to achieve ambitious global objectives to restore large areas of forest, such as those agreed in the 2015 Paris Accords.

The study was conducted in Coto Brus county in southern Costa Rica on a former coffee farm that is being restored to forest for conservation. In the 1950's the region underwent rapid deforestation and land conversion to coffee agriculture and pasture with forest cover reduced to 25% by 2014.

In 2018, the researchers set out two areas of roughly 35 × 40m, spreading coffee pulp into a half meter-thick layer on one area and leaving the other as a control.

The researchers analysed soil samples for nutrients immediately prior to the application of the coffee pulp and again two years later. They also recorded the species present, the size of woody stems, percentage of forest ground cover and used drones to record canopy cover.

Dr Cole warns that as a case study with two years of data, further research is needed to test the use of coffee pulp to aid forest restoration. "This study was done at only one large site so more testing is needed to see if this strategy works across a broader range of conditions. The measurements we share are only from the first two years. Longer-term monitoring would show how the coffee pulp affected soil and vegetation over time. Additional testing can also assess whether there are any undesirable effects from the coffee pulp application."

A limitation of using coffee pulp or other agricultural by-products is that its use is mostly limited to relatively flat and accessible areas where the material can be delivered and the risk of the added nutrients being washed into nearby watersheds can be managed.

On further research into the use of coffee pulp, Dr Cole said: "We would like to scale up the study by testing this method across a variety of degraded sites in the landscape. Also, this concept could be tested with other types of agricultural non-market products like orange husks.

"We hope our study is a jumping off point for other researchers and industries to take a look at how they might make their production more efficient by creating links to the global restoration movement."

- Ends -

Credit: 
British Ecological Society

How AI beats spreadsheets in modelling future volumes for city waste management

image: Growing cities tend to run out of land for waste management and new landfill sites. Machine learning can help city managers create more powerful long-term forecasts of solid waste volumes and landfill requirements, even with missing or inaccurate data, researchers from the University of Johannesburg have shown.

Image: 
Therese van Wyk, University of Johannesburg

Growing cities tend to run out of land for waste management and new landfill sites.

Artificial Intelligence can help city managers create more powerful long-term forecasts of solid waste volumes and landfill requirements, even with missing or inaccurate data.

UJ researchers found that a 10-neuron model produced the best 30-year forecast for municipal solid waste in a growing city.

All over the world, large cities are running out of space for municipal solid waste. Existing landfill sites are rapidly filling up and no-one wants a new site anywhere near their homes or businesses. Meanwhile, taxpayers aren't interested in higher costs for quality waste management either.

One way of significantly extending the working life of existing waste management sites is recycling. Recyling can also reduce unemployment, help to establish a circular economy or move towards zero waste.

But often, households are highly resistant to recycling, or recycling more.

A recent study shows how Artificial Intelligence (AI) can give city waste managers a more powerful way of forecasting landfill requirements for a city in the long term.

The researchers used machine learning to forecast municipal solid waste in a large African city. The forecast shows how much waste there will be in 30 years' time, if levels of recycling stay the same.

Dr Olusola Olaitan Ayeleru and Mr Lanrewaju Ibrahim Fajimi published their research in the Journal of Cleaner Production. Both are at the Department of Chemical Engineering at the University of Johannesburg.

Planning for waste with spreadsheets

Predicting when a city's landfill sites will run out of space is hard to do, even when lots of accurate information is available. However conventional statistical forecasting using a spreadsheet may be good enough to plan 30 years ahead.

At the same time, spreadsheets with lots of manually adjusted formulas and macros are hard to understand. These can also be time-consuming and difficult to maintain.

But forecasting for different recycling scenarios may not be possible on spreadsheets. Taking population growth, types of waste, weather and other datasets into account in such a forecast may not be possible either.

Machine learning can beat macros

In developing countries, information about the waste generated in a city is often missing or inaccurate. Here, spreadsheets are unlikely to give city managers forecasts for long-term planning.

However, machine learning can "learn" by itself from the data that is available, and from more data added later.

Also, machine learning is better suited to take advantage of multiple datasets, in different formats.

A rapidly growing city

Johannesburg is the economic hub of South Africa and the biggest city in the country. It attracts people from other provinces and foreign nationals in search of jobs.

For this study, only the City of Johannesburg Metropolitan Municipality was included. This spans from Diepsloot and Midrand in the north; to Ennerdale/Orange Farm in the south; Doornkop/Soweto in the west; to Bruma in the east.

The neighbouring cities of Ekhurhuleni, Tshwane, Mogale, Merafong, Rand West, Emfuleni, Midvaal and Lesedi were excluded from the study.

Between 1996 and 2001 the City of Johannesburg population grew from 2.59 million to 3.22 million. By 2011 the city's population was 4.43 million, according to the national census data.

The same year, 90% of an estimated 59 million tonnes of general waste produced in South Africa ended up in landfills, while 10% was recycled. Nationally, 12.9% of metropolitan households self-reported that they recycled, followed by 10.8% of households across urban areas.

For 2021 the city's population was forecasted at 5.3 million, according to its 2019/2020 Integrated Development Plan.

Just a few years left at landfill

The city currently operates four landfill sites.

In September 2020, the COO of Pickitup, the city's waste management company, said to media that four and a half years of capacity are left at these sites.

In 2018, the city started a separation-at-source recycling programme. Plastic, paper, glass and cans as well as household-generated garden waste are recycled.

In February 2021, Pickitup announced a co-production programme with 48 companies. The goal is to increase waste picking, street cleaning and recycling awareness and education in the city. Fifteen new Pickitup staff per ward will coordinate the programme.

Data plugged into AI

Ayeleru and Fajimi used machine learning to forecast the solid municipal waste in Johannesburg in 30 year's time, using a standard notebook computer with a i7 processor.

The researchers used census data from 2011 indicating population, formally employed, unemployed and the number of family units. The data was supplied by the national government agency StatsSA.

They combined this with data about total annual solid municipal waste at the city's four landfill sites, from 1996 to 2008. This data was supplied by the City of Johannesburg.

Different kinds of AI

In this study, Fajimi used two kinds of machine learning to generate 30-year forecasts of total solid waste generated in the city. Both algorithms are known for accurate predictions and consistency.

The first type is called artificial neural networks (ANNs). They used 5,10, 20, 30 and 40 neuron models to create five forecasts. This type of model can learn by itself. He created the model in MATLAB software, which has a robust ANN neural fitting toolbox.

The second type is called supported vector machines (SVMs). He used linear, quadratic, cubic, ?ne gaussian, medium gaussian and coarse gaussian methods in MATLAB software to create another six forecasts.

The 10-neuron model produced the best ANN forecast. Among the SVM's the linear model produced the best forecast.

The AI bottom line

The 10-neuron model forecasted that the population in the City of Johannesburg is likely to increase from 5.3 million in 2021 to 6.4 million in 2031; and to 8.4 million in 2050.

In contrast, the model didn't forecast the same increase in municipal solid waste. Instead, it forecasted an increase in total annual waste from 1.61 million tonnes in 2021 to 1.72 million tonnes in 2031; and to 1.95 million tonnes in 2050.

"One may expect that waste generation ought to increase as population increases, but this is also dependent on factors like low or high purchasing power or source of income," says Ayeleru.

"When citizens lose their source of income or the purchasing power is low, the amount of waste generated would be reduced since they would be doing cooking of food at home compared to buying ready-made food at restaurant, for example."

Next steps

In follow-up research, Ayeleru and Fajimi are investigating how to use AI to forecast the waste types and how much income the city could generate from each of those.

"The City of Johannesburg is currently doing much better in its waste management compared to other large cities on the continent. This AI forecast can help facilitate the city's design of future waste management infrastructure," says Ayeleru.

"In the short term, the first step the city can take is educating people, so they start recycling more. Secondly, the city may need to look beyond what they are doing at the moment to generate income from solid waste."

Credit: 
University of Johannesburg

Carried with the wind: mass migration of Larch Budmoth to the Russian High Arctic

video: Live Larch Budmoth walking on tundra, Vize Island, air temperature +3C, 30.07.2020. Scientists believe that this moth arrived on the island two weeks earlier after travelling with the winds some 1200 km across the Arctic ocean.

Image: 
Dr Maria Gavrilo

Arctic habitats have fascinated biologists for centuries. Their species-poor insect faunas, however, provide little reward for entomologists - scientists who study insects - to justify spending several weeks or even months in the hostile environments of tundra or polar deserts. As a result, data on insects from the High Arctic islands are often based on occasional collecting and remain scarce.

Vize Island, located in the northern part of the Kara Sea, is one of the least studied islands of the Russian High Arctic in terms of its biota. Scientists Dr Maria V. Gavrilo of the Arctic and Antarctic Research Institute in Russia and Dr Igor I. Chupin of the Institute of Systematics and Ecology of Animals in Russia visited this ice-free lowland island in the summer of 2020.

"Our expedition studied the ecology of Ivory Gull", Maria Gavrilo says, "but we also looked for other wildlife." Because of the lack of data, scientists appreciate any observation on insects they can get from the High Arctic.

On the island, the team found hundreds of small moths. They were identified by Dr Mikhail V. Kozlov of the University of Turku, Finland, as Larch Budmoths - the first and only terrestrial invertebrate to ever be observed and collected on Vize Island. Their observations are published in the open-access, peer-reviewed journal Nota Lepidopterologica.

The scientists first observed live and freshly dead moths on the sandy banks of a pond near the meteorological station. Then, they saw hundreds of them at the sandy bottom of a river valley with shallow streams. Moths, single or in groups, were mostly found at the water's edge, along with some fine floating debris. Despite extremely low daily temperatures (+2-5°C), flying moths were also spotted on several occasions.

The larvae of Larch Budmoth feed on the needles of different coniferous trees. Because Vize Island is located 1000 km north of the tree limit, the scientists can be sure about the migratory origin of the moths observed on Vize Island. They were likely transported there on 12-14 July 2020 by strong winds coming from the continent. The nearest potential source population of Larch Budmoth is located in the northern part of the Krasnoyarsk Region, which means they travelled at least 1200 km.

Importantly, some moths remained alive and active for at least 20 days after their arrival, which means that long-distance travel did not critically deplete resources stored in their bodies. The current changes in climate are making it easier for more southerly insects to invade species-poor areas in the High Arctic islands - provided they can reach them and survive there.

"The successful arrival of a large number of live moths from continental Siberian forests to Vize Island has once more demonstrated the absence of insurmountable barriers to initial colonisation of High Arctic islands by forest insects", concludes Mikhail Kozlov, who has studied Arctic insects for decades. "The Arctic islands will be colonised by forest insects as soon as changing environmental conditions allow the establishment of local populations."

Credit: 
Pensoft Publishers

Contact lenses poised to detect cancer, treat disease and replace digital screens

image: Contact Lens Technologies of the Future offers one of the most comprehensive reviews of advancements to come, catapulting the commonly-used medical device to applications well beyond refractive error correction, including disease detection, treatment and digital screen replacement.

Image: 
Centre for Ocular Research & Education (CORE)

WATERLOO, Ontario, March 29, 2021--A newly-published paper represents one of the most comprehensive reviews of advancements to come in contact lenses, catapulting the commonly-used medical device to applications well beyond refractive error correction.

Contact Lens Technologies of the Future (Jones L, et al.) is now in press from Contact Lens and Anterior Eye, the peer review journal of the British Contact Lens Association (BCLA). It joins nine other papers being printed in next month's special edition as part of the BCLA-led Contact Lens Evidence-based Academic Reports (CLEAR) series.

"There are a range of diverse technologies that are shaping the future of contact lenses, in some cases already showing their potential in late-stage development initiatives and even commercially-available products," said Lyndon Jones, director of the Centre for Ocular Research & Education (CORE) and the paper's lead author. "Novel biomaterials, nanotechnology progress, unique optical designs, biosensing discoveries, antibacterial agents and even battery miniaturization and power transfer are coalescing like never before. The next several years will see incredible advancements and growth for an expanded contact lens category."

Dr. Jones is considered one of the world's foremost authorities on contact lens advancements. He holds the top contact lens ranking for 2010-2021 by Expertscape and is currently the 13th most influential optometrist researcher across all disciplines as determined by OptomRankings.com.

The extensively-referenced paper explores several areas in which innovations are anticipated to make an impact. The presence of biomarkers in the tear film will give rise to diagnostic contact lenses to help detect and monitor systemic and ocular diseases, including diabetes, cancer, and dry eye disease. Integrated circuit progress may give rise to in-lens intraocular pressure monitoring for glaucoma and even retinal vasculature imaging for early detection of diseases such as hypertension, stroke and diabetes.

Ocular disease treatment and management may likewise benefit from progress in fluid dynamics, materials science and microelectronics. Dehydration-resistant materials combined with electro-osmotic flow and reactive oxygen species-scavenging materials--when integrated into lenses--could offer alternative dry eye disease therapies. Liquid crystal cells could replicate the functionality of the pupil and iris arrangement, autonomously filtering incoming light to overcome physiological defects. Embedded, tunable spectral filtering has the potential to mitigate color vision deficiencies.

Drug delivering contact lenses may offer more accurate dosing versus traditional eye drops, increasing the residence time of a drug on the ocular surface with less exposure to elements such as blinking and non-productive conjunctival absorption, reducing the many known side effects of drugs. Delivery might come from in vitro uptake and release, incorporation of drug-containing nanoparticles into contact lens materials during the manufacturing process, and even molecular imprinting to imbue polymers with memory characteristics that aid dispensation. These techniques and related advancements will open up opportunities for contact lenses as theranostics, the multi-disciplinary medical field that combines therapeutics and diagnostics. Uniting sensing technology and microfabrication, theranostic lenses would release appropriate therapeutics based on continuous monitoring inputs, replacing more invasive procedures.

While "smart" lenses have become associated with on-eye head-up displays, the authors write that optical enhancements extend well beyond those manifestations. Customized optics could address aberrated eyes, with the front surface of a lens shaped to specifically reduce measured aberrations based on each person's unique corneal shape. Embedded microelectronics might constantly monitor corneal gaze direction, controlling optical elements to address presbyopia in real time. Myopia control lenses are slowing axial growth in children, responding to one of the most pressing issues in eye health today. And optical and digital display discoveries hold the potential for assisting people who suffer from low vision--and then extend to the general population to replace or supplement traditional screens.

The paper concludes with an overview of packaging and storage case material and design developments that may offer improved hygiene and reduced wearer-induced contamination.

Joining Dr. Jones as one of this paper's 14 authors is Chau-Minh Phan, a research assistant professor with CORE. Six other CORE scientists contributed to the BCLA CLEAR Reports series, serving as co-authors on papers regarding anatomy and physiology, complications, evidence-based practice, and the effect of lens materials and design.

Credit: 
McDougall Communications

Procedures identify Barrett's esophagus patients at risk for cancer progression

image: A combination of esophageal brushing and extensive genetic sequencing of the sample collected can detect chromosomal alterations in people with Barrett's esophagus, identifying patients at risk for progressing to esophageal cancer. Credit: Elizabeth Cook

Image: 
Elizabeth Cook

A combination of esophageal brushing and extensive genetic sequencing of the sample collected can detect chromosome alterations in people with Barrett's Esophagus, identifying patients at risk for progressing to esophageal cancer, according to a new study by researchers at the Johns Hopkins Kimmel Cancer Center and Case Western Reserve University.

In Barrett's Esophagus (BE), chronic acid reflux from the stomach damages the cells lining the lower esophagus, causing them to become more like cells of the lower digestive system. Cells in the lower esophagus progress through several precancerous stages before sometimes developing into esophageal adenocarcinoma, a cancer with a five-year survival rate below 20 percent. BE is the only known precursor to esophageal adenocarcinoma.

Clinicians can detect these progressive states in BE by looking for chromosomal alterations known as aneuploidy--a common feature in most cancer cells--but until now the process has involved multiple biopsies.

A single esophageal brushing paired with the sequencing technique called RealSeqS is sensitive and specific enough to identify aneuploidy at several stages of BE progression, and can even match specific types of aneuploidy with specific stages of the disease, the researchers say.

"Aneuploidy has long been implicated in the development, initiation or progression of esophageal cancer but the assays or experimental methods to detect this have not been as easily achieved or as high throughput as we would have wanted to allow for clinical implementation," says senior study author Chetan Bettegowda, M.D., Ph.D., Jennison and Novak Families Professor of Neurosurgery at the Kimmel Cancer Center.

The method could help clinicians identify people at higher risk of having BE progress to esophageal adenocarcinoma, "instituting more intensive follow-up for those patients or instituting treatments at earlier stages of the disease," Bettegowda says.

Results were published online Jan. 21 in the journal Gastroenterology.

Esophageal brushing uses a soft brush attachment for an endoscope to collect cells from the esophageal lining. The method can sample cells from a larger surface area of the esophagus than a usual biopsy. Endoscopy, often with brushing, "is part of the standard of care for people who have Barrett's," Bettegowda says, so incorporating the aneuploidy detection method would not change substantially change clinical practice.

Only a few cells out of hundreds sampled through brushing may demonstrate aneuploidy, however, so combining brushing with massively parallel sequencing is key to finding these cells. The researchers used a technique called RealSeqS to look across more than 350,000 regions of the genome to identify specific chromosome arm alterations. RealSeqS can typically detect aneuploidy at the level of about 1 in 200 cells, which is necessary when large portions of the esophageal brushing are derived from normal cells, says study lead author Christopher Douville, the Kimmel Center researcher who designed RealSeqS.

"The method is quick and can process hundreds of samples cost effectively, making it an ideal choice for the BE research," says Douville.

The researchers obtained esophageal brushings from patients without BE, with BE and no abnormal cells, with BE and either low or high-grade cell abnormalities, and patients with esophageal adenocarcinoma. They trained their identification system to distinguish brushings from BE patients without cell abnormalities from those with adenocarcinoma using samples from 79 patients. The research team then looked at samples from 268 patients to test whether the method could distinguish different stages of BE progression.

Bettegowda, Douville and their colleagues identified a threshold of aneuploidy that distinguished patients with high-grade cell abnormalities and adenocarcinoma from BE patients with no cell abnormalities. At this threshold, the method identified high-risk patients in 232 or 86.7% of cases.

The method also identified specific chromosomal arm changes at each stage of BE progression toward adenocarcinoma. Some of these specific changes have already been linked to disease stages by previous research, "but they were not comprehensive enough to be used in some diagnostic or prognostic way," says Douville.

The researchers used these changes to develop an assessment tool they called BAD (Barrett's Aneuploidy Decision), for distinguishing stages of BE progression. The tool could be especially useful for identifying high-risk individuals when cell biopsies look benign, because the BAD tool points out molecular changes that are linked to more aggressive disease, Bettegowda says.

"Aneuploidy is as universal a biomarker as we can find at the moment for cancer, so we are excited to see how we can employ this technology for a multitude of cancers," he says. "We are optimistic that given the versatility of the RealSeqS assay that we could study other disease processes, other cancer types and potentially find other ways to intervene earlier."

The work was supported by grants CA150964, CA163060, CA152756, and UH3 CA205105-03 from the National Institutes of Health; grants RA37 CA230400-01, U01CA230691, Oncology Core CA 06973, from Burroughs Wellcome Career Award for Medical Scientists; Earlier Detection of Cancers Using Non-Plasma Liquid Biopsies; The Virginia and D.K. Ludwig Fund for Cancer Research; The Sol Goldman Sequencing Facility at Johns Hopkins; and The Conrad R. Hilton Foundation.

Credit: 
Johns Hopkins Medicine