Tech

Demonstration of unconventional transverse thermoelectric generation

image: (a) Schematic diagram showing the concept of Seebeck-driven transverse thermoelectric generation (STTG). The charge current induced by the Seebeck effect in the thermoelectric material generates a large thermoelectric voltage in the magnetic material in the direction perpendicular to a temperature gradient. (b) Transverse thermoelectric voltage as a function of the size ratio between the thermoelectric and magnetic materials. The solid curves were calculated using our phenomenological models describing the STTG and the square symbols were measured in the experiments. (c) Schematic of the sample structure. (d) Comparison between the transverse thermopower for plain Co2MnGa, in which only the anomalous Nernst effect (ANE) appears, and the Co2MnGa-Si hybrid structure, in which both STTG and ANE appear simultaneously.

Image: 
NIMS

A NIMS research team devised a new thermoelectric generation mechanism with a hybrid structure composed of thermoelectric and magnetic materials. The team then actually fabricated this structure and observed the record-high thermopower appearing in the direction perpendicular to a temperature gradient (i.e., transverse thermoelectric generation). These results may offer insights into new mechanisms and structural designs applicable to the development of versatile energy harvesting technologies and highly sensitive heat flux sensors.

The Seebeck effect is a phenomenon in which a temperature gradient across a metal or semiconductor is converted into a thermoelectric voltage. Because this effect can be used to convert waste heat into electrical energy, its potential applications (e.g., autonomous power sources for IoT devices) have been extensively studied. However, Seebeck-effect-driven thermoelectric generation has disadvantages: a thermopower is generated along the direction of a temperature gradient (i.e., longitudinal thermoelectric generation). Because of this parallel relationship, a thermoelectric material needs to be extended in the direction of a temperature gradient to create large temperature differences and resultant large thermoelectric voltage. Furthermore, in conventional Seebeck devices, a complex structure composed of a serial connection of many pairs of two different thermoelectric materials is necessary to enhance a thermoelectric voltage. However, these arrangements increase production cost, make the material/structure less durable, and limit its practical applicability. In contrast, the anomalous Nernst effect?a thermoelectric phenomenon that occurs only in magnetic materials?can generate a thermoelectric voltage perpendicular to the direction of a temperature gradient. This effect may thus enable generation of a thermopower in a transverse direction, and the thermoelectric voltage can be enhanced simply by increasing the length of the material in the direction perpendicular to a temperature gradient. Transversely extended thermoelectric materials are expected to have significantly greater flexibility when integrated into modules and to offset the aforementioned disadvantages related to the Seebeck effect. However, the anomalous Nernst effect has been shown to generate only a very small thermopower?less than 10 μV/K at near room temperature?making its practical application difficult.

In this research project, the research team devised and demonstrated a new thermoelectric generation mechanism in which a longitudinal thermopower induced by the Seebeck effect in a thermoelectric material can be converted into a transverse thermopower in a magnetic material via the anomalous Hall effect. The team then simulated this mechanism based on phenomenological model calculations and found it potentially capable of generating very high thermopower beyond 100 μV/K perpendicular to the direction of a temperature gradient when materials and structures are optimized. To experimentally verify this result, the team fabricated a hybrid structure composed of Co2MnGa?a magnetic compound capable of producing the large anomalous Hall effect?and semiconducting Si capable of producing the large Seebeck effect. This structure generated the record-high positive and negative transverse thermopowers (+82 μV/K and -41 μV/K). The magnitude and sign of the measured thermopowers are well reproduced by the prediction based on the model calculations. The thermoelectric generation capability of the composite can further be improved by material and structural optimization.

The thermopower observed in this project was more than 10 times larger than the previously recorded highest thermopower generated by the anomalous Nernst effect. This result is expected to significantly advance R&D efforts aiming to put transverse thermoelectric generation into practical use. In future studies, we plan to research and develop effective magnetic and thermoelectric materials, create composite structures using these materials, and optimize their structures. We will then use these hybrid materials to develop energy harvesting technologies capable of powering IoT devices and heat flux sensors that can be used for energy-saving purposes.

Credit: 
National Institute for Materials Science, Japan

Aging cells in abdominal fluid cause increased peritoneal dissemination of gastric cancer

image: A: Researchers investigated the cellular senescence and secretion of senescence-related proteins in the ascites of gastric cancer patients and found that CAFs secrete more senescence-related proteins and promote cancer progression than other (cancer and blood) cells.
B: Senescent CAFs secrete more senescence-related proteins than non-aging CAFs.

Image: 
Dr. Takatsugu Ishimoto

Through an analysis of cellular components (cell fractions) from malignant ascites (fluid buildup in the abdomen caused by gastric cancer), a research collaboration based in Kumamoto University (Japan) has demonstrated that cellular senescence of cancer-associated fibroblasts (CAFs) play an important role in the peritoneal dissemination of gastric cancer foci (cells different from surrounding cells). This understanding should enable the development of new treatments for cancer dissemination in the peritoneum by targeting cancer cells at focal sites and CAFs in patients with gastric cancer.

Peritoneal dissemination is one type of cancer metastasis in which cancer cells break through the walls of internal organs, scatter throughout the abdomen, and grow into the peritoneum. Highly malignant gastric cancers, such as scirrhous gastric cancer, are prone to peritoneal dissemination which make them very difficult to treat. Furthermore, the development of disseminated foci is known to be closely related to the prognosis. Thus, controlling peritoneal dissemination is an important issue for improved gastric cancer treatment.

CAFs are fibroblasts that make up the cancer microenvironment, the fluids and tissues surrounding cancer cells, and are known to secrete a variety of factors involved in malignant transformations. In this study, researchers used CAFs derived from post-operative specimens of scirrhous gastric cancer patients to mimic the inflammatory environment and found that CAFs cause cellular senescence due to the release of inflammatory substances (cytokines) from cancer and surrounding cells.

Senescent cells continue to secrete senescence-associated proteins that aid in cancer cell progression and are known to cause chronic inflammation. Thus, researchers focused on changes in the epigenome of CAFs as a mechanism for the chronic secretion of these proteins. A detailed genetic analysis revealed that CAFs that underwent cellular senescence showed characteristic changes in histone modifications resulting in the activation of genes that form cellular senescence-related substances. Senescent CAFs were found to continuously secrete senescence-related proteins that promote cancer progression. Furthermore, analysis of various cell fractions from malignant ascites from patients with scirrhous gastric cancer also revealed the presence of aging CAFs.

"Our study has revealed the significance of CAFs in ascites due to peritoneal dissemination of gastric cancer," said study leader, Dr. Takatsugu Ishimoto. "Based on our findings, a new therapeutic strategy for gastric cancer that targets cancer cells at peritoneal dissemination sites and senescent CAFs could be possible for preventing peritoneal dissemination."

Credit: 
Kumamoto University

Words conveyed with gesture

image: Prof. Przemys?aw ?ywiczy?ski from the Institute of Linguistics at the Faculty of Humanities at the NCU in Toru?

Image: 
Andrzej Roma?ski/NCU in Toru?

The question of the origin of the language is one of the most important and at the same time one of the most difficult to solve. It was formulated in antiquity and has inspired religion and philosophy ever since, in some periods, above all the Enlightenment, becoming the axis of reflection on other fundamental issues, such as human nature. In the last few decades, research in this field has intensified, drawing on evolutionism and having an interdisciplinary character, involving linguists, psychologists, primatologists and neuroscientists. The study of language evolution is currently considered one of the most intensively developing fields of communication research, and publications in this field appear in the most important scientific journals in the world. - I once made a review of various world creation myths and in all of them there is a smaller or larger element of reflection on the origin of language and diversification of languages, says Prof. Przemys?aw ?ywiczy?ski from the Institute of Linguistics at the Faculty of Humanities at the Nicolaus Copernicus University in Toru?. - In the Western, Judeo-Christian tradition, we have the biblical story of how Adam named the animals, creating - the first - Adamic language, then the myth of the tower of Babel. This indirectly shows that the problem of the origin of language has always been important to us, because it says something important about our humanity.

The modern study of the evolution of language is a scientific emanation of the need to answer the question of where the language came from.

Language is a system of communication specific to humans. - While in relation to other species we can talk about complex communication, for instance, monkeys have a number of the so-called calls and extensive gesture communication, we cannot talk about the language in their case - explains Prof. ?ywiczy?ski. - If we are to define a language, we must remember that it is based on meaning. There is a vocal form in the case of spoken languages, or a gesture, or a sign, in the case of sign languages, and something distinct which that form means. This kind of relationship between the form and the content is basically absent in the world of animals.

The second thing that distinguishes language is its compositionality, which means that units of meaning can be combined and new meanings can be constructed. For instance, if we have the words: "red" and " car", we don't always have to use them separately, but we can combine them, and then, there is a new content, i.e., a car that is red. This kind of ability doesn't actually occur in the animal world either.

What we recognise to be the beginning of language depends on how narrow a definition of language we want to use. If we refer to the origins of the relationship between the form and the meaning, most researchers, considering the size of the brain, the technological advances or the size of the groups, would say that around 2 million years ago there was a breakthrough in human evolution, when a new system of communication must have emerged. However, when it comes to language in the strict sense, some researchers date its origins to more than half a million years ago, when we had a common ancestor with the Neanderthal. Research into the genes responsible for language vocalisations, which our relative species also had, provides support here. This is the upper limit of the origin of language, while the lower limit is connected by most researchers with the origin of the human species, which occurred about 200 thousand years ago.

The researcher from Toru? points out that although animals do not have a language, they can obviously communicate. - The basic definition of communication comes from Krebs and Dawkins who believed that the primary function of communication was manipulation, explains Prof. ?ywiczy?ski. - Generally speaking, it is such a behaviour of an organism A that causes changes in the behaviour of an organism B. This is how, for example, sexual selection works or the interaction between the hunter and the prey he hunts. However, in the case of humans, we also have meanings and communication is about making these meanings consistent. So that the sender of the message is able to make what is in his mind consistent with what is in the mind of the receiver of the message.

If you put people in experimental conditions and forbid them to use language, you can observe what means they will use to start communicating. Research unanimously shows that iconic strategies work best here. - If I say the word "key", there is nothing in this form itself that suggests what I am talking about, whereas if I draw a "key", we are dealing with iconicity - explains the researcher from the Nicolaus Copernicus University. - Drawing through similarity reflects the things we want to draw our attention to. If we talk about communication in interaction, when you can't always draw, gestures - or more broadly, what we call pantomime - is definitely a better strategy to start communicating than, for example, vocal communication, which is based on more arbitrary forms than gestural communication. It is easier to draw something with your hands than to do it with your voice.

The study of the origins of language and communication as a research project became possible when empirical data allowed us to go beyond speculation. Developments in ape communication research, neuroscience, psycholinguistics or genetics were all elements that, taken together, made it possible to talk about how language might have evolved. - In the contemporary evolution of language, we mainly study detailed problems, e.g., which modality is best suited to establish communication, explains the linguist from Toru?. - We solve a research problem, which sometimes helps us to say something more general about the origin of language. This is how the various 'puzzles' are put together, but they will never form a perfect whole, because the study of the origin of language and language itself is a multi-layered process. We have talked about the brain, communication, but language also relates to social expression. What we say and how we say it, is determined by our social relations, the group we live in, its size and even the natural conditions we live in. We have a very large catalogued description of languages, in fact almost all languages in existence today, which is about 7,000. The distribution of language features can be correlated with the features of the environment in which their speakers live. For example, tonal languages are more popular in humid climates than in dry ones, because humid air promotes better vocal cords. All this shows that the picture is quite complex. By looking at one thing, we don't see another, but we still know a little more at the end.

The latest issue of the scientific journal Philosophical Transactions of the Royal Society B entitled "Reconstructing prehistoric languages" is devoted to the topic of the origins of language. Among the papers published in the journal there is an article "Pantomimic fossils in modern human communication" by Prof. ?ywiczy?ski and dr. hab. S?awomir Wacewicz, NCU Prof. from the Centre for Language Evolution Studies, written in collaboration with dr Casey Lister from the University of Western Australia.

The authors present a number of arguments for the pantomimic origin of language. We usually associate pantomime with a theatrical genre or a game of puns. However, in the study of language evolution, the view that pantomime was an important link in the evolutionary development of language is gaining popularity. A well-known and popular view says that it is likely that pantomime played an important role in the early stages of language formation, but in fact it is with us all the time, and what we have focused on are the contexts of how pantomime appears in the communication of people today, explains Prof. ?ywiczy?ski. - It usually occurs when people cannot use a language, because they have been experimentally forbidden to do so, or they don't have a common language, or they are unable to use it.

The researchers studied records from the period of the great discoveries, when the sailors did not know what awaited them in the new land, and yet somehow managed to communicate with indigenous people. The observations on newly emerging sign languages are also interesting, because there you can see how a language is born from the moment when people have no system for communicating with each other, to a fully formed system. Since the 1990s, there have been several cases where researchers have managed to study such a language from its very formation. The best-known case is Nicaraguan sign language. In Nicaragua, there was no language for deaf children to learn and no schools for them. The Americans came and set up a school to teach them American sign language. It turned out that before they did so, the little ones had created their own language. Studies on emerging languages show that at the beginning communication is based on pantomime, on whole-body expression, only later do the proper signs for a given language take shape and the grammar of the sign language develops.

The use of pantomime is also linked to language deficits - for example, in the case of aphasia, when language expression becomes limited due to brain damage, but gestures or pantomime are largely unaffected. This shows that on a cerebral level, pantomime, gestures and vocal language are - at least partly - separate, and people who cannot speak start using pantomime spontaneously, and very often this becomes the basis for language therapy.

Another important feature of human communication highlighted by the researchers of language evolution is multimodality. When we speak, we also gesture, and we often use, for example, graphic signs in our communication. Very often, traditional narratives creatively combine voice, pantomime and drawing.

The researchers assume that the primary communication system was an iconic one, with pantomime at its core, which made it much easier to convey meaning. However, elements such as drawing and vocalisation complemented it. The development of language was based on increasing the role of vocalisation at the expense of pantomime. Why is it that today, we mainly talk and not show? When we start communicating, the iconic form is great, because it's easy to explain something by showing or drawing, but when we have more and more meanings, iconicity becomes an obstacle. - For example, it' s difficult to clearly separate kicking the ball from the footballer when using pantomime alone, explains the linguist from the Nicolaus Copernicus University. - If we have a general notion of kicking the ball, it's easy to show it, but if we want to show 'footballer' and 'kick the ball' separately, it's not so easy. When we have to convey a growing number of increasingly subtle meanings, the iconicity of the pantomime becomes a limitation for the development of communication, and then we need something more arbitrary, and that is the spoken word.

Credit: 
Nicolaus Copernicus University in Torun

Recurrent neural network advances 3D fluorescence imaging

image: None

Image: 
by Ozcan Lab, UCLA

Rapid 3D microscopic imaging of fluorescent samples has gained increasing importance in numerous applications in physical and biomedical sciences. Given the limited axial range that a single 2D image can provide, 3D fluorescence imaging often requires time-consuming mechanical scanning of samples using a dense sampling grid. In addition to being slow and tedious, this approach also introduces additional light exposure on the sample, which might be toxic and cause unwanted damage, such as photo-bleaching.

By devising a new recurrent neural network, UCLA researchers have demonstrated a deep learning-enabled volumetric microscopy framework for 3D imaging of fluorescent samples. This new method only requires a few 2D images of the sample to be acquired for reconstructing its 3D image, providing ~30-fold reduction in the number of scans required to image a fluorescent volume. The convolutional recurrent neural network that is at the heart of this 3D fluorescence imaging method intuitively mimics the human brain in processing information and storing memories, by consolidating frequently appearing and important object information and features, while forgetting or ignoring some of the redundant information. Using this recurrent neural network scheme, UCLA researchers successfully incorporated spatial features from multiple 2D images of a sample to rapidly reconstruct its 3D fluorescence image.

Published in Light: Science and Applications, the UCLA team demonstrated the success of this volumetric imaging framework using fluorescent C. Elegans samples, which are widely used as a model organism in biology and bioengineering. Compared with standard wide-field volumetric imaging that involves densely scanning of samples, this recurrent neural network-based image reconstruction approach provides a significant reduction in the number of required image scans, which also lowers the total light exposure on the sample. These advances offer much higher imaging speed for observing 3D specimen, while also mitigating photo-bleaching and phototoxicity related challenges that are frequently observed in 3D fluorescence imaging experiments of live samples.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

Discovery of non-toxic semiconductors with a direct band gap in the near-infrared

image: Crystal structure of the inverse perovskite Ca3SiO semiconductor

Image: 
NIMS

NIMS and the Tokyo Institute of Technology have jointly discovered that the chemical compound Ca3SiO is a direct transition semiconductor, making it a potentially promising infrared LED and infrared detector component. This compound--composed of calcium, silicon and oxygen--is cheap to produce and non-toxic. Many of the existing infrared semiconductors contain toxic chemical elements, such as cadmium and tellurium. Ca3SiO may be used to develop less expensive and safer near-infrared semiconductors.

Infrared wavelengths have been used for many purposes, including optical fiber communications, photovoltaic power generation and night vision devices. Existing semiconductors capable of emitting infrared radiation (i.e., direct transition semiconductors) contain toxic chemical compounds, such as mercury cadmium telluride and gallium arsenide. Infrared semiconductors free of toxic chemical elements are generally incapable of emitting infrared radiation (i.e., indirect transition semiconductors). It is desirable to develop high-performance infrared devices using non-toxic, direct transition semiconductors with a band gap in the infrared range.

Conventionally, the semiconductive properties of materials, such as energy band gap, have been controlled by combining two chemical elements that are located on the left and right side of group IV elements, such as III and V or II and VI. In this conventional strategy, energy band gap becomes narrower by using heavier elements: consequently, this strategy has led to the development of direct transition semiconductors composed of toxic elements, such as mercury cadmium telluride and gallium arsenide. To discover infrared semiconductors free of toxic elements, this research group took an unconventional approach: they focused on crystalline structures in which silicon atoms behave as tetravalent anions rather than their normal tetravalent cation state. The group ultimately chose oxysilicides (e.g., Ca3SiO) and oxygermanides with an inverse perovskite crystalline structure, synthesized them, evaluated their physical properties and conducted theoretical calculations. These processes revealed that these compounds exhibit a very small band gap of approximately 0.9 eV at a wavelength of 1.4 μm, indicating their great potential to serve as direct transition semiconductors. These compounds with a small direct band gap may potentially be effective in absorbing, detecting and emitting long infrared wavelengths even when they are processed into thin films, making them very promising near-infrared semiconductor materials to be used in infrared sources (e.g., LEDs) and detectors.

In future research, we plan to develop high-intensity infrared LEDs and highly sensitive infrared detectors by synthesizing these compounds in the form of large single-crystals, developing thin film growth processes and controlling their physical properties through doping and transforming them into solid solutions. If these efforts bear fruit, toxic chemical elements currently used in existing near-infrared semiconductors may be replaced with non-toxic ones.

Credit: 
National Institute for Materials Science, Japan

Taking microelectronics to a new dimension

image: None

Image: 
by Erik Hagen Waller, Julian Karst, and Georg von Freymann

Metallic microstructures are the key components in almost every current or emerging technology. For example, with the next wireless communication standard (6G) being established, the need for advanced components and especially antennas is unbroken. The drive to yet higher frequencies and deeper integration goes hand in hand with miniaturization and fabrication technologies with on-chip capability. Via direct laser writing - an additive manufacturing technology that offers sub-micron precision and feature sizes - highly sophisticated and integrated components come into reach. One big advantage of direct laser writing is that it is not limited to the fabrication of planar structures but enables almost arbitrary 3D microstructures. This dramatically increases the options available to component or device designers and offers vast potential for, e.g., antenna performance improvement: gain, efficiency and bandwidth are higher at lower feeding losses for 3D antennas compared to their planar counterparts. These advantages become even more pronounced the higher the frequency gets.

In a recent paper published in Light: Advanced Manufacturing, a team of scientists from the Fraunhofer ITWM, the Technische Universität Kaiserslautern and the Stuttgart University have developed a novel photosensitive material that enables direct fabrication of highly conductive microcomponents via direct laser writing.

"Not only are the resulting structures made of almost 100 % silver, but they also have above 95% material density. Furthermore, almost arbitrary structure geometries are possible while onchip compatibility is maintained with this approach.", says Erik Waller, the lead scientist of the project.

The feasibility and strength of the approach was demonstrated by the fabrication of a polarizer based on an array of helical antennas working in the infrared spectral region.

"This material and technology are well suited for the fabrication of conductive three dimensional micrometer-sized components. Next, we want to show the integration of thus fabricated components on conventionally fabricated chips. We then indeed take microelectronics to another dimension.", says Georg von Freymann, head of department at the Fraunhofer ITWM and professor at the Technische Universität Kaiserslauten.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

New treatment can reduce facial pressure injuries from PPE in frontline healthcare workers

image: Pictured is Natalie McEvoy, Clinical Research Nurse in Anaesthesia and Critical Care at RCSI, wearing her personal protective equipment. A study from RCSI has found that a new 'care bundle' can reduce the incidence of facial pressure injuries in frontline COVID-19 healthcare workers caused by the prolonged wearing of Personal Protective Equipment.

Image: 
RCSI

A study has found that a new 'care bundle' can reduce the incidence of facial pressure injuries in frontline COVID-19 healthcare workers caused by the prolonged wearing of Personal Protective Equipment (PPE).

The study, led by researchers from RCSI University of Medicine and Health Sciences Skin Wounds and Trauma (SWaT) Research Centre, is published in the current edition of the Journal of Wound Care.

The research took place over a two-month period amongst healthcare workers in a large acute hospital in Ireland. In the study, approximately 300 frontline staff were provided with a care bundle, which was designed in line with international best practice and consisted of face cleansing material (WaterWipes® baby wipes), moisturising balm (Eucerin Aquaphor Soothing Skin BalmTM), and protective tape (Mepitac TapeTM).

Results showed that prior to using the care bundle 29% of respondents developed a facial pressure injury, whereas after using the care bundle only 8% of respondents developed such an injury. The analysis revealed that when using the care bundle, staff were almost five times less likely to develop a facial pressure injury. In a secondary finding, respondents reported the bundle as easy to use, safe and effective.

Professor Zena Moore, Director of the SWaT Research Centre and Head of the RCSI School of Nursing and Midwifery, was lead researcher on the study. Prof Moore said, "We are acutely aware of the facial injuries, such as pressure ulcers, bruises and skin tears that healthcare workers are experiencing due to the prolonged wearing of protective equipment during the pandemic and especially the wearing of medical face masks. These injuries can be painful for staff and injuries in some cases can put them at increased risk of infection. This study is the first of its kind carried out at the height of the pandemic in an effort to help mitigate the occurrence of Facial Pressure Injuries. The results tell us that when skincare is prioritised, and a systematic preventative care bundle approach is adopted, there are clear benefits for the frontline workers and the workplaces involved."

Credit: 
RCSI

Identifying banknote fingerprints can stop counterfeits on streets

image: A demonstration of the feature area from a £10 polymer banknote (top). Three snapshots at the bottom show the random variation of the patterns for three different £10 banknotes.

Image: 
University of Warwick

In 2016 the Bank of England introduced plastic (polymer) banknotes, alongside 50 other countries that use polymer banknotes

Counterfeit polymer banknotes on the streets have increased over the last few years, therefore the need to prevent and identify counterfeit banknotes has increased

Using a technique called Polymer Substrate Fingerprinting, researchers from the University of Warwick are able to identify each banknote's own fingerprint, which is unique and unclonable

Since the introduction of plastic (polymer) banknotes in 2016, the number of counterfeit notes on the streets has increased, however, researchers from Department of Computer Science at the University of Warwick have developed a novel technique called Polymer Substrate Fingerprinting, which identifies every banknote's fingerprint which is unique and unclonable.

Polymer £5 notes were introduced to the UK in 2016, with the £10 note following in 2017 and the £20 note in 2020, however as polymer banknotes have been introduced to the UK, the number of counterfeit banknotes has also increased.

Researchers from the Department of Computer Science at the University of Warwick and their collaborator from Durham University, have had the paper 'Anti-Counterfeiting for Polymer Banknotes Based on Polymer Substrate Fingerprinting', published in the journal IEEE Transactions on Information Forensics and Security, in which they propose a novel technique called Polymer Substrate Fingerprinting, which can identify each banknote's own unique, unclonable fingerprint.

Researchers have found that every polymer banknote has a unique "fingerprint", which is caused by the inevitable imperfection in the physical manufacturing process, whereby the opacity coating, a critical step during the production of polymer notes, leaves an uneven coating layer with a random dispersion of impurities in the ink. This imperfection results in random translucent patterns when a polymer banknote is back-lit by a light source.

Researchers have further presented a low-cost method to reliably capture these patterns, by using a commodity negative-film scanner, and processed them into a compact 2048-bit feature vector (fingerprint), to uniquely identify each individual banknote with extremely high accuracy.

The image analysis focuses on a small feature area, where the random translucent patterns from the opacity coating layer are directly exposed without being obstructed by security printing. For the example of the £10, the feature is chosen between the 'ten' hologram, and the see-through window. The random patterns extracted from the opacity coating layer form the unique 'fingerprint' is further protected by a veneer finish applied on both sides of the polymer note, and hence, is robust against rough handling in daily usage, similar to how the human iris is protected by the clear cornea in iris recognition.

Using 340 banknotes the researchers collected an extensive dataset of 6,200 sample images, and have proven that their method can identify each banknotes fingerprint successfully despite rough-daily handling.

Sheng Wang, a PhD student from the Department of Computer Science who has been designing, prototyping and testing this technique for that last two and half years comments:

"Although card and contactless payments tend to be more popular today banknotes still play a crucial role in society. In fact, there are 500 billion banknotes in circulation in the world, meaning counterfeit notes are a major threat to society and economy.

"Polymer banknotes have existing anti-counterfeit features, however we've found a new way to identify counterfeit notes without changing the manufacturing process, as the negative-film scanner can pick up each banknote's unique fingerprint which cannot be cloned.

"We have also found the extracted fingerprints contain around 900 bits of entropy, which is dramatically higher than 249-bit entropy for iris-codes used in iris recognition. This high entropy makes our method extremely scalable to identify every polymer note globally."

Professor Feng Hao, from the Department of Computer Science comments:

"Like every human has unique biometric features, we have found every polymer banknote has its own 'bio-metric', which is unique, naturally occurring, and can't be physically cloned. This new finding gives us the basis to design a completely new anti-counterfeiting method for banknotes.

"It is universally believed that once counterfeiters have access to essentially the same printing equipment and ink as used by legitimate government to print fake banknotes the game is over, as there will be no way to distinguish genuine and fake banknotes. Contrary to this belief our research shows, perhaps surprisingly, that there is still hope to defeat counterfeiting even in that extreme scenario."

Credit: 
University of Warwick

Plantwise plant clinics help promote sustainable crop pest management in Rwanda and Zambia

image: Farmers receive advise on how to manage crop pests and diseases at a plant clinic in Zambia run as part of the CABI-led Plantwise programme.

Image: 
CABI

CABI-led Plantwise plant clinics can help promote more sustainable ways to fight crop pests and diseases in Rwanda and Zambia - such as the fall armyworm - with the judicial use of pesticides within Integrated Pest Management (IPM) plans.

Dr Justice Tambo, lead author of the study published in the journal Food Policy, surveyed 1,474 farm households in Rwanda and Zambia and found that although farmers who visit plant clinics show a higher probability of opting for pesticides for pest control, they do not use them intensively and are more likely to adopt safer and more sustainable alternatives.

The scientists, which include researchers from the Rwanda Agriculture and Animal Resources Development Board (RAB) and the Zambia Agriculture Research Institute (ZARI), also revealed that plant clinic users are significantly more likely than non-users to wear protective clothing while working with pesticides.

Crop pests are a major limiting factor of agricultural productivity growth worldwide and in Sub-Saharan Africa, in particular, crop losses due to pests are predicted to be between 40% to 60%. The fall armyworm, for example, was confirmed on the continent in 2016 and has the potential to cause annual maize losses of about 8 to 20 million tonnes in just 12 countries.

Dr Tambo said, "In this study, we investigated whether plant clinics can enhance rational use of pesticides, in terms of reduced use of highly toxic or banned pesticides, adoption of alternative and environmentally friendly pest management practices, proper disposal of pesticide wastes, use of personal protective equipment (PPE), and the implications for incidence of pesticide-related ill-health.

"The IPM paradigm aims to reduce reliance on pesticides by encouraging the use of a combination of sustainable pest control practices. IPM techniques include judicious use of pesticides, as well as non-chemical pest management practices, such as intensive monitoring, resistant varieties, cultural control, physical or mechanical control and biological control."

Plant clinics, which are set up at public places such as markets and village centres, allow any farmer to bring a sample of any 'sick' crop where a plant doctor will be able to identify, diagnose and recommend treatment for the farmer to consider.

In collaboration with national partners, the Plantwise programme has established about 4,500 plant clinics in 34 countries across Africa, Asia and Latin America. In Rwanda, there are 66 active plant clinics that are manned by 230 trained plant doctors. Similarly, Zambia has 121 plant clinics that are staffed by 352 plant doctors.

At the time of this study, these plant clinics had attended to 16,130 and 12,000 farmers' queries on about 100 crops in Rwanda and Zambia respectively, signifying their growing popularity and importance as a source of plant health information.

Mr Mathews Matimelo, of ZARI, said, "Pesticides can provide rapid means of controlling pests and prevent crop losses, but they can also pose a major risk to humans, animals and the environment, if not used judiciously.

"Plant clinic users are more likely than non-users to use pesticides in combination with other non-chemical methods of pest control, including cultural and physical methods such as crop rotation and handpicking of egg masses and larvae.

"In particular, our impact estimates indicated that clinic users adopt significantly more FAW management options than their matched non-clinic user counterparts - 15% and 20% more in Zambia and Rwanda, respectively."

The researchers conclude that the plant clinic extension approach would benefit from scaling so as to promote IPM adoption in smallholder agriculture.

In addition, they suggest that further training could be given to the small percentage of farmers to help them steer clear of banned, restricted or wrong pesticides as well as the proper use of PPE.

Credit: 
CABI

Lymph node collection kit may improve long-term survival after lung cancer surgery

Denver--March 26, 2021---A lymph node collection kit can help surgeons attain compete resection and improve long-term survival after curative-intent lung cancer surgery, according to a study published in the Journal of Thoracic Oncology Clinical and Research Reports. The journal is published by the International Association for the Study of Lung Cancer.

Surgical resection is the most important curative treatment for non-small cell lung cancer (NSCLC). With successful implementation of lung cancer screening programs, the proportion of patients with NSCLC who undergo surgery is likely to increase significantly.

"However, poor surgical quality reduces the survival benefit of curative-intent surgery and suboptimal pathologic nodal evaluation is the most prevalent NSCLC surgery quality deficit. The problem is global, and prevalent across institutions of different characteristics," said co-author Matthew Smeltzer, PhD, University of Memphis School of Public Health, Memphis, Tenn.

The IASLC has proposed a revision of the residual disease (R-factor) classification, from complete (R0), microscopic incomplete (R1) and grossly incomplete (R2) to R0, 'R-uncertain', R1 and R2. The adverse prognostic impact of R-uncertainty has been independently validated, with the majority caused by poor nodal evaluation.

"We previously demonstrated longer survival after surgical resection with a lymph node specimen collection kit, published in the Journal of Thoracic Oncology in February, and now evaluate R-factor redistribution as the mechanism of its survival benefit," Dr. Smeltzer said.

The kit includes 12 anatomically pre-labelled specimen containers and a checklist to indicate certain lymph node stations mandated for examination.

"We designed it to improve the intraoperative retrieval of lymph nodes compatible with evidence-based guidelines, the secure transfer of lymph node specimens between surgery and pathology teams, and the accurate identification of the anatomic provenance of lymph node specimens to encourage thorough and accurate pathologic evaluation," he said.

An ongoing NIH-funded population-based Dissemination and Implementation project (2 R01 CA172253), the Mid-South Quality of Surgical Resection (MSQSR) project involves 15 hospitals in Eastern Arkansas, North and Central Mississippi and Western Tennessee. The principal investigator, Ray Osarogiagbon, MD, of Baptist Memorial Health Care Corporation in Memphis, Tenn., is a member of the IASLC's Staging and Prognostic Factors Committee. In this particular analysis, Dr. Smeltzer and his co-researchers analyzed 3,505 lung resections between 2009-2019 from the MSQSR cohort.

Of 3,505 resections, 34% were R0, 60% R-uncertain, and 6% R1/2. The R0 percentage increased from 9% in 2009 to 56% in 2019. Kit cases were 66% R0 and 29% R-uncertain, compared to 14% R0 and 79% R-uncertain in non-kit cases (p Kit cases also had lower percentages of non-examination of lymph nodes (ie. pNX), 1% vs. 14% (p The adjusted hazard ratio (aHR) for kit cases versus non-kit cases was 0.75 ([CI 0.66-0.85], p Dr. Smeltzer pointed out that a more carefully controlled trial is planned to corroborate these results.

"Ultimately, the main limitation of this study is that it was not a randomized controlled trial. We propose to conduct such a trial to further evaluate the lymph node kit," he said.

Credit: 
International Association for the Study of Lung Cancer

Moiré effect: How to twist material properties

image: Two layers of a 2D-material are stacked, which influences the properties of the material.

Image: 
Erik Zumalt, Lukas Linhart

The discovery of the material graphene, which consists of only one layer of carbon atoms, was the starting signal for a global race: Today, so-called "2D materials" are produced, made of different types of atoms. Atomically thin layers that often have very special material properties not found in conventional, thicker materials.

Now another chapter is being added to this field of research: If two such 2D layers are stacked at the right angle, even more new possibilities arise. The way in which the atoms of the two layers interact creates intricate geometric patterns, and these patterns have a decisive impact on the material properties, as a research team from TU Wien and the University of Texas (Austin) has now been able to show. Phonons - the lattice vibrations of the atoms - are significantly influenced by the angle at which the two material layers are placed on top of each other. Thus, with tiny rotations of such a layer, one can significantly change the material properties.

The Moiré Effect

The basic idea can be tried out at home with two fly screen sheets - or with any other regular meshes that can be placed on top of each other: If both grids are perfectly congruent on top of each other, you can hardly tell from above whether it is one or two grids. The regularity of the structure has not changed.

But if you now turn one of the grids by a small angle, there are places where the gridpoints of the meshes roughly match, and other places where they do not. This way, interesting patterns emerge - that is the well-known moiré effect.

"You can do exactly the same thing with the atomic lattices of two material layers," says Dr. Lukas Linhart from the Institute for Theoretical Physics at TU Wien. The remarkable thing is that this can dramatically change certain material properties - for example, graphene becomes a superconductor if two layers of this material are combined in the right way.

"We studied layers of molybdenum disulphide, which, along with graphene, is probably one of the most important 2D materials," says Prof Florian Libisch, who led the project at TU Wien. "If you put two layers of this material on top of each other, so-called Van der Waals forces occur between the atoms of these two layers. These are relatively weak forces, but they are strong enough to completely change the behaviour of the entire system."

In elaborate computer simulations, the research team analysed the quantum mechanical state of the new bilayer structure caused by these weak additional forces, and how this affects the vibrations of the atoms in the two layers.

The angle of rotation matters

"If you twist the two layers a little bit against each other, the Van der Waals forces cause the atoms of both layers to change their positions a little bit," says Dr Jiamin Quan, from UT Texas in Austin. He led the experiments in Texas, which confirmed the results of the calculations: The angle of rotation can be used to adjust which atomic vibrations are physically possible in the material.

"In terms of materials science, it is an important thing to have control over phonon vibrations in this way," says Lukas Linhart "The fact that electronic properties of a 2D material can be changed by joining two layers together was already known before. But the fact that the mechanical oscillations in the material can also be controlled by this now opens up new possibilities for us. Phonons and electromagnetic properties are closely related. Via the vibrations in the material, one can therefore intervene in important many-body effects in a controlling way." After this first description of the effect for phonons, the researchers are now trying to describe phonons and electrons combined, hoping to learn more about important phenomena like superconductivity.

The material-physical Moiré effect thus makes the already rich research field of 2D materials even richer - and increases the chances of continuing to find new layered materials with previously unattainable properties and enables the use of 2D materials as an experimental platform for quite fundamental properties of solids.

Credit: 
Vienna University of Technology

Artificial neurons help decode cortical signals

image: Alexey Ossadtchi, director of the HSE Center for Bioelectric Interfaces

Image: 
Alexey Ossadtchi

Russian scientists have proposed a new algorithm for automatic decoding and interpreting the decoder weights, which can be used both in brain-computer interfaces and in fundamental research. The results of the study were published in the Journal of Neural Engineering.

Brain-computer interfaces are needed to create robotic prostheses and neuroimplants, rehabilitation simulators, and devices that can be controlled by the power of thought. These devices help people who have suffered a stroke or physical injury to move (in the case of a robotic chair or prostheses), communicate, use a computer, and operate household appliances. In addition, in combination with machine learning methods, neural interfaces help researchers understand how the human brain works.

Most frequently brain-computer interfaces use electrical activity of neurons, measured, for example, with electro- or magnetoencephalography. However, a special decoder is needed in order to translate neuronal signals into commands. Traditional methods of signal processing require painstaking work on identifying informative features--signal characteristics that, from a researcher's point of view, appear to be most important for the decoding task.

Initially, the authors focused on electrocorticography (ECoG) data--an invasive recording of neural activity with electrodes located directly on the cortical surface under the dura mater, a shell that encapsulates brain, -- and developed an artificial neural network architecture that automates the extraction of interpretable features.

As conceived by the scientists, the neural network algorithm should not be too complicated in terms of the number of parameters. It should be automatically tuned and enable one to interpret the learned parameters in physiologically meaningful terms. The last requirement is especially important: if it is met, the neural network can be used not only to decode signals, but also to gain new insights into the neuronal mechanisms, the dream come true for neuroscientists and neurologists. Therefore, in addition to a new neural network for signal processing, the authors proposed (and theoretically justified) a method for interpreting the parameters of the broad class of neural networks.

The neural network proposed by the researchers consists of several similarly structured branches, each of which is automatically tuned to analyze the signals of a separate neural population in a certain frequency range and is tuned away from interference. To do this, they use convolutional layers similar to those that comprise neural networks, sharpened for image analysis, and which act as spatial and frequency filters. Knowing the weights of the spatial filter, it is possible to determine where the neural population is located, and the temporal convolution weights show how the neuron activity changes over time in addition to indirectly indicating the neuronal population size.

To assess the performance of their neural network in combination with a new method for interpreting its parameters, the scientists first generated a set of realistic model data, or 20 minutes of activity from 44 neuron populations. Noise was added to the data to simulate interference when recording signals in real conditions. The second set of data to check was a dataset from the BCI Competition IV, containing the ECoG data of several subjects who periodically moved their fingers spontaneously. Another set of ECoG data was collected by the scientists themselves at the Moscow State University of Medicine and Dentistry, which serves as the clinical base of the Centre for Bioelectric Interfaces of HSE University. Unlike previous data, the records collected by the scientists contained complete geometric information about the location of the ECoG electrodes on the surface of each patient's cerebral cortex. This made it possible to interpret the weights of the spatial filters learned by the neural network and to discern somatotopy (i.e., the relationship between the neural population's position on the cerebral cortex and the body part it functionally corresponds to) in the location of the neuron populations pivotal for decoding the movement of each finger.

The neural network performed nicely: with the BCI Competition IV dataset, it worked on par with the solution proposed by the competition winners, but, unlike the solution, it used automatically selected features. While working with both real and model data, the researchers proved that it is possible to interpret the scale parameters correctly and in detail, and the interpretation gives physiologically plausible results. The researchers also applied a new technique to the classification of imaginary movements based on non-invasive (obtained from the surface of the head, without implanting electrodes) EEG data. As in the case of ECoG, the neural network provided high decoding accuracy and feature interpretability.

'We are already using this approach to build invasive brain-computer interfaces, as well as to solve issues of preoperative cortex mapping, which is necessary to ensure that key behavioral functions are preserved after brain surgery', says the scientific lead of the study and the director of the HSE Center for Bioelectric Interfaces, Alexei Ossadcthi. 'In the nearest future, the developed technique will be used to automatically extract knowledge about the principles according to which the brain implements a broad range of behavioral functions.'

Credit: 
National Research University Higher School of Economics

Mussel sensors pave the way for new environmental monitoring tools

image: Researchers at North Carolina State University have designed and demonstrated a new system that allows them to remotely monitor the behavior of freshwater mussels. A prototype of the system is shown here. The system could be used to alert researchers to the presence of toxic substances in aquatic ecosystems.

Image: 
James Reynolds, NC State University

Researchers at North Carolina State University have designed and demonstrated a new system that allows them to remotely monitor the behavior of freshwater mussels. The system could be used to alert researchers to the presence of toxic substances in aquatic ecosystems.

"When mussels feed, they open their shells; but if there's something noxious in the water, they may immediately close their shells, all at once," says Jay Levine, co-author of a paper on the work and a professor of epidemiology at NC State. "Folks have been trying to find ways to measure how widely mussels or oysters open their shells off and on since the 1950s, but there have been a wide variety of challenges. We needed something that allows the animals to move, can be placed in streams and collects data - and now we have it."

"We've basically designed a custom Fitbit to track the activities of mussels," says Alper Bozkurt, corresponding author of the paper and a professor of electrical and computer engineering at NC State.

The fundamental idea for the research stems from the fact that feeding behavior in mussels is generally asynchronous - it's not a coordinated affair. So, if a bunch of mussels close their shells at once, that's likely a warning there's something harmful in the water.

One of the things the researchers are already doing with the new sensor system is monitoring mussel behavior to determine if there are harmless circumstances in which mussels may all close their shells at the same time.

"Think of it as a canary in the coal mine, except we can detect the presence of toxins without having to wait for the mussels to die," Levine says. "At the same time, it will help us understand the behavior and monitor the health of the mussels themselves, which could give us insights into how various environmental factors affect their health. Which is important, given that many freshwater mussel species are threatened or endangered."

"To minimize costs, all the components we used to make this prototype sensor system are commercially available - we're just using the technologies in a way nobody has used them before," Bozkurt says.

Specifically, the system uses two inertial measurement units (IMUs) on each mussel. Each of the IMUs includes a magnetometer and an accelerometer - like the ones used in smartphones to detect when you are moving the phone. One IMU is attached to the mussel's top shell, the other to its bottom shell. This allows the researchers to compare the movement of the shell halves relative to each other. In other words, this allows the researchers to tell if the mussel is closing its shell, as opposed to the mussel being tumbled in the water by a strong current.

Wires from the IMUs are designed to run to a data acquisition system that would be mounted on a stake in the waterway. When placed in a natural setting, the data acquisition system is powered by a solar cell and transmits data from the sensors wirelessly via a cellular network. The current prototype has four mussels connected to the system, but it could handle dozens.

The researchers did more than 250 hours of testing with live mussels in a laboratory fish tank, and found that the sensors were exceptionally accurate - measuring the angle of the mussel's shell opening to within less than one degree.

"You can definitely tell when it's closed, when it's open and by how much," Bozkurt says.

"Our aim is to establish an 'internet-of-mussels' and monitor their individual and collective behavior," Bozkurt says. "This will ultimately enable us to use them as environmental sensors or sentinels."

The researchers are now continuing their testing to better understand the robustness of the system. For example, how long might it last in practical use under real-life conditions? The team plans to begin field testing soon.

"In addition to exploring its effectiveness as an environmental monitor, we're optimistic that the technology can help us learn new things about the mussels themselves," Levine says. "What prompts them to filter and feed? Does their behavior change in response to changes in temperature? While we know a lot about these animals, there is also a lot we don't know. The sensors provide us with the opportunity to develop baseline values for individual animals, and to monitor their shell movement in response to environmental changes."

Credit: 
North Carolina State University

Making molecular movies of a biological process of energy conversion

image: Sunlight begins the process of energy generation in a marine bacterium when it jump-starts a pigment called "chloride-ion pumping rhodopsin" that moves the chloride ions (Cl-) unidirectionally into the cells. Scientists have been able to see this process for the first time, using an advanced imaging technique called time-resolved serial femtosecond crystallography, which produces molecular movies.

Image: 
PNAS

Many organisms use sunlight to fuel cellular functions. But exactly how does this conversion of solar energy into chemical energy unfold?

In a recent experiment, an international team of scientists, including two researchers from UWM, sought answers using an advanced imaging technique called time-resolved serial femtosecond crystallography to watch a pigment found in some marine bacteria as it was exposed to sunlight outside the cell.

For this experiment, the researchers documented, for the first time, the dynamics of the "chloride ion-pumping rhodopsin," an atomic "pump," which is jump-started by sunlight and moves chloride ions unidirectionally into the bacterial cells.

These pumps may also serve as a kind of molecular solar cell for energy conversion.

Once inside, the chloride ions create a negatively charged environment, while the outside of the cell remains neutral. This results in an electric field that the bacteria could use to move, grow and maintain other vital functions.

The imaging method is capable of capturing sequential pictures of the molecular changes of biological macromolecules at work.

How chloride is pumped and transported within the rhodopsin has been elusive, said Marius Schmidt, UWM professor of physics who is one of the corresponding authors on the paper, which appeared March 22 in the journal Proceedings of the National Academy of Sciences. Recent UWM graduate and postdoctoral researcher Suraj Pandey also contributed substantially to this research.

"We intended to find out how sunlight drives the pump," Schmidt said. "The rhodopsin's response to light starts the mechanism. The response is extremely fast - at picoseconds, a trillionth of a second. So, we needed to make a molecular movie on extremely fast time scales to follow the 'pumping' action."

The movie shows both the mechanics of the pump and the movements of the pump's piston in a cylinder, which displaces the chloride ions. The size of this piston, however, is one billion times smaller than a piston in the macroscopic world. The movie shows in atomic detail how the piston pushes the chloride ions into the bacterial cell.

Imaging was done with Free Electron Laser (XFEL) equipment at the Linac Coherent Light Source in California that uses ultra-short and intense X-ray pulses that repeat rapidly to capture the frames of the molecular movie. X-rays hit protein molecules that are located in tiny crystals and which diffract into certain patterns, revealing where the atoms are at a single instance of time.

With so many pulses per second, the equipment's camera can work at blistering speed, churning out a multitude of "snapshots." Those are then mathematically reconstructed into 3D moving images that depict the arrangement of atoms over time - the structural changes that occur when proteins accomplish a task.

The work is relatable to human biology, he said, because the structure of the chloride pump is very similar to the rhodopsin and the photopsins in the human eye. While it doesn't pump chloride, rhodopsin enables the human eye to absorb light and convert it into the generation of electrical signals that communicate with the brain.

The structure of this chloride-ion pump also resembles that of other G-protein coupled receptors (GPCRs) in humans, which regulate functions such as blood pressure and hormone management. GPRCs are of interest to the pharmaceutical industry because they are prime drug targets.

Although the imaging is being used to advance fundamental biological science, this mechanism could also be applied in the future, Schmidt said, to design light-sensitive molecular pumps. "The chloride pump could be inserted into other organisms, and you'd be able to manipulate their behavior with light through a method called optogenetics," he said.

Credit: 
University of Wisconsin - Milwaukee

BioRescue develops ethical risk assessment for northern white rhino rescue programme

image: Ethical Risk Assessment prior to an oocyte collection at Ol Pejeta Conservancy, a BioRescue procedure.

Image: 
BioRescue/Jan Zwilling

The BioRescue consortium develops and applies new technological approaches as a last straw for saving critically endangered species such as the northern white rhinoceros. Advanced Assisted Reproductive Technologies (aART) push the boundaries of what can be done to create new offspring. Consequently, new ethical questions regarding the application of these tools arise and need to be answered, and relevant animal welfare issues to be addressed. In order to ensure that the ethical risk assessment matches the technological breakthrough with aART, the BioRescue consortium developed and applies the "ETHAS" tool, an ethical self-assessment tool explicitly designed for assisting oocyte collections, in vitro fertilizations, embryo transfers and further procedures of the BioRescue consortium. ETHAS was developed under the leadership of Padua University and is described in detail in a new paper published in the scientific journal "Animals".

From the very beginning of conceptualising the BMBF sponsored BioRescue programme, it was obvious that - if successful - the newly developed technologies for assisted reproduction would require a solid ethical and animal welfare framework to be applied for species conservation. "If we have new things we can do, it is our duty to also consider what we should do and how we can apply them in a way that truly respects animal welfare, risk considerations, the safety of the people involved and the quality of the procedures", says BioRescue head Prof Thomas Hildebrandt from the German Leibniz Institute for Zoo and Wildlife Research (Leibniz-IZW). These ethical considerations have been a pivotal pillar of the programme's work. Prof Barbara de Mori and her team from the Ethics Laboratory for Veterinary Medicine, Conservation and Animal Welfare at the University of Padua in Italy have continuously monitored and evaluated every procedure the BioRescue team has conducted since the beginning of the scientific rescue programme in 2019. "We soon realised that for assisted reproduction technologies as they are developed and applied in the BioRescue programme no ethical assessment tools existed", explains de Mori. "As the reproduction specialists had to design new intervention approaches, we were able to develop and apply in parallel a new robust and solid ethical risk assessment framework." ETHAS is the result, aiming at raising the ethical standards of the application of aART in general and setting the bar as high as possible for the BioRescue procedures.

ETHAS is an ethical self-assessment tool explicitly designed to assess ARTs in mammalian conservation breeding programmes. It consists of two checklists, the "Ethical Evaluation Sheet" and the "Ethical Risk Assessment". The ETHAS checklists merge risk analysis - based on a combination of traditional risk assessment, aspects of animal welfare and the assessment of specific ethical risks - with an ethical analysis to assess the ethical acceptability of the procedures under assessment. "ETHAS underwent several applications in different conditions (zoos and semi-captive management) that allowed the review, improvement and refinement of the tool in an iterative procedure via the shared work between ethicists and reproduction experts", de Mori explains the process of developing the tool, in which also specialists from the Kenya Wildlife Service and the Avantea Laboratory in Italy were involved. "We are committed to the development of high level endangered species welfare and ethical standards even as we works against endangered species extinction and recovery", says Dr Patrick Omondi from Kenya Wildlife Service.

The framework not only ensures that procedures are conducted to the highest standards of animal welfare and safety, it is also a crucial tool for determining how valuable and risky future procedures might be which built on already established ones. "For example, all risks that have been taken to create a northern white rhino embryo now accumulate in this embryo", explains Pierfrancesco Biasetti, member of the Padua University Ethics Laboratory and of Leibniz-IZW. "This means they have an exceptionally high conservation value and decisions about transferring them to create new offspring must reflect this value."

Having a robust risk assessment tool at hand allows the consortium to make evidence-based, well-reasoned decisions. "We are well aware of the fact that pushing technologies for conservation purposes to the limit can raise questions with more than one valid answer", Hildebrandt and de Mori sum up. "Depending on personal perspectives there can be different views as to what is an acceptable risk for a procedure and what is an ethically flawless approach for species conservation. With ETHAS we respect these challenges, consider every ethical and risk-related aspect of our procedures and ultimately work towards our goal of preventing the extinction of the northern white rhino".

The accompanying ethical scientific work is supported by Merck and also focuses on questions from stakeholders and the general public. "Applying these new technologies inevitably raises new questions also in the public and in stakeholder-driven discourses on saving biodiversity", says Steven Seet, science communicator at the Leibniz-IZW. "These include questions such as 'is it worth spending public money to save single wildlife species?' or 'if we have technologies and methods to save species from the brink of extinction, do we still have to care about nature?'."

Credit: 
Leibniz Institute for Zoo and Wildlife Research (IZW)