Tech

Introducing the world's thinnest technology -- only two atoms thick

video: The research video.

Image: 
Tel Aviv University

A scientific breakthrough: Researchers from Tel Aviv University have engineered the world's tiniest technology, with a thickness of only two atoms. According to the researchers, the new technology proposes a way for storing electric information in the thinnest unit known to science, in one of the most stable and inert materials in nature. The allowed quantum-mechanical electron tunneling through the atomically thin film may boost the information reading process much beyond current technologies.

The research was performed by scientists from the Raymond and Beverly Sackler School of Physics and Astronomy and Raymond and Beverly Sackler School of Chemistry. The group includes Maayan Vizner Stern, Yuval Waschitz, Dr. Wei Cao, Dr. Iftach Nevo, Prof. Eran Sela, Prof. Michael Urbakh, Prof. Oded Hod, and Dr. Moshe Ben Shalom. The work is now published in Science magazine.

"Our research stems from curiosity about the behavior of atoms and electrons in solid materials, which has generated many of the technologies supporting our modern way of life," says Dr. Ben Shalom. "We (and many other scientists) try to understand, predict, and even control the fascinating properties of these particles as they condense into an ordered structure that we call a crystal. At the heart of the computer, for example, lies a tiny crystalline device designed to switch between two states indicating different responses - "yes" or "no", "up" or "down" etc. Without this dichotomy - it is not possible to encode and process information. The practical challenge is to find a mechanism that would enable switching in a small, fast, and inexpensive device.

Current state-of-the-art devices consist of tiny crystals that contain only about a million atoms (about a hundred atoms in height, width, and thickness) so that a million of these devices can be squeezed about a million times into the area of one coin, with each device switching at a speed of about a million times per second.

Following the technological breakthrough, the researchers were able, for the first time, to reduce the thickness of the crystalline devices to two atoms only. Dr. Ben Shalom emphasizes that such a thin structure enables memories based on the quantum ability of electrons to hop quickly and efficiently through barriers that are just several atoms thick. Thus, it may significantly improve electronic devices in terms of speed, density, and energy consumption.

In the study, the researchers used a two-dimensional material: one-atom-thick layers of boron and nitrogen, arranged in a repetitive hexagonal structure. In their experiment, they were able to break the symmetry of this crystal by artificially assembling two such layers. "In its natural three-dimensional state, this material is made up of a large number of layers placed on top of each other, with each layer rotated 180 degrees relative to its neighbors (antiparallel configuration)" says Dr. Ben Shalom. "In the lab, we were able to artificially stack the layers in a parallel configuration with no rotation, which hypothetically places atoms of the same kind in perfect overlap despite the strong repulsive force between them (resulting from their identical charges). In actual fact, however, the crystal prefers to slide one layer slightly in relation to the other, so that only half of each layer's atoms are in perfect overlap, and those that do overlap are of opposite charges - while all others are located above or below an empty space - the center of the hexagon. In this artificial stacking configuration the layers are quite distinct from one another. For example, if in the top layer only the boron atoms overlap, in the bottom layer it's the other way around."

Dr. Ben Shalom also highlights the work of the theory team, who conducted numerous computer simulations "Together we established deep understanding of why the system's electrons arrange themselves just as we had measured in the lab. Thanks to this fundamental understanding, we expect fascinating responses in other symmetry-broken layered systems as well," he says.

Maayan Wizner Stern, the PhD student who led the study, explains: "The symmetry breaking we created in the laboratory, which does not exist in the natural crystal, forces the electric charge to reorganize itself between the layers and generate a tiny internal electrical polarization perpendicular to the layer plane. When we apply an external electric field in the opposite direction the system slides laterally to switch the polarization orientation. The switched polarization remains stable even when the external field is shut down. In this the system is similar to thick three-dimensional ferroelectric systems, which are widely used in technology today."

"The ability to force a crystalline and electronic arrangement in such a thin system, with unique polarization and inversion properties resulting from the weak Van der Waals forces between the layers, is not limited to the boron and nitrogen crystal," adds Dr. Ben Shalom. "We expect the same behaviors in many layered crystals with the right symmetry properties. The concept of interlayer sliding as an original and efficient way to control advanced electronic devices is very promising, and we have named it Slide-Tronics".

Maayan Vizner Stern concludes: "We are excited about discovering what can happen in other states we force upon nature and predict that other structures that couple additional degrees of freedom are possible. We hope that miniaturization and flipping through sliding will improve today's electronic devices, and moreover, allow other original ways of controlling information in future devices. In addition to computer devices, we expect that this technology will contribute to detectors, energy storage and conversion, interaction with light, etc. Our challenge, as we see it, is to discover more crystals with new and slippery degrees of freedom."

Credit: 
Tel-Aviv University

Slowing the sugar rush to yield better grapes

image: Researcher Pietro Previtali collecting sample Cabernet Sauvignon grapes for the study, San Joaquin Valley, California.

Image: 
Image supplied by University of Adelaide.

One of the many challenges for grape growers posed by climate change is the accelerated rate at which grapes ripen in warmer climates, which can result in poor colour and aroma development.

In a new study published in the Journal of Agricultural and Food Chemistry, researchers from the University of Adelaide found it is possible to increase the flavour potential of Cabernet Sauvignon grapes by slowing down the ripening process with strategies including crop load manipulation and irrigation management.

Lead author, Pietro Previtali from the University of Adelaide's School of Agriculture, Food and Wine, said: "Advanced maturation due to warmer temperatures is a key issue for grape growers in most wine regions worldwide and especially in warm and dry areas such as Australia and California."

"It leads to faster sugar accumulation in grapes, which results in reaching the targeted sugar levels when the concentrations of colour and aroma compounds are below their maximum values.

"Growers therefore have to compromise between harvesting when sugar is ready but the desired flavours are missing, and prolonging grape maturation until an optimal composition of colour, mouthfeel and aroma compounds is achieved.

"The problem with prolonging maturation is grapes undergo shrivelling and yields are reduced with a negative consequence on profitability, and higher sugar levels that lead to high-alcohol wines."

Where earlier research has found techniques such as thinning vines and intense irrigation late in the growing season can change wine composition, the new study examined how these techniques specifically affect the development of aroma compounds in the grapes themselves.

The researchers grew Cabernet Sauvignon wine grapes at a commercial vineyard in the San Joaquin Valley in California. The vines were either thinned, or irrigated late in the growing season, or both, with grapes collected throughout the ripening period. These were compared with grapes grown in the same block where neither technique was applied.

The researchers found that delaying ripening slowed down sugar accumulation, which led to a decrease in green aroma compounds, unwanted in winemaking, and increased fruity aromas, colour and mouthfeel compounds, associated with red wine quality.

They also observed that the composition of grape quality traits is not dependent on a single strategy.

"Rather, groups of compounds were responsive to different factors, including crop load, irrigation, ripening rate and in some cases an interaction of these," Mr Previtali said.

Using the strategies available, the researchers sought to achieve the longest delay possible to study the relationship between sugar accumulation and flavour development. For example, a delay of three weeks was achieved through a 35 per cent reduction of crop load and late season irrigation of 50 per cent additional water.

"While representing a valuable experimental tool, this approach however may not be practical due to availability and high cost of irrigation, particularly as water becomes a scarcer resource," said Project lead and co-author, Associate Professor Christopher Ford, the University of Adelaide.

"Tailoring the management of these strategies seems to be the way to achieve the targeted levels of aroma compounds, colour and mouthfeel in wines."

The researchers say replication of these vineyard trials over future seasons is necessary to fully understand the implications of year-to-year variation, and develop a broader understanding of combining crop load and late season irrigation to delay sugar accumulation.

Credit: 
University of Adelaide

Thermal waves observed in semiconductor materials

image: Amplified frequency-domain thermoreflectance setup used to study the existence of second sound in germanium. Two different lasers are focused onto the surface of the samples using a microscope objective. A rather large combination of optical elements allows to control and modify the spot size and shape, as well as the power and harmonic modulation of the lasers. Cold nitrogen gas is used for better visualization of the lasers optical path.

Image: 
ICMAB, CSIC

A study published in Science Advances reports on the unexpected observation of thermal waves in germanium, a semiconductor material, for the first time. This phenomenon may allow a significant improvement in the performance of our electronic devices in a near future. The study is led by researchers from the Institute of Materials Science of Barcelona (ICMAB, CSIC) in collaboration with researchers from the Universitat Autònoma de Barcelona, and the University of Cagliari.

Heat, as we know it, originates from the vibration of atoms, and transfers by diffusion at ambient temperatures. Unfortunately, it is rather difficult to control, and leads to simple and inefficient strategies to manipulate it. This is why, for example, large amounts of residual heat can accumulate in our computers, mobile phones and, in general, most electronic devices.

However, if heat was transported through waves, such as light, it would offer new alternatives to control it, especially through the unique and intrinsic properties of waves.

Thermal waves have been observed to date only in few materials, such as solid helium or, more recently, in graphite. Now, the study published in Science Advances by researchers from the Institute of Materials Science of Barcelona (ICMAB, CSIC) in collaboration with researchers from the Universitat Autònoma de Barcelona, and the University of Cagliari, reports on the observation of thermal waves on solid germanium, a semiconductor material used typically in electronics, similar to silicon, and at room temperature. "It was not expected to encounter these wave-like effects, known as second sound, on this type of material, and in these conditions," says Sebastián Reparaz, ICMAB Researcher at the Nanostructured Materials for Optoelectronics and Energy Harvesting (NANOPTO) Group and leader of this study.

The observation occurred when studying the thermal response of a germanium sample under the effect of lasers, producing a high-frequency oscillating heating wave on its surface. The experiments showed that, contrarily to what was believed until now, heat did not dissipate by diffusion, but it propagated into the material through thermal waves.

Apart from the observation itself, in the study, researchers unveil the approach to unlock the observation of thermal waves, possibly in any material system.

What is second sound and how can it be observed in any material

First observed in the 1960s on solid helium, thermal transport through waves, known as second sound, has been a recurrent subject for researchers who have repeatedly tried to demonstrate its existence in other materials. Recent successful demonstrations of this phenomenon on graphite have revitalized its experimental study.

"Second sound is the thermal regime where heat can propagate in the form of thermal waves, instead of the frequently observed diffusive regime. This type of wave-like thermal transport has many of the advantages offered by waves, including interference and diffraction", says ICMAB researcher Sebastián Reparaz.

"Wave-like effects can be unlocked by driving the system in a rapidly varying temperature field. In other words, a rapidly varying temperature field forces the propagation of heat in the wave-like regime" explains Reparaz, and adds, "The interesting conclusion of our work is that these wave-like effects could be potentially observed by most materials at a sufficiently large modulation frequency of the temperature field. And, what is even more interesting, its observation is not restricted to some specific materials."

Applications of second sound in a near future

"The possible applications of second sound are limitless", says Sebastián Reparaz. Achieving these applications, however, will require a deep understanding on the ways to unlock this thermal propagation regime on any given material. Being able to control heat propagation through the properties of waves opens new ways to design the upcoming generations of thermal devices, in a similar way to the already established developments for light. "Specifically, the second sound thermal regime could be used to rethink how we deal with waste heat", he adds.

From a theoretical point of view, "these findings allow unifying the current theoretical model, which until now considered that materials where this type of wave-like behavior was observed (such as graphite) were very different from the semiconductor materials currently used in the manufacture of electronic chips (such as silicon and germanium)" says F. Xavier Álvarez, researcher at the UAB. "Now all these materials can be described using the same equations. This observation establishes a new theoretical framework that may allow in the not too distant future a significant improvement in the performance of our electronic devices," adds Álvarez.

Credit: 
Universitat Autonoma de Barcelona

Prehistoric homes would have failed modern air quality tests

Domestic burning of wood and dung fuels in Neolithic homes would have exceeded modern internationally-agreed standards for indoor air quality, exposing inhabitants to unsafe levels of particulates.

Working with environmental engineers, archaeologists at Newcastle University, UK, used modern air quality monitoring methods to assess the impact of domestic fuel burning inside buildings at Çatalhöyük, in Turkey, one of the world's earliest settlements.

A typical house at Çatalhöyük, a UNESCO World Heritage site, had a domed oven set against the south wall, located beneath an opening in the roof. In the 1990s, a replica of one of these houses was built at Çatalhöyük to show visitors what they may have looked like during the time of occupation.

Although previous studies have shown that burning biofuels has significant negative consequences on health, especially in enclosed spaces with poor ventilation, the relationship between fuel use and health in prehistory has never been explored.

The research team, which included experts from Northumbria, Durham and Copenhagen universities, burned different types of fuel in the hearth of the replica house and measured pollution levels to test how living in these buildings may have exposed the inhabitants to fine particulate matter and impacted on their respiratory health.

The research, which was funded by the Wellcome Trust, found that the average levels of fine particulate matter (PM2.5) over a two hour period were extremely high and that concentrations continued to remain high up to 40 minutes after the fires had burnt out afterwards.

The results indicated greater exposure directly in front of the oven although similar levels were also detected to the side of the hearth, suggesting that a person's position in relation to the fire would have had only a minimal impact on exposure.

Dr Lisa-Marie Shillito, Senior Lecturer in Landscape Archaeology, explained: "At Çatalhöyük, the lack of a proper chimney, and the fact that buildings consist of a single, small room that combined living space and the hearth, means that anyone inside the building would have been exposed to unsafe levels of particulates as a result of everyday domestic activities. This would almost certainly have had a negative health impact on these communities, due to a combination of an open fire and lack of ventilation."

Studying air pollution and respiratory health in the past can be challenging because human remains do not always provide clear signs due to inadequate preservation. Small particles of PM2.5 can travel deep into the lungs where they become embedded in the tissue and can even enter the blood stream, triggering an inflammatory response outside the lungs. The remains of many of the inhabitants of Çatalhöyük show signs of osteoperiostitis, or bone lesions, which can be response to infection, and the research team suggest that this may be explained by the chronic exposure to PM2.5 that this community would have had.

Professor Anil Namdeo, Professor of Air Quality Management, Northumbria University, said: "This work has important implications for the current era. Many communities all around the world still use biomass for cooking and heating, and in poorly ventilated houses, resulting in more than four million deaths each year associated with indoor air pollution. Our study highlights this issue and could pave the way for developing mitigation measures to minimise this."

Credit: 
Newcastle University

Ivermectin treatment in humans for reducing malaria transmission

Malaria still kills millions. Researchers are excited by a new intervention: giving people a drug which kills mosquitoes that bite them. Incredibly, this is a reality, as the drug ivermectin, widely used for the control of parasite infections such as lymphatic filariasis and onchocerciasis, appears to do this. With some mosquitoes now resistant to the insecticides used in treated bed nets, this is a potentially important new control measure.

LSTM's Dr Rebecca Thomas and Dr Joseph Okebe, together with Dr de Souza from the Noguchi Memorial Institute for Medical Research (NMIMR(link is external)), University of Ghana, first examined the experimental evidence that giving the drug to people kills the mosquitoes that bite them.

All included studies showed large effects of ivermectin on mosquito mortality. They then went on to seek studies that evaluated whether this made a difference in malaria transmission and the amount of malaria in people living in malarial areas. They found only one study has been published to date. When published in 2019, there was quite a lot of press coverage stating ivermectin reduced malaria in children based on original report of this trial.

Unfortunately, although this paper was published in the Lancet, the statistical methods used in the analysis were not standard and did not adjust adequately for clustering, so the effect may have been overestimated.

After the researchers who did the original study kindly supplied the data, the London School of Hygiene and Tropical Medicine's Dr Clémence Leyrat, expert in issues of small samples and non-compliance in cluster-randomized trials, and Dr John Bradley, expert in cluster-randomized trials, reanalysed the data and are part of the Cochrane review author team.

Dr Leyrat and Dr Bradley said: "To date there has been one small pilot trial of ivermectin. The statistical analysis of the trial was unsuitable, and the original publication overstated the amount of evidence it provided. With this reanalysis we show that the trial did not produce strong evidence that mass administration of ivermectin is useful in preventing malaria, leaving the question of its efficacy open. Fortunately, there are some larger trials in progress which will provide further evidence."

The one included study enrolled eight villages in Burkina Faso, which were randomly assigned to receive ivermectin or a control. All villages received ivermectin, as part of the scheduled control of lymphatic filariasis. In addition, the treatment villages received five more doses of ivermectin, once every three weeks. The effect of ivermectin on malaria was measured in children younger than five years of age. In these children, the treatment did not show a notable difference in the presence of malaria between the treatment and control groups. Following the review of the data from this study, the team summarized that they were uncertain whether community administration of ivermectin influenced malaria transmission.

Dr de Souza from the NMIMR commented, "There may be varied reasons for the observed lack of evidence. However, challenges to the use of ivermectin could be the duration of the reproductive cycle in mosquitoes, the mosquito behaviour, and the incubation period of malaria parasites in the mosquito. While the conditions for laboratory experiments are carefully controlled, there may be wide variations in the natural environment. As such, a short-term increase in mosquito mortality alone through ivermectin administration could slightly reduce the risk of malaria transmission but may not be sufficient for a sustained impact. Given the ivermectin half-life in the blood, maintaining a high enough ivermectin concentration over several days and weeks to kill any blood feeding, infective, mosquito will be key"

Senior author, Dr Joseph Okebe agreed: "Unfortunately, it is currently not possible to say if treating an entire community with ivermectin reduces malaria. However, several research studies are in progress and, as such, we anticipate they will provide more answers to this important question in the future".

Credit: 
Liverpool School of Tropical Medicine

NIST laser 'comb' systems now measure all primary greenhouse gases in the air

image: NIST researchers used a laser frequency-comb instrument (illustration at lower right) to simultaneously measure three airborne greenhouse gases -- nitrous oxide, carbon dioxide and water vapor -- plus the major air pollutants ozone and carbon monoxide over two round-trip paths (arrows) from a NIST building in Boulder, Colo., to a reflector on a balcony of another building, and another reflector on a nearby hill.

Image: 
N. Hanacek/NIST

Researchers at the National Institute of Standards and Technology (NIST) have upgraded their laser frequency-comb instrument to simultaneously measure three airborne greenhouse gases -- nitrous oxide, carbon dioxide and water vapor -- plus the major air pollutants ozone and carbon monoxide.

Combined with an earlier version of the system that measures methane, NIST's dual comb technology can now sense all four primary greenhouse gases, which could help in understanding and monitoring emissions of these heat-trapping gases implicated in climate change. The newest comb system can also help assess urban air quality.

These NIST instruments identify gas signatures by precisely measuring the amounts of light absorbed at each color in the broad laser spectrum as specially prepared beams trace a path through the air. Current applications include detecting leaks from oil and gas installations as well as measuring emissions from livestock. The comb systems can measure a larger number of gases than conventional sensors that sample air at specific locations can. The combs also offer greater precision and longer range than similar techniques using other sources of light.

NIST's latest advance, described in a new paper, shifts the spectrum of light analyzed from the near-infrared into the mid-infrared, enabling the identification of more and different gases. The older, near-infrared comb systems can identify carbon dioxide and methane but not nitrous oxide, ozone or carbon monoxide.

Researchers demonstrated the new system over round-trip paths with lengths of 600 meters and 2 kilometers. The light from two frequency combs was combined in optical fiber and transmitted from a telescope located at the top of a NIST building in Boulder, Colorado. One beam was sent to a reflector located on a balcony of another building, and a second beam to a reflector on a hill. The comb light bounced off the reflector and returned to the original location for analysis to identify the gases in the air.

A frequency comb is a very precise "ruler" for measuring exact colors of light. Each comb "tooth" identifies a different color. To reach the mid-infrared part of the spectrum, the key component is a specially engineered crystal material, known as periodically poled lithium niobate, that converts light between two colors. The system in this experiment split the near-infrared light from one comb into two branches, used special fiber and amplifiers to broaden and shift the spectrum of each branch differently and to boost power, then recombined the branches in the crystal. This produced mid-infrared light at a lower frequency (longer wavelength) that was the difference between the original colors in the two branches.

The system was precise enough to capture variations in atmospheric levels of all of the measured gases and agreed with results from a conventional point sensor for carbon monoxide and nitrous oxide. A major advantage in detecting multiple gases at once is the ability to measure correlations between them. For example, measured ratios of carbon dioxide to nitrous oxide agreed with other studies of emissions from traffic. In addition, the ratio of excess carbon monoxide versus carbon dioxide agreed with similar urban studies but was only about one-third the levels predicted by the U.S. National Emissions Inventory (NEI). These levels provide a measure of how efficiently fuel combusts in emissions sources such as cars.

The NIST measurements, in echoing other studies suggesting there is less carbon monoxide in the air than the NEI predicts, put the first hard numbers on the reference levels or "inventories" of pollutants in the Boulder-Denver area.

"The comparison with the NEI shows how hard it is to create inventories, especially that cover large areas, and that it is critical to have data to feed back to the inventories," lead author Kevin Cossel said. "This isn't something that will directly impact most people on a day-to-day basis -- the inventory is just trying to replicate what is actually happening. However, for understanding and predicting air quality and pollution impacts, modelers do rely on the inventories, so it is critical that the inventories be correct."

Researchers plan to further improve the new comb instrument. They plan to extend the reach to longer distances, as already demonstrated for the near-infrared system. They also plan to boost detection sensitivity by increasing the light power and other tweaks, to enable detection of additional gases. Finally, they are working on making the system more compact and robust. These advances may help improve understanding of air quality, specifically the interplay of factors influencing ozone formation.

Credit: 
National Institute of Standards and Technology (NIST)

COVID-19-mRNA vaccine induces good immune response against coronavirus variants

A new Finnish study shows that 180 health care workers who had received two doses of the Pfizer and Biontech vaccine have very good antibody responses against the SARS-CoV-2 virus. The immune response was as strong against the alpha variant (formerly the UK variant) but was somewhat decreased against the beta variant (formerly the South Africa variant).

Finnish researchers from the University of Turku and University of Helsinki together with Turku University Hospital, Helsinki University Hospital, and the Finnish Institute for Health and Welfare studied the immune response induced by the coronavirus vaccinations, which started in Finland in December. The researchers analysed vaccine responses in 180 health care workers, each of whom received two doses of the Pfizer and Biontech mRNA vaccine.

All vaccinated subjects were found to have an excellent antibody level against the original virus after two vaccine doses. The immune response was just as strong against the alpha variant of the virus. Although the immune response against the beta variant was weaker, the vaccinated subjects did have neutralising antibodies that give relatively good protection against the variant.

"The study demonstrates the efficiency of the covid-19 vaccine and its ability to induce antibody responses in working-age population regardless of their age or sex. The vaccine is one of the most effective I have ever studied," says Professor of Virology Ilkka Julkunen, whose research group focuses on studying the immune response created by coronavirus vaccines and natural infections.

"After two doses, the immune response created by the covid-19 vaccine was even better than after a coronavirus infection with mild symptoms," describes Doctoral Candidate Pinja Jalkanen from the University of Turku.

"It is also very promising that nearly all of the vaccinated subjects had even a small amount of neutralising antibodies against the beta variant," adds Professor Anu Kantele from Helsinki University Hospital and Meilahti Vaccine Research Center (MeVac) of the University of Helsinki.

Credit: 
University of Turku

Eel products in the EU and the UK need better regulation

Growing in popularity, unagi kabayaki - grilled freshwater eel in soy sauce - can be found on the menu of many Japanese restaurants, and is stocked by Asian shops and in specialist supermarkets. But new research tracing the DNA of eel fillets used for this dish has found that fraudulent food labelling is rife, with a third of the products violating EU regulations on the provision of food information. With certain species of eels now endangered, the researchers say that accurate labelling on these products is vital if the global eel trade is to be sustainable.

The European eel is a critically endangered species with trade strictly regulated, and import and export banned across the EU's external borders. While the researchers found little evidence of illegal trade in European eel in the products they examined, the prevalence of fraudulent labelling suggests that EU, and current UK, labelling requirements are insufficient.

"Only through DNA analysis were we able to demonstrate that more than ten percent of the unagi kabayaki fillets were prepared from species other than that indicated on
the label" said Florian Stein, the lead author from the Technische Universität Braunschweig in Germany. He added, "In times when eel trafficking is considered to be one of the biggest wildlife crimes and consumer awareness regarding the source of their products in general is increasing - the level of evident labelling fraud is alarming."

The origins and labelling of 108 unagi kabayaki products for sale in the Netherlands, Germany, Belgium, France and the UK were investigated by the researchers. Being prepared fillets covered in sauce, it was impossible to identify the species without molecular analysis. Taking DNA and cross-referencing this information with a global database meant that researchers could pinpoint the species involved and check the accuracy of each product label.

The researchers found that none of the products purchased in Europe were produced in Europe, all were imported from China and Taiwan. Each product contained the fillets of one type of eel, but in total four species were detected across the sample group, only one of which is found in Chinese waters. 73 samples were American eel, 33 Japanese eel, and a single sample of European eel and Indian shortfin eel were identified.

"The presence of eels originating from various parts of the world points at the global nature of the eel trade", said Vincent Nijman, Professor in Anthropology at Oxford Brookes University in the UK, and one of the authors of the paper.

"American eel is transported from the east coast of the US to southern China, where it is turned into eel fillets, these are then exported to the Netherlands from where they end up in UK supermarkets. At the same time, in another shop, also in the UK, you can buy similar looking fillets that are actually Indian eels imported from Germany, that also were processed in China but which originated from perhaps the Philippines".

The research paper says it is vital that the EU and the UK achieve straightforward labelling requirements that include the scientific name of the species, and that this is made mandatory for prepared and preserved fish products.

Andrew Kerr, Chairman of the Sustainable Eel Group in Brussels, who was not involved in the research, noted, "Eel is essentially a wild fish and finite - control is therefore an essential requirement for a sustainable global trade. Accurate and fully traceable labelling are in everyone's long term interests."

Credit: 
Oxford Brookes University

SLAS Discovery's July special edition 'Drug discovery targeting COVID-19' now available

Oak Brook, IL - The July edition of SLAS Discovery is a Special Edition featuring the cover article, "Development of a High-Throughput Screening Assay to Identify Inhibitors of the SARS-CoV-2 Guanine-N7-Methyltransferase Using RapidFire Mass Spectrometry" by Lesley-Anne Pearson, Charlotte J. Green, Ph.D., De Lin, Ph.D., Alain-Pierre Petit, Ph.D., David W. Gray, Ph.D., Victoria H. Cowling, Ph.D., and Euan A. F. Fordyce, Ph.D., (Drug Discovery Unit, School of Life Sciences, University of Dundee, Dundee, UK).

In January 2021, a survey of immunologists, infectious-disease researchers and virologists found that 90% of respondents believe SARS-CoV-2 will become endemic, continuing to circulate in pockets of the global population for years to come. Even as vaccines are becoming more widely available, there are people who either do not respond to the treatment or are not suitable for vaccination. There is a critical need to develop small molecule inhibitors for this pathogen.

The cover article highlights the work of the Drug Discovery Unit at the University of Dundee (Dundee, Scotland, UK) reporting on the development of a high-throughput biochemical assay to assess the impact of small molecules on the methyltransferase activity of SARS-CoV-2 nonstructural protein 14 (nsp14). This enzyme is responsible for the N7-methylation of the cap at the 5' end of viral RNA and is critical in helping coronaviruses evade host defenses. The label-free MS-based assay developed was used to screen a library of 1771 FDA-approved drugs. The chemical hits that were identified may serve as starting points for drug discovery programs aimed at delivering therapeutics for the SARS-CoV-2 virus.

The July issue of SLAS Discovery includes four articles of original research.

These include:

Development of a Novel Label-Free and High-Throughput Arginase-1 Assay Using Self-Assembled Monolayer Desorption Ionization Mass Spectrometry

A Solid Supported Membrane-Based Technology for Electrophysical Screening of B0 AT1-Modulating Compounds

Characterization of Transport Activity of SLC11 Transporters in Xenopus laevis Oocytes by Fluorophore Quenching

High-Throughput Phenotypic Assay for Compounds That Influence Mitochondrial Health Using iPSC-Derived Human Neurons

Other articles include:

Development of a High-Throughput Screening Assay to Identify Inhibitors of the SARS-CoV-2 Guanine-N7-Methyltransferase Using RapidFire Mass Spectrometry

A High-Throughput Radioactivity-Based Assay for Screening SARS-CoV-2 nsp10-nsp16 Complex

Label-Free Screening of SARS-CoV-2 NSP14 Exonuclease Activity Using SAMDI Mass Spectrometry

A Quantitative Bioassay to Determine the Inhibitory Potency of NGF-TrkA Antagonists

Access to July's SLAS Discovery issue is available at https://journals.sagepub.com/toc/jbxb/current.

For more information about SLAS and its journals, visit https://www.slas.org/publications/slas-discovery/ Access a "behind the scenes" look at the latest issue with SLAS Discovery Author Insights podcast. Tune in by visiting https://www.buzzsprout.com/1099559.

Credit: 
SLAS (Society for Laboratory Automation and Screening)

Antibiotic-resistant bacteria found in cattle

image: New research from the University of Georgia uncovered antimicrobial-resistant salmonella in cows.

Image: 
Andrew Davis Tucker/UGA

Growing resistance to our go-to antibiotics is one of the biggest threats the world faces. As common bacteria like strep and salmonella become resistant to medications, what used to be easily treatable infections can now pose difficult medical challenges.

New research from the University of Georgia shows that there may be more antimicrobial-resistant salmonella in our food animals than scientists previously thought.

Using technology she developed, UGA researcher Nikki Shariat and Amy Siceloff, a first-year doctoral student in UGA's Department of Microbiology, found that traditional culturing methods used to test livestock for problematic bacteria often miss drug-resistant strains of salmonella. This finding has implications for treating sick food animals and the people who get infected by eating contaminated meat.

The study, published in Antimicrobial Agents and Chemotherapy, showed that 60% of cattle fecal samples contained multiple strains of salmonella that traditional testing methods missed. More alarmingly, Shariat found that about one out of every 10 samples tested positive for a drug-resistant strain of salmonella called Salmonella Reading. In addition to being antibiotic resistant, Salmonella Reading can cause severe illness in people.

A new technology emerges

Developed by Shariat in 2015, CRISPR-SeroSeq enables researchers to analyze all the types of salmonella present in a given sample. Traditional methods only examine one or two colonies of bacteria, potentially missing some strains of salmonella altogether. Shariat's technology identifies molecular signatures in salmonella's CRISPR regions, a specialized part of the bacteria's DNA. It also helps researchers identify which strains of the bacteria are most abundant.

In the current study, Shariat and colleagues found multiple salmonella strains in cattle feces before the animals were treated with the antibiotic tetracycline. After treatment, several of the dominant salmonella strains in the sample were wiped out, allowing Salmonella Reading to flourish.

Traditional culturing methods missed the antibiotic-resistant strain in the original samples. It was only once the antibiotic eliminated the more abundant strains that conventional methods were able to detect Salmonella Reading in the samples.

"This suggests that traditional tests have underestimated the amount of antibiotic-resistant bacteria in the past," said Shariat, an assistant professor of population health in the College of Veterinary Medicine.

But CRISPR-SeroSeq is a much more sensitive tool. It flagged the Salmonella Reading before antibiotic treatment.

"We need to know the antimicrobial resistance profiles of the bacteria that are present in animals," Shariat said. "That knowledge could make us change our choice of the type of antibiotic we use to treat ill animals. It can also help us select the best antibiotic for people who get sick from eating contaminated meat."

Missing the mark

Shariat's research shows that current surveillance efforts are likely underestimating the amount of antimicrobial resistance that exists.

Agencies that track antimicrobial resistance, like the FDA, USDA and CDC, among others, still rely on traditional sampling methods, which means they may be missing reservoirs of drug-resistant bacteria.

"The problem is you have hundreds of salmonella colonies in a given sample, but you only pick one or two of them to test," Shariat said. "It becomes a numbers game where researchers only pick the most abundant ones, and this means that they underestimate the different types of salmonella that are present."

Using CRISPR-SeroSeq can help fill that knowledge gap, giving researchers a better idea of how much antibiotic resistant bacteria exists. This information can help livestock farmers reduce and control outbreaks and guide policy on how to fight back against a growing public health threat.

Credit: 
University of Georgia

Autonomous excavators ready for around the clock real-world deployment

video: Baidu's Autonomous Excavator System (AES) is among the world's first uncrewed excavation systems to have been deployed in real-world scenarios and continuously operating for over 24 hours.

Image: 
Baidu Research

Researchers from Baidu Research Robotics and Auto-Driving Lab (RAL) and the University of Maryland, College Park, have introduced an autonomous excavator system (AES) that can perform material loading tasks for a long duration without any human intervention while offering performance closely equivalent to that of an experienced human operator.

AES is among the world's first uncrewed excavation systems to have been deployed in real-world scenarios and continuously operating for over 24 hours, bringing about industry-leading benefits in terms of enhanced safety and productivity.

The researchers described their methodology in a research paper published on June 30, 2021, in Science Robotics.

"This work presents an efficient, robust, and general autonomous system architecture that enables excavators of various sizes to perform material loading tasks in the real world autonomously," said Dr. Liangjun Zhang, corresponding author and the Head of Baidu Research Robotics and Auto-Driving Lab.

Excavators are vital for infrastructure construction, mining, and rescue applications. The global market size for excavators was $44.12 billion in 2018 and is expected to grow to $63.14 billion by 2026.

Given this projected market increase, construction companies worldwide are facing hiring shortages for skilled heavy machinery operators, particularly excavators. Additionally, COVID-19 continues to exacerbate the labor shortage crisis. Another contributing factor is the hazardous and toxic work environments that can impact the health and safety of on-site human operators, including cave-ins, ground collapses, or other excavation accidents that cause approximately 200 casualties per year in the U.S. alone.

The industry is therefore taking a scientific approach and looking to create excavator robots that can provide groundbreaking solutions to meet these needs, making the development of systems such as AES a growing trend alongside the implementation of other robots in manufacturing, warehouses, and autonomous vehicles.

While most industry robots are comparatively smaller and function in more predictable environments, excavator robots are required to operate in an extensive range of hazardous environmental conditions. They must be able to identify target materials, avoid obstacles, handle uncontrollable environments, and continue running under difficult weather conditions.

AES uses accurate and real-time algorithms for perception, planning, and control alongside a new architecture to incorporate these capabilities for autonomous operation. Multiple sensors - including LiDAR, cameras, and proprioceptive sensors - are integrated for the perception module to perceive the 3D environment and identify target materials, along with advanced algorithms such as a dedusting neural network to generate clean images.

With this modular design, the AES architecture can be effectively utilized by excavators of all sizes - including 6.5 and 7.5 metric ton compact excavators, 33.5 metric ton standard excavators, and 49 metric ton large excavators - and is suitable for diverse applications.

To evaluate the efficiency and robustness of AES, researchers teamed up with a leading equipment manufacturing company to deploy the system at a waste disposal site, a toxic and harmful real-world scenario where automation is in strong demand. Despite the challenging assignment, AES was able to continuously operate for more than 24 hours without any human intervention. AES has also been tested in winter weather conditions, where vaporization can pose a threat towards the sensing performance of LiDAR. The amount of materials excavated, in both wet and dry form, was 67.1 cubic meters per hour for a compact excavator, which is in line with the performance of a traditional human operator. "AES performs consistently and reliably over a long time, while the performance of human operators can be uncertain," said Dr. Zhang.

Researchers also set up ten different scenarios at a closed testing field to see how the system performed in numerous real-world tasks. After testing a variety of large, medium-sized, and compact excavators, AES was ultimately proven to match the average efficiency of a human operator in terms of the amount of materials excavated per hour.

"This represents a key step moving towards deploying robots with long operating periods, even in uncontrolled indoor and outdoor environments," said Dr. Dinesh Manocha, Distinguished University Professor of Computer Science and Electrical and Computer Engineering at the University of Maryland, College Park.

Going forward, Baidu Research RAL will continue to refine core modules of AES and further explore scenarios where extreme weather or environmental conditions may be present.

Baidu has been collaborating with several of the world's leading construction machinery companies to automate traditional heavy construction machinery with AES. "We aim to leverage our robust and secure platform, infused with our powerful AI and cloud capabilities to transform the construction industry," said Dr. Haifeng Wang, CTO of Baidu.

Credit: 
Baidu Inc

Study finds breast cancer's response to tumor stiffness may predict bone metastasis

In cases of breast cancer, bone metastasis - when cancer cells spread to new sites in the bone - causes the most breast cancer-related harm and is often incurable in advanced disease. A new study by University of Arizona Health Sciences researchers found that cancer cells become more aggressive when exposed to tissue stiffening and that these changes persist over time.

Tumor stiffening, which develops as diseased breast tissue becomes fibrotic, plays a major role in how breast cancer cells spread throughout the body. The paper, "Breast tumor stiffness instructs bone metastasis via maintenance of mechanical conditioning," published today in the journal Cell Reports, found that the stiffness of the breast tumor microenvironment can cause changes to cancer cells that make them more aggressively spread to the bone. The resulting changes are maintained as "mechanical memory," which instructs the cancer cells to send signals that lead to the breakdown of bone. Once this happens, patients often suffer debilitating complications like spontaneous fractures.

"Unfortunately, bone metastasis is normally not identified until an advanced state when it's not reversible," said senior author Ghassan Mouneimne, PhD, associate professor of cellular and molecular medicine and cancer biology in the UArizona College of Medicine - Tucson. "What's really exciting is one day being able to take a sample from the patient's primary tumor and predict who is at high risk for bone metastasis. Then we could intervene with a prevention strategy that we are now validating in the lab."

The study, which is the first to demonstrate the concept of mechanical memory during cancer metastasis, developed a novel mechanical conditioning, or "MeCo," score, to quantify the cellular changes. Eventually, researchers hope the MeCo score can be used to help identify breast cancer patients who might benefit from repurposed antifibrotic treatments to prevent bone metastasis.

"The higher the patient's breast tumor MeCo score, the higher the likelihood they would go on to have bone metastasis and poorer outcomes," said Casey Romanoski, PhD, assistant professor of cellular and molecular medicine and a member of the BIO5 Institute and UArizona Cancer Center. "This stiffness signature could have incredible clinical utility."

To further explore the clinical application, Dr. Mouneimne and Adam Watson, PhD, a former graduate student and postdoctoral fellow at the UArizona Cancer Center, worked with Tech Launch Arizona, the office of the university that commercializes inventions stemming from research, to launch a startup, MeCo Diagnostics, LLC. The company is working toward maturing the technology and bringing it to the marketplace where it can impact the lives of breast cancer patients everywhere.

It was previously known that tumor stiffness induces cellular changes that lead to a more aggressive cancer, but according to Dr. Watson, lead author on the paper, the concept of "stiffness" was misleading.

"Most early-stage breast tumors are stiffer than surrounding tissue, yet most don't spread to bone," he said. "It's not about tumor stiffness but rather stiffness responsiveness of the cancer cells, which we call mechanical conditioning."

To study this phenomenon, the team created a laboratory environment that mimicked the stiff or soft tumor environments encountered in the body and assessed how breast cancer cells responded. They found that cells grown in a stiff environment had a "mechanoresponse" characterized by cell spreading, invasion and turning on genes linked with both bone development and disease. And these gene changes endured even after the cells were moved to a soft environment.

Next, researchers looked at what genes were turned on and off in breast cancer cells in response to the stiff environments. Based on these gene expression changes, they developed the MeCo score, which was validated and refined using data from thousands of patients with breast cancer.

"This is the culmination of a lot of work by researchers from many different fields," Dr. Mouneimne said. "It highlights the environment we have at the University of Arizona Health Sciences, and how working together can make progress in this challenging area of breast cancer metastasis."

Future investigations could focus on how cancer cells maintain the gene expression changes that drive metastasis, based on additional findings that identified a transcription factor called RUNX2 that was activated by fibrotic-like stiffness. RUNX2 stays attached to the DNA as the cell divides and "bookmarks" which genes remain turned on, which includes the genes that drive bone metastasis and the breakdown of bone.

Credit: 
University of Arizona Health Sciences

Knowledge of nurses for pain management of patients on maintenance hemodialysis

In hospitals, we know that the nurses are the first to deal with patients in pain. For any ailment, including patients needing hemodialysis, the knowledge of pain management, the nurses have, allow them to provide optimal pain management.

This qualitative study aims to explore the experiences, perceptions, and beliefs of nurses serving in the hemodialysis unit in terms of pain management practices. The study helped identify the educational needs of the nurses to improve pain management in practice. A subjective sample of 16 nurses working in four out-patient hemodialysis units in Amman, Jordan, was hired for this research. Using semi-structured interviews, the data was collected.

The study found five themes and fifteen subcategories in which the nurses' knowledge for pain management can be analyzed. The findings will be useful for evaluating pain management practices for patients on maintenance hemodialysis, developing educational programs for nurses serving in hemodialysis units, focusing on improving pain management, and providing knowledge regarding these issues.

Credit: 
Bentham Science Publishers

Conservation aquaculture could bring more native oysters to west coast

Ten estuaries on the West Coast of North America have been identified as priority locations for expanding the use of conservation aquaculture in a study led by the Native Olympia Oyster Collaborative and funded by the Science for Nature and People Partnership (SNAPP). SNAPP is a research collaboration supported by the National Center for Ecological Analysis & Synthesis (NCEAS) at UC Santa Barbara.

The study, published in Plos One, recommends locations and methods for the strategic expansion of conservation aquaculture to bring back Olympia oyster populations -- both to local estuaries where they have most declined, and into more local restaurants for oyster lovers to dine on. The authors propose using aquaculture in these estuaries -- seven of which are in California -- in a win-win scenario that supports severely declined Olympia oyster populations, while also benefiting people, including local shellfish growers and Tribal communities.

"If you've eaten oysters on the half shell anywhere on the West Coast of North America, chances are good that you've been eating one of a few species introduced to the region for just that purpose," said April Ridlon, SNAPP postdoctoral scholar, collaborative lead of the Native Olympia Oyster Collaborative (NOOC) and lead author of the study. That's partly because the oyster native to this coast, the Olympia oyster (Ostrea lurida), was overfished in the Gold Rush era, and some populations -- faced with other stressors like habitat changes and sedimentation -- never recovered.

Humans have used aquaculture -- growing aquatic animals and plants to produce food -- for millennia." The oysters served at your local oyster bar have likely been grown first in a hatchery in tanks, then put out into a bay or estuary until they reach an appropriate size to be harvested and make it onto your plate," Ridlon said. "This same process can also be used to restore declining wild populations, similar to captive breeding programs for endangered species like California Condors and Hawaiian Monk Seals. Native oysters that are raised in a hatchery can be added to local estuaries permanently to help boost their numbers where populations have severely declined. Using aquaculture techniques to support wild populations of native species is what we call conservation aquaculture."

Restoration of Olympia oyster populations doesn't always require the use of aquaculture; many projects have been implemented across the species' range from British Columbia to Baja California, without using aquaculture at all (see the NOOC Restoration Site Map). Aquaculture techniques can also pose risks, including unintended negative ecological and genetic outcomes resulting from releasing hatchery-raised oysters into wild populations, many of which aren't well understood.

This new study thoroughly evaluates the risks and rewards, and recommends aquaculture only at ten priority estuaries where using it is critical to restore oyster populations, and where the benefits clearly outweigh the risks. Two priority estuaries are in the Puget Sound region of Washington: Northern Puget Sound and Whidbey Basin; one is in Oregon: Netarts Bay. The remaining seven are in California: Humboldt Bay, Tomales Bay, Richardson Bay, Elkhorn Slough, Morro Bay, Carpinteria Marsh, and Mugu Lagoon. Aquaculture can be used to support conservation efforts at all of these sites, without growing native oysters to harvest or sell them.

At some of the priority estuaries with good water quality and a nearby hatchery, commercial aquaculture and harvest also are possible, making this tool a unique way to support both oysters and people. In these estuaries, oyster offspring produced by commercial aquaculture may be swept into the bay and settle in local wild populations, increasing their numbers. Growers may also benefit from adding a new oyster species to those that they grow -- one that is more resilient to the diseases or climate-related events that can wipe out other oyster species entirely.

In Puget Sound, where the Olympia oyster once supported the shellfish industry, many farmers are already growing the native oyster, supported by a niche market of "foodies" looking for a different flavor and the "tide-to-table" experience. "Olys -- the industry nickname for Olympia oysters -- have a unique flavor profile. They're smaller than most other oysters and pack a punch of flavor," said Shina Wysocki, owner of Chelsea Farms Oyster Bar in Olympia, Wash. "They also provide an authentic connection to the traditional foodways of the West Coast." Puget Sound Restoration Fund (PSRF) has thoughtfully engaged with commercial growers to explore conservation aquaculture of Olympia oysters in the region, and hopes to continue this approach at the two priority estuaries identified by the study: Northern Puget Sound and Whidbey basin.

"This model of partnership between growers and restoration organizations could be used in other priority estuaries to expand the toolkit for bringing native oysters back to mudflats and menus in the future," said Betsy Peabody, executive director of PSRF.

And the Olympia oyster isn't the only marine species that can benefit from using aquaculture as a tool to restore its populations. "There is a growing interest in using conservation aquaculture for marine species globally -- to support native populations of everything from kelp to abalone to giant clams," said Tiffany Waters, co-author and global aquaculture manager for The Nature Conservancy. "This kind of collaborative research is so exciting, as it can be used as a model for other species and to provide win-win opportunities for both nature and people."

Credit: 
University of California - Santa Barbara

Did your plastic surgeon really turn back the clock? Artificial intelligence may be able to quantify how young you actually look after facelift surgery

June 30, 2021 - For most patients, the reasons for having a facelift are simple: to "turn back the clock" for a younger and more attractive appearance. Even during the pandemic year 2020, more than 234,000 patients underwent facelift surgery, according to American Society of Plastic Surgeons (ASPS) statistics.

When considering facelift surgery, patients may ask, "How much younger will I look?" For plastic surgeons, that has been a difficult question to answer. Typically, the cosmetic outcomes of facelifting have been judged on a case-by-case basis, or with the use of subjective ratings.

Now research suggests a new, objective approach to assessing the reduction in apparent age after facelift surgery: artificial intelligence (AI) networks trained to estimate age based on facial photos. "Our study shows that currently available AI algorithms can recognize the success of facelifting, and even put a number on the reduction in years of perceived age," comments senior author James P. Bradley, MD, Vice Chairman of Surgery, Zucker School of Medicine at Hofstra/Northwell. The study is published in the July issue of Plastic and Reconstructive Surgery®, the official medical journal of the ASPS.

The study used a type of AI algorithm called convolutional neural networks. "By training on datasets containing millions of public images, these neural networks can learn to discern facial features with much higher 'experience' than a typical person," Dr. Bradley explains.

Four different, publicly available neural networks were used to make objective age estimates of facial age for 50 patients who underwent facelifting. The AI estimates were made using standardized photos taken before and at least one year after facelift surgery. The results were compared with patients' subjective ratings of their appearance, along with responses to a standard patient-rated evaluation (FACE-Q questionnaire).

The patients were all women, average age 58.7 years. The AI algorithms used in the study were 100 percent accurate in identifying the patients' age, based on "before" photos.

In the "after" photos, the neural networks recognized a 4.3-year reduction in age after facelift surgery. That was substantially less than the 6.7-year reduction, as rated by patients themselves. "Patients may tend to overestimate how much younger they look after facelift surgery - perhaps reflecting their emotional and financial investment in the procedure," Dr. Bradley comments.

On the FACE-Q questionnaire, patients were highly satisfied with the results of their facelift surgery: average scores (on a 0-to-100 scale) were 75 for facial appearance and over 80 for quality of life. Neural network estimates of age reduction were directly correlated with patient satisfaction. "The younger the AI program perceives a patient's age, the greater their satisfaction with the results of their facelift," says Dr. Bradley.

Artificial intelligence algorithms can provide an objective and reliable estimate of the apparent reduction in age after facelift surgery, the new findings suggest. These age estimates also seem to provide an indicator of patient satisfaction scores - even if the reduction in years doesn't quite match the patient's own subjective rating.

"Together with powerful image analysis tools used in modern plastic surgery, neural networks may play a useful role in counseling patients and demonstrating successful results of facial rejuvenation procedures," Dr. Bradley adds. "We think that AI algorithms could also play a useful role for plastic surgeons in assessing their own results and comparing the outcomes of different techniques."

Credit: 
Wolters Kluwer Health