Culture

Vapers show chemical changes in their genome linked to cancer

Biologically important changes in DNA seen in smokers are also being found in people who vape, according to a new study published in the journal Epigenetics.

A team of scientists at the Keck School of Medicine of USC have found people who vape exhibit similar chemical modifications in their overall genome and in parts of their DNA as people who smoke cigarettes.

These specific chemical alterations, also known as epigenetic changes, can cause genes to malfunction -- and are commonly found in nearly all types of human cancer as well as other serious diseases.

The findings add to a growing list of health concerns associated with vaping, which is perceived by many as a safer alternative to smoking. E-cigarette use has soared among youth, with more than 25 percent of high school students now using the products, according to the CDC.

The new study, led by Ahmad Besaratinia, PhD, associate professor at the Keck School of Medicine of USC, examined a group of people matched for age, gender and race, divided equally into three categories: vapers only, smokers only and a control group of people who neither vaped nor smoked.

Blood was drawn from each of the participants and tested for changes in levels of two specific chemical tags attached to DNA that are known to impact gene activity and/or function. These chemical tags include: (1) methyl groups in a specific DNA sequence, named Long Interspersed Nucleotide Element 1 (LINE-1); and (2) hydroxymethyl groups in the genome overall. Changes in the levels of these chemical tags, which are important for genomic stability and regulation of gene expression, occur in various stages of development, as well as in diseases such as cancer.

Of the 45 study participants, vapers and smokers both showed significant reduction in the levels of both chemical tags compared to the control group. This is the first study to show that vapers, like smokers, have these biologically important changes detectable in their blood cells.

"That doesn't mean that these people are going to develop cancer," said Besaratinia. "But what we are seeing is that the same changes in chemical tags detectable in tumors from cancer patients are also found in people who vape or smoke, presumably due to exposure to cancer-causing chemicals present in cigarette smoke and, generally at much lower levels, in electronic cigarettes' vapor."

This is the newest study Besaratinia's team has done on vapers and smokers. Their earlier study published last year (IJMS, 2019) examined changes in gene expression in epithelial cells taken from the mouths of vapers and smokers compared to a control group. In that study, both vapers and smokers showed abnormal gene expression in a large number of genes linked to cancer.

"Our new study adds an important piece to that puzzle by demonstrating that epigenetic mechanisms, specifically changes in chemical tags attached to the DNA, may contribute to the abnormal expression of genes in vapers and smokers alike," said Besaratinia.

He and his team plan to continue their research. The next step is to look at the whole genome and identify all the genes targeted by these two chemical changes in vapers versus smokers.

"Considering the established role many genes play in human diseases, this investigation should provide invaluable information, which may have immediate public health and policy implications, said Besaratinia. "The epidemic of teen vaping and the recent outbreak of vaping-related severe lung injury and deaths in the U.S. underscore the importance of generating scientific evidence on which future regulations for electronic cigarette manufacturing, marketing, and distribution can be based."

Credit: 
Keck School of Medicine of USC

Fossilized insect from 100 million years ago is oldest record of primitive bee with pollen

image: 100-million-year-old Discoscapa apicula. The bee is carrying four beetle triungulins.

Image: 
Image provided by George Poinar Jr., OSU College of Science.

CORVALLIS, Ore. - Beetle parasites clinging to a primitive bee 100 million years ago may have caused the flight error that, while deadly for the insect, is a boon for science today.

The female bee, which became stuck in tree resin and thus preserved in amber, has been identified by Oregon State University researcher George Poinar Jr. as a new family, genus and species.

The mid-Cretaceous fossil from Myanmar provides the first record of a primitive bee with pollen and also the first record of the beetle parasites, which continue to show up on modern bees today.

The findings, published in BioOne Complete, shed new light on the early days of bees, a key component in evolutionary history and the diversification of flowering plants.

Insect pollinators aid the reproduction of flowering plants around the globe and are also ecologically critical as promoters of biodiversity. Bees are the standard bearer because they're usually present in the greatest numbers and because they're the only pollinator group that feeds exclusively on nectar and pollen throughout their life cycle.

Bees evolved from apoid wasps, which are carnivores. Not much is known, however, about the changes wasps underwent as they made that dietary transition.

Poinar, professor emeritus in the OSU College of Science and an international expert in using plant and animal life forms preserved in amber to learn more about the biology and ecology of the distant past, classified the new find as Discoscapa apicula, in the family Discoscapidae.

The fossilized bee shares traits with modern bees - including plumose hairs, a rounded pronotal lobe, and a pair of spurs on the hind tibia - and also those of apoid wasps, such as very low-placed antennal sockets and certain wing-vein features.

"Something unique about the new family that's not found on any extant or extinct lineage of apoid wasps or bees is a bifurcated scape," Poinar said, referring to a two-segment antennae base. "The fossil record of bees is pretty vast, but most are from the last 65 million years and look a lot like modern bees. Fossils like the one in this study can tell us about the changes certain wasp lineages underwent as they became palynivores - pollen eaters."

Numerous pollen grains on Discoscapa apicula show the bee had recently been to one or more flowers.

"Additional evidence that the fossil bee had visited flowers are the 21 beetle triungulins - larvae - in the same piece of amber that were hitching a ride back to the bee's nest to dine on bee larvae and their provisions, food left by the female," Poinar said. "It is certainly possible that the large number of triungulins caused the bee to accidently fly into the resin."

Credit: 
Oregon State University

Preliminary evidence suggests that new coronavirus cannot be passed from mother to child late in pregnancy

Evidence of intrauterine vertical transmission was assessed by testing for the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2, formerly the 2019 novel coronavirus or 2019-nCoV) in amniotic fluid, cord blood, and neonatal throat swab samples from six pregnancies. All samples were negative.

Study also evaluated clinical characteristics of the 2019 novel coronavirus disease (COVID-19) infection in nine pregnant women.

There is currently no evidence that the 2019 novel coronavirus disease (COVID-19) causes severe adverse outcomes in neonates or that it can pass to the child while in the womb, according to a small observational study of women from Wuhan, China, who were in the third trimester of pregnancy and had pneumonia caused by COVID-19.

In the study, published in The Lancet, there were two cases of fetal distress, but all nine pregnancies resulted in livebirths. The study also finds that symptoms from COVID-19 infection in pregnant women were similar to those reported in non-pregnant adults, and no women in the study developed severe pneumonia or died.

The authors of the new study caution that their findings are based on a limited number of cases, over a short period of time, and only included women who were late in their pregnancy and gave birth by caesarean section. The effects of mothers being infected with the virus during the first or second trimester of pregnancy and the subsequent outcomes for their offspring remain unclear, as well as whether the virus can be passed from mother to child during vaginal birth.

The new study comes after the news of a newborn (born to a mother infected with COVID-19) testing positive for COVID-19 infection within 36 hours of birth, which prompted questions about whether the virus could be contracted in the womb.

Talking about this case, lead author of the study Professor Yuanzhen Zhang, Zhongnan Hospital of Wuhan University, China, says: "It is important to note that many important clinical details of this case are missing, and for this reason, we cannot conclude from this one case whether intrauterine infection is possible. Nonetheless, we should continue to pay special attentions to newborns born to mothers with COVID-19 pneumonia to help prevent infections in this group." [1]

Co-author, Prof Huixia Yang, Peking University First Hospital, China, adds: "Existing studies into the effects of COVID-19 apply to the general population, and there is limited information about the virus in pregnant women. This is important to study because pregnant women can be particularly susceptible to respiratory pathogens and severe pneumonia, because they are immunocompromised and because of pregnancy-related physiological changes which could leave them at higher risk of poor outcomes. Although in our study no patients developed severe pneumonia or died of their infection, we need to continue to study the virus to understand the effects in a larger group of pregnant women." [1]

In the study, the medical records of nine pregnant women who had pneumonia caused by COVID-19 infection were retrospectively reviewed. Infection was lab-confirmed for all women in the study, and the authors studied the nine women's symptoms.

In addition, samples of amniotic fluid, cord blood, neonatal throat swabs and breast milk were taken for six of the nine cases [2] and tested for severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). Importantly, the samples of amniotic fluid, cord blood, and neonatal throat swabs were collected in the operating room at the time of birth to guarantee that samples were not contaminated and best represented intrauterine conditions.

All mothers in the study were aged between 26-40 years. None of them had underlying health conditions, but one developed gestational hypertension from week 27 of her pregnancy, and another developed pre-eclampsia at week 31. Both patients' conditions were stable during pregnancy.

The nine women in the study had typical symptoms of COVID-19 infection, and were given oxygen support and antibiotics. Six of the women were also given antiviral therapy.

All nine pregnancies resulted in livebirths, and there were no cases of neonatal asphyxia. Four women had pregnancy complications (two had fetal distress and two had premature rupture of membrane), and four women had preterm labour which was not related to their infection and occurred after 36 gestational weeks. Two of the prematurely born newborns had a low birthweight.

The authors note that their findings are similar to observations of the severe acute respiratory syndrome (SARS) virus in pregnant women, where there was no evidence of the virus being passed from mother to child during pregnancy or birth.

They also explain that future follow-up of the women and children in the study will be necessary to determine their long-term safety and health.

They note some limitations in their study, including that the risk of infection in pregnant women and the effects of the time or mode of delivery on pregnancy outcomes were not evaluated. They say that future research is needed to determine whether COVID-19 could damage the placenta as this could increase risk of vertical transmission.

Writing in a linked Comment, Dr Jie Qiao (who was not involved in the study), Peking University Third Hospital, China, notes that this new research helps to understand the clinical characteristics, pregnancy outcomes, and vertical transmission potential of COVID-19, and notes that this is valuable for preventive and clinical practice in China and elsewhere under such emergent circumstances.

She also compares the effects of the virus to those of SARS, and says: "Previous studies have shown that SARS during pregnancy is associated with a high incidence of adverse maternal and neonatal complications, such as spontaneous miscarriage, preterm delivery, intrauterine growth restriction, application of endotracheal intubation, admission to the intensive care unit, renal failure, and disseminated intravascular coagulopathy. However, pregnant women with COVID-19 infection in the present study had fewer adverse maternal and neonatal complications and outcomes than would be anticipated for those with SARS-CoV-1 infection. Although a small number of cases was analysed and the findings should be interpreted with caution, the findings are mostly consistent with the clinical analysis done by Zhu and colleagues of ten neonates born to mothers with COVID-19 pneumonia."

Credit: 
The Lancet

New study shows Deepwater Horizon oil spill larger than the Obama administration claimed

image: On April 20, 2010, the Deepwater Horizon oil rig exploded, releasing 210 million gallons of crude oil into the Gulf of Mexico for a total of 87 days, making it the largest oil spill in U.S. history. Oil slicks from the blowout covered an estimated area of 57,000 square miles (149,000 square kilometers).

Image: 
US Coast Guard

MIAMI--Toxic and invisible oil spread well beyond the known satellite footprint of the Deepwater Horizon oil spill, according to a new study led by scientists at the University of Miami (UM) Rosenstiel school of Marine and Atmospheric Science. These new findings have important implications for environmental health during future oil spills.

The UM Rosenstiel School-led research team combine oil-transport modeling techniques with remote sensing data and in-water sampling to provide a comprehensive look at the oil spill. The findings revealed that a fraction of the spill was invisible to satellites, and yet toxic to marine wildlife.

"We found that there was a substantial fraction of oil invisible to satellites and aerial imaging," said the study's lead author Igal Berenshtein, a postdoctoral researcher at the UM Rosenstiel School. "The spill was only visible to satellites above a certain oil concentration at the surface leaving a portion unaccounted for."

On April 20, 2010, the Deepwater Horizon oil rig exploded, releasing 210 million gallons of crude oil into the Gulf of Mexico for a total of 87 days, making it the largest oil spill in U.S. history. Oil slicks from the blowout covered an estimated area of 57,000 square miles (149,000 square kilometers).

These new findings, published in Science Advances, showed a much wider extent of the spill beyond the satellite footprint, reaching the West Florida shelf, the Texas shores, the Florida Keys and along the Gulf Stream towards the East Florida shelf.

"Our results change established perceptions about the consequences of oil spills by showing that toxic and invisible oil can extend beyond the satellite footprint at potentially lethal and sub-lethal concentrations to a wide range of wildlife in the Gulf of Mexico," said Claire Paris, senior author of the study and professor of ocean sciences the UM Rosenstiel School. "This work added a 3rd dimension to what was previously seen as just surface slicks. This additional dimension has been visualized with more realistic and accurate oil spill models developed with a team of chemical engineers and more efficient computing resources."

The new framework developed by the researchers can assist emergency managers and decision makers in better managing the impacts of future potential oil spills, said the authors.

Credit: 
University of Miami Rosenstiel School of Marine, Atmospheric, and Earth Science

Increasing number of grocery stores in some areas could reduce food waste up to 9%

INFORMS Journal Manufacturing & Service Operations Management Study Key Takeaways:

An increase in the number of stores directly decreases consumer waste due to improved access to groceries.

Too many stores could increase store waste due to distribution of inventory and price competition.

The optimal solution to reduce food waste is to increase the number of stores in a given area, but not too much. The perfect balance will ensure a drop in food waste.

CATONSVILLE, MD, February 12, 2020 - Food waste is a big problem in the United States. According to the U.S. Department of Agriculture, food waste is estimated at between 30-40 percent of the food supply. New research in the INFORMS journal Manufacturing & Service Operations Management finds that increasing the number of grocery stores in certain areas can drastically decrease waste.

The study, "Grocery Store Density and Food Waste," was conducted by Elena Belavina of Cornell University and examined grocery industry, economic, and demographic data, finding that store density in most American cities is well below the optimal level, and modest increases in store density substantially reduces waste.

The research finds that areas like Manhattan have a lot of grocery store options and a lot of people. A high number of stores in this area reduces food waste by consumers and retailers. With more stores, households have to travel less to visit a store, which means consumers make more frequent trips with smaller purchases per trip. In turn, these smaller basket sizes imply less food waste because it's less likely the food will expire before it's used.

Thus, increasing grocery store density allows households more flexibility to decide how often and how many groceries to purchase to accommodate their needs. The author found that in a city like Chicago, results show even modest increases in store density can lead to substantial decreases in food waste.

"Just 3-4 more stores in the Chicago area can lead to 6-9% reduction in waste. This is accompanied by a 1-4% decrease in grocery expenses for households," said Belavina, an associate professor in the SC Johnson College of Business at Cornell. "What's more, increasing the number of grocery stores in a given area also works to combat emissions, while reducing consumer food expenditures, achieving two goals that are often considered competing."

"The key is finding the right number of stores for each area. An increase in the number of stores decreases consumer waste due to improved access to groceries, but too many store options increase retail waste due to relocating inventory, price competition and diminished demand by customers."

Credit: 
Institute for Operations Research and the Management Sciences

Study: Diet makes a difference in fight against hospital-acquired infection

LAS VEGAS - February 11, 2020 - Popular diets low in carbs and high in fat and protein might be good for the waistline, but a new UNLV study shows that just the opposite may help to alleviate the hospital-acquired infection Clostridioides difficile.

In a study published in mSystems, an open access journal of the American Society for Microbiology, UNLV scientists found that an interaction between antibiotic use and a high-fat/high-protein diet exacerbate C. diff infections in mice. Conversely, they found that a high-carbohydrate diet - which was correspondingly low in fat and protein - nearly eliminated symptoms.

C. diff, an intestinal infection designated as an urgent threat by the U.S. Centers for Disease Control and Prevention, is often acquired when antibiotics have wiped out the "good" bacteria in the gut. Hundreds of thousands of people are diagnosed with C. diff infections each year and more than 10,000 die.

"Every day, we are learning more about the human microbiome and its importance in human health," said Brian Hedlund, a UNLV microbiologist and study co-author. "The gut microbiome is strongly affected by diet, but the C. diff research community hasn't come to a consensus yet on the effects of diet on its risk or severity. Our study helps address this by testing several diets with very different macronutrient content. That is, the balance of dietary carbohydrate, protein, and fat were very different." 

Though studies suggest dietary protein exacerbates C. diff, there's little or no existing research exploring the interaction of a high-fat/high-protein diet with the infection. Hedlund and study co-author Ernesto Abel-Santos, a UNLV biochemist, caution that the study was conducted using an animal model, and more work is underway to begin to establish a link between these diets and infections in people.

"Extreme diets are becoming very popular but we do not know the long-term effects on human health and specifically on the health of the human gut flora," Abel-Santos said. "We have to look at humans to see if it correlates."

Recent studies suggest that because antibiotics kill bacterial species indiscriminately, the medications decimate populations of organisms that compete for amino acids, leaving C. diff free to propagate.

But Hedlund said the story is even more complex. "It's clear that it's not just a numbers game," he said. The new work suggests that diet may promote microbial groups that can be protective, even after antibiotics. For an infection to flourish, he said, "you might need this combination of wiping out C. diff competitors with antibiotics and then a diet that promotes overgrowth and disease."

The new study raised other questions as well. For example: The high-carb diet, which was protective against C. diff infection, gave rise to the least diverse community of microbes.

"Lots of papers say that a lower microbial diversity is always a bad thing, but in this case, it had the best disease outcome," said Abel-Santos. However, he cautions that a high-carb diet could lead to animals becoming asymptomatic carriers that can disseminate the infection to susceptible subjects.

Credit: 
University of Nevada, Las Vegas

Faith-centered tattoos are analyzed in study of university students

image: Religion-based tattoo on college student in Baylor University study.

Image: 
Kevin Dougherty

With more than a quarter of U.S. adults now having tattoos -- and nearly half of millennials sporting them -- only a handful of studies have focused on religious tattoos. But a new study by researchers at Baylor University and Texas Tech University analyzes faith-centered tattoos and is the first to use visual images of them.

The study, published in the journal Visual Studies, analyzed 752 photos of tattoos taken at a Christian university in the United States and found that nearly 20% of those were overtly religious in content.

"The embrace of tattoos in the United States reflects a generational shift toward greater individualism and self-expression," said lead author Kevin D. Dougherty, Ph.D., associate professor of sociology at Baylor University. "Americans born since the 1970s have increasingly embraced tattoos as an acceptable means to communicate identity and belonging, whereas previous generations of Americans largely did not. Today, men and women in the United States are equally likely to have tattoos."

A 2016 Harris Poll showed that 29% of American adults had at least one tattoo -- up from 14% in 2008.

"An interesting discovery in our research is that the religious tattoos of college students are more likely than non-religious ones to face inward, toward the owner," Dougherty said. An example is a tattoo on the inner wrist.

"We speculate that religious tattoos may serve a different purpose than do tattoos of favorite sports teams, occupations or hobbies. While any visible tattoo is a public proclamation, tattoos oriented toward the owner represent a personal reminder of identity or affiliation. In this way, religious tattoos are personal but not private. They may encourage individuals to live in accordance with their religious beliefs."

The study also found some evidence that a generally visible tattoo may be conceptually different from tattoos hidden by clothing, said co-author Jerome R. Koch, Ph.D., professor of sociology at Texas Tech University. He has studied body art on college campuses for more than a decade.

"Generally visible tattoos seem intended more toward stories of life and remembrance, which the wearer may be willing to openly discuss," Koch said. "Tattoos which are only visible, say, to someone else with whom they are intimately involved may be more closely tied to sense of self, private memories and/or emotional conflicts."

Photos used in the study were taken by sociology students as part of a semester-long research project. Researchers analyzed 752 photos by owners' gender; whether the tattoos were religious in nature; and tattoo size -- small (1 inch by 1 inch or smaller), medium (3 inches by 3 inches) or large (larger than 3 inches or more than a quarter of an arm or leg). The study also examined whether the tattoo faced the owner or faced out; and whether those with religious content featured an image, text or both image and text.

The analysis found that:

Overt religious content appeared in 145 photos (19% of total sample).

More men in the photos (23%) had religious tattoos than women (17%).

Of the religious tattoos on women, most (69%) were small and in more easily concealed locations. The most frequent sites of their religious tattoos were the wrist (23%), foot (18%) and back (18%).

Men's religious tattoos were more likely to be large than non-religious ones (61% compared to 44%). Most prevalent sites for men's religious tattoos were upper arm (26%), forearm (21%) and back (19%).

Half of the religious tattoos were images -- the most common being the cross. More than one quarter were text, often Bible references, with a slight majority being New Testament references. But the Old Testament book of Psalms was most popular. Images with text comprised 21% of religious tattoos.

Religious tattoos were more likely than non-religious ones to face the owner, with 26% facing inward, in contrast to 18 % of non-religious tattoos.

Researchers said they have no way of knowing if these findings apply to all students at the university or to students at other universities. They also say it is probable that they undercounted religious tattoos -- in part because tattoos may have religious or spiritual connotations but not be recognized as such.

Dougherty and Koch are expanding their research to a national level with random samples.

"So far, all our work has involved college students as respondents," Koch said. "Since we know tattoos tell life stories, broadening our respondent base is the next logical step. How might life stories expressed through body art -- religious and otherwise -- differ by wider differences in the race, age and social class?"

"We have a study in progress on religion and tattoos in a national sample of U.S. adults," Dougherty said. "Our research question is: Do religious people in the United States today get tattoos? We also have plans for a national survey on religious tattoos. This will allow us to determine the percentage of Americans with religious tattoos and how they differ from other Americans without a tattoo or with tattoos that have no religious significance."

Future research also might examine how tattoos are viewed in other parts of the globe, Koch said.

"It would be interesting to compare and contrast the path toward legitimation of tattoos in different parts of the developed Western world," he said. "We have some information from other scholars that, for example, conservative Catholicism in Latin America may continue to stigmatize tattoo wearers. So broader religious/folk culture may be in play where there is greater antipathy or stricter cultural norms against body art. Conversely, some of our students have reported that the fact their tattoo was religious lent legitimation with their families more than a tattoo of another type might."

Credit: 
Baylor University

Researchers find test to ID children at higher risk for cystic fibrosis liver disease

AURORA, Colo. (Feb. 12, 2020) - A major multi-center investigation of children with cystic fibrosis has identified a test that allows earlier identification of those at risk for cystic fibrosis liver disease.

The study, which includes 11 clinical sites in North America, was led by Michael Narkewicz, MD, professor of pediatrics from the University of Colorado School of Medicine and Children's Hospital Colorado. The findings of the study are published today ahead of print in The Journal of Pediatrics.

Cystic fibrosis is a genetic disorder that primarily affects the lungs and the pancreas, but can also create problems in the liver and other organs. Advanced cystic fibrosis liver disease refers to advanced scarring of the liver, which occurs in about 7 percent to 10 percent of patients with cystic fibrosis. The exact cause of advanced cystic fibrosis liver disease is not known, but only individuals with cystic fibrosis develop it.

"As we develop new therapies for cystic fibrosis and for other liver diseases, it is critical that we better understand which patients with cystic fibrosis are at higher risk for cystic fibrosis liver disease," said Narkewicz. "Our study is a vital first step in identifying those patients and should help in the creation of targets for therapies that could prevent cystic fibrosis liver disease."

The results reported in the article show that researchers could identify children at higher risk for advanced liver disease by using research-based ultrasound screening with consensus grading by four radiologists. The report is a four-year interim analysis conducted as part of a projected nine-year study.

The study covers 722 participants recruited between January 2010 and February 2014. Children from 3 years to 12 years of age who had been diagnosed with cystic fibrosis were eligible to enroll in the study. Using research-based ultrasound screening, researchers sought to determine whether the participants had a heterogeneous pattern on their livers that could indicate a higher risk for advanced cystic fibrosis liver disease. The study found that of the participants with the heterogeneous pattern were at a nine-fold higher risk than those participants with a normal pattern to develop advanced cystic fibrosis liver disease.

"This is the first large multicenter study of ways to predict which children with cystic fibrosis are at higher risk for advanced liver disease," Narkewicz said. "The findings show that we have found an important way to project which patients might develop cystic fibrosis liver disease. It also gives us clinical therapeutic targets for interventional therapies that could prevent the development of this liver disease. "

These research-based ultrasound exams relied on multiple radiologists reviewing the results. Narkewicz said clinically based liver ultrasound exams may not have the same predictive value. The research-based ultrasound tests are a next step after liver enzyme monitoring and could indicate an increased risk for developing advanced cystic fibrosis liver disease. Researchers are continuing to study the magnitude of the risk.

Credit: 
University of Colorado Anschutz Medical Campus

Machine learning implemented for quantum optics by Skoltech scientists

image: The theoretical beam is the goal scientists wished to achieve.

Image: 
https://doi.org/10.1038/s41534-020-0248-6

 

 

As machine learning continues to surpass human performance in a growing number of tasks, scientists at Skoltech have applied deep learning to reconstruct quantum properties of optical systems.

Through a collaboration between the quantum optics research laboratories at Moscow State University, led by Sergey Kulik, and members of Skoltech's Deep Quantum Laboratory of CPQM, led by Jacob Biamonte, the scientists have successfully applied machine learning to the state reconstruction problem.

Their findings have been reported in the Nature Partner Journal, npj Quantum Information, and are the first to show that machine learning can reconstruct quantum states from experimental data in the presence of noise and detector errors.

Skoltech PhD student Adriano Macarone Palmieri, lead author of the study, described the findings as " a new open door towards deeper insights ." Adriano has a Master's Degree in Physics from Bologna and joined Skoltech from Italy, where he worked as a data scientist.

Working closely with MSU's PhD student, Egor Kovlakov, Adriano reached out to his former colleague and a current postdoctoral fellow at Bocconi University, Federico Bianchi. Federico, a machine learning expert with a recent PhD from the University of Milano-Bicocca, describes the findings as "a sound example of data driven discovery which combines machine learning and quantum physics." While Federico didn't have experience with quantum mechanics prior to joining this study, he viewed the problem in terms of information and helped create a novel model of the system based on deep feed forward neural networks.

Both Adriano and Federico worked tirelessly and in close collaboration with many members of Deep Quantum Laboratory, including Dmitry Yudin who describes the findings as an important first step towards the practical use of neural network architecture in a lab for improving quantum tomography with available quantum setups of noisy experimental data. Such quantum information processing is used ubiquitously in paradigmatic quantum devices for quantum computation and optimization. In the forthcoming future, the researchers plan to address further challenges of upscaling quantum information devices, and expect this work to be foundational in their further research.

These results wouldn't have been possible without the experimental research of Egor Kovlakov, supported by Stanislav Straupe and Sergei Kuliik, from MSU. Over the last several years, they have applied a wide range of techniques to the state reconstruction problem. To the surprise of the coauthors, deep learning outperformed these state-of-the-art methods in a real experiment.

The MSU team generated data with an experimental platform based on spatial states of photons to prepare and measure high-dimensional quantum states. Experimental errors in state preparation and measurements inevitably plague the results and the situation becomes worse with increasing dimensionality. At the same time, extending the dimensionality of accessible quantum states is extremely important for quantum communication protocols and, especially, quantum computing. This is where machine learning techniques come in useful. The Skoltech team implemented a deep neural network implemented to analyze the noisy experimental data and efficiently learn to perform denoising, significantly improving the quality of quantum state reconstruction.

Skoltech's Deep Quantum Laboratory team believes that machine learning techniques will play an essential role in the future development of quantum technologies. As the available quantum devices become more and more complex, it gets harder and harder to control all the parameters at the desired level of precision.This came out as a very natural field of application for deep learning and machine learning techniques in general.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

Computer estimate claims a third of plant and animal species could be gone in 50 years

image: The common giant tree frog from Madagascar is one of many species impacted by recent climate change.

Image: 
John J. Wiens

Accurately predicting biodiversity loss from climate change requires a detailed understanding of what aspects of climate change cause extinctions, and what mechanisms may allow species to survive.

A new study by University of Arizona researchers presents detailed estimates of global extinction from climate change by 2070. By combining information on recent extinctions from climate change, rates of species movement and different projections of future climate, they estimate that one in three species of plants and animals may face extinction. Their results are based on data from hundreds of plant and animal species surveyed around the globe.

Published in the Proceedings of the National Academy of Sciences, the study likely is the first to estimate broad-scale extinction patterns from climate change by incorporating data from recent climate-related extinctions and from rates of species movements.

To estimate the rates of future extinctions from climate change, Cristian Román-Palacios and John J. Wiens, both in the Department of Ecology and Evolutionary Biology at the University of Arizona, looked to the recent past. Specifically, they examined local extinctions that have already happened, based on studies of repeated surveys of plants and animals over time.

Román-Palacios and Wiens analyzed data from 538 species and 581 sites around the world. They focused on plant and animal species that were surveyed at the same sites over time, at least 10 years apart. They generated climate data from the time of the earliest survey of each site and the more recent survey. They found that 44% of the 538 species had already gone extinct at one or more sites.

"By analyzing the change in 19 climatic variables at each site, we could determine which variables drive local extinctions and how much change a population can tolerate without going extinct," Román-Palacios said. "We also estimated how quickly populations can move to try and escape rising temperatures. When we put all of these pieces of information together for each species, we can come up with detailed estimates of global extinction rates for hundreds of plant and animal species."

The study identified maximum annual temperatures -- the hottest daily highs in summer -- as the key variable that best explains whether a population will go extinct. Surprisingly, the researchers found that average yearly temperatures showed smaller changes at sites with local extinction, even though average temperatures are widely used as a proxy for overall climate change.

"This means that using changes in mean annual temperatures to predict extinction from climate change might be positively misleading," Wiens said.

Previous studies have focused on dispersal -- or migration to cooler habitats -- as a means for species to "escape" from warming climates. However, the authors of the current study found that most species will not be able to disperse quickly enough to avoid extinction, based on their past rates of movement. Instead, they found that many species were able to tolerate some increases in maximum temperatures, but only up to a point. They found that about 50% of the species had local extinctions if maximum temperatures increased by more than 0.5 degrees Celsius, and 95% if temperatures increase by more than 2.9 degrees Celsius.

Projections of species loss depend on how much climate will warm in the future.

"In a way, it's a 'choose your own adventure,'" Wiens said. "If we stick to the Paris Agreement to combat climate change, we may lose fewer than two out of every 10 plant and animal species on Earth by 2070. But if humans cause larger temperature increases, we could lose more than a third or even half of all animal and plant species, based on our results."

The paper's projections of species loss are similar for plants and animals, but extinctions are projected to be two to four times more common in the tropics than in temperate regions.

"This is a big problem, because the majority of plant and animal species occur in the tropics," Román-Palacios said.

Credit: 
University of Arizona

Autonomous vehicle technology may improve safety for US Army convoys, report says

U.S. Army convoys could be made safer for soldiers by implementing autonomous vehicle technology to reduce the number of service members needed to operate the vehicles, according to a new study from the RAND Corporation.

"The Army is interested in autonomous technology because if they can reduce the number of soldiers needed to run a convoy, they can keep soldiers safe," said Shawn McKay, lead author of the study and a senior engineer with RAND, a nonprofit research organization.

McKay and his colleagues examined three different autonomous vehicle concepts: the fully autonomous employment concept, where all the vehicles are unmanned; the partially unmanned employment concept, featuring a lead truck with soldiers followed by unmanned vehicles in a convoy; and minimally manned, a "bridging" concept featuring a soldier in the driver's seat of each of the follower trucks to monitor the automated system and driving environment.

A minimally manned Army convoy put 28 percent fewer soldiers at risk compared to current practices. A partially unmanned convoy would put 37 percent fewer soldiers at risk, and a fully autonomous convoy would put 78 percent fewer soldiers at risk.

The technology to make an Army convoy fully autonomous doesn't exist yet. McKay said part of the challenge for the Army is that current automated technology is still limited and has mainly been tested in settings with well-manicured infrastructure, including standardized road markings and signs.

"We're looking at a combat environment - it's very complex," McKay said. "An Army convoy could be operating in a Third World environment where road markings and road conditions are very poor, there's open terrain, there's herds of animals, and you're under combat situations."

"With current technology, human 'operators' are still required to monitor the driving environment and regain control when the autonomous systems are unable to handle the situation. When you have a convoy of several vehicles driving autonomously and one halts due to an obstacle the autonomous system cannot handle, you have a situation where the convoy becomes vulnerable. The bridging concept mitigates this risk while still reducing soldier risk."

Partially unmanned technology won't be available for highway driving for several more years, but minimally manned is currently ready for Army adaptation and deployment in urban and highway environments, McKay said.

The study recommends the Army implement the minimally manned concept as a necessary bridging strategy to achieve the partially unmanned capability. The Army also should develop clear and practical technical requirements to reduce key development risks, such as from cyberattack. Pressure to leverage automated trucks to reduce the number of soldiers at risk may build before the Army has worked out all the problems with these systems so the Army will need to prepare accurate assessments of system readiness and the risks associated with implementation before the Army is ready.

Credit: 
RAND Corporation

What is the best way to encourage innovation? Competitive pay may be the answer

image: UC San Diego research reveals companies may be able to encourage radical novel thinking by changing how they pay them.

Image: 
erhui1979

Economists and business leaders agree that innovation is a major force behind economic growth, but many disagree on what is the best way to encourage workers to produce the "think-outside-of-the-box" ideas that create newer and better products and services. New research from the University of California San Diego indicates that competitive "winner-takes-all" pay structures are most effective in getting the creative juices flowing that help fuel economic growth.

The findings are based on a study authored by professor of economics Joshua Graff Zivin and assistant professor of management Elizabeth Lyons who partnered with Thermo Fisher Scientific, one of the globe's largest bio tech companies, in creating a contest for the Baja California office. Participants in the competition, which was open to all non-management employees of Thermo Fisher and other tech companies in the region, were asked to design digital solutions to help share medical equipment across small healthcare clinics in the region.

The competition was created to test which of two common compensation models produced more novel ideas. Those who signed up were randomly selected to compete in either the "winner-takes-all" category, in which there was one prize of $15,000 awarded to first place, or the "top 10" category, in which the same amount of prize money was spread out among the top 10 entries.

"Participants under the winner-takes-all compensation scheme submitted proposals that were significantly more novel than their counterparts in the other scheme," said the authors of National Bureau of Economic Research (NBER) working paper, who both hold appointments with the UC San Diego School of Global Policy and Strategy. "While the two groups did not statistically differ from one another on their overall scores, the risk taking encouraged by the competition with a single prize resulted in innovators pursuing more creative solutions."

They added, "These findings are significant because the 21st century economy is one that prizes novelty. Firms view it as an important source of comparative advantage. It is also an essential ingredient in the development of technological breakthroughs that transform markets with major impacts to consumers and producers."

How firms can produce more creative ideas with a limited amount of resources

Most modern-day mechanisms created by chief technology officer (CTOs) and management gurus designed to spark innovation often rely on performance-based-pay and hinge on assumptions regarding the ability and ambitions of employees and their risk preferences. However, the results from the NBER paper show that with identical groups of innovators, companies can increase the innovative output of employees just by changing how they pay them.

"Those competing for one big prize had to push further in making their results creative; however what is most surprising is that this is a relatively low-cost way for companies to induce more radical innovation," Lyons noted.

Though there was more risk vs. reward in the "winner-takes-all" category, both produced the about the same number of submissions (20 in "top 10" category and 22 in the "winner-takes-all" category), indicating that having less of a chance of winning a monitory award did not have an impact on the amount of work output.

The entries were judged by a panel of six experts. Half of the judges were from industry (Thermo Fisher and Teradata) and the other half were from academia (computer sciences professors from local universities in the Baja California region).

The novelty of the submissions was evaluated on a scale from one to five, relative to what is currently and/or soon to be available on the market. The lowest possible score of one was given for proposed solutions already on the market and the highest score of five was awarded to submissions in which no one else has thought of a similar idea.

Those who entered could work as an individual or in teams. The results of teams vs. individual entries in both categories are consistent with other studies, showing that teams with diversified skill-sets and deepened professional experience produced better entries than that of individuals. However, the team entries in the "winner-takes-all" category were again more novel than the group work in the "top 10" category.

Subsequently, participants in both categories were surveyed on their risk preferences. Not surprisingly, those less averse to risk performed better in the "winner-takes-all" category.

The results also revealed that women who submitted entries in the contest performed better than average in both categories of the competition.

In conclusion, the authors noted that genius is not created by incentives, but empowered by them.

"It is important to recognize that incentives alone are insufficient to spark creativity," they wrote. "More work is required to understand the raw ingredients that shape the relationship between creativity and compensation."

Credit: 
University of California - San Diego

NIST researchers link quartz microbalance measurements to international measurement system

video: This animation demonstrates a new method for linking mass measurements made using quartz crystal microbalances directly to the SI. Ensuring the accuracy of these tiny sensors could provide a common reference for the microelectronic fabrication industry, among other applications.

Image: 
NIST

Researchers at the National Institute of Standards and Technology (NIST) have found a way to link measurements made by a device integral to microchip fabrication and other industries directly to the recently redefined International System of Units (SI, the modern metric system). That traceability can greatly increase users' confidence in their measurements because the SI is now based entirely on fundamental constants of nature.

The device, a dime-size disk called a quartz crystal microbalance (QCM), is critically important to businesses that rely on precision control of the formation of thin films. Very thin: They range from micrometers (millionths of a meter) to a few tens of nanometers (billionths of a meter, or about 10,000 times thinner than a human hair) and are typically produced in a vacuum chamber by exposing a target surface to a meticulously regulated amount of chemical vapor that sticks to the surface and forms the film. The greater the exposure, the thicker the film.

Thin films are essential components in electronic semiconductor devices, optical coatings for lenses, LEDs, solar cells, magnetic recording media for computing, and many other technologies. They are also employed in technologies that measure the concentration of microbial contaminants in air, pathogens in the water supply, and the number of microorganisms that attach themselves to biological surfaces in the course of infection.

All those uses demand extremely accurate measurements of the film's thickness. Because that is difficult to measure directly, manufacturers frequently use QCMs, which have a valuable property: When an alternating current is applied to them, they vibrate at a resonant frequency unique to each disk and its mass.

To determine exactly how much film material is being deposited, they place the a QCM disk in the vacuum chamber and measure its resonant frequency. Then the disk is exposed to a chemical vapor. The more vapor that adheres to the QCM, the greater its mass -- and the slower it vibrates. That change in frequency is a sensitive measure of the added mass.

"But despite ubiquitous implementation of QCMs throughout industry and academia," said NIST physicist and lead researcher Corey Stambaugh, "a direct link to the SI unit of mass has not existed." The relationship between the SI unit of mass (the kilogram) and resonance frequency is assumed to be well characterized after decades of QCM measurements. But over the years, industry has made inquiries to NIST regarding the absolute mass accuracy of these frequency measurements. The new results presented by Stambaugh and colleagues are in large measure a response to those queries.

"We expect that our findings will enable a new, higher level of assurance in QCM measurements by providing traceability to the new SI," said NIST physicist Joshua Pomeroy, who with Stambaugh and others report their findings today in the journal Metrologia. The redefinition of the SI units in May 2019 eliminated the previous metal prototype kilogram as a standard and instead defined the kilogram in terms of a quantum constant.

In the new SI, mass at the kilogram level will be realized in the United States using that constant in NIST's Kibble balance.

In the new SI, NIST They have also developed a standard instrument, called the electrostatic force balance (EFB), that provides extremely accurate measurement of masses in the milligram range and lower), which are directly linked to the SI by way of a quantum constant. The EFB provided the team with reference milligram sized-masses with a precision on the order of a fraction of a microgram (1/1,000,000th of 1 gram, or about one millionth the mass of an average paper clip).

Stambaugh and colleagues carefully weighed an uncoated quartz disk, then suspended it in a vacuum chamber and measured its resonant frequency. About 0.5 meters (20 inches) below the disk was a furnace that heated a quantity of gold to 1480 C (2700 F). Gold vapor from the furnace rose and attached itself to the lower surface of the QCM, increasing its mass and thus slowing its resonant frequency. The scientists repeated the procedure at different time intervals and thus different amounts of mass accretion. was repeated at different time intervals. The researchers deposited gold vapor was over different time intervals and recorded the subsequent changes in resonant frequency. They weighed the disk again using the same EFB reference masses. This provided an accurate measurement of the change in mass, and thus provided an exact measure of the amount of gold deposited.

In the course of the work, the team also performed a complete assessment of the uncertainties in the QCM measurements. They identified the most accurate mathematical method of correlating the addition of mass to the change in the QCM's resonant frequency.

"This work provides a key step in a technique for traceably tracking -- and thus correcting for -- mass changes over time," said NIST physicist Zeina Kubarych.

In that regard, the new findings could help improve the way mass is disseminated following the new SI definition. The new kilogram is "realized" -- converted from an abstract definition to a physical reality -- through highly controlled laboratory measurements in a vacuum chamber. But the working standards of the kilogram will be disseminated -- physically delivered to measurement-science laboratories -- in the form of metal masses in the open air. That means that water vapor and whatever else is in the air can adsorb onto the surface of a kilogram working standard, causing inaccurate measurement of its mass.

Because humidity and air contaminants differ substantially around the world, measurements of a carefully calibrated mass standard can differ appreciably from place to place at the levels of accuracy needed for industrial and scientific metrology. If, however, a calibrated QCM were to accompany each standard, it could provide an accurate measure of the amount of material adsorbed in transit and at the destination, helping the labs to receive more accurate definitions of the new kilogram while taking environmental conditions into account.

Credit: 
National Institute of Standards and Technology (NIST)

Answers to microbiome mysteries in the gills of rainbow trout

While many immunologists use mouse models to conduct their research, J. Oriol Sunyer of Penn's School of Veterinary Medicine has made transformational scientific insights using a very different creature: rainbow trout.

In a paper featured on the cover of the journal Science Immunology, Sunyer and colleagues developed a method to manipulate the trout immune system to reveal a new understanding of how the animals defend against infection while promoting a healthy microbiome. The work addresses a decades-old question of whether mucosal antibodies--those present on mucosal surfaces of the body such as the gut, or in the case of fish, the gills--evolved to fight pathogens, or to preserve a healthy microbiome. As it turns out, mucosal immunoglobulins coevolved both roles from very early on during vertebrate evolution.

"You might be thinking, 'Rainbow trout? We fish for them; we eat them,'" says Sunyer. "But it turns out they can also tell us a lot about some fundamental biomedical, evolutionary, and immunological questions."

Specifically, Sunyer and colleagues found that a mucosal antibody, an immunoglobulin known as IgT, is critical both in controlling pathogens and in regulating the microbiome of fish gills, a tissue type that shares similarities with several mucosal surfaces of mammals, such as the intestines.

"We found that IgT is playing two paradoxical roles--on the one hand reducing bad microbes, and on the other hand promoting the presence of certain beneficial bacteria," says Sunyer. "Fish are the earliest bony vertebrates to possess a mucosal immune system, and so the fact that fish possess a specialized immunoglobulin that does both jobs suggests that these two processes are so fundamentally important for vertebrate survival that they arose concurrently, early on in evolution."

For nearly 20 years, Sunyer's lab has contributed a steady stream of discovery regarding the evolution and roles of the immune system using fish as model species. In 2010, a seminal paper in Nature Immunology featured on the journal cover identified the role of IgT. It was the first time that fish were shown to have a form of mucosal immunity--a more specialized response to pathogens that enter the body from the environment; in this case, through the gills, skin, and fish gut.

"Before that we thought only four-legged animals, or tetrapods, had mucosal immunity," Sunyer says. That study demonstrated the induction of potent IgT responses upon infection with a mucosal pathogen.

The group also showed that IgT coats a large portion of the bacteria that are part of the fishes' microbiome, the community of bacteria and other microbes that dwell on various tissues of the animals' bodies. That got the researchers thinking about which function arose first for vertebrate mucosal immunoglobulins: fighting pathogens or preserving a healthy microbiome.

"In mammals, the immunoglobulin IgA seems to have analogous function to IgT in fish," Sunyer explains. "In the last few years there have been some key studies showing that IgA is required to keep the mammalian microbiome in check. In mice and humans lacking IgA, their microbiome changes: The beneficial bacteria go down and the potentially disease-causing bacteria go up."

A weakness of these studies in mammals lacking IgA, Sunyer notes, is the inability to tease apart the precise role of IgA in preserving a balanced microbiome, since the lack of IgA from birth precludes the establishment of a healthy microbiota in these animals.

To better understand the roles of mucosal immunoglobulins in preserving a healthy microbiome, Sunyer and colleagues developed a model in adult fish where researchers could temporarily deplete them of IgT, lasting about two months.

By doing so they could study the role of IgT in preserving, rather than establishing, a healthy microbiome, while also evaluating the susceptibility to pathogens of fish lacking IgT.

When they depleted IgT, the researchers found that levels of a mucosal parasite greatly increased, underscoring the immunoglobulin's role in defending against harmful invaders. But they also saw a dramatic impact on the microbiome composition: IgT-depleted fish lost the IgT coating on the bacterial community in their gills and had more bacteria "escape" from gill surfaces and enter the tissue layer beneath, leading to tissue damage and inflammation.

Looking closely at the bacteria coated by IgT in normal animals, the research team found that IgT targeted specific species over others. These species included bacteria associated with both health and disease states in fish--similar to what had been found with IgA in mammals.

Critically, the authors found that the overall microbiome in IgT-depleted fish was significantly altered, in a shift known as dysbiosis. The overall diversity of bacteria present decreased, numbers of beneficial bacteria such as those producing short-chain fatty acids--critical for the maintenance of tissue integrity and immune homeostasis--also decreased, while disease-associated species increased.

"We see that there seems to be specific microbes that have to be controlled," says Sunyer. "Either they are harmful and tend to escape and cause problems in the nearby tissue in the absence of IgT, or perhaps they are beneficial but require IgT to colonize the mucosal surfaces. In both fish and mammals, it now seems apparent that their respective mucosal immunoglobulins do these jobs."

One great benefit of the researcher's IgT depletion technique is that it's temporary and performed in adult animals. After several weeks of depletion, the fish IgT levels return to normal. Thus the researchers were able to track the microbiome as IgT came back, observing what amounted to recovery; the microbes in the gill regained IgT coating, the microbiome was restored to its initial composition, and the tissue damage and inflammation that had been seen around the gills was reversed.

"In microbiome studies, recovery is a very important point," Sunyer says. "When you take an antibiotic, you can perturb your microbiome to the extent that recovery may take a very long time, but the perturbation we used, of removing IgT, had a profound but transient effect on the microbiome composition, which underwent a speedy recovery."

As more and more scientific studies identify links between the microbiome and various aspects of health from maintaining a healthy weight to the risk of cancer or even neurological conditions like Alzheimer's and schizophrenia, Sunyer is hopeful that his fish model will find even more applications.

"Studying only mammalian models is not going to be enough to understand the role of the microbiome in all of these physiological processes," says Sunyer.

Because the symbiotic relationship between vertebrates and their microbiome is very ancient, and one which first flourished with the emergence of mucosal immunoglobulins in fish, Sunyer says that "rainbow trout will help us discover the underlying mechanisms by which the interactions between immunoglobulins and the microbiome influence immunity, metabolism, cancer, and much more."

These studies, Sunyer adds, will have a crucial impact on the potential uses of specific species of fish bacteria as probiotic agents that may stimulate the immune system to protect against pathogens. With every other fish that we eat deriving from fish farming, an industry plagued with emerging pathogens, novel therapies, such as probiotics, are in urgent need.

Credit: 
University of Pennsylvania

New Argonne etching technique could advance the way semiconductor devices are made

image: Argonne chemists Jeff Elam (left) and Anil Mane (right) and colleagues have molecular layer etching that may help develop microelectronics and show the way beyond Moore's Law. Not shown are Matthias Young, Angel Yanguas-Gil, Devika Choudhury and Steven Letourneau.

Image: 
Argonne National Laboratory

Microelectronics like semiconductor devices are at the heart of the technologies we use each day. As we move into an era where we are stretching the limits of Moore’s Law, it is essential to find new ways to continue to pack more circuitry into each individual device in order to increase the speed and capability of our computers.

Researchers at the U.S. Department of Energy’s (DOE) Argonne National Laboratory have developed a new technique that could potentially help make these increasingly small but complex devices. The technique, known as molecular layer etching, is detailed in a new paper published in Chemistry of Materials.

“MLE has the potential to help usher in new pathways for fabricating and controlling material geometries at the nanoscale, which could open new doors in microelectronics and extend beyond traditional Moore’s Law scaling.” — Jeff Elam, Argonne chemist

To make microelectronics smaller, manufacturers have to cram in more and more circuitry onto smaller films and 3D structures. Today, this happens by using thin film deposition and etching, techniques to grow or remove films one layer at a time.

“Our ability to control matter at the nanoscale is limited by the kinds of tools we have to add or remove thin layers of material. Molecular layer etching (MLE) is a tool to allow manufacturers and researchers to precisely control the way thin materials, at microscopic and nanoscales, are removed,” said lead author Matthias Young, an assistant professor at the University of Missouri and former postdoctoral researcher at Argonne.

Together with molecular layer deposition (MLD), a deposition technique, MLE can be used to design microscopic architectures. These approaches are analogs of atomic layer deposition (ALD) and atomic layer etching (ALE), the more commonly applied techniques for fabricating microelectronics. However, unlike atomic layering techniques, which deal exclusively with inorganic films, MLD and MLE can be used to grow and remove organic films as well.

How it works

In principle, MLE works by exposing thin films, several nanometers or micrometers thick, to pulses of gas inside a vacuum chamber. The process starts with one gas (Gas A) which, upon entry, reacts with the surface of the film. Next, the film is exposed to a second gas (Gas B).  This AB process is repeated until the desired thickness is removed from the film.

“The net effect of A and then B is the removal of a molecular layer from your film,” said Argonne chemist Jeff Elam, a co-author of the study. “If you do that process sequentially, over and over again, you can reduce the thickness of your film to achieve the desired final thickness.”

A key aspect of MLD is that the A and B surface reactions are self-limiting.  They only continue until all of the available reactive surface sites are consumed, and then the reactions naturally terminate.  This self-limiting behavior is extremely helpful in manufacturing since it is relatively easy to scale the process up to larger substrate sizes. 

Researchers tested their approach using alucone, an organic material similar to silicone rubber that has potential applications in flexible electronics. Gas A in their experiment was a lithium-containing salt, and Gas B was trimethyl aluminum (TMA), an organometallic aluminum-based compound.

During the etching process, the lithium compound reacted with the surface of the alucone film in a way that caused the lithium to stick onto the surface and disrupt the chemical bonding in the film. Then, when the TMA was introduced and reacted, it removed the layer of film containing lithium.  The lithium serves a sacrificial role — it is deposited on the surface temporarily to break chemical bonds but is then removed by the TMA.

“The process can go on layer by layer like that and you can remove the whole material if you wanted to,” Young said.

Opening new doors in microelectronics

Using this technique can help manufacturers and researchers develop new ways of making nanostructures. The process may also be a safer option for them to use because it is free of halogens, a harsh components of chemicals common in other etching processes. It also has the advantage of being selective; the etching technique can selectively remove MLD layers without affecting nearby ALD layers.

“MLE has the potential to help usher in new pathways for fabricating and controlling material geometries at the nanoscale, which could open new doors in microelectronics and extend beyond traditional Moore’s Law scaling,” Elam said.

Credit: 
DOE/Argonne National Laboratory