Tech

Flash graphene rocks strategy for plastic waste

image: Flash graphene made from plastic by a Rice University lab begins as post-consumer plastic received from a recycler. It is then mixed with carbon black and processed into turbostratic graphene via timed pulses of AC and DC electricity.

Image: 
Tour Group/Rice University

HOUSTON - (Oct. 29, 2020) - Plastic waste comes back in black as pristine graphene, thanks to ACDC.

That's what Rice University scientists call the process they employed to make efficient use of waste plastic that would otherwise add to the planet's environmental woes. In this instance, the lab of Rice chemist James Tour modified its method to make flash graphene to enhance it for recycling plastic into graphene.

The lab's study appears in the American Chemical Society journal ACS Nano.

Simply, instead of raising the temperature of a carbon source with direct current, as in the original process, the lab first exposes plastic waste to around eight seconds of high-intensity alternating current, followed by the DC jolt.

The products are high-quality turbostratic graphene, a valuable and soluble substance that can be used to enhance electronics, composites, concrete and other materials, and carbon oligomers, molecules that can be vented away from the graphene for use in other applications.

"We produce considerable amount of hydrogen, which is a clean fuel, in our flashing process," said Rice graduate student and lead author Wala Algozeeb.

Tour estimated that at industrial scale, the ACDC process could produce graphene for about $125 in electricity costs per ton of plastic waste.

"We showed in the original paper that plastic could be converted, but the quality of the graphene wasn't as good as we wanted it to be," Tour said. "Now, by using a different sequence of electrical pulses, we can see a big difference."

He noted most of the world's plastic recycling technologies are ineffective, and that only about 9% of produced plastic is recycled. Most notorious, Tour said, is an island of plastic waste the size of Texas that has formed in the Pacific Ocean.

"We have to deal with this," he said. "And there's another problem: Microbes in the ocean that convert carbon dioxide into oxygen are being hindered by plastic breakdown products and they're reversing the process, taking oxygen and converting it to carbon dioxide. That's going to be really bad for humans."

Tour noted flash joule conversion eliminates much of the expense associated with recycling plastic, including sorting and cleaning that require energy and water. "Rather than recycling plastic into pellets that sell for $2,000 a ton, you could be upcycling to graphene, which has a much higher value," he said. "There's an economic as well as an environmental incentive."

Despite the overwhelming amount of plastic feedstock, having too much graphene won't be a problem, Tour said. "Whatever you do with carbon, once you've taken it up out of the ground from oil or gas or coal, it ends up in the carbon dioxide cycle," he said. "The nice thing about graphene is its biological degradation under many conditions is very slow, so in most cases it doesn't reenter the carbon cycle for hundreds of years."

He noted the researchers are working to refine the flash graphene process for other materials, especially for food waste. "We're working toward generating a good pulse sequence to convert food waste into very high-quality graphene with as little emission as possible," he said. "We're employing machine learning programs to help us know where to go."

The new study follows another recent paper that characterizes flash graphene produced from carbon black via direct current joule heating. That paper, also in ACS Nano, combined microscopy and simulations to show two distinct morphologies: turbostratic graphene and wrinkled graphene sheets. The study described how and why the rearranged carbon atoms would take one form or the other, and that the ratio can be controlled by adjusting the duration of the flash.

Credit: 
Rice University

Smart solution to detect seafood spoilage

Importantly, Flinders University's Professor of Aquaculture Jian Qin - who led the study with Flinders colleague Professor Youhong Tang ¬- says this simple device could become commercially viable and enable a "real-time" monitoring of spoilage in seafood to ensure food safety for consumers.

The first author of this publication was Professor Yonghua Jiang, a visiting scholar from Jimei University, China. She estimates that this device can be a major cost saver for the seafood industry and retailers, as spoilage accounts for at least 10% of all seafood production.

The core of the new spoilage analysis technology is understanding that biogenic amines play an important physiological function of living cells, but a high level of biogenic amines in seafood has an adverse impact on human health and can cause food poisoning.

Therefore, biogenic amines have become important indicators for the evaluation of food freshness and edibility - and reading these amines can be done by a simple and cost-effective method using the filter papers loaded with an AIEgen, such as dihydroquinoxaline derivative (H + DQ2), to monitor salmon spoilage.

The research found that as spoilage in the salmon samples increased, triggering more amine vapours, so too did the intensity of the readings on the treated filter papers.

Results from the study - Semi-quantitative Evaluation of Seafood Spoilage Using Filter-paper Strips Loaded With an Aggregation-induced Emission Luminoge, by Yonghua Jiang, Zhaowei Zhong, Weixin Ou, Haoming Shi, Parvej Alam, Ben Zhong Tang, Jian Qin and Youhong Tang - have been published in the journal Food Chemistry (DOI: 10.1016/j.foodchem.2020.127056)

"This study provides a quick and simple way for testing amine vapour from fish and provides baseline information for developing an easy-to-use, on-site method to evaluate seafood quality for customers," says Professor of Materials Engineering Youhong Tang, from Flinders University's Institute of NanoScale Science and Technology and Medical Device Research Institute.

The research team will now do further optimisation tests on the paper strips and the AIEgen loading, to provide a more robust solution for daily usage towards commercial applications.

The team also wants to allign the AIEgen loaded paper strips with smartphone apps to transfer information for quantitative evaluation.

Credit: 
Flinders University

Researchers find source of breast tumor heterogeneity and pathway that limits emergence

LEBANON, NH - A key hurdle in treating breast cancers is intratumoral heterogeneity, or the presence of multiple different cell populations within the same tumor that have distinct characteristics such as gene expression, metabolism and ability to divide, spread and grow. These cells can also respond with varying levels of sensitivity to standard therapies, and ultimately, are a contributing factor to therapeutic resistance.

A team of researchers led by Dartmouth's and Dartmouth-Hitchcock's Norris Cotton Cancer Center have a new understanding of how tumor heterogeneity arises and how it can be curtailed to render tumors less metastatic and more sensitive to therapy. The team identified that the mammary basal cell lineage contributes to breast cancer heterogeneity, fueling the outgrowth of multiple aggressive tumor subpopulations.

"Through the activation of a signaling pathway driven by protein kinase A (PKA), we are able to limit the self-renewal potential of basal cells, which impedes the outgrowth of metastatic, therapy-resistant tumor cell subpopulations," says Principal Investigator and cancer biologist Diwakar Pattabiraman, PhD, who is also the corresponding author of the study.

The team's findings, "Limiting Self-Renewal of the Basal Compartment by PKA Activation Induces Differentiation and Alters the Evolution of Mammary Tumors," are newly published in
Developmental Cell.

Pattabiraman notes that developing an understanding of how to tackle this heterogeneity is crucial for successful therapeutic intervention. "While there are no approaches to targeting PKA currently, there could be some therapeutic utility in pursuing the inhibition of its substrates such as Sox4. We plan to study the PKA-Sox4 connection in further detail, specifically to explore the possibility of targeting the transcriptional ability of Sox4."

Credit: 
Dartmouth Health

International study uncovers secret surfing life of remoras hitchhiking on blue whales

image: A remora attached to the skin of a blue whale

Image: 
Stanford University & Cascadia Research Collective. Image collected under NMFS permit #16111.

Sticking to the bodies of sharks and other larger marine life is a well-known specialty of remora fishes (Echeneidae) and their super-powered suction disks on their heads. But a new study has now fully documented the "suckerfish" in hitchhiking action below the ocean's surface, uncovering a much more refined skillset that the fish uses for navigating intense hydrodynamics that come with trying to ride aboard a 100-ft. blue whale (Balaenoptera musculus).

In a study published Oct. 28 in the Journal of Experimental Biology, an international team of researchers studying the unique fluid environments of blue whales traveling off the coast of Palos Verdes and San Diego, CA has reported capturing the first-ever continuous recording of remora behavior on a host organism, using advanced biosensing tags with video recording capabilities.

The study shows the secrets behind the remora fish's success in hitchhiking aboard baleen whales more than 30 times their size to safely traverse the ocean -- they select the most flow-optimal regions on the whale's body to stick to, such as behind the whale's blowhole, where drag resistance for the fish is reduced by as much as 84%. The team's findings also show that remoras can freely move around to feed and socialize on their ride even as their whale host hits burst speeds of more than 5 meters per second, by utilizing previously unknown surfing and skimming behaviors along special low-drag traveling lanes that exist just off the surface of the whale's body.

Researchers say the study represents the highest-resolution whole-body fluid dynamic analysis of whales to date, the insights from which could potentially be used as a basis to better understand the behavior, energy use and overall ecological health of the species, as well as improve tagging and tracking of whales and other migratory animals in future studies.

"Whales are like their own floating island, basically like their own little ecosystems. ...To get a look into the flow environment of blue whales within a millimeter resolution through this study is extremely exciting," said Brooke Flammang, assistant professor of biology at New Jersey Institute of Technology and the study's corresponding author. "Through lucky coincidence, our recordings captured how remoras interact in this environment and are able to use the distinct flow dynamics of these whales to their advantage. It is incredible because we've really known next to nothing about how remoras behave on their hosts in the wild over any prolonged period of time."

Until now, scientists studying the symbiotic relationships between remoras and their hosts in their natural ocean habitat have predominantly relied on still images and anecdotal evidence, leaving much of how they go about their renown sticking behavior beneath the surface a mystery.

In their recent investigation, the researchers employed multi-sensor biologging tags with dual cameras that they attached to the whales via four 2-inch suction disks. The tags were able to calculate various measurements inside the whale's ecosystem, such as surface pressure and complex fluid forces around the whales, as well as GPS location and traveling speeds through tag vibrations, all while video recording the remoras at 24 frames per second and 720p resolution.

"Fortunately, the drag on dimple-shaped airplane cockpits has been measured many times and we were able to apply this knowledge to help figure out the drag these remoras were experiencing," said Erik Anderson, co-author, biofluid dynamics researcher at Grove City College and guest investigator at the Woods Hole Oceanographic Institution. "But our study still required calculating, for the first time ever, the flow over a blue whale using computational fluid dynamics ... it took an international team of biologists, programmers, engineers and a supercomputer to do that."

The team's 211 minutes of video footage and whale tag data processed by researchers at the Barcelona Supercomputing Center captured a total of 27 remoras at 61 locations on the whales overall, finding that the remoras were most often podding and traveling between three of the most hydrodynamically beneficial spots where separating flow and wakes are caused by the whale's distinct topographical features: directly behind the blowhole, next to and behind the dorsal fin, and the flank region above and behind the pectoral fin.

According to the team's measurements, Anderson says that the shear force experienced by an average-sized remora in the wake behind the blow hole of a whale swimming at the casual speed of 1.5 m/s can be as low as 0.02 Newtons, half the force of drag in the free stream above. However, Anderson notes that the average remora's suction force of 11-17 Newtons is more than a match for even the most intense parking spot on the whale, its tail, where the remora experiences roughly 0.14 Newtons of shear force. And though the forces are greater, the same is true even for large remora riding on whales swimming at much higher speeds.

"We learned that the remora's suction disk is so strong that they could stick anywhere, even the tail fluke where the drag was measured strongest, but they like to go for the easy ride," said Erik Anderson. "This saves them energy and makes life less costly as they hitchhike on and skim over the whale surface like a NASA probe over an asteroid or some mini-world."

Remoras Go Surf's Up/p>

The tags showed that to conserve energy while getting about on their floating island, the remoras take advantage of the whale's physics by surfing inside a thin layer of fluid surrounding the whale's body, known as a boundary layer, where the team found drag force is reduced by up to 72% compared to the much more forceful free stream just above. Flammang says the fishes can lift within 1cm from their host in this layer to feed or join their mates at other low-drag social spots on the whale, occasionally changing directions by skimming, or repeatedly attaching and releasing their suction disks on the whale's body.

Flammang suspects that remoras are able to move freely without being completely peeled from their speedy hosts, which can move nearly seven times faster than the remora, through something called the Venturi effect.

"The skimming and surfing behavior is amazing for many reasons, especially because we think that by staying about a centimeter off the whale body, they are taking advantage of the Venturi effect and using suction forces to maintain their close proximity," explained Flammang. "In this narrow space between the remora and whale, when fluid is funneled into a narrow space it moves at a higher velocity but has lower pressure, so it is not going to push the remora away but can actually suck it toward the host. They can swim up into the free stream to grab a bite of food and come back down into the boundary layer, but it takes a lot more energy to swim in the free stream flow."

Along with uncovering new details of the remora's hitchhiking prowess, the team says they will continue to explore both the flow environments around whales and the mechanisms by which specifically adapted organisms like remoras successfully attach to hosts in order to improve animal tag technologies and designs for extended periods of behavioral and ecological monitoring. The team is also using their new insights into the remora's preferred low-drag attachment locations to better inform where they might tag whales in studies to come.

"It's an extremely arduous process to study whales what with permitting, research regulations and the game of chance of finding animals, all for the tags to usually fall off within 48 hours," said Flammang. "If we can come up with a better way to collect longer term data through better tag placement or better technologies, it could really advance our learning of the species, and many other animals that remoras attach to."

Credit: 
New Jersey Institute of Technology

Exposure to suboptimal doses of antimalarial drugs could, under certain circumstances, increase mala

image: Parasite exposure to suboptimal doses of artemisinin can lead to a considerable increase in sexual conversion rates and gametocyte numbers. This effect depends on the stage of the parasite's cycle (it was observed with trophozoites but not with the previous ring stage).

Image: 
ISGlobal

Barcelona, October 28, 2020-. Exposure to suboptimal doses of the antiparasitic drug artemisinin could increase the sexual conversion rate of the malaria parasite Plasmodium falciparum, thereby increasing the probability of transmission, according to a study led by the Barcelona Institute for Global Health (ISGlobal), an institution supported by "la Caixa" Foundation. The findings, published in eLife, may have public health implications, particularly in the context of mass antimalarial drug administration campaigns.

The malaria parasite P. falciparum replicates asexually in human blood every 48 hours, causing the typical clinical symptoms of the disease. At each replication cycle, a small number of parasites take a different pathway: that of sexual conversion to generate gametocytes. This sexual form of the parasite is the only one that can be transmitted to the mosquito. Sexual conversion is a highly regulated process, since the parasite needs to maintain a balance between asexual replication within the host and transmission between hosts. "From an evolutionary point of view, the parasite's capacity to adjust its sexual conversion rate in response to the host's conditions is clearly advantageous," explains Alfred Cortés, ICREA researcher at ISGlobal and study coordinator. One factor that clearly decreases the parasite's "comfort" within the host is exposure to parasite-killing drugs.

In order to determine whether artemisinin (the first-choice drug for treating malaria falciparum) or artemisinin-related drugs can affect the parasite's sexual conversion rate, Cortés and his team used a transgenic parasite line that allows to quantify sexual conversion in a dish, under different experimental conditions. They found that parasite exposure to artemisinin can lead to a considerable increase in sexual conversion rates and gametocyte numbers. But this effect was only seen with suboptimal drug doses and depended on the stage of the parasite's cycle (it was observed with trophozoites but not with the previous ring stage).

"Our results show there is a complex interaction between antimalarial drugs and sexual conversion, which depends on the parasite stage, its metabolic state and drug doses," says Harvie Portugaliza, first author of the study.

"It is possible that patients in which most parasites are at the trophozoite stage at the time of treatment may have a peak in gametocyte production ten days later (the time required for their maturation), if the drug did not manage to kill all the parasites," says Cortés. Exposure to suboptimal drug concentrations could result from using poor quality drugs or poor adherence to the treatment. Currently, the team led by Cortés is performing epidemiological studies to determine whether sexual conversion is indeed higher among patients treated with artemisinin.

Reference

Portugaliza HP, Miyazaki S, Geurten F, et al. Artemisinin exposure at the ring or trophozoite stage impacts Plasmodium falciparum sexual conversion differently. eLife. 2020. DOI: 10.7554/eLife.60058

About ISGlobal

The Barcelona Institute for Global Health, ISGlobal, is the fruit of an innovative alliance between the "la Caixa" Foundation and academic and government institutions to contribute to the efforts undertaken by the international community to address the challenges in global health. ISGlobal is a consolidated hub of excellence in research that has grown out of work first started in the world of health care by the Hospital Clínic and the Parc de Salut MAR and in the academic sphere by the University of Barcelona and Pompeu Fabra University. The pivotal mechanism of its work model is the transfer of knowledge generated by scientific research to practice, a task undertaken by the institute's Education and Policy and Global Development departments. ISGlobal has been named a Severo Ochoa Centre of Excellence and is a member of the CERCA programme of the Generalitat de Catalunya.

DOI

10.7554/eLife.60058

Credit: 
Barcelona Institute for Global Health (ISGlobal)

RUDN University chemist suggested increasing the biofuel production efficiency with silica-supported

image: A chemist from RUDN University developed a silica-supported heteropolyacid system to produce ethers from waste products of the wood and paper industry and agriculture. Ethers can be used as biofuels, and the new method increases the efficiency of their production 4 to 10 times, thus reducing energy consumption and making the manufacturing of biofuels cheaper.

Image: 
RUDN University

A chemist from RUDN University developed a silica-supported heteropolyacid system to produce ethers from waste products of the wood and paper industry and agriculture. Ethers can be used as biofuels, and the new method increases the efficiency of their production 4 to 10 times, thus reducing energy consumption and making the manufacturing of biofuels cheaper. The results of the study were published in the Molecular Catalysis journal.

The production of biofuel from non edible feedstocks is one of the main goals for a more sustainable future. As a rule, one of the most relevant feedstocks for it is lignocellulose--dry waste products of the wood and paper industry or agriculture. Among other components, lignocellulose contains hydroxymethyl furfural or HMF. From it one can obtain ethers that are used as eco-friendly fuel. A chemist from RUDN University developed a molecular matrix (xerogel) containing Preyssler heteropolyacids that increases the efficiency of ether production from HMF 4 to 10 times.

"HMF is a small molecule that has a lot of attention in industry. Its etherification is an important field of research because the products of this reaction are used as fuel and precursors of complex molecules. We tried to optimize HMF etherification with a silica xerogel containing heteropolyacids," said Rafael Luque, PhD, the head of the Molecular Design and Synthesis of Innovative Compounds for Medicine Science Center at RUDN University.

To produce ethers, HMF has to react with alcohols, so the researchers used butanol in their studies. The so-called Preyssler acids acted as a catalyst. They could have worked on their own, but the team found a way to modulate their activity. The chemists developed a xerogel from silica dioxide and immobilized the Preyssler acids on it. Acid molecules were distributed across the molecular matrix of the xerogel, thus increasing the contact area with HMF. As a result, the xerogel increased both the conversion of the reaction (i.e. the amount of reacting HMF) and its selectivity (the quantity of produced ether as compared to other products).

Having conducted a series of experiments, the team identified optimum reaction parameters: the temperature at 100 ?, and the HMF to butanol ratio at 1 to 3. In these conditions, the level of conversion reached 89%, and selectivity was 73%. Therefore, using the catalytic system, one can obtain a given amount of ether from considerably less primary product. This would reduce energy consumption and make the production process cheaper. Moreover, the xerogel can be treated with ethanol and reused up to 5 times, with conversion and selectivity dropping only to 50% and 60% respectively after all 5 cycles.

"The parameters that we identified can be applied to similar reactions of HMF with other alcohols to obtain ethers with different structures. Such ethers can further be used as fuel or precursors for complex molecules," added Rafael Luque, PhD, the head of the Molecular Design and Synthesis of Innovative Compounds for Medicine Science Center at RUDN University.

Credit: 
RUDN University

Knotting semimetals in topological electrical circuits

image: Imaging nodal knots in momentum space through topolectrical circuits

Image: 
SUTD

Invented more than 15,000 years ago, knots represent one of the earliest technological breakthroughs at the dawn of human history that kick-started the subsequent rise of human civilisation. Even today, we are still relying on knots in our daily life. Shoelace knots, for instance, have played a critical role in keeping shoes firmly on our feet for generations. Although knots are ancient inventions, the scientific and mathematical significance of knots were only discovered about 200 years ago. Famed mathematicians, such as Carl Frederich Gauss and Peter Guthrie Tait, developed the general recipes for constructing different knots, and the mathematical rules that govern the classifications of knots according to their mathematical behaviours. Today, knot theory has formed one of the central pillars in many areas, including computer science, molecular biology, protein folding, DNA engineering, and drug discovery.

Intriguingly, the electronic properties of a peculiar type of metals, known as the nodal knot semimetals, can also exhibit complex behaviours that mathematically mimic knots. These peculiar knots are known as the momentum space knot, which arises when several electronic bands are intertwined and entangled together. Simply put, the concept of electronic bands provides a powerful physics picture which is particularly useful for describing the electronic properties of solids. Momentum space is the "landscape" that hosts such electronic bands.

For instance, electrically insulated solids typically have pockets of bands that are well-separated by empty voids - these empty voids in momentum space serve as a "no-man zone" that forbids electricity flow, thus rendering such material an electrically insulating property. On the other hand, the relatively large abundance of electronic bands and the absence of voids in metals allow electricity to flow through it more effortlessly making them good conductors.

What makes nodal knot semimetals especially unusual when compared to normal metals is that the electronic bands intertwine and entangle to form knotted structures in momentum space. This is mathematically equivalent to the knots we encounter in everyday life.

Although nodal knot metals have been predicted to exist in several crystals, synthesising these exotic crystals and probing the subtle momentum space knots remains a formidable task. To remedy such difficulties, physicists from Singapore and Germany have come up with a new class of designer electrical system in 2018, which is based entirely on an electrical circuit board. Such designer electrical circuit, dubbed topolectrical circuits, can emulate the complex physical behaviour of crystalline solid materials using ubiquitous electrical components such as resistors, capacitors, inductors and operational amplifiers. Leveraging on their enormous design flexibility, topolectrical circuits have been widely used to illustrate exotic physics phenomena in recent years.

Reporting in Nature Communications, physicists from Singapore (National University of Singapore and Singapore University of Technology and Design), Germany (University of Würzburg) and China (Sun Yat-sen University) have achieved a breakthrough in the synthesis and the measurement of momentum space nodal knots using topolectrical circuits.

"The research community has come a long way in the discovery of exotic phases of matter. More than a decade ago, the first topological insulator was synthesised, marking the first time robust topologically protected phenomena was detected in a real material. Today, we have not only engineered a sophisticated topological system based on knotted structures, but also realised it with low-cost, ubiquitous electrical components" said Dr Ching Hua Lee, Assistant Professor of National University of Singapore, who led the international research team, and pioneered the approach of using topolectrical circuits to study fundamental physics phenomena.

A rather unusual aspect of the momentum space knots is the existence of a smoking gun electrical signature at the boundary of the nodal knot metal, commonly known as the "drumhead states". Measuring drumhead states in solid materials is however highly challenging, and typically requires state-of-the-art instruments, such as high energy synchrotron X-rays and ultrahigh vacuum environments.
In contrast, probing drumhead states in topolectrical circuits requires only simple electrical measurements which can be readily carried out in most labs.

"Topological effects require very precise values of inductor/capacitor components. To counteract this difficulty, we used machine learning to find variations of the circuit design which displayed the same topological phenomena but can be constructed using less precisely made parts."," said Amanda Sutrisno, a research team member from the Singapore University of Technology and Design.

Aided by machine learning algorithms, the team has designed topolectrical circuits operating at "sweet spots" that are particularly robust against electrical noise. This novel design allows the elusive electrical signatures of drumhead states to be unambiguously identified.

"The ability to control electrical circuit using topology may offer a new route towards electrical signal processing, remote sensing, and digital information processing using inexpensive and low power components. These aspects could be tremendously important for future technologies such as IoT and beyond 5G networks," said Assistant Professor Yee Sin Ang from the Singapore University of Technology and Design.

Credit: 
Singapore University of Technology and Design

Weak equivalence principle violated in gravitational waves

The Weak Equivalence Principle (WEP) is a key aspect of classical physics. It states that when particles are in freefall, the trajectories they follow are entirely independent of their masses. However, it is not yet clear whether this property also applies within the more complex field of quantum mechanics. In new research published in EPJ C, James Quach at the University of Adelaide, Australia, proves theoretically that the WEP can be violated by quantum particles in gravitational waves - the ripples in spacetime caused by colossal events such as merging black holes.

As well as resolving a long-standing debate in quantum theory, Quach's findings could lead to the development of advanced new materials, including fluids with infinite conductivity and zero viscosity. These could be used as advanced gravitational wave detectors and may even lead to devices which can mirror gravitational waves and harvest their energy. Quach based his approach around a principle named 'Fisher information' - a way of measuring how much information an observable random variable carries about a particular unknown parameter. Here, the random variable describes the position of a quantum particle in a gravitational field, while the unknown parameter is its mass. If the WEP were obeyed, the Fisher information should be zero in this case.

Through his calculations, Quach rewrote an equation describing the WEP for freely falling quantum particles, to incorporate their Fisher information. He showed that while these particles obey the WEP in static gravitational fields, their trajectories can indeed give away information about their mass when they pass through gravitational waves. For the first time, the calculation precisely characterises how the WEP can be violated by quantum particles, and provides key insights for future studies searching for the violation through real experiments.

Credit: 
Springer

Direct observation of a single electron's butterfly-shaped distribution in titanium oxide

image: Figure 1. (a) Distribution of a butterfly-shaped 3d electron orbital. (b) Valence electron density distribution around the titanium (Ti3+) ion at the centre of the titanium oxide (TiO6 ) octahedron obtained by the CDFS analysis developed by the research team for this project.

Image: 
Shunsuke Kitou

The functions and physical properties of solid materials, such as magnetic order and unconventional superconductivity, are greatly influenced by the orbital state of the outermost electrons (valence electrons) of the constituent atoms. In other words, it could be said that the minimal unit that determines a solid material's physical properties consists of the orbitals occupied by the valence electrons. Moreover, an orbital can also be considered a minimal unit of "shape," so the orbital state in a solid can be deduced from observing the spatially anisotropic distribution of electrons (in other words, from how the electron distribution deviates from spherical symmetry).

The orbital states in elements are basic knowledge that can be found in quantum mechanics or quantum chemistry textbooks. For example, it is known that the 3d electrons in transition elements such as iron and nickel have characteristic butterfly-type or gourd-type shapes (see Fig. 1(a)). However, until now, it has been extremely difficult to observe the real-space distribution of such electron orbitals directly.

Now, a research collaboration between Nagoya University, University of Wisconsin-Milwaukee, Japan's RIKEN and Institute for Molecular Science, the University of Tokyo, and the Japan Synchrotron Radiation Research Institute (JASRI), has observed the spatial distribution of a single valence electron at the centre of an octahedron-shaped titanium oxide molecule, using synchrotron X-ray diffraction (see Fig. 1(b)).

To analyse the X-ray diffraction data from the titanium oxide sample, the team developed a Fourier synthesis method in which data from each titanium ion's inner shell electrons - which do not contribute to the compound's physical properties - are subtracted from the total electron distribution of each ion, leaving only the butterfly-shaped valence electron density distribution. The method is called core differential Fourier synthesis (CDFS).

Furthermore, a closer look at the butterfly-shaped electron density revealed that high density remained in the central region (see Fig. 2(a)), in contrast with bare titanium in which electrons do not exist at the centre because of the node of the 3d orbital (see Fig. 1(a)). After careful data analysis, it was found that the electron density at the centre consists of the valence electrons occupying the hybridized orbital generated by the bond between titanium and oxygen. First-principles calculations confirmed this non-trivial orbital picture and reproduced the results of the CDFS analysis very well (see Fig. 2(b)). The image directly demonstrates the well-known Kugel-Khomskii model of the relationship between the magnetic and orbital-ordered states.

The CDFS method can determine the orbital states in materials regardless of the physical properties and can be applied to almost all elements and without the need for difficult experiments or analytical techniques: the method requires neither quantum-mechanical nor informatic models, so bias introduced by analysts is minimized. The results may signal a breakthrough in the study of orbital states in materials. The CDFS analysis will provide a touchstone for a complete description of the electronic state by first-principles or other theoretical calculations.

Credit: 
Nagoya University

Smart fluorescent molecular switches based on boron-based compounds

A molecular switch is a molecule that can be reversibly shifted between two or more stable states in response to external stimuli, such as a change on pH, light or electric current. These molecules are of interest in the field of nanotechnology for application in molecular computers or responsive drug delivery systems. If in one of the two states (on/off) the molecule is fluorescent, the compounds are then called fluorescent molecular switches, and their applications are even more interesting in the field of life sciences, especially if they can operate in small spaces. For example, they can be used for biosensing and as imaging probes inside cells.

Scientists from the Institute of Materials Science of Barcelona (ICMAB, CSIC) and the Universitat Autònoma de Barcelona (UAB) have developed a set of extremely stable fluorescent molecular switches that can be controlled electrochemically by applying a potential. This is possible due to the presence of a very particular redox active anion - a negatively charged molecule, which oxidizes and reduces very fast. In this case, the anion is the so-called COSAN anion (its full name is cobaltabisdicarbollide, and chemical formula [3,3'-Co(C2B9H11)2]- ), a boron cluster-based complex with a Co(III) center, which has the unusual property to self-assemble into vesicles and micelles.

These systems are the first examples of smart redox-controlled fluorescent molecular switches obtained from boron cluster-based compounds. Due to the presence of the COSAN, they are extremely stable, soluble in a large number of organic solvents, and show a large reversible fluorescence modulation. Additionally, these molecules can form gels with 1D nanostructures by self-assembly, which can preserve in some cases the luminescent behavior.

This research work is the result of a collaboration between Dr. Rosario Núñez from the Inorganic Materials and Catalysis Laboratory (LMI) from ICMAB-CSIC and Dr. Jordi Hernando from the Group of Electrochemistry, Photochemistry and Organic Reactivity (GEFRO) at the Department of Chemistry of the UAB. The experience of the LMI group at the ICMAB in the chemistry, electrochemistry and photoluminescence of boron clusters-based materials, and the experience of the GEFRO group at the UAB in the study of luminescent and electrochemical properties of fluorescent dyes such as perylene derivatives, came together in a very positive synergy that has allowed coupling the particularities of both research areas to produce these new smart molecules with outstanding electro-optical behavior.

"Owing to the presence of the COSAN, the properties of these compounds unambiguously demonstrate their capacity to behave as redox-induced fluorescent switches, which could be of use for the design of molecular memories and information processing devices, biosensing and imaging probes, or electrofluorochromic displays" says Rosario Núñez, researcher at the ICMAB.

"Moreover, these systems surpass the performance of previous systems based on conjugates of perylendiimides with other metal-based redox units such as ferrocene; on one hand, they display larger reversible fluorescence modulation with minimal degradation, while their solubility in polar media is dramatically enhanced, an essential requirement for future applications in biological systems" explains Jordi Hernando, researcher at the UAB.

Credit: 
Universitat Autonoma de Barcelona

Racial disparities in treatment for common lung cancer persist despite gains

PHILADELPHIA -- Lung cancer continues to be the most deadly solid cancer in the world, despite the fact that survival rates have been improving over the past decade. However, African American patients have worse outcomes and shorter lifespans after being diagnosed with lung cancer. A new study examines more recent data, and shows that although Black patients are now more likely to receive the most effective treatment than a decade ago, the disparity persists.

"We wanted to take a closer look at surgery - the most effective therapy - and how the two other second-line treatment options might affect the disparities in long-term outcomes across populations," says senior author Olugbenga Okusanya, MD, an Assistant Professor of Surgery at Thomas Jefferson University and researcher at the Sidney Kimmel Cancer Center - Jefferson Health. "There has been concern that these second-line treatments have been contributing to the disparity in outcomes."

The results were published in the Journal of Surgical Oncology.

The most effective treatment for early stage non-small cell lung cancer is surgery to remove a portion of lung. However two types of radiation therapy are also used as a second-line therapy, with stereotactic ablative radiotherapy (SABR) shown to be more effective than external beam radiation therapy (EBRT) for early stage disease.

The researchers examined data from 192,415 patients in the National Cancer Data Base who were diagnosed with early-stage (stage 1) non-small cell lung cancer between 2004 and 2015. Of these patients, 91% were white and 9% were Black. "Few reports have included this many patients and looked at both surgery and radiation therapy," says Dr. Okusanya.

The biggest disparity was in the use of surgery, which is the most effective form of therapy for early-stage lung cancer, a difference that persisted in every year of the 11-year study period. Of note, although the utilization of surgery increased over time for both white and Black patients, the rate of increase in Black patients was faster than in white patients. "This indicates that some work is being done to close the disparity in the utilization of surgery in Black patients," says Dr. Okusanya.

Dr. Okusanya and colleagues showed that the use of SABR increased from 2004 to 2015, and the rate of EBRT decreased, as expected, based on reports that EBRT was less effective for these patients. There was no disparity across racial groups in these two second-line therapies.

"Lung cancer incidence and mortality in the Greater Philadelphia region is profound--far outstripping national averages. Studies such as these are critical for understanding factors that contribute to cancer disparities, and are part of the Sidney Kimmel Cancer Center's overarching mission to improve the lives of all cancer patients and their families," said Dr. Karen E. Knudsen, executive vice president of oncology services for Jefferson Health and enterprise director of the Sidney Kimmel Cancer Center - Jefferson Health. "Variances in care across demographics are simply unacceptable. Raising awareness to this issue through Dr. Okusanya's work is the first step toward meaningful change."

In addition, other studies have suggested that comorbidities in Black patients were one of the drivers for worse outcomes, rather than the utilization of surgery. "In contrast, we found that when Black patients get surgery there is actually a trend for them to have better survival than their white counterparts," says Dr. Okusanya.

"We need to continue to reduce barriers to successful treatments for Black cancer patients," says Dr. Okusanya. "We know these disparities exist across cancer types and treatments and understanding some of the drivers of these inequities is key to fixing them."

Credit: 
Thomas Jefferson University

How hard is it to vote in your state?

image: A graphic ranking of ease of vote across the United States.

Image: 
Northern Illinois University

UPDATE: After publication, data input errors were discovered in the 2020 Cost of Voting Index (COVI), affecting the state rankings. The corrected COVI values (and rankings) for each presidential election year from 1996 through 2020 can be found here – scroll down to download the file titled, “COVI Values 1996-2020old and new (xlsx).”

DeKalb, Ill. — With the Nov. 3 presidential election just around the corner, a new analysis identifies U.S. states that make it easiest, and those that make it more challenging, to register and vote.

Political scientist Scot Schraufnagel of Northern Illinois University, along with colleagues Michael J. Pomante II of Jacksonville University and Quan Li of Wuhan University in China, recently updated their Cost of Voting Index to reflect a host of new state laws in place for the 2020 election. The analysis is published online ahead of print in Election Law Journal: Rules, Politics, and Policy.

Voting laws vary widely from one state to another and change frequently. The index uses an assemblage of dozens of current election laws to rank each state according to the time and effort it takes to vote in U.S. presidential elections. Importantly, both stages of the voting process—registering to vote and casting a ballot—are combined into a single index value.

 

Scot Schraufnagel

“Voting and elections are at the heart of our democracy, and voting should be easy,” says Schraufnagel, the study’s lead author. “One characteristic that helps define the competency of an electoral system and the legitimacy of governing institutions is the ease in which you can cast a ballot.”

States with more restrictions have higher cost-of-voting scores and lower rankings for ease of vote. Improving the ease of voting nationwide potentially could bump presidential election turnout from an average of about 55 percent to more than 65 percent, Schraufnagel says.

“State policies that impact the cost of voting are not the only factors that influence turnout, but they are arguably the most efficient way to alter the overall difficulty of voting for citizens, which is a major reason why we see variations in turnout from state to state,” adds Michael Pomante, who specializes in voter behavior.

As for those who claim that easing restrictions would result in widespread voter fraud, Schraufnagel says there is no systematic evidence from research that suggests ballot-stuffing or voter impersonation are major concerns.

“Because voter registration and balloting administration in a presidential election is so decentralized, it would be very difficult to sway a national election in our country by stuffing ballot boxes or casting fake votes,” he says. “A bigger issue is laws that disfranchise people, which cause democracy to be compromised. Research shows that politicians pay less attention to those who don’t participate in elections.”

Oregon takes top honors for making it easy on voters in 2020—followed by Washington, Utah, Illinois and Maryland. Factors that help make voting convenient include online voter registration, early voting, mail-in voting, being able to register as late as Election Day and automatic voter registration of citizens who are eligible to vote.

“Notably, among the top states, Oregon, Utah and Washington all have permanent vote-by-mail processes,” Schraufnagel says. “Illinois is among the group of states that make absentee voting very easy, although it’s not the same as mail-in voting.”

On the flipside, Texas is the state with the most restrictive voting processes, followed by Georgia, Missouri, Mississippi and Tennessee. Texas maintains an in-person voter registration deadline 30 days prior to Election Day, has reduced the number of polling stations in some parts of the state by more than 50 percent and has the most restrictive pre-registration law in the country, according to the analysis.

Other factors that make voters jump through more hoops in some states include strict voter ID laws, cumbersome absentee voting processes and a lack of early voting options.

The initial Cost of Voting Index, which examined two decades of data, was published in 2018. In recent years, the researchers have seen dynamic changes in voting laws. States that moved up considerably on the ease-of-vote ranking, by making concerted efforts to make the voting process more hassle-free, include Virginia, Michigan, Hawaii, Vermont, Ohio, New York, South Dakota, Washington, Illinois and Rhode Island.

Virginia’s move up the rankings was particularly dramatic. The state climbed 37 places and is now the 12th easiest state in which to vote. Specifically, Virginia’s state legislature approved an automatic voter registration law, got rid of the in-person registration deadline and made Election Day a state holiday, among other considerations.

Similarly, Michigan jumped up by 32 places in the ranking. In November 2018, voters there overwhelmingly approved eight changes to the state constitution making it easier to register and cast a ballot. “What is abundantly clear from the examples of Virginia and Michigan is that if a state wishes to make voting more accessible, it is entirely possible to do so,” Schraufnagel says.

States that saw the biggest declines in their ease-of-vote rankings were West Virginia, Missouri, Iowa, Georgia, Alaska, Nebraska, Maine, Florida, Delaware and Arkansas.

“The primary reason why these states have seen the largest declines in rankings,” Pomante explains, “is they lack a willingness to modernize their policies to ease the difficultly of voting and stay current with election law trends we see in many other states.”

The authors note that some states made temporary changes to state election law in response to the COVID-19 pandemic. These provisional laws were not factored into the 2020 Cost of Voting Index, but they are taken into consideration in a separate index in the newly published report.

Because of concerns over possible illness transmission at polling places, four states (CA, NV, NJ and VT) and the District of Columbia adopted provisional vote-by-mail processes. Additionally, 15 states made absentee voting processes less restrictive. Had the pandemic-related changes been made permanent, some of those states would have improved their ease-of-vote rankings. Notably, California would have moved up four notches in the rankings, and Vermont would have moved up two places.

The researchers also note that some reforms—such as online voter registration and automatic voter registration—are found to come with a reduced monetary cost for states.

“Overall, our research offers a panoramic summation of how various institutional designs at the state level affect citizens’ chances of voting at the individual level,” says co-author Quan Li. “We believe our research not only sheds light on future electoral reforms, in particular, but also contributes to democracy studies in general.”

Media Contact: Tom Parisi

About NIU

Northern Illinois University is a student-centered, nationally recognized public research university, with expertise that benefits its region and spans the globe in a wide variety of fields, including the sciences, humanities, arts, business, engineering, education, health and law. Through its main campus in DeKalb, Illinois, and education centers for students and working professionals in Chicago, Hoffman Estates, Naperville, Oregon and Rockford, NIU offers more than 100 areas of study while serving a diverse and international student body.

Journal

Election Law Journal Rules Politics and Policy

DOI

10.1089/elj.2020.0666

Credit: 
Northern Illinois University

Artificial intelligence-based algorithm for the early diagnosis of Alzheimer's

image: Network activation map from the output of second temporal convolution layer mapped onto MNI brain atlas. From doi 10.1117/1.JMI.7.5.056001.

Image: 
H. Parmar et al. doi 10.1117/1.JMI.7.5.056001.

Alzheimer's disease (AD) is a neurodegenerative disorder that affects a significant proportion of the older population worldwide. It causes irreparable damage to the brain and severely impairs the quality of life in patients. Unfortunately, AD cannot be cured, but early detection can allow medication to manage symptoms and slow the progression of the disease.

Functional magnetic resonance imaging (fMRI) is a noninvasive diagnostic technique for brain disorders. It measures minute changes in blood oxygen levels within the brain over time, giving insight into the local activity of neurons. Despite its advantages, fMRI has not been used widely in clinical diagnosis. The reason is twofold. First, the changes in fMRI signals are so small that they are overly susceptible to noise, which can throw off the results. Second, fMRI data are complex to analyze. This is where deep-learning algorithms come into the picture.

In a recent study published in the Journal of Medical Imaging, scientists from Texas Tech University employed machine-learning algorithms to classify fMRI data. They developed a type of deep-learning algorithm known as a convolutional neural network (CNN) that can differentiate among the fMRI signals of healthy people, people with mild cognitive impairment, and people with AD.

CNNs can autonomously extract features from input data that are hidden to human observers. They obtain these features through training, for which a large amount of pre-classified data is needed. CNNs are predominantly used for 2D image classification, which means that four-dimensional fMRI data (three spatial and one temporal) present a challenge. fMRI data are incompatible with most existing CNN designs.

To overcome this problem, the researchers developed a CNN architecture that can appropriately handle fMRI data with minimal pre-processing steps. The first two layers of the network focus on extracting features from the data solely based on temporal changes, without regard for 3D structural properties. Then, the three subsequent layers extract spatial features at different scales from the previously extracted time features. This yields a set of spatiotemporal characteristics that the final layers use to classify the input fMRI data from either a healthy subject, one with early or late mild cognitive impairment, or one with AD.

This strategy offers many advantages over previous attempts to combine machine learning with fMRI for AD diagnosis. Harshit Parmar, doctoral student at Texas Tech University and lead author of the study, explains that the most important aspect of their work lies in the qualities of their CNN architecture. The new design is simple yet effective for handling complex fMRI data, which can be fed as input to the CNN without any significant manipulation or modification of the data structure. In turn, this reduces the computational resources needed and allows the algorithm to make predictions faster.

Can deep learning methods improve the field of AD detection and diagnosis? Parmar thinks so. "Deep learning CNNs could be used to extract functional biomarkers related to AD, which could be helpful in the early detection of AD-related dementia," he explains.

The researchers trained and tested their CNN with fMRI data from a public database, and the initial results were promising: the classification accuracy of their algorithm was as high as or higher than that of other methods.

If these results hold up for larger datasets, their clinical implications could be tremendous. "Alzheimer's has no cure yet. Although brain damage cannot be reversed, the progression of the disease can be reduced and controlled with medication," according to the authors. "Our classifier can accurately identify the mild cognitive impairment stages which provide an early warning before progression into AD."

Credit: 
SPIE--International Society for Optics and Photonics

Researchers break magnetic memory speed record

image: A microscope image of the structures used to initiate the magnetization switching. In the experiment, an optical pump is directed at the photoconductive switch (light grey square), which converts the light into 6-picosecond electric pulses. The structure guides these pulses toward the magnet, which is located in the narrow region between the two grey lines. When the pulses reach the magnet, they trigger the magnetization switching.

Image: 
Image by K. Jhuria

Berkeley -- Spintronic devices are attractive alternatives to conventional computer chips, providing digital information storage that is highly energy efficient and also relatively easy to manufacture on a large scale. However, these devices, which rely on magnetic memory, are still hindered by their relatively slow speeds, compared to conventional electronic chips.

In a paper published in the journal Nature Electronics, an international team of researchers has reported a new technique for magnetization switching -- the process used to "write" information into magnetic memory -- that is nearly 100 times faster than state-of-the-art spintronic devices. The advance could lead to the development of ultrafast magnetic memory for computer chips that would retain data even when there is no power.

In the study, the researchers report using extremely short, 6-picosecond electrical pulses to switch the magnetization of a thin film in a magnetic device with great energy efficiency. A picosecond is one-trillionth of a second.

The research was led by Jon Gorchon, a researcher at the French National Centre for Scientific Research (CNRS) working at the University of Lorraine's L'Institut Jean Lamour in France, in collaboration with Jeffrey Bokor, professor of electrical engineering and computer sciences at the University of California, Berkeley, and Richard Wilson, assistant professor of mechanical engineering and of materials science and engineering at UC Riverside. The project began at UC Berkeley when Gorchon and Wilson were postdoctoral researchers in Bokor's lab.

In conventional computer chips, the 0s and 1s of binary data are stored as the "on" or "off" states of individual silicon transistors. In magnetic memory, this same information can be stored as the opposite polarities of magnetization, which are usually thought of as the "up" or "down" states. This magnetic memory is the basis for magnetic hard drive memory, the technology used to store the vast amounts of data in the cloud.

A key feature of magnetic memory is that the data is "non-volatile," which means that information is retained even when there is no electrical power applied.

"Integrating magnetic memory directly into computer chips has been a long-sought goal," said Gorchon. "This would allow local data on-chip to be retained when the power is off, and it would enable the information to be accessed far more quickly than pulling it in from a remote disk drive."

The potential of magnetic devices for integration with electronics is being explored in the field of spintronics, in which tiny magnetic devices are controlled by conventional electronic circuits, all on the same chip.

State-of-the-art spintronics is done with the so-called spin-orbit torque device. In such a device, a small area of a magnetic film (a magnetic bit) is deposited on top of a metallic wire. A current flowing through the wire leads to a flow of electrons with a magnetic moment, which is also called the spin. That, in turn, exerts a magnetic torque -- called the spin-orbit torque -- on the magnetic bit. The spin-orbit torque can then switch the polarity of the magnetic bit.

State-of-the-art spin-orbit torque devices developed so far required current pulses of at least a nanosecond, or a millionth of a second, to switch the magnetic bit, while the transistors in state-of-the-art computer chips switch in only 1 to 2 picoseconds. This leads to the speed of the overall circuit being limited by the slow magnetic switching speed.

In this study, the researchers launched the 6-picosecond-wide electrical current pulses along a transmission line into a cobalt-based magnetic bit. The magnetization of the cobalt bit was then demonstrated to be reliably switched by the spin-orbit torque mechanism.

While heating by electric currents is a debilitating problem in most modern devices, the researchers note that, in this experiment, the ultrafast heating aids the magnetization reversal.

"The magnet reacts differently to heating on long versus short time scales," said Wilson. "When heating is this fast, only a small amount can change the magnetic properties to help reverse the magnet's direction."

Indeed, preliminary energy usage estimates are incredibly promising; the energy needed in this "ultrafast" spin-orbit torque device is almost two orders of magnitude smaller than in conventional spintronic devices that operate at much longer time scales.

"The high energy efficiency of this novel, ultrafast magnetic switching process was a big, and very welcome, surprise," said Bokor. "Such a high-speed, low-energy spintronic device can potentially tackle the performance limitations of current processor level memory systems, and it could also be used for logic applications."

The experimental methods used by the researchers also offer a new way of triggering and probing spintronic phenomena at ultrafast time scales, which could help better understand the underlying physics at play in phenomena like spin-orbit torque.

Credit: 
University of California - Berkeley

Election 2020 chatter on Twitter busy with bots, conspiracy theorists, USC study finds

image: USC scientists analyzed tweets from June 20 to Sept. 9, 2020, and found that right-leaning bots were responsible for 20% of the tweets on certain conspiracy theories, including QAnon, so-called "gate" conspiracies, as well as misinformation about COVID-19.

Image: 
USC Institute for Information Sciences

Bots and conspiracy theorists have infested the Twitter chatter around the upcoming U.S. presidential election, USC researchers have found.

Looking at more than 240 million election-related tweets, the study found that thousands of automated accounts, known as bots, had posted tweets about President Donald Trump, his Democratic opponent former Vice President Joe Biden and both their campaigns.

Most of these bots were promoting right-leaning political conspiracies like QAnon. Even though the bots are believed to have been responsible for a few million of the tweets, they likely reached hundreds of thousands of Twitter users.

The USC study was published Wednesday by the journal First Monday, just days before the Nov. 3 election. The research focused on election-related tweets from June 20, 2020, to Sept. 9, 2020, and other data from Twitter during that period. The USC researchers wrote that their analysis was an attempt to chart the landscape of social media manipulation in the 2020 election.

"I think the most important finding here is that bots exacerbate the consumption of content within the same political chamber, so it increases the effect of echo chambers or the salience of those tweets," said the study's lead author Emilio Ferrara, associate professor of computer science at the USC Viterbi School of Engineering and associate professor of communication at the USC Annenberg School for Communication and Journalism. He also is a research team leader at the USC Information Sciences Institute.

Bots almost exclusively retweeted original posts by Twitter users who are human, the scientists noted. In turn, many humans retweeted the bots' messages that aligned with their political leanings, which then led to additional retweets and replies.

The bots were identified by Botometer, a machine learning-based tool developed by USC and Indiana University scientists to identify likely bots based on their tweeting behavior and several other characteristics.

Election 2020: A significant partisan divide on Twitter

Throughout their analysis, the research team identified significant differences between bots and humans and the type of election content they tweet and retweet on the social media platform.

In addition to studying the bots, the researchers examined the political leanings of real human users and common hashtags in those 240 million tweets, as well as any tweets containing stories or other content from partisan media and traditional news outlets.

Highlights of the study included:

Right-leaning accounts significantly outnumbered their left-leaning counterparts by 4-to-1 among bots and by 2-to-1 among humans.

Users that shared or retweeted news from right-leaning media platforms were almost 12 times more likely to share conspiracy narratives than users that shared or retweeted content from left-leaning media (25% vs. 2%).

The conspiracy theories that the researchers traced in tweets included the far-right conspiracy QAnon, as well as conspiracies such as "pizzagate," a debunked claim linking Democratic Party officials and U.S. restaurants with child sex trafficking. They also studied which users were likely to share politically biased COVID-19 conspiracy narratives about the origins of the coronavirus or unsupported claims about treatments for the disease.

About 13% of all users sharing conspiracy narratives were suspected bots. The proportion of users tweeting about QAnon were most concentrated in certain states: Alaska, Idaho, Kentucky, Mississippi, Montana and Oklahoma.

Just 4% of the bots shared news from left-leaning and centrist media outlets, including The Washington Post, The New York Times, Los Angeles Times, ABC News, BBC, CNN and others.

About 20% of users that shared content from right-leaning media (e.g. Breitbart, OANN and Infowars) were likely bots.

Foreign interference continues to threaten election integrity

Ferrara led a study in 2016 just before President Trump was elected that revealed the prevalence of what turned out to be foreign-operated bots, mostly Russian, on Twitter seeking to distort the election chatter. Ongoing U.S. investigations since have shown a consistent pattern of election cybersecurity threats from Russia, China and most recently Iran.

The latest study noted some foreign countries engaging in manipulation that previously were not.

After analyzing Twitter's data on recently banned users, the researchers noted that Ghana and Nigeria had launched information campaigns to target left-leaning users about the Black Lives Matter movement. Saudi Arabia and Turkey also had high engagement with right-leaning users; Russia and China mostly targeted left-leaning fringe groups and conservative groups.

"In short, the state of social media manipulation during the 2020 election is no better than it was in 2016. We are very concerned by the proliferation of bots used to spread political conspiracies and the widespread appeal that those conspiracy narratives seem to have on the platform," Ferrara said. "The combination of automated disinformation and distortion on social media continue to threaten the integrity of U.S. elections."

Credit: 
University of Southern California