Tech

Investigating dense plasmas with positron waves

The investigation of Electron-Positron-Ion (EPI) plasma?--?a fully ionised gas of electrons and positrons that includes astrophysical plasmas like solar winds?--?has attracted a great deal of attention over the last twenty years. A new study published in EPJ D by Garston Tiofack, Faculty of Sciences, University of Marousa, Cameroon, and colleagues, assesses the dynamics of positron acoustic waves (PAWS) in EPI plasmas whilst under the influence of magnetic fields, or magnetoplasmas.

The authors studied the changes in PAWs using a framework of Korteweg-de Vries (KdV) and modified Korteweg-de Vries (mKdV) equations finding a former led to compressive positron acoustic solitary waves (PASWs), whilst the latter resulted in the same and additional rarefactive PASWs. Mathematical models and numerical simulations performed by the researchers also allowed them to consider the effect of various other factors on the magnetoplasma including the concentration of hot electrons to that of positrons and applied nonthermal parameters.

The team discovered that the transition to chaos in the magnetoplasma depends strongly on the frequency and strength of external periodic perturbations.

The study thus serves a useful guide to understanding the changes that occur at magnetoplasma in Auroral Acceleration Regions (AAR) and as they apply to PAWs. The team's results could also help develop research into astrophysical plasma, which include solar flares and interstellar plasmas thus giving physicists a window into the processes that take place in extreme environments like active galactic nuclei and supernovae explosions.

Bringing the team's research down to earth somewhat, it could also assist teams which generate plasma across the globe. These plasmas play a major role in a new generation of nuclear fusion reactors, which aim to generate clean power by replicating the processes that occur in the stars.

These plants use plasmas which are controlled with the use of powerful magnetic fields, thus making the understanding of such influences of critical importance to future clean energy production.

Credit: 
Springer

Considering disorder and cooperative effects in photon escape rates from atomic gases

Whilst a great deal of research has studied the rates of photons escaping from cold atomic gases, these studies have used a scalar description of light leaving some of its properties untested. In a new paper published in EPJ B Louis Bellando, a post-doctoral researcher at LOMA, University of Bordeaux, France, and his coauthors--Aharon Gero and Eric Akkermans, Technion-Israel Institute of Technology, Israel, and Robin Kaiser, Université Côte d'Azur, France--aim to numerically investigative the roles of cooperative effects and disorder in photon escape rates from a cold atomic gas to construct a model that considers the vectorial nature of light. Thus, the study accounts for properties of light, previously neglected.

"Our study focuses on light propagation in cold atomic gases, in which atoms hardly move. On their way out of the gas, photons undergo multiple scattering by the atoms," Bellando says. "Roughly speaking, the greater the number of these scattering events?--?the longer it takes the photons to leave the gas, and thus the smaller their escape rates. This classical description fits the so-called radiation trapping, which occurs, for example, when light undergoes a random walk in a glass of milk."

When taking into account interference and quantum mechanical effects, two mechanisms affect these escape rates: Anderson localisation arising from interference effects in the presence of disorder, and Dicke's superradiance?--??cooperative effects stemming from light-mediated interactions between the atoms.

Numerically studying photon escape rates from a three-dimensional cloud of cold atoms allowed the team to consider if there were any marked differences between the behaviour in the simple scalar case?--?giving a single value to every point in a region--and the more complex vector case that assigns magnitude and direction to every point in a given area.

One of the biggest surprises encountered by the researchers as they collected their results was how well vector field observations agreed with scalar field tests. "Surprisingly, we found no significant difference between the scalar and vectorial models, and in both cases, the dominant mechanism was cooperativity," says Bellando. "Now we know that the scalar model constitutes an excellent approximation when considering photon escape rates from atomic gases."

Because the scalar model is much simpler than the vectorial one, the similarity between the two means that in the case of photon escape rates models can use scalar fields rather than vector fields without the risk of losing substantial information.

"Light-matter interaction is an exciting field of research, both theoretically and experimentally," Bellando concludes. "Advances in this area may have a significant impact on other emerging fields, such as quantum computing."

Credit: 
Springer

Using neutron scattering to better understand milk composition

Neutron scattering is a technique commonly used in physics and biology to understand the composition of complex multicomponent mixtures and is increasingly being used to study applied materials such as food. A new paper published in EPJ E by Gregory N Smith, Niels Bohr Institute, University of Copenhagen, Denmark, shows an example of neutron scattering in the area of food science. Smith uses neutron scattering to better investigate casein micelles in milk, with the aim of developing an approach for future research.

Smith, also a researcher at the ISIS Neutron and Muon Source in the UK, explains why better modelling of how neutrons are scattered by structures in colloid materials is important. "How well you can understand the structure of a system from scattering data depends on how good your model is, and the better and more realistic your model, the better your understanding," the researcher says. "This is true for food as for any material. A better understanding of the structure of casein in milk can help better understand dairy products."

Neutron scattering can be used to investigate fluids by swapping the water solvent within them with heavy water - ?water where hydrogen is replaced with deuterium, an isotope of hydrogen possessing a nucleus with a proton and a neutron rather than just a proton.

"I set out to see if the model that I had developed for casein micelles in milk could also be applied to existing neutron scattering data. The particular set of data that I looked at was extensive and had measurements from a large number of backgrounds, with different water to heavy water ratios," Smith continues. "This meant that I would not only be able to see if the model worked with different measurements, which would support its wider application, but also meant that I would be able to better quantify the composition of milk."

Smith further explains that he was pleased to see his model agreed well when compared with existing data, something that is not always guaranteed when testing out new models with scattering experiments. What surprised the researcher, however, was just how much scattering occurred even in skimmed milk with less fat droplets.

"Even common and everyday materials, such as food, have a complex structure on the nanoscale," Smith concludes. "You might look at milk and just see a cloudy liquid, but inside there are proteins that self-assemble into colloids, proteins that are free in solution, large droplets of fat, and many other components as well.

"By using a technique like scattering to study such a system, you can get beneficial information about all these constituents."

Credit: 
Springer

Republican and Democratic voters agree on one thing--the need for generous COVID-19 relief

image: Bright Line Watch asked experts to rate the severity of the threats posed to democracy. More than 90 percent of the polled experts viewed the items that scored highest across the (ab)normality-and-importance dimensions as either a "moderate," "serious," or "grave" threat.

Image: 
Bright Line Watch

Both Democrats and Republicans overwhelmingly favor politicians who support generous COVID-19 relief spending, yet remain deeply polarized over the legitimacy of the 2020 presidential election results and former President Donald Trump's second impeachment. Meanwhile, political experts find that the former president's actions and those taken by congressional supporters in the aftermath of the election represent serious departures from American democratic norms.

Those are among the most recent findings of Bright Line Watch, the political science research project cofounded by Gretchen Helmke, a professor of political science at the University of Rochester, and her colleagues at the University of Chicago and Dartmouth College. The watchdog group started regular surveys about the health of US democracy in February 2017.

Read Bright Line Watch's latest (February 2021) survey, "American democracy at the start of the Biden presidency."

The team found strong bipartisan support for a new COVID-19 relief package, with Republican voters favoring a hypothetical candidate who supports a $500 billion pandemic appropriation over one who opposes it by 11 points, independents by 12 points, and Democrats by 18 points.

COVID relief has proven to be extremely popular with supporters of both parties, says Bright Line Watch cofounder Brendan Nyhan, a professor of government at Dartmouth College. "We've seen Democrats and Republicans in Congress at times compete to provide more generous offers of aid and assistance. The public seems to largely agree that the government should provide more help given the economic circumstances Americans currently face."

Meanwhile, the legitimacy of the election result remains a polarizing issue: while 42 percent of Republican public policymakers expressed confidence in the integrity of the election results at the national level, only 22 percent of Republicans in the public sample felt the same way. To Helmke, the public's continued partisan view of the election is troubling.

"In a democracy people basically have to trust that the rules are fair and that if their party or their team loses, the stakes of that loss won't be intolerable, that in the future they'll be able to contest an election again, and that they'll have a chance of winning. That keeps everyone committed to democracy and to playing by the rules," Helmke says. "Once you break that faith--that elections actually determine who the winner is--people's allegiance to democracy wanes."

While the latest survey provides a snapshot of the state of democracy in the early days of the new Biden administration, it's also a look in the rearview mirror. The Bright Line Watch team found that loyalties and antipathy toward the former president--whose Senate impeachment trial began immediately after the surveys were conducted--continue to shape the views of citizens and government officials alike.

As a result, the "country still lives in the shadow of the Trump legacy," the team writes.

As they had done throughout the project, the group fielded two parallel surveys--one to political experts and one to a representative sample of the US population--between January 28 and February 8.

Among the key findings in the survey of the public:

Partisan differences in confidence in the 2020 election and on legal and political accountability for former President Trump are profound. Democrats trust the election, support disqualifying Trump from holding future office, and believe he should face criminal prosecution. Republicans distrust the election results and favor moving on without consequences for Trump. Independents are split.

While there is cross-party consensus on government spending on pandemic relief, stark polarization over the certification of the presidential election and impeachment continues, with Republicans punishing Republican candidates for crossing the party line on either issue.

Among the key findings in the survey of political experts:

The experts overwhelmingly favor a set of reform proposals to expand voting participation, tighten campaign finance regulation, and modify how electoral districts are configured and votes are cast. They also favor abolishing the Senate filibuster and imposing term limits on Supreme Court justices. The only reform the experts reject is compulsory voting.

Experts rate the January 6 insurrection and President Trump's pressure on state-level officials to overturn the election as among the most abnormal and important events of the Trump presidency. They overwhelmingly regard these events and the votes by a majority of Republican lawmakers in Congress not to certify the presidential election results as grave or serious threats to American democracy.

Thinking of secession?

The specter of secession entered into the group's battery of questions after legislators at the local and state level started mentioning it publicly. For the first time Bright Line Watch asked its public sample about the prospect of breaking up the United States into more than one country--a genuinely radical proposition, the team acknowledges.

"Until recently, we would have regarded it as too marginal to include in a survey. But state legislators in Mississippi and Texas and state GOP leaders in Texas and Wyoming have openly advocated secession in recent months, prompting us to design two survey items to gauge perceptions of this idea," they write.

Notably, when presented with a proposal for their region to secede from the United States, almost one in three Americans polled (29 percent) is willing to entertain the prospect. Republicans (33 percent) support secession more than Democrats (21 percent); but Democrats are more amenable to secession than Republicans in areas where they tend to hold power.

Yet, the researchers caution against reading too much into that data: the results reflect initial reactions by respondents about an issue that they are very unlikely to have considered carefully, the team cautions.

Credit: 
University of Rochester

New study highlights importance of context to physical theories

A Swansea University scientist's research into the geometrical characteristics of a physical theories is highlighted in a new paper.

Physicist Dr Farid Shahandeh said: "Imagine a physical theory whose explanation for the trajectory of an apple falling from a tree differs for Gala and Pink Lady. We know that the apple's variety has nothing to do with how it falls. A theory like this is overcomplicated.

"Any seemingly unnecessary and nonsensical parameter like this adds context to a theory's description of a physical phenomenon.

"Luckily, classical theories are not contextuality. But, we know that if we try to interpret quantum mechanics in classical terms, it becomes contextual, and consequently counterintuitive."

In his study Contextuality of General Probabilistic Theories Dr Shahandeh examines what structural property of a theory like quantum mechanics makes it prone to contextuality.

It has just been published in PRX Quantum, a prestigious journal from American Physical Society.

He said: "In this work we determine the geometrical characteristics that give rise to either noncontextuality or contextuality of a physical theory.

"In doing so, we consider the framework of general probabilistic theories, a unified tool that comes very handy for constructing generic physical theories."

He added: "The techniques and results obtained in this work can be extended and applied to quantum information processing tasks that aim for non-classical advantages."

Dr Shahandeh is the recipient of a Royal Commission for the Exhibition of 1851 research fellowship, the annual awards designed to give early career scientists or engineers of exceptional promise the opportunity to conduct a research project of their own instigation.

His research at the College of Science examines the characterization of fundamental resources required for quantum computation algorithms.

Credit: 
Swansea University

Quantum quirk yields giant magnetic effect, where none should exist

image: Rice University theoretical physicists (from left) Hsin-Hua Lai, Qimiao Si and Sarah Grefe worked with experimental collaborators at Vienna University of Technology to understand topological features of a nonmagnetic Weyl-Kondo semimetal allowed it to produce a giant Hall effect in the absence of a magnetic field.

Image: 
Photo by Jeff Fitlow/Rice University

HOUSTON - (Feb. 26, 2021) - In a twist befitting the strange nature of quantum mechanics, physicists have discovered the Hall effect -- a characteristic change in the way electricity is conducted in the presence of a magnetic field -- in a nonmagnetic quantum material to which no magnetic field was applied.

The discovery by researchers from Rice University, Austria's Vienna University of Technology (TU Wien), Switzerland's Paul Scherrer Institute and Canada's McMaster University is detailed in a paper in the Proceedings of the National Academy of Sciences. Of interest are both the origins of the effect, which is typically associated with magnetism, and its gigantic magnitude -- more than 1,000 times larger than one might observe in simple semiconductors.

Rice study co-author Qimiao Si, a theoretical physicist who has investigated quantum materials for nearly three decades, said, "It's really topology at work," referring to the patterns of quantum entanglement that give rise the unorthodox state.

The material, an exotic semimetal of cerium, bismuth and palladium, was created and measured at TU Wien by Silke Bühler-Paschen, a longtime collaborator of Si's. In late 2017, Si, Bühler-Paschen and colleagues discovered a new type of quantum material they dubbed a "Weyl-Kondo semimetal." The research laid the groundwork for empirical investigations, but Si said the experiments were challenging, in part because it wasn't clear "which physical quantity would pick up the effect."

In April 2018, Bühler-Paschen and TU Wien graduate student Sami Dzsaber, the study's first author, dropped by Si's office while attending a workshop at the Rice Center for Quantum Materials (RCQM). When Si saw Dzsaber's data, he was dubious.

"Upon seeing this, everybody's first reaction is that it is not possible," he said.

To appreciate why, it helps to understand both the nature and the 1879 discovery of Edwin Hall, a doctoral student who found that applying a magnetic field at a 90-degree angle to conducting wire produced a voltage difference across the wire, in the direction perpendicular to both the current and the magnetic field. Physicists eventually discovered the source of the Hall effect: The magnetic field deflects the motion of passing electrons, pulling them toward one side of the wire. The Hall effect is a standard tool in physics labs, and devices that make use of it are found in products as diverse as rocket engines and paintball guns. Studies related to the quantum nature of the Hall effect captured Nobel Prizes in 1985 and 1998.

Dzsaber's experimental data clearly showed a characteristic Hall signal, even though no magnetic field was applied.

"If you don't apply a magnetic field, the electron is not supposed to bend," Si said. "So, how could you ever get a voltage drop along the perpendicular direction? That's why everyone didn't believe this at first."

Experiments at the Paul Scherrer Institute ruled out the presence of a tiny magnetic field that could only be detected on a microscopic scale. So the question remained: What caused the effect?

"In the end, all of us had to accept that this was connected to topology," Si said.

In topological materials, patterns of quantum entanglement produce "protected" states, universal features that cannot be erased. The immutable nature of topological states is of increasing interest for quantum computing. Weyl semimetals, which manifest a quasiparticle known as the Weyl fermion, are topological materials.

So are the Weyl-Kondo semimetals Si, Bühler-Paschen and colleagues discovered in 2018. Those feature both Weyl fermions and the Kondo effect, an interaction between the magnetic moments of electrons attached to atoms inside the metal and the spins of passing conduction electrons.

"The Kondo effect is the quintessential form of strong correlations in quantum materials," Si said in reference to the correlated, collective behavior of billions upon billions of quantum entangled particles. "It qualifies the Weyl-Kondo semimetal as one of the rare examples of a topological state that's driven by strong correlations.

"Topology is a defining characteristic of the Weyl-Kondo semimetal, and the discovery of this spontaneous giant Hall effect is really the first detection of topology that's associated with this kind of Weyl fermion," Si said.

Experiments showed that the effect arose at the characteristic temperature associated with the Kondo effect, indicating the two are likely connected, Si said.

"This kind of spontaneous Hall effect was also observed in contemporaneous experiments in some layered semiconductors, but our effect is more than 1,000 times larger," he said. "We were able to show that the observed giant effect is, in fact, natural when the topological state develops out of strong correlations."

Si said the new observation is likely "a tip of the iceberg" of extreme responses that result from the interplay between strong correlations and topology.

He said the size of the topologically generated Hall effect is also likely to spur investigations into potential uses of the technology for quantum computation.

"This large magnitude, and its robust, bulk nature presents intriguing possibilities for exploitation in topological quantum devices," Si said.

Si is the Harry C. and Olga K. Wiess Professor in Rice's Department of Physics and Astronomy and director of RCQM. Bühler-Paschen is a professor at TU Wien's Institute for Solid State Physics.

Credit: 
Rice University

Cerium sidelines silver to make drug precursor

image: A mild process discovered by Rice University chemists could replace difficult, silver-based catalysis to create valuable fluoroketones, a precursor in the design and manufacture of drugs.

Image: 
Illustration by Renee Man/@chemkitty

HOUSTON - (Feb. 26, 2021) - Save your silver! It's better used for jewelry than as a catalyst for drugs.

Rice University scientists have developed a greatly simplified method to make fluoroketones, precursors for drug design and manufacture that typically require a silver catalyst.

Rice chemist Julian West and graduate students Yen-Chu Lu and Helen Jordan introduced a process for the rapid and scalable synthesis of fluoroketones that have until now been challenging and expensive to make.

Their open-access work graces the cover of the Feb. 21 issue of the Royal Society of Chemistry journal ChemComm.

The lab's new process replaces silver with cerium-based ceric ammonium nitrate (CAN), which produces functional precursors under mild conditions in about 30 minutes.

"We could make batches of this in a bathtub," West said.

Cerium has demonstrated such potential in other labs, and the fact that it's 800 times more abundant in the Earth's crust than silver made it of great interest to the Rice team.

"Ketones are a gateway functional group in molecules that you can use to make different things, like anti-cancer compounds," said West, who came to Rice in 2019 with funding from the Cancer Prevention and Research Institute of Texas and was named a Forbes 30 Under 30 science "game changer" last year.

"They're a great foothold to turn into an alkene or an aromatic ring," he said. "The important part of this paper is that we're incorporating fluorine into these fragments. Fluorine is an interesting element and quite abundant, but it's barely used in biology.

"Fluorine has some extreme properties: It's incredibly electronegative, so it holds onto its electrons," West said. "That makes it hard for enzymes in biological processes to deal with them in pharmaceuticals like anti-cancer molecules."

Hydrogen atoms in drug molecules are easy for the liver to process, but replacing them with fluorines "is like armor plating at that position," he said. "That helps drugs last far longer in the body, so you don't have to take as much. That's desirable for chemotherapeutics." He noted that atorvastatin (aka Lipitor), one of the most commonly prescribed drugs in the United States, incorporates fluorine for the same purpose.

"We want to put fluorine in specific places in the molecule where we know it will make a difference, and this ketone functional group allows us to do it," West said. "People have been using a silver catalyst, but the process requires a lot of silver, it takes a long time at high temperature and it has to be done under a carefully controlled nitrogen or argon atmosphere.

"Our process is cheap bucket chemistry, and we think the reaction is done in about five minutes," he said. "But we leave it for 30, just to be safe."

The process is highly scalable. "When Yen-Chu tripled the initial recipe, he got the exact same result," West said. "That's rare in these kinds of reactions."

Credit: 
Rice University

Scientists investigated more thoroughly Walker breakdown in 3D magnetic nanowires

image: FEFU spin nano lab, research equipment

Image: 
FEFU press office

Physicists from Russia, Chile, Brazil, Spain, and the UK, have studied how the magnetic properties change in 3D nanowires, promising materials for various magnetic applications, depending on the shape of their cross-section. Particularly, they more deeply probed into the Walker breakdown phenomenon, on the understanding of which the success of the implementation of the future electronics devices depends. The research outcome appears in Scientific Reports.

The cross-sectional geometry of a three-dimensional nanowire affects the domain wall dynamics and therefore is crucial for their control. In turn, managing the DW dynamics under various external conditions is necessary in order to realize the future electronics and computing devices, operating on new physical principles. Such equipment will be faster, more reliable, smaller, and more energy-efficient. An example of it is magnetic memory, generators of magnetic signals, magnetic logic devices.

The domain wall dynamics in magnetic nanowires is curbed by the Walker breakdown phenomenon. That is the loss of the linear dependence of the velocity of domain walls on the magnitude of the external magnetic field when the field exceeds a critical value known as the Walker field.

"We managed to find out that the oscillatory behavior of the DW in a nanowire with a polygonal cross-section comes from energy changes due to deformations of the DW shape during the rotation around the nanowire. Thus, a deeper understanding of the Walker breakdown phenomenon is provided," says research participant Yuri Ivanov, a docent at the Department of Computer Systems, Far Eastern Federal University School of Natural Sciences. "We have studied 3D nanostructures in which domain walls can oscillate not only along the nanowire but also around it. This double oscillation can be considered as a basis, when designing, for example, the sources of radiofrequency electromagnetic radiation (nano-oscillators) for smartphones of the new generation."

The production of 3D magnetic nanowires is a fast-growing area of research. The material secures a special position among prospective magnetic nanostructures. The different cross-sectional shapes and curvatures of nanowires determine their dynamic and static magnetic properties. However, it is extremely difficult to study these properties due to the three-dimensional structure of the nano-objects. An additional complication, scientists see in the scaling up of the production of 3D nanowires and its compatibility with existing engineering solutions, for example, in nanoelectronics.

Next, the scientists plan the development of a theoretical model to predict the change in the dynamic magnetic properties in 3D nanowires of various cross-sections and curvatures.

Credit: 
Far Eastern Federal University

Identifying patient-specific differences to treat HCM with precision medicine

Hypertrophic cardiomyopathy (HCM) is a cardiovascular disease characterized by thickening of the left ventricle, otherwise known as the main squeezing chamber of the heart. HCM is best known for causing sudden death in athletes but can occur in persons of any age, often without symptoms. While frequently discussed in the context of genetics, most patients with HCM do not have a known genetic variant. Investigators from Brigham and Women's Hospital uncovered a means to study the complexity of this disease beyond the identification of individual genes. This new approach offers a path toward treating HCM using individualized medicine. In a recent study, investigators analyzed the role protein-protein interactions (PPIs) play in differentiating individual cases of HCM. Their results are published in Nature Communications.

"While genes play a role in HCM, there is more information surrounding this condition that can't be explained with genetics alone," said corresponding author Bradley Maron, MD, a cardiologist in the Division of Cardiovascular Medicine at the Brigham. "This raises the question of whether there are other important components of the disease. With this project, we aim to provide an expanded view of the pathobiology of HCM in a way that doesn't hinge on understanding specific gene mutations."

The team collected tissue from 18 HCM patients recently recruited to receive myectomies, surgical procedures involving the excision of a portion of the heart muscle wall. To identify individual PPIs among the HCM cohort, Maron, co-lead author Ruisheng Wang, PhD, and colleagues analyzed tissue contents using RNA-seq, a technique that allows researchers to identify patterns of where and when genes are active. Through a series of steps, they identified patient-specific PPIs (known as reticulotypes) corresponding to the unique biological characteristics of each patient's disease profile.

The group discovered that they could distinguish individualized protein networks in each patient in the HCM cohort.

"These findings represent a major step forward for precision medicine," said Maron. "With the identification of patient-specific biological wiring maps, researchers may one day develop personalized treatments informed by patients' protein networks."

The study is unique in that researchers studied affected tissue collected directly from HCM patients, allowing for a more robust, accurate way of studying patients' pathobiology than has been performed previously. Maron states, however, that in the future, he hopes to develop less invasive ways to perform this same test, whether it be through the collection of blood samples or through other biomarkers in the clinic. The team additionally aspires to apply this same procedure to other diseases as well, hoping to expand the number of opportunities for precision medicine.

"This study illustrates the complexity of HCM but also offers a clearer path forward for understanding the disease pathobiology with the promise of opportunity for precision medicine in this disease," said Maron.

Credit: 
Brigham and Women's Hospital

Nuclear physicists on the hunt for squeezed protons

image: A new experiment used high-energy electrons to knock out protons from within a carbon nucleus in search of "squeezed protons". These are protons that are "squeezed" such that their constituent quarks are in a small size configuration, allowing them to slip out of the nucleus without interacting with other protons or neutrons, an effect called color transparency. The new experiment pushed the measurements to the highest speeds ever explored with electrons, but found that the knocked-out protons behave just as ordinary protons.

Image: 
DOE's Jefferson Lab

While protons populate the nucleus of every atom in the universe, sometimes they can be squeezed into a smaller size and slip out of the nucleus for a romp on their own. Observing these squeezed protons may offer unique insights into the particles that build our universe.

Now, researchers hunting for these squeezed protons at the U.S. Department of Energy's Thomas Jefferson National Accelerator Facility have come up empty handed, suggesting there's more to the phenomenon than first thought. The result was recently published in Physical Review Letters.

"We were looking to squeeze the proton such that its quarks are in a small-size configuration. And that's a pretty tough thing to do," said Holly Szumila-Vance, a Jefferson Lab staff scientist.

Protons are made of three quarks bound up by the strong force. In an ordinary proton, the strong force is so strong that it leaks out, making the proton stick to other protons and neutrons around it in the nucleus. That's according to quantum chromodynamics, or QCD, the theory that describes how quarks and the strong force interact. In QCD, the strong force is also referred to as the color force.

However, QCD also predicts that the proton can be squeezed such that the quarks become more tightly knit - essentially wrapping themselves up so tightly in the color force that it no longer leaks out of the proton. When that happens, the proton no longer sticks to other particles and can move freely through the nucleus. This phenomenon is called "color transparency," since the proton has become invisible to the color force of the particles around it.

"It's a fundamental prediction of quantum chromodynamics, the theory that describes these particles," Szumila-Vance explained.

An earlier experiment showed color transparency in simpler particles made of quarks called pions. Where protons have three quarks, pions have just two. In addition, another experiment conducted with protons had also suggested that protons also may exhibit color transparency at energies well within reach of the recently upgraded facility at Jefferson Lab.

"We expected to find the protons squeezed just like the pions," said Dipangkar Dutta, a professor at Mississippi State University and a spokesperson for the experiment. "But we went to higher and higher energies and are still not finding them."

The experiment was one of the first to run in the Continuous Electron Beam Accelerator Facility, a DOE Office of Science User Facility, following its 12 GeV upgrade. In the experiment, the nuclear physicists directed high-energy electrons from CEBAF into the nuclei of carbon atoms. They then measured the outgoing electrons and any protons that came out.

"This was an exciting experiment to be a part of. It was the first experiment to run in Experimental Hall C after we upgraded the hall for 12 GeV running," said Szumila-Vance. "These were the highest-momentum protons measured at Jefferson Lab, and the highest-momentum protons ever produced by electron scattering."

"At the energies we are probing, the proton is usually decimated, and you're looking at the debris of the proton," Dutta explained. "But in our case, we want the proton to stay a proton, and the only way that that can happen is if the quarks kind of squeeze together, hold each other much more tightly so that they can escape together from the nucleus."

While the nuclear physicists observed several thousand protons in the experiment, they did not find the tell-tale signs of color transparency in the new data.

"I think this tells us that the proton is more complicated than we expected," said Szumila-Vance. "This is a fundamental prediction of the theory. We know that it has to exist at some high energy, but just don't yet know where that will happen."

The researchers said the next step is to better understand the phenomenon in simpler particles where it has already been observed, so that improved predictions can be made for more complex particles, such as protons.

Credit: 
DOE/Thomas Jefferson National Accelerator Facility

Study uncovers flaws in process for maintaining state voter rolls

States regularly use administrative records, such as motor-vehicle data, in determining whether people have moved to prune their voter rolls. A Yale-led study of this process in Wisconsin shows that a significant percentage of registered voters are incorrectly identified as having changed addresses, potentially endangering their right to vote.

The study, published in the journal Science Advances, found that at least 4% of people listed as suspected "movers" cast ballots in 2018 elections using addresses that were wrongly flagged as out of date. Minority voters were twice as likely as white voters to cast their ballot with their original address of registration after the state marked them as having moved, the study showed.

The findings suggest that states should more clearly communicate the processes they use to update voter-registration files and that a more robust effort is required to confirm whether individuals have moved before they are removed from the voter rolls, said Yale political scientist Gregory A. Huber, the study's lead author.

"The process of maintaining states' voter-registration files cries out for greater transparency," said Huber, the Forst Family Professor of Political Science in the Faculty of Arts & Sciences. "Our work shows that significant numbers of people are at risk of being disenfranchised, particularly those from minority groups.

"Unfortunately, we don't know enough about the process used to prune voter rolls nationwide to understand why mistakes occur and how to prevent them."

Regularly updating voter rolls prevents registration files from becoming bloated with individuals who have died, moved away, or are otherwise no longer eligible to vote. When these rolls swell with ineligible voters, it raises concerns about potential fraud (although there is little evidence it causes unlawful voting, Huber says) and creates headaches for political campaigns, which rely on accurate registration records to reach potential voters.

Americans are not obligated to inform local election officials when they move to a new address, but federal law mandates that states identify changes in residence among registered voters. To better accomplish this task, 30 states, including Wisconsin, and the District of Columbia have formed the Electronic Registration Information Center (ERIC), a non-profit organization that assists them in improving the accuracy of their voter rolls.

ERIC uses various administrative records, including motor vehicle data, change of address information from the U.S. Postal Service, and the Social Security Administration's master death file, to flag registrations that are potentially out of date. It provides states a "movers list" of people who likely have changed residences. The states contact listed individuals, often by sending them postcards they can use to confirm their address. If people do not return the postcards, their registration can be inactivated, starting the process for removal.

Federal privacy protections and ERIC's agreements with member states prohibit the organization from disclosing who is marked as having moved and on what basis they were flagged as such, making it difficult to examine its process. However, after submitting a Wisconsin Freedom of Information Act request, Huber and his co-authors obtained special "movers poll books" from the state which list all people who were marked as suspected movers and who did not respond to the postcard notification. Individuals in the books who showed up to vote in 2018 signed their names in these books, providing evidence that they voted at addresses that had been flagged as invalid.

The researchers collected movers poll books from a representative sample of election wards and matched their contents against voting records for 2018 local, state, and federal elections. They found that at least 9,000 people -- about 4% of those listed in the poll books -- voted in 2018 using the address of registration that ERIC had marked as invalid. Minority voters were twice as likely to be incorrectly identified as having moved.

The study likely undercounts the number of registered voters incorrectly listed as having moved, the researchers said, explaining that a significant number of people who did not respond to the postcard might have nonetheless renewed their voting registration before the poll books were published. In addition, the study examined low-turnout elections, making it likely that many people wrongly listed in the poll books weren't covered in the analysis because they didn't vote, Huber said.

The researchers are not suggesting that ERIC intentionally targeted minorities.

"There's no malice here," Huber said. "ERIC wants to help states, but relying on administrative records inevitably produces mistakes for any number of reasons. This makes the process used to validate having moved, such as mailed postcards, even more important. Without more information, we can't be certain why the process disparately affects minorities."

A potential reason for the disparity is that minorities are more likely than whites to live in apartment buildings and large households, which may increase the risk of errors in administrative records, the researchers suggest. In addition, residents of apartment buildings also may be less likely to confirm their address using the postcard since mail service can be spottier in multi-unit buildings than single-family homes.

Huber credits Wisconsin for taking steps to protect people's voting rights.

"The poll books are a great way to identify mistakes and prevent people from being disenfranchised," he said. "The state also has same day voter registration, which is another safety valve that doesn't exist in many states. We suggest that states expend more effort on contacting people at risk of losing their registration."

Credit: 
Yale University

Imaging space debris in high resolution

image: From left to right: Space debris modeled as a cluster of six reflective objects, an image developed of the debris without accounting for the objects' rotation, and an image developed after accounting for the objects' rotation. Accounting for the rotation produces a much clearer image.

Image: 
Figure courtesy of Matan Leibovich, George Papanicolaou, and Chrysoula Tsogka.

Litter is not only a problem on Earth. According to NASA, there are currently millions of pieces of space junk in the range of altitudes from 200 to 2,000 kilometers above the Earth's surface, which is known as low Earth orbit (LEO). Most of the junk is comprised of objects created by humans, like pieces of old spacecraft or defunct satellites. This space debris can reach speeds of up to 18,000 miles per hour, posing a major danger to the 2,612 satellites that currently operate at LEO. Without effective tools for tracking space debris, parts of LEO may even become too hazardous for satellites.

In a paper publishing today in the SIAM Journal on Imaging Sciences, Matan Leibovich (New York University), George Papanicolaou (Stanford University), and Chrysoula Tsogka (University of California, Merced) introduce a new method for taking high-resolution images of fast-moving and rotating objects in space, such as satellites or debris in LEO. They created an imaging process that first utilizes a novel algorithm to estimate the speed and angle at which an object in space is rotating, then applies those estimates to develop a high-resolution picture of the target.

Leibovich, Papanicolaou, and Tsogka used a theoretical model of a space imaging system to construct and test their imaging process. The model depicts a piece of fast-moving debris as a cluster of very small, highly reflective objects that represent the strongly reflective edges of an item in orbit, such as the solar panels on a satellite. The cluster of reflectors all move together with the same speed and direction and rotate about a common center. In the model, multiple sources of radiation on the Earth's surface--such as the ground control stations of global navigation satellite systems--emit pulses that are reflected by target pieces of space debris. A distributed set of receivers then detects and records the signals that bounce off the targets.

The model focuses on sources that produce radiation in the X-band, or from frequencies of 8 to 12 gigahertz. "It is well known that resolution can be improved by using higher frequencies, such as the X-band," Tsogka said. "Higher frequencies, however, also result in distortions to the image due to ambient fluctuations from atmospheric effects." Signals are distorted by turbulent air as they travel from the target to receivers, which can make the imaging of objects in LEO quite challenging. The first step of the authors' imaging process was thus to correlate the data taken at different receivers, which can help reduce the effects of these distortions.

The diameter of the area encompassed by the receivers is called the physical aperture of the imaging system -- in the model, this is about 200 kilometers. Under normal imaging conditions, the physical aperture's size determines the resolution of the resulting image; a larger aperture begets a sharper picture. However, the quick movement of the imaging target relative to the receivers can create an inverse synthetic aperture, in which the signals that were detected at multiple receivers as the target moved throughout their field of view are synthesized coherently. This configuration can effectively improve the resolution, as if the imaging system had a wider aperture than the physical one.

Objects in LEO can spin on timescales that range from a full rotation every few seconds to every few hundred seconds, which complicates the imaging process. It is thus important to know--or at least be able to estimate--some details about the rotation before developing the image. The authors therefore needed to estimate the parameters related to the object's rotation before synthesizing the data from different receivers. Though simply checking all of the possible parameters to see which ones yield the sharpest image is technically feasible, doing so would require a lot of computational power. Instead of employing this brute force approach, the authors developed a new algorithm that can analyze the imaging data to estimate the object's rotation speed and the direction of its axis.

After accounting for the rotation, the next step in the authors' imaging process was to analyze the data to develop a picture of the space debris that would hopefully be as accurate and well-resolved as possible. One method that researchers often employ for this type of imaging of fast-moving objects is the single-point migration of cross correlations. Though atmospheric fluctuations do not usually significantly impair this technique, it does not have a very high resolution. A different, commonly-used imaging approach called Kirchhoff migration can achieve a high resolution, as it benefits from the inverse synthetic aperture configuration; however, the trade-off is that it is degraded by atmospheric fluctuations. With the goal of creating an imaging scheme that is not too heavily affected by atmospheric fluctuations but still maintains a high resolution, the authors proposed a third approach: an algorithm whose result they call a rank-1 image. "The introduction of the rank-1 image and its resolution analysis for fast-moving and rotating objects is the most novel part of this study," Leibovich said.

To compare the performance of the three imaging schemes, the authors gave simulated data of a rotating object in LEO to each one and compared the images that they produced. Excitingly, the rank-1 image was much more accurate and well-resolved than the result of single-point migration. It also had similar qualities to the output of the Kirchhoff migration technique. But this result was not entirely surprising, given the problem's configuration. "It is important to note that the rank-1 image benefits from the rotation of the object," Papanicolaou said. Though a rotating object generates more complex data, one can actually incorporate this additional information into the image processing technique to improve its resolution. Rotation at certain angles can also increase the size of the synthetic aperture, which significantly improves the resolution for the Kirchhoff migration and rank-1 images.

Further simulations revealed that the rank-1 image is not easily muddled by errors in the new algorithm for the estimation of rotation parameters. It is also more robust to atmospheric effects than the Kirchhoff migration image. If receivers capture data for a full rotation of the object, the rank-1 image can even achieve optimal imaging resolution. Due to its good performance, this new imaging method could improve the accuracy of imaging LEO satellites and space debris. "Overall, this study shed light on a new method for imaging fast-moving and rotating objects in space," Tsogka said. "This is of great importance for ensuring the safety of the LEO band, which is the backbone of global remote sensing."

Credit: 
Society for Industrial and Applied Mathematics

New sustainable building simulation method points to the future of design

ITHACA, N.Y. - A team from Cornell University's Environmental Systems Lab, led by recent graduate Allison Bernett, has put forth a new framework for injecting as much information as possible into the pre-design and early design phases of a project, potentially saving architects and design teams time and money down the road.

"(Our framework) allows designers to understand the full environmental impact of their building," said Bernett, corresponding author of "Sustainability Evaluation for Early Design (SEED) Framework for Energy Use, Embodied Carbon, Cost, and Daylighting Assessment" which published Jan. 10 in the Journal of Building Performance Simulation.

Principle investigators are Timur Dogan, assistant professor of architecture in the College of Architecture, Art and Planning; and Katharina Kral, a licensed architect and lecturer in the Department of Architecture.

"How we look at this is, there's the cost of change in the design process, and then the opportunity of impact," Dogan said. "In the very beginning, changing something doesn't cost anything, but if you're a month into the project, changing something is really expensive, because now you have to rehire consultants and redesign things.

"And then the other thing is the potential of impact," he said. "In the very beginning, just with a simple nudge in the right direction, you can change a project from being an energy hog to something that's very sustainable, and integrates well into the environment."

In 2018, according to the International Energy Agency, the construction sector accounted for 39% of energy and process-related greenhouse gas emissions. That included 11% originating from the manufacturing of building materials and products.

The Sustainability Evaluation for Early Design (SEED) Framework is a decision-making tool that can dynamically and concurrently simulate several variables: building energy performance; embodied carbon (carbon emissions generated by construction and materials); construction cost; and daylighting (the use of natural light to illuminate indoor spaces).

The framework will allow architects and design teams to rapidly trial and rank tens of thousands of design iterations, using as few as four inputs.

Using publicly available data and a suite of available design simulation programs - including Rhino/Grasshopper (a CAD program); ClimateStudio, developed by Dogan, for daylight simulation and building energy modeling; and engineering software Karamba3D - Bernett and the team tested SEED in a case study of a hypothetical mid-sized office building modeled in Boston, Washington, D.C., and Phoenix.

The SEED Framework generated thousands of design options based on variables specific to the three cities in the case study, offering designers the flexibility of many options early in the process, before changing course would get too expensive.

"The idea is, you run this analysis," Dogan said, "and you get a few options that already make a lot of sense, and some options that you can completely forget about. ... [It] always comes down to this lack of information in the decision-making process.

"In that sense, the construction industry is super inefficient," he said. "There's too many players who don't know the full picture and then make decisions that are not always rational. This framework that Allison worked on is geared to help bring the information to the table. Every stakeholder in the design process can then form their own opinion about design goal priorities."

SEED's greatest asset, Bernett said, is amassing a tranche of data on multiple factors in one place, and involving architects early in the design and pre-design phases.

"It takes a lot of time to gather all that data, and we have that prepackaged. So there's definitely a hunger for that," said Bernett, who presented the SEED Framework in September 2019 at the International Building Performance Simulation Conference, in Rome.

"Right now, we rely heavily on energy modelers and consultants to do this work," she said. "And if we can involve architects more readily and more early on, I think that we're going to see a lot of improvement and cost-effectiveness to these early design decisions."

In addition to the publicly available design simulations, the team used AutoFrame, a new procedure developed by Kral for automatically computing structural systems. AutoFrame helps improve the precision of embodied carbon assessments and daylight simulations.

The Cornell Atkinson Center for Sustainability's Small Grants Program provided pivotal support for this work, Bernett said.

"That funding really gave it the push it needed," she said. "It allowed me to present a first iteration [of SEED] at the conference in Rome, and then to really flesh out the research more after that."

Credit: 
Cornell University

Scientists use Doppler to peer inside cells

image: David Nolte works with the Doppler apparatus to peer inside living cells, giving him insight into intracellular activity, metabolism, and pathogenicity

Image: 
Purdue University photo/Rebecca McElhoe

WEST LAFAYETTE, Ind. -- Doppler radar improves lives by peeking inside air masses to predict the weather. A Purdue University team is using similar technology to look inside living cells, introducing a method to detect pathogens and treat infections in ways that scientists never have before.

In a new study, the team used Doppler to sneak a peek inside cells and track their metabolic activity in real time, without having to wait for cultures to grow. Using this ability, the researchers can test microbes found in food, water, and other environments to see if they are pathogens, or help them identify the right medicine to treat antibiotic-resistant bacteria.

David Nolte, Purdue's Edward M. Purcell Distinguished Professor of Physics and Astronomy; John Turek, professor of basic medical sciences; Eduardo Ximenes, research scientist in the Department of Agricultural and Biological Engineering; and Michael Ladisch, Distinguished Professor of Agricultural and Biological Engineering, adapted this technique from their previous study on cancer cells in a paper released this month in Communications Biology.

Using funding from the National Science Foundation as well as Purdue's Discovery Park Big Idea Challenge, the team worked with immortalized cell lines -- cells that will live forever unless you kill them. They exposed the cells to different known pathogens, in this case salmonella and E. coli. They then used the Doppler effect to spy out how the cells reacted. These living cells are called "sentinels," and observing their reactions is called a biodynamic assay.

"First we did biodynamic imaging applied to cancer, and now we're applying it to other kinds cells," Nolte said. "This research is unique. No one else is doing anything like it. That's why it's so intriguing."

This strategy is broadly applicable when scientists have isolated an unknown microbe and want to know if it is pathogenic -- harmful to living tissues -- or not. Such cells may show up in food supply, water sources or even in recently melted glaciers.

"This directly measures whether a cell is pathogenic," Ladisch said. "If the cells are not pathogenic, the Doppler signal doesn't change. If they are, the Doppler signal changes quite significantly. Then you can use other methods to identify what the pathogen is. This is a quick way to tell friend from foe."

Being able to quickly discern whether a cell is harmful is incredibly helpful in situations where people encounter a living unknown microorganism, allowing scientists to know what precautions to take. Once it is known that a microbe is harmful, they can begin established protocols that allow them to determine the specific identity of the cell and determine an effective antibiotic against the microorganism.

Another benefit is the ability to quickly and directly diagnose which bacteria respond to which antibiotics. Antibiotic resistance can be a devastating problem in hospitals and other environments where individuals with already compromised bodies and immune systems may be exposed to and infected by increasingly high amounts of antibiotic resistant bacteria. Sometimes this results in a potentially fatal condition called bacterial sepsis, or septicemia. This is different from the viral sepsis that has been discussed in connection with COVID-19, though the scientists say their next steps will include investigating viral sepsis.

Treating sepsis is challenging. Giving the patient broad-spectrum antibiotics, which sounds like a good idea, might not help and could make the situation worse for the next patient. Letting bacteria come into close contact with antibiotics that do not kill them only makes them more resistant to that antibiotic and more difficult to fight next time.

Culturing the patient's tissues and homing in on the correct antibiotic to use can take time the patient does not have, usually eight to 10 hours. This new biodynamic process allows scientists to put the patient's bacterial samples in an array of tiny petri dishes containing the tissue sentinels and treat each sample with a different antibiotic. Using Doppler, they can quickly notice which bacterial samples have dramatic metabolic changes. The samples that do are the ones that have reacted to the antibiotic -- the bacteria are dying, being defeated and beaten back by antibiotics.

"When we treat with antibiotics, the bacteria don't have to multiply much before they start to affect the tissue sentinels," Nolte explained. "There are still too few bacteria to see or to measure directly, but they start to affect how the tissues behaves, which we can detect with Doppler."

In less than half the time a traditional culture and diagnosis takes, doctors could tell which antibiotic to administer, bolstering the patient's chances for recovery. The researchers worked closely with the Purdue Research Foundation Office of Technology Commercialization to patent and license their technologies. They plan to further explore whether this method would work for tissue samples exposed to nonliving pathogenic cells or dried spores, and to test for and treat viral sepsis.

Credit: 
Purdue University

Nanomedicine activation profile determines efficacy depending on tumor c-Myc expression

image: Fig. 1: Different drug release profile depending on the linker used for block-copolymers of nano-micelles

FR-JQ1H/m with aliphatic aldehyde linker:
Fast drug release in gradually elevated acidity

SR-JQ1H/m with aromatic aldehyde linker:
Slow drug release in gradually elevated acidity

Drug release curve of FR-JQ1 and SR-JQ
depending on the pH change of tumor tissues

Image: 
2021 Innovation Center of NanoMedicine

February 26, 2021 - Kawasaki, Japan: The Innovation Center of NanoMedicine (Director General: Prof. Kazunori Kataoka, Location: Kawasaki in Japan, Abbreviation: iCONM) reported in ACS Nano (Impact Factor: 14.588 in 2019) together with the group of Prof. Yu Matsumoto of Otorhinolaryngology and Head and Neck Surgery (Prof. Tatsuya Yamasoba) and the group of Prof. Horacio Cabral of the Department of Bioengineering (Prof. Ryo Miyake) in the University of Tokyo that the efficacy of polymeric nano-micelles with different drug activation profile depends on the expression level of c-Myc, one of the major proto-oncogene, has been developed. See: https://pubs.acs.org/doi/10.1021/acsnano.1c00364

It is known that c-Myc is involved in cancer cell proliferation and angiogenesis and changes the cell cycle, suppresses normal cell differentiation, and promotes cancer metastasis. It is a typical proto-oncogene that regulates many genes related to growth factors and is known to be involved in developing of many cancers, such as chromosomal translocation in Burkitt lymphoma. Therefore, drug discovery research is being conducted worldwide as an anticancer drug targeting this transcription factor that can directly attack cancer stem cells. However, since embryonic lethality occurs in c-Myc knockout mice, c-Myc is considered as an essential gene for living cells, and selective delivery to cancer tissues is an important key to developing its inhibitors. Besides, c-Myc is also known as a factor necessary for the initial induction of iPS cells. In the future this inhibition can be expected to be applied as a technology that can also be used to suppress iPS cell-derived carcinogenesis.

In this study, JQ1H, which is a structural analogue of JQ1, a typical indirect c-Myc inhibitor, was encapsulated inside functional nano-micelles, and their efficacy was evaluated. JQ1 binds to a bromodomain protein called BRD4, which is involved in the activation of RNA polymerase II regulating the expression of c-Myc, to inhibit this stream strongly. As a result, the activity of RNA polymerase is weakened and c-Myc expression is down-regulated. Although JQ1 was expected as a promising epigenome drug due to its strong gene expression inhibition, it has an extremely short half-life in vivo due to its fast kidney excretion and rapid clearance after administration. Additionally, JQ1 is almost insoluble in water. These properties of JQ1 became big issues to develop it as an effective drug. The polymeric nano-micelles developed so far at the Innovation Center of NanoMedicine (iCONM), for anticancer therapy, demonstrated (1) stabilization of encapsulated drugs, (2) suppression of kidney excretion, (3) EPR (selective drug delivery to cancer tissues) mediated tumor accumulation, and (4) drug release based on tumor acidosis. This time, we confirmed good antitumor activity in mice transplanted with tongue cancer, melanoma and pancreatic cancer using JQ1-equipped nano-micelles.

Nano-micelles containing JQ1H leak into the tumor tissue from blood vessels after systemic administration due to the so-called EPR effect. Tumor tissues are rich in lactic acid due to its enhanced glycolysis and is more acidic than normal tissues. In this work, two types of nano-micelle were prepared; one in which hydrophobic JQ1H were linked to an amphiphilic block polymer composed of hydrophilic polyethylene glycol block and hydrophobic poly-amino acid block using 3-aminopropionaldehyde (aliphatic aldehyde) linker and the other micelle in which JQ1H was linked with polymer via p-aminomethylbenzaldehyde (aromatic aldehyde) linker. An amphiphilic block polymer was synthesized and used as a base material for nano-micelles (Fig. 1). When it was self-assembled in water to a micellar structure and administered to cancer-bearing mice, the antitumor activity shown in Fig. 2 was achieved. When the linker is an aliphatic aldehyde or when it is an aromatic aldehyde, the release pattern of the drug differs greatly depends on the acidity. The former releases the drug rapidly, and the latter releases the drug slowly. Therefore, the former nano-medicine was named FR-JQ1H/m and the latter was named SR-JQ1H/m. The antitumor activity of these nano-micelles differ greatly depending on the expression level of c-Myc. While, FR-JQ1H/m is more effective for tumors with high c-Myc expression, SR-JQ1H/m is more effective for tumors with low c-Myc expression.

In the future, we believe that the selection of nano-micelles according to the expression level of biomarkers will be an important step toward the realization of personalized medicine and in-body hospitals.

Credit: 
Innovation Center of NanoMedicine