Tech

Researchers surprised to find bacterial parasites behind rise of 'super bugs'

image: Professor of microbiology and molecular genetics, University of Pittsburgh School of Medicine.

Image: 
Vaughn Cooper

PITTSBURGH, July 16, 2021 - For the first time ever, researchers from the University of Pittsburgh School of Medicine discovered that phages--tiny viruses that attack bacteria--are key to initiating rapid bacterial evolution leading to the emergence of treatment-resistant "superbugs." The findings were published today in Science Advances.

The researchers showed that, contrary to a dominant theory in the field of evolutionary microbiology, the process of adaptation and diversification in bacterial colonies doesn't start from a homogenous clonal population. They were shocked to discover that the cause of much of the early adaptation wasn't random point mutations. Instead, they found that phages, which we normally think of as bacterial parasites, are what gave the winning strains the evolutionary advantage early on.

"Essentially, a parasite became a weapon," said senior author Vaughn Cooper, Ph.D., professor of microbiology and molecular genetics at Pitt. "Phages endowed the victors with the means of winning. What killed off more sensitive bugs gave the advantage to others."

When it comes to bacteria, a careful observer can track evolution in the span of a few days. Because of how quickly bacteria grow, it only takes days for bacterial strains to acquire new traits or develop resistance to antimicrobial drugs.

The researchers liken the way bacterial infections present in the clinic to a movie played from the middle. Just as late-arriving moviegoers struggle to mentally reconstruct events that led to a scene unfolding in front of their eyes, physicians are forced to make treatment decisions based on a static snapshot of when a patient presents at a hospital. And just like at a movie theater, there is no way to rewind the film and check if their guess about the plot or the origin of the infection was right or wrong.

The new study shows that bacterial and phage evolution often go hand in hand, especially in the early stages of bacterial infection. This is a multilayered process in which phages and bacteria are joined in a chaotic dance, constantly interacting and co-evolving.

When the scientists tracked changes in genetic sequences of six bacterial strains in a skin wound infection in pigs, they found that jumping of phages from one bacterial host to another was rampant--even clones that didn't gain an evolutionary advantage had phages incorporated in their genomes. Most clones had more than one phage integrated in their genetic material--often there were two, three or even four phages in one bug.

"It showed us just how much phages interact with one another and with new hosts," said Cooper. "Characterizing diversity in early bacterial infections can allow us to reconstruct history and retrace complex paths of evolution to a clinical advantage. And, with growing interest in using phages to treat highly resistant infections, we are learning how to harness their potency for good."

Credit: 
University of Pittsburgh

The paradox of a free-electron laser without the laser

A new way of producing coherent light in the ultra-violet spectral region, which points the way to developing brilliant table-top x-ray sources, has been produced in research led at the University of Strathclyde.

The scientists have developed a type of ultra-short wavelength coherent light source that does not require laser action to produce coherence. Common electron-beam based light sources, known as fourth-generation light sources, are based on the free-electron laser (FEL), which uses an undulator to convert electron beam energy into X-rays.

Coherent light sources are powerful tools that enable research in many areas of medicine, biology, material sciences, chemistry and physics.

This new way of producing coherent radiation could revolutionise light sources, as it would make them highly compact, essentially table-top size, and capable of producing ultra-short duration pulses of light, much shorter than can be produced easily by any other means.

Making ultraviolet and X-ray coherent light sources more widely available would transform the way science is done; a university could have one of the devices in a single room, on a table top, for a reasonable price.

The group is now planning a proof-of-principle experiment in the ultraviolet spectral range to demonstrate this new way of producing coherent light. If successful, it should dramatically accelerate the development of even shorter wavelength coherent sources based on the same principle. The Strathclyde group has set up a facility to investigate these types of sources: the Scottish Centre for the Application of Plasma-based Accelerators (SCAPA), which hosts one of the highest power lasers in the UK.

The new research has been published in Scientific Reports, one of the Nature family of journals.

Professor Dino Jaroszynski, of Strathclyde's Department of Physics, led the research. He said: "This work significantly advances the state-of-the-art of synchrotron sources by proposing a new method of producing short-wavelength coherent radiation, using a short undulator and attosecond duration electron bunches.

"This is more compact and less demanding on the electron beam quality than free-electron lasers and could provide a paradigm shift in light sources, which would stimulate a new direction of research. It proposes to use bunch compression - as in chirped pulse amplification lasers - within the undulator to significantly enhance the radiation brightness.

"The new method presented would be of wide interest to a diverse community developing and using light sources."

In FELs, as in all lasers, the intensity of light is amplified by a feedback mechanism that locks the phases of individual radiators, which in this case are "free" electrons. In the FEL, this is achieved by passing a high energy electron beam through the undulator, which is an array of alternating polarity magnets.

Light emitted from the electrons as they wiggle through the undulator creates a force called the ponderomotive force that bunches the electrons - some are slowed down, some are sped up, which causes bunching, similar to traffic on a motorway periodically slowing and speeding up.

Electrons passing through the undulator radiate incoherent light if they are uniformly distributed - for every electron that emits light, there is another electron that partially cancels out the light because they radiate out of phase. An analogy of this partial cancelling out is rain on the sea: it produces many small ripples that partially cancel each other out, effectively quelling the waves - reducing their amplitude. In contrast, steady or pulsating wind will cause the waves to amplify through the mutual interaction of the wind with the sea.

In the FEL, electron bunching causes amplification of the light and the increase in its coherence, which usually takes a long time - thus very long undulators are required. In an X-ray FEL, the undulators can be more than a hundred metres long. The accelerators driving these X-ray FELs are kilometres long, which makes these devices very expensive and some of the largest instruments in the world.

However, using a free-electron laser to produce coherent radiation is not the only way; a "pre-bunched" beam or ultra-short electron bunch can also be used to achieve exactly the same coherence in a very short undulator that is less than a metre in length. As long as the electron bunch is shorter than the wavelength of the light produced by the undulator, it will automatically produce coherent light - all the light waves will add up or interfere constructively, which leads to very brilliant light with exactly the same properties of light from a laser.

The researchers have demonstrated theoretically that this can be achieved using a laser-plasma wakefield accelerator, which produces electron bunches that can have a length of a few tens of nanometres. They show that if these ultra-short bunches of high energy electrons pass through a short undulator, they can produce as may photons as a very expensive FEL can produce. Moreover, they have also shown that by producing an electron bunch that has an energy "chirp", they can ballistically compress the bunch to a very short duration inside the undulator, which provides a unique way of going to even shorter electron bunches and therefore produce even shorter wavelength light.

Credit: 
University of Strathclyde

New optimisation method for computational design of industrial applications

image: New optimisation method for computational design of industrial applications

Image: 
University of Malaga

In the field of industrial engineering, using simulations to model, predict and even optimise the response of a system or device is widespread, as it is less expensive and less complex -and, sometimes, less dangerous- than fabricating and testing several prototypes.

This type of simulation studies uses numerical methods that, depending on the problem to be addressed -for example, reducing the aerodynamic forces of an aircraft by changing its shape or using the minimum possible amount of material on elements under loading without breaking- require the simulation of a wide variety of possible combinational cases, which entails high computational costs.

The researchers from the School of Industrial Engineering of the University of Malaga Francisco Javier Granados Ortiz and Joaquín Ortega Casanova have taken one step further by developing a novel computational design optimisation method that reduces these simulation costs by using artificial intelligence.

Faster and cost-efficient designs

They have developed a new methodology with Machine Learning algorithms to predict whether a combination of the design parameters of a problem will be useful or not, based on the objective pursued, and thus guide the design process.

"This method enables us to obtain faster optimised designs by discarding simulations of little or no interest, thus saving not only physical prototype fabrication costs, but also those related to simulation", explain the researchers of the Area of Fluid Mechanics.

Credit: 
University of Malaga

New discoveries and insights into the glass transition

image: DSC traces of the La(Ce)NiAl system, the arrows indicate the calorimetric glass-transition temperature (Tg) (left). The Temperature dependence of the loss modulus of the La(Ce)NiAl system normalized by the maximum peak value. The arrows indicate the α-relaxation temperature (Tα) (right).

Image: 
Kato laboratory, IMR, Tohoku University

A collaborative group from Tohoku University and Johns Hopkins University have provided valuable insights into the glass transition.

When a liquid is cooled rapidly, it gains viscosity and eventually becomes a rigid solid glass. The point at which it does so is known as the glass transition.

But the exact physics behind the glass transition, and the nature of glass in general, still pose many questions for scientists.

Metallic Glasses (MGs) are highly sought after since they combine the flexibility of plastic with the strength of steel. They are amorphous materials with a disordered atomic structure and exhibit unique and divergent thermodynamic and dynamic characteristics, especially when approaching the glass-transition temperature.

The glass transition in MGs is usually determined by calorimetric and dynamical measurements. The calorimetric glass transition detects the temperature at which specific heat has an abrupt jump, whereas dynamical transition looks at the diverse relaxation responses that emerge with increasing temperature forms.

Generally, the calorimetric glass-transition temperature follows the same trend as the dynamic α-relaxation temperature.

However, the collaborative group discovered that high configuration entropy significantly influences the glass transition of MGs and leads to the decoupling between calorimetric and dynamical glass transitions of high entropy metallic glasses.

The results of their research were published in the journal Nature Communication on June 22, 2021.

Their study presents a new glass-forming system that uses high configurational entropy, named high entropy metallic glasses (HEMGs).

The group featured Specially Appointed Professor Jing Jiang and Professor Hidemi Kato from the Institute for Materials Research at Tohoku University and Professor Mingwei Chen from Johns Hopkins University.

"We are excited about this discovery and believe this work furthers our understanding of the fundamental mechanism behind the glass transition," said members of the research group.

Credit: 
Tohoku University

Non-genetic photoacoustic stimulation of single neurons by a tapered fiber optoacoustic emitter

image: a, Schematic of TFOE enabling single neuron stimulation. A 3-nanosecond pulsed laser is delivered nto the TFOE to generate acoustic signal via the photoacoustic effect. b, Fabrication of TFOE. Multiwall CNT/PDMS mixture as the coating material casted on a metal mesh followed by a punch-through method to coat the tapered fiber tip. Optical images show the TFOE has a tip with an overall diameter of 20 microns. c, Characterization of the pressure generated by TFOE and successful stimulation showed by calcium imaging. Left: acoustic pressure generated as a function of the distance from TFOE tip. Middle left: TFOE induced stimulation of GCaMP6f expressing single neuron. Middle right: TFOE selectively stimulation of axon (red) and dendrites (yellow and green) of a multipolar neuron. Right: TFOE integrated with whole cell patch clamp. Excitatory and inhibitory neurons are genetically coded in dark and red.

Image: 
by Linli Shi, Ying Jiang, Fernando R. Fernandez, Guo Chen, Lu Lan, Heng-ye Man, John A. White, Ji-Xin Cheng, Chen Yang

Neuromodulation at high spatial resolution has been an invaluable approach for treating neurological diseases and advancing fundamental knowledge in the field of neuroscience, as firing of a small population or even single neurons can specifically alter animal behavior or brain state. Optogenetics is a powerful method capable of modulating population neural activity in rodents, yet its requirement for viral transfection limits its applications in nonhuman primates and humans. As a rapidly growing modality, focused ultrasound has been harnessed in a myriad of brain neuromodulation applications. However, conventional piezo-based transducers offer a spatial resolution of several millimeters. It is also challenging to directly measure electrophysiological response of cells under ultrasound stimulation using whole-cell patch-clamp electrophysiology, which is the gold standard technique for high-fidelity analysis of the biophysical mechanisms of neuromodulation. New strategies with essential capabilities, including single and subcellular precision and integration of single cell electrophysiology recording, are still sought to enable the understanding of mechanical stimulation at the single cell level and to offer high precision for potential clinical applications.

In a new paper published in Light Science & Application, a team of scientists, led by Professors Chen Yang and Ji-xin Cheng from Boston University have developed a tapered fiber optoacoustic emitter (TFOE), which exploits the optoacoustic effect and generates acoustic field localized within 40 μm, for photoacoustic neural stimulation at the single cell and subcellar level. The significant advancement of TFOE in both spatial resolution and optoacoustic conversion efficiency are achieved by fiber engineering, material modification and a new deposition method. Spatially, they demonstrated acoustic stimulation with an unprecedent precision. Temporally, single acoustic pulse with duration of sub-microsecond generated by TFOE successfully activated neurons, which was found as the shortest acoustic stimuli for successful neuromodulation. Importantly, the near field acoustic wave generated by TFOE allowed optoacoustic stimulation with simultaneously monitoring cell response using whole cell patch clamp recording. Their studies revealed cell-type-specific response to acoustic stimulation for excitatory and inhibitory neurons.

These findings show the exciting potential of TFOE as a platform technology for non-genetic stimulation of the neural system with high spatial and temporal precision. Many new research opportunities will be opened up by the new capabilities offered by TFOE. For example, by unveiling the cell-type-specific threshold to acoustic stimulation for excitatory and inhibitory neurons, different acoustic pressure and duration can be applied to achieve certain cell-type selectivity in multiscale brain region. Meanwhile, single acoustic pulse with duration of sub-microsecond can be further fine-tuned to design the temporal profile of stimulus, which will allow controlling the neuron activity patterns to mimic natural neural codes. Furthermore, acoustic stimulation of neurons, with pharmacologically or genetically modifying ion channels integrated with patch clamp, provides new insight to the electrophysiological mechanisms of mechanical neuromodulation. Without any metal components, the TFOE is immune to electromagnetic interference and is compatible with functional magnetic resonance imaging (fMRI), which holds promise for future study toward understanding of behavior and disease in human patients. Given the increasing popularity of ultrasound neuromodulation, the compactness, cost-effectiveness and versatility of TFOE open broad opportunities to utilize the optoacoustic effect in the field of neuroscience, the scientists forecast.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

Future information technologies: Topological materials for ultrafast spintronics

image: Snapshots of the electronic structure of Sb acquired with femtosecond time-resolution. Note the changing spectral weight above the Fermi energy (EF).

Image: 
HZB/Nature Communication Physics (2021)

The laws of quantum physics rule the microcosm. They determine, for example, how easily electrons move through a crystal and thus whether the material is a metal, a semiconductor or an insulator. Quantum physics may lead to exotic properties in certain materials: In so-called topological insulators, only the electrons that can occupy some specific quantum states are free to move like massless particles on the surface, while this mobility is completely absent for electrons in the bulk. What's more, the conduction electrons in the "skin" of the material are necessarily spin polarized, and form robust, metallic surface states that could be utilized as channels in which to drive pure spin currents on femtosecond time scales (1 fs= 10-15 s).

These properties open up exciting opportunities to develop new information technologies based on topological materials, such as ultrafast spintronics, by exploiting the spin of the electrons on their surfaces rather than the charge. In particular, optical excitation by femtosecond laser pulses in these materials represents a promising alternative to realize highly efficient, lossless transfer of spin information. Spintronic devices utilizing these properties have the potential of a superior performance, as they would allow to increase the speed of information transport up to frequencies a thousand times faster than in modern electronics.

However, many questions still need to be answered before spintronic devices can be developed. For example, the details of exactly how the bulk and surface electrons from a topological material respond to the external stimulus i.e., the laser pulse, and the degree of overlap in their collective behaviors on ultrashort time scales.

A team led by HZB physicist Dr. Jaime Sánchez-Barriga has now brought new insights into such mechanisms. The team, which has also established a Helmholtz-RSF Joint Research Group in collaboration with colleagues from Lomonosov State University, Moscow, examined single crystals of elemental antimony (Sb), previously suggested to be a topological material. "It is a good strategy to study interesting physics in a simple system, because that's where we can hope to understand the fundamental principles," Sánchez-Barriga explains. "The experimental verification of the topological property of this material required us to directly observe its electronic structure in a highly excited state with time, spin, energy and momentum resolutions, and in this way we accessed an unusual electron dynamics," adds Sánchez-Barriga.

The aim was to understand how fast excited electrons in the bulk and on the surface of Sb react to the external energy input, and to explore the mechanisms governing their response. "By controlling the time delay between the initial laser excitation and the second pulse that allows us to probe the electronic structure, we were able to build up a full time-resolved picture of how excited states leave and return to equilibrium on ultrafast time scales. The unique combination of time and spin-resolved capabilities also allowed us to directly probe the spin-polarization of excited states far out-of-equilibrium", says Dr. Oliver J. Clark.

The data show a "kink" structure in transiently occupied energy-momentum dispersion of surface states, which can be interpreted as an increase in effective electron mass. The authors were able to show that this mass enhancement plays a decisive role in determining the complex interplay in the dynamical behaviors of electrons from the bulk and the surface, also depending on their spin, following the ultrafast optical excitation.

"Our research reveals which essential properties of this class of materials are the key to systematically control the relevant time scales in which lossless spin-polarised currents could be generated and manipulated," explains Sánchez-Barriga. These are important steps on the way to spintronic devices which based on topological materials possess advanced functionalities for ultrafast information processing.

Credit: 
Helmholtz-Zentrum Berlin für Materialien und Energie

Simplified method for calibrating optical tweezers

image: A microparticle held with optical tweezers in the microscope. Inset: Illustration of the held particle (magnified); shown in red is the light of the infrared laser used.

Image: 
Pascal Runde

Measurements of biomechanical properties inside living cells require minimally invasive methods. Optical tweezers are particularly attractive as a tool. It uses the momentum of light to trap and manipulate micro- or nanoscale particles. A team of researchers led by Prof. Dr. Cornelia Denz from the University of Münster (Germany) has now developed a simplified method to perform the necessary calibration of the optical tweezers in the system under investigation. Scientists from the University of Pavia in Italy were also involved. The results of the study have been published in the journal Scientific Reports.

The calibration ensures that measurements of different samples and with different devices are comparable. One of the most promising techniques for calibrating optical tweezers in a viscoelastic medium is the so-called active-passive calibration. This involves determining the deformability of the sample under investigation and the force of the optical tweezers. The research team has now further improved this method so that the measurement time is reduced to just a few seconds. The optimized method thus offers the possibility of characterizing dynamic processes of living cells. These cannot be studied with longer measurements because the cells reorganize themselves during the measurement and change their properties. In addition, the shortening of the measurement time also helps to reduce the risk of damage to the biological samples due to light-induced heating.

In simplified terms, the underlying procedure to perform the calibration works as follows: The micro- or nanometer-sized particles are embedded in a viscoelastic sample held on the stage of a microscope. Rapid and precise nanometer-scale displacements of the specimen stage cause the optically trapped particle to oscillate. By measuring the refracted laser light, changes in the sample's position can be recorded, and in this way, conclusions can be drawn about its properties, such as stiffness. This is usually done sequentially at different oscillation frequencies. The team led by Cornelia Denz and Randhir Kumar, a doctoral student in the Münster research group, now performed the measurement at several frequencies simultaneously for a wide frequency range. This multi-frequency method leads to a shortened measurement time of a few seconds. The scientists used solutions of methyl cellulose in water at different concentrations as samples. These have a similar viscoelasticity to living cells.

Background: Biomechanical properties such as stiffness, viscosity and viscoelasticity of living cells and tissues play a crucial role in many vital cellular functions such as cell division, cell migration, cell differentiation and tissue patterning. These properties of living cells could also serve as indicators of disease progression. For example, the onset and development of cancer is typically accompanied by changes in cell stiffness, viscosity, and viscoelasticity.

Credit: 
University of Münster

Individual protected areas in Amazonia differ greatly in how effectively they help to fight deforestation and carbon emissions

image: Individual protected areas showed substantial variation in their impact, i.e. estimated amount of prevented deforestation. The average impact over all areas, and also within each protection category, was positive.

Image: 
Teemu Koskimäki et al.

While tropical forests remain threatened and their future is uncertain, the importance of understanding how well individual protected areas avoid deforestation increases. Researchers from the University of Turku and University of Helsinki, Finland, have investigated this question in a newly published study that focuses on the State of Acre in Brazilian Amazonia.

Tropical forests are unique environments that have huge species diversity and also act as important reservoirs of organic carbon, thereby counteracting climate change. However, their area is diminishing due to deforestation, which gives reason to worry both about the survival of their biodiversity and about the increasing carbon emissions. To help to optimise conservation efforts, it is important to understand how well conservation areas succeed in safeguarding tropical forests.

A group of researchers from the Amazon Research Team of the University of Turku and from the University of Helsinki have now compared deforestation rates between protected and environmentally similar non-protected areas in the Acre state of Amazonian Brazil.

- We found that most protected areas have been effective against deforestation and the associated carbon emissions. In total, we estimated that each year the network of protected areas in Acre helps to avoid the same amount of carbon emissions that is produced by more than 120,000 Europeans, explains the lead author of the study, Doctoral Candidate Teemu Koskimäki.

Carrying out this kind of analyses is based on massive amounts of data and requires sophisticated analytical methods. The computer software needed to do this was developed by Postdoctoral Researcher Johanna Eklund and colleagues in an earlier project.

- To quantify the effect of protection, we had to take into account many other variables to find and match protected and non-protected areas that are similar in terms of deforestation threat. For example, the closer an area is to a big city and the easier it is to reach, the more deforestation pressure it faces whether it is protected or not, explains Eklund.

Identifying Differences Helps to Plan Future Conservation Actions

Interestingly, the researchers discovered significant variation among the protected areas.

- Some of the protected areas were very effective, whereas others seemed to suffer from even more severe deforestation than similar non-protected forests. Recognising these differences and their causes could make the management of protected areas more efficient and help to allocate resources to areas where they are most needed, Koskimäki says.

In the case of indigenous lands, the primary objective is to safeguard space for the local traditional peoples to live in rather than to protect nature. Nevertheless, these areas were found to be at least as effective in preventing deforestation as other categories of protected area. It seems that indigenous peoples have not been called guardians of the forest without good reason.

- We are now starting a new project to assess how climate change is affecting biodiversity and the livelihoods of indigenous communities in central Amazonia. This is done in collaboration with the local people. They are worried, because the seasonal patterns of rains and river floods seem to be changing and becoming less predictable, says Professor Hanna Tuomisto.

In the future, it would be interesting to clarify what the components of a successful protected area are in order to identify and spread the good practices. The results of this new study could be used to identify potential study subjects for future on-the-ground research at the local level. Such research could also focus on other factors that contribute to conservation success than prevention of deforestation, such as how well protected areas prevent selective logging or unsustainable levels of hunting.

The research article has been published in the journal Environmental Conservation.

Credit: 
University of Turku

No sign of COVID-19 vaccine in breast milk

Messenger RNA vaccines against COVID-19 were not detected in human milk, according to a small study by UC San Francisco, providing early evidence that the vaccine mRNA is not transferred to the infant.

The study, which analyzed the breast milk of seven women after they received the mRNA vaccines and found no trace of the vaccine, offers the first direct data of vaccine safety during breastfeeding and could allay concerns among those who have declined vaccination or discontinued breastfeeding due to concern that vaccination might alter human milk. The paper appears in JAMA Pediatrics.

Research has demonstrated that vaccines with mRNA inhibit transmission of the virus that causes COVID-19. The study analyzed the Pfizer and Moderna vaccines, both of which contain mRNA.

The World Health Organization recommends that breastfeeding people be vaccinated, and the Academy of Breastfeeding Medicine has said there is little risk of vaccine nanoparticles or mRNA entering breast tissue or being transferred to milk, which theoretically could affect infant immunity.

"The results strengthen current recommendations that the mRNA vaccines are safe in lactation, and that lactating individuals who receive the COVID vaccine should not stop breastfeeding," said corresponding author Stephanie L. Gaw, MD, PhD, assistant professor of Maternal-Fetal Medicine at UCSF.

"We didn't detect the vaccine associated mRNA in any of the milk samples tested," said lead author Yarden Golan, PhD, a postdoctoral fellow at UCSF. "These findings provide an experimental evidence regarding the safety of the use of mRNA-based vaccines during lactation."

The study was conducted from December 2020 to February 2021. The mothers' mean age was 37.8 years and their children ranged in age from one month to three years. Milk samples were collected prior to vaccination and at various times up to 48 hours after vaccination.

Researchers found that none of the samples showed detectable levels of vaccine mRNA in any component of the milk.

The authors noted that the study was limited by the small sample size and said that further clinical data from larger populations were needed to better estimate the effect of the vaccines on lactation outcomes.

Credit: 
University of California - San Francisco

From genes to memes: Algorithm may help scientists demystify complex networks

UNIVERSITY PARK, Pa. -- From biochemical reactions that produce cancers, to the latest memes virally spreading across social media, simple actions can generate complex behaviors. For researchers trying to understand these emergent behaviors, however, the complexity can tax current computational methods.

Now, a team of researchers has developed a new algorithm that can serve as a more effective way to analyze models of biological systems, which in turn allows a new path to understanding the decision-making circuits that make up these systems. The researchers add that the algorithm will help scientists study how relatively simple actions lead to complex behaviors, such as cancer growth and voting patterns.

The modeling framework used consists of Boolean networks, which are a collection of nodes that are either on or off, said Jordan Rozum, doctoral candidate in physics at Penn State. For example, a Boolean network could be a network of interacting genes that are either turned on -- expressed -- or off in a cell.

"Boolean networks are a good way to capture the essence of a system," said Rozum. "It's interesting that these very rich behaviors can emerge out of just coupling little on and off switches together -- one switch is toggled and then it toggles another switch and that can lead to a big cascade of effects that then feeds back into the original switch. And we can get really interesting complex behaviors out of just the simple couplings."

"Boolean models describe how information propagates through the network," said Réka Albert, distinguished professor of physics and biology in the Penn State Eberly College of Science and an affiliate of the Institute for Computational and Data Sciences. Eventually, the on/off states of the nodes fall into repeating patterns, called attractors, which correspond to the stable long-term behaviors of the system, according to the researchers, who report their findings in the current issue of Science Advances.

Even though these systems are based on simple actions, the complexity can scale up dramatically as nodes are added to the system, especially in the case when events in the system are not synchronous. A typical Boolean network model of a biological process with a few dozen nodes, for example, has tens of billions of states, according to the researchers. In the case of a genome, these models can have thousands of nodes, resulting in more states than there are atoms in the observable universe.

The researchers use two transformations -- parity and time reversal -- to make the analysis of Boolean networks more efficient. The parity transformation offers a mirror image of the network, switching nodes that are on to off and vice versa, which helps identify which subnetworks have combinations of on and off values that can sustain themselves over time. Time reversal runs the dynamics of the network backward, probing which states can precede an initial input state.

The team tested their methods on a collection of synthetic Boolean networks called random Boolean networks, which have been used for than 50 years as a way to model how gene regulation determines the fate of a cell. The technique allowed the team to find the number of attractors in these networks for more than 16,000 genes, which, according to the researchers, are sizes larger than ever before analyzed in such detail.

According to the team, the technique could help medical researchers.

"For example, you might want a cancer cell to undergo apoptosis (programmed cell death), and so you want to be able to make the system pick the decisions that lead towards that desired outcome," said Rozum. "So, by studying where in the network these decisions are made, you can figure out what you need to do to make the system choose those options."

Other possibilities exist for using the methods to study issues in the social sciences and information technology.

"The propagation of information would also make an interesting application," said Albert. "For example, there are models that describe a society in which people have binary opinions on a matter. In the model people interact with each other, forming a local consensus. Our methods could be used to map the repertoire of consensus groups that are possible, including a global consensus."

She added that uses could extend to any area where researchers are trying to find ways to eliminate pathological behaviors, or drive the system into more normal behaviors.

"To do this, the theory existed, methodologies existed, but the computational expense was a limiting factor," said Albert. "With this algorithm, that has to a large part been eliminated."

The researchers have developed a publicly available software library and the algorithms have already been used in studies carried out by her group, according to Albert.

Computations for the study were performed using Penn State's Roar supercomputer.

Albert and Rozum worked with Jorge Gómez Tejeda Zañudo, postdoctoral associate at Broad Institute and Dana-Farber Cancer Institute; Xiao Gan, postdoctoral researcher at the Center for Complex Network Research; and Dávid Deritei, graduate research fellow at Semmelweis University.

Credit: 
Penn State

Invention: The Storywrangler

image: UVM scientists have invented a new tool: the Storywrangler. It visualizes the use of billions of words, hashtags and emoji posted on Twitter. In this example from the tool's online viewer, three global events from 2020 are highlighted: the death of Iranian general Qasem Soleimani; the beginning of the COVID-19 pandemic; and the Black Lives Matter protests following the murder of George Floyd by Minneapolis police. The new research was published in the journal Science Advances.

Image: 
UVM

For thousands of years, people looked into the night sky with their naked eyes -- and told stories about the few visible stars. Then we invented telescopes. In 1840, the philosopher Thomas Carlyle claimed that "the history of the world is but the biography of great men." Then we started posting on Twitter.

Now scientists have invented an instrument to peer deeply into the billions and billions of posts made on Twitter since 2008 -- and have begun to uncover the vast galaxy of stories that they contain.

"We call it the Storywrangler," says Thayer Alshaabi, a doctoral student at the University of Vermont who co-led the new research. "It's like a telescope to look -- in real time -- at all this data that people share on social media. We hope people will use it themselves, in the same way you might look up at the stars and ask your own questions."

The new tool can give an unprecedented, minute-by-minute view of popularity, from rising political movements to box office flops; from the staggering success of K-pop to signals of emerging new diseases.

The story of the Storywrangler -- a curation and analysis of over 150 billion tweets--and some of its key findings were published on July 16 in the journal Science Advances.

EXPRESSIONS OF THE MANY

The team of eight scientists who invented Storywrangler -- from the University of Vermont, Charles River Analytics, and MassMutual Data Science -- gather about ten percent of all the tweets made every day, around the globe. For each day, they break these tweets into single bits, as well as pairs and triplets, generating frequencies from more than a trillion words, hashtags, handles, symbols and emoji, like "Super Bowl," "Black Lives Matter," "gravitational waves," "#metoo," "coronavirus," and "keto diet."

"This is the first visualization tool that allows you to look at one-, two-, and three-word phrases, across 150 different languages, from the inception of Twitter to the present," says Jane Adams, a co-author on the new study who recently finished a three-year position as a data-visualization artist-in-residence at UVM's Complex Systems Center.

The online tool, powered by UVM's supercomputer at the Vermont Advanced Computing Core, provides a powerful lens for viewing and analyzing the rise and fall of words, ideas, and stories each day among people around the world. "It's important because it shows major discourses as they're happening," Adams says. "It's quantifying collective attention." Though Twitter does not represent the whole of humanity, it is used by a very large and diverse group of people, which means that it "encodes popularity and spreading," the scientists write, giving a novel view of discourse not just of famous people, like political figures and celebrities, but also the daily "expressions of the many," the team notes.

In one striking test of the vast dataset on the Storywrangler, the team showed that it could be used to potentially predict political and financial turmoil. They examined the percent change in the use of the words "rebellion" and "crackdown" in various regions of the world. They found that the rise and fall of these terms was significantly associated with change in a well-established index of geopolitical risk for those same places.

WHAT'S HAPPENING?

The global story now being written on social media brings billions of voices -- commenting and sharing, complaining and attacking -- and, in all cases, recording -- about world wars, weird cats, political movements, new music, what's for dinner, deadly diseases, favorite soccer stars, religious hopes and dirty jokes.

"The Storywrangler gives us a data-driven way to index what regular people are talking about in everyday conversations, not just what reporters or authors have chosen; it's not just the educated or the wealthy or cultural elites," says applied mathematician Chris Danforth, a professor at the University of Vermont who co-led the creation of the StoryWrangler with his colleague Peter Dodds. Together, they run UVM's Computational Story Lab.

"This is part of the evolution of science," says Dodds, an expert on complex systems and professor in UVM's Department of Computer Science. "This tool can enable new approaches in journalism, powerful ways to look at natural language processing, and the development of computational history."

How much a few powerful people shape the course of events has been debated for centuries. But, certainly, if we knew what every peasant, soldier, shopkeeper, nurse, and teenager was saying during the French Revolution, we'd have a richly different set of stories about the rise and reign of Napoleon. "Here's the deep question," says Dodds, "what happened? Like, what actually happened?"

GLOBAL SENSOR

The UVM team, with support from the National Science Foundation, is using Twitter to demonstrate how chatter on distributed social media can act as a kind of global sensor system -- of what happened, how people reacted, and what might come next. But other social media streams, from Reddit to 4chan to Weibo, could, in theory, also be used to feed Storywrangler or similar devices: tracing the reaction to major news events and natural disasters; following the fame and fate of political leaders and sports stars; and opening a view of casual conversation that can provide insights into dynamics ranging from racism to employment, emerging health threats to new memes.

In the new Science Advances study, the team presents a sample from the Storywrangler's online viewer, with three global events highlighted: the death of Iranian general Qasem Soleimani; the beginning of the COVID-19 pandemic; and the Black Lives Matter protests following the murder of George Floyd by Minneapolis police. The Storywrangler dataset records a sudden spike of tweets and retweets using the term "Soleimani" on January 3, 2020, when the United States assassinated the general; the strong rise of "coronavirus" and the virus emoji over the spring of 2020 as the disease spread; and a burst of use of the hashtag "#BlackLivesMatter" on and after May 25, 2020, the day George Floyd was murdered.

"There's a hashtag that's being invented while I'm talking right now," says UVM's Chris Danforth. "We didn't know to look for that yesterday, but it will show up in the data and become part of the story."

Credit: 
University of Vermont

Organic electronics possibly soon to enter the GHz-regime

image: Image of a 5-stage complementary ring-oscillator composed of organic permeable base transistors.

Image: 
Erjuan Guo

Physicists of the Technische Universität Dresden introduce the first implementation of a complementary vertical organic transistor technology, which is able to operate at low voltage, with adjustable inverter properties, and a fall and rise time demonstrated in inverter and ring-oscillator circuits of less than 10 nanoseconds, respectively. With this new technology they are just a stone's throw away from the commercialization of efficient, flexible and printable electronics of the future. Their groundbreaking findings are published in the renowned journal "Nature Electronics".

Poor performance is still impeding the commercialization of flexible and printable electronics. Hence, the development of low-voltage, high-gain, and high-frequency complementary circuits is seen as one of the most important targets of research. High-frequency logic circuits, such as inverter circuits and oscillators with low power consumption and fast response time, are the essential building blocks for large-area, low power-consumption, flexible and printable electronics of the future. The research group "Organic Devices and Systems" (ODS) at the Institute of Applied Physics (IAP) at TU Dresden headed by Dr. Hans Kleemann is working on the development of novel organic materials and devices for high performance, flexible and possibly even biocompatible electronics and optoelectronics. Increasing the performance of organic circuits is one of the key challenges in their research. It was only some month ago, when PhD-student Erjuan Guo announced an important breakthrough with the development of efficient, printable, and adjustable vertical organic transistors.

Now, building on their previous findings, the physicists demonstrate for the first time vertical organic transistors (organic permeable base transistors, OPBTs) integrated into functional circuits. Dr. Hans Kleemann and his team succeeded in proving that such devices possess reliable performance, long-term stability, as well as unprecedented performance measures.

„In previous publications, we found that the second control-electrode in the vertical transistor architecture enables a wide-range of threshold voltage controllability, which makes such devices become ideal for efficient, fast and complex logic circuits. In the recent publication, we add a vital feature to the technology by demonstrating complementary circuits such as integrated complementary inverters and ring-oscillators. Using such complementary circuits, the power- efficiency and speed of operation can be improved by more than one order of magnitude and might possibly allow organic electronics to enter the GHz-regime", explains Erjuan Guo, who meanwhile received a Ph.D. with distinction from Technische Universität Dresden.

The complementary inverters and ring-oscillators developed at the IAP represent a milestone towards flexible, low-power GHz-electronics as it would be needed, for example in wireless communication applications. "Furthermore, our findings might inspire the entire research community to envision alternative vertical organic transistor designs as they seem to enable high-frequency operation and low-cost integrated at the same time", adds Erjuan Guo enthusiastically.

Credit: 
Technische Universität Dresden

On the internet, nobody knows you're a dog -- or a fake Russian Twitter account

BUFFALO, N.Y. - Many legacy media outlets played an unwitting role in the growth of the four most successful fake Twitter accounts hosted by the Russian Internet Research Agency (IRA) that were created to spread disinformation during the 2016 U.S. presidential campaign, according to a study led by a University at Buffalo communication researcher.

In roughly two years beginning in late 2015, these accounts went from obscurity to microcelebrity status, growing from about 100 to more than 100,000 followers. With its heavily populated follower base ready to spread the word -- like all heavily engaged Twitter audiences -- the IRA could strategically deploy messages and provide visible metrics, creating an illusion of authority and authenticity that often escaped the scrutiny of casual consumers and professional journalists.

The frantic retweets, by what the study showed to be extreme ideological enclaves, certainly fueled the accumulation of followers, but Yini Zhang, PhD, an assistant professor of communication at UB, says that mainstream and hyperpartisan news media also significantly amplified IRA messaging and contributed to that follower growth by unknowingly embedding IRA tweets in their content.

Zhang says there was an ideological asymmetry to the study's results. Of the four puppet accounts in the study, @TEN_GOP and @Pamela_Moore13 posed as conservative trolls, while @Crystal1Johnson and @glod_up imitated liberals.

"We did not observe the same effect on the liberal and conservative accounts," she says. "The two conservative accounts received a huge boost from mainstream media and hyperconservative media quoting tweets in their news stories, but we did not see mainstream media and hyperprogressive media doing the same thing for the two liberal accounts."

The findings, published in the Journal of Communication, reveal how large social media followings can often depend on a combination of the dynamics within a particular platform and the news media's treatment of the messages emerging from those platforms. The evidence revealed in the study provides insights into the ecology of the 21st century political communication environment, suggesting that people's tendency to seek confirmation and engage with pro-attitudinal information, as well as the media's drive for audience attention, can work in favor of successful political disinformation actors.

In this case, constructive attempts to provide new information by integrating digital and legacy content ironically resulted in the unintended spread of disinformation, which Zhang defines as fabricated information that's intended to cause harm in ways that benefit its agents.

"Examining how and why these accounts grew so quickly and to such astounding proportions allows us to understand the mechanisms of influence accrual in the digital era," says Zhang, the study's corresponding author and an expert in social media and political communication. "None of this was intentional. It's about operational realties.

"But with this knowledge, we can begin to address and curtail the problem of disinformation."

The research team started their work with 2,700 puppet accounts released in 2017 by the House Intelligence Committee, which received the information from Twitter. From that group, the researchers identified the four most retweeted English-speaking accounts: two conservative accounts and two liberal accounts. They collected data from Twitter about the tweets and retweets of the IRA accounts. They then searched more than 200 media outlets across the ideological spectrum to determine where the uptake of IRA tweets was occurring.

"Strong social media posts can validate content," says Zhang. "But in their effort to turn heads, these legacy outlets were contributing to the growth of Russian sock puppet accounts."

The processes of incorporating digital content into mainstream media makes sense, but requires careful consideration, according to Zhang.

"Social media content looks very attractive given the cost cutting realities in mainstream media and lost advertising revenue," says Zhang. "But it also demonstrates a vulnerability within the current media economy.

"Turning heads might also mean unintentionally contributing to the growth of fake accounts, which should be subject to the same questions of credibility as any other news source: Is this account in fact what it actually claims to be?"

Credit: 
University at Buffalo

Study examines the role of deep-sea microbial predators at hydrothermal vents

image: A view of the Apollo Vent Field at the northern Gorda Ridge, where samples were collected by the ROV Hercules for studying microbial predators

Image: 
Image OET/Nautilus Live

The hydrothermal vent fluids from the Gorda Ridge spreading center in the northeast Pacific Ocean create a biological hub of activity in the deep sea. There, in the dark ocean, a unique food web thrives not on photosynthesis but rather on chemical energy from the venting fluids. Among the creatures having a field day feasting at the Gorda Ridge vents is a diverse assortment of microbial eukaryotes, or protists, that graze on chemosynthetic bacteria and archaea.

This protistan grazing, which is a key mechanism for carbon transport and recycling in microbial food webs, exerts a higher predation pressure at hydrothermal vent sites than in the surrounding deep-sea environment, a new paper finds.

"Our findings provide a first estimate of protistan grazing pressure within hydrothermal vent food webs, highlighting the important role that diverse deep-sea protistan communities play in deep-sea carbon cycling," according to the paper, Protistan grazing impacts microbial communities and carbon cycling ad deep-sea hydrothermal vents published in the Proceedings of the National Academy of Sciences (PNAS).

Protists serve as a link between primary producers and higher trophic levels, and their grazing is a key mechanism for carbon transport and recycling in microbial food webs, the paper states.

The research found that protists consume 28-62% of the daily stock of bacteria and archaea biomass within discharging hydrothermal vent fluids from the Gorda Ridge, which is located about 200 kilometers off the coast of southern Oregon. In addition, researchers estimate that protistan grazing could account for consuming or transferring up to 22% or carbon that is fixed by the chemosynthetic population in the discharging vent fluids. Though the fate of all of that carbon is unclear, "protistan grazing will release a portion of the organic carbon into the microbial loop as a result of excretion, egestion, and sloppy feeding," and some of the carbon will be taken up by larger organisms that consume protistan cells, the paper states.

After collecting vent fluid samples from the Sea Cliff and Apollo hydrothermal vent fields in the Gorda Ridge, researchers conducted grazing experiments, which presented some technical challenges that needed to be overcome. For instance, "prepping a quality meal for these protists is very difficult," said lead author Sarah Hu, a postdoctoral investigator in the Marine Chemistry and Geochemistry Department at the Woods Hole Oceanographic Institution (WHOI).

"Being able to do this research at a deep-sea vent site was really exciting because the food web there is so fascinating, and it's powered by what's happening at this discharging vent fluid," said Hu, who was onboard the E/V Nautilus during the May-June 2019 cruise. "There is this whole microbial system and community that's operating there below the euphotic zone outside of the reach of sunlight. I was excited to expand what we know about the microbial communities at these vents."

Hu and co-author Julie Huber said that quantitative measurements are important to understand how food webs operate at pristine and undisturbed vent sites.

"The ocean provides us with a number of ecosystem services that many people are familiar with, such as seafood and carbon sinks. Yet, when we think about microbial ecosystem services, especially in the deep sea, we just don't have that much data about how those food webs work," said Huber, associate scientist in WHOI's Marine Chemistry and Geochemistry Department.

Obtaining baseline measurements "is increasingly important as these habitats are being looked at for deep-sea mining or carbon sequestration. How might that impact how much carbon is produced, exported, or recycled?" she said.

"We need to understand these habitats and the ecosystems they support," Huber said. "This research is connecting some new dots that we weren't able to connect before."

Credit: 
Woods Hole Oceanographic Institution

Private-public partnership helps to evaluate satellite observations of atmospheric CO2 over oceans

Hiroshi Tanimoto, Director of the Earth System Division at the National Institute for Environmental Studies (NIES), Japan, and Astrid Müller together with their international research team, have developed a new method to evaluate satellite observations of XCO2 over open ocean areas, which are currently inaccessible through established validation network sites. In the new approach, a reference CO2 dataset is formulated by combining cargo ship and passenger aircraft observations which were conducted in cooperation with operators of the private sector.

(Background)

After the Paris Agreement entered into force, commitment to reduce greenhouse gas emissions are being expedited. CO2 is the most important anthropogenically produced greenhouse gas. Emissions due to fossil fuel combustion and cement production have caused an accelerated increase of atmospheric CO2 to more than 410 ppm in 2020 since the 1950s (Dlugokencky and Tans, 2021). High quality and high density measurements are needed to estimate changes in anthropogenic and natural emission towards the implementation of the Paris Agreement and achieving the goal of net-zero greenhouse gas emission. Increased research and development activities have been led to monitor CO2 from more than 200 locations on the earth's surface (in 1958, it has been only 2 sites), and by a growing fleet of space-based satellites with global coverage. Among these satellites, the Greenhouse Gases Observing Satellite (GOSAT) and the Orbiting Carbon Observatory 2 (OCO-2) have been launched in 2009 and 2014. The advantage of the space-based observations is their high spatial and temporal coverage, even over inaccessible areas of the globe, albeit at a lower accuracy compared to the in situ measurements. Ocean surfaces are among these difficult to access areas. They cover 70% of the Earth and play the most reliable role in the removal (~2.5 PgC/yr) of the anthropogenic emitted CO2 (~10 PgC/yr) into the atmosphere (Friedlingstein et al., 2019).

Satellite XCO2 data products require validation, which is usually performed against land-based XCO2 data products from the Total Carbon Column Observing Network (TCCON) (Wunch et al., 2011), a network of ground-based Fourier transform infrared spectrometers. However, validation sites observing the atmosphere over the ocean are limited to some coastal and island sites. Therefore, the accuracy of the satellite XCO2 data products over the ocean cannot be fully verified by using TCCON.

Recent studies pointed out that there is apparently a greater bias in the satellite observations over oceans than over land surfaces. TCCON observations of XCO2 at Burgos in the Philippines indicated a bias of satellite XCO2 of -0.8 ppm in the vicinity of the tropical Pacific (Velazco et al., 2017). A bias of -0.7 ppm was also seen by observations of vertical CO2 profiles using aircraft (Kulawik et al., 2019). However, these aircraft campaigns are limited and expensive. To improve satellite data over oceans, there has been no effective method to systematically evaluate ocean biases over long temporal periods and wide spatial areas until now.

In cooperation with the private sector, the National Institute for Environmental Studies has conducted long-term atmospheric observations by cargo ships operating between Japan and North America, Australia, and Southeast Asia, and passenger aircrafts flying from Japan to various parts of the world. Taking advantage of the regular and cost-efficient observations with wide geographical coverage, the new approach is an effective method to evaluate satellite observations over the oceans where no reference data were available. In this study, we applied the method to CO2 in the western Pacific Ocean.

(Method)

In our study, we combine cargo ship (Ship Of Opportunity - SOOP) and passenger aircraft (Comprehensive Observation Network for Trace gases by Airliner - CONTRAIL) observations of CO2, and, with the aid of model calculations, we constructed CO2 profiles from which we obtained the "observation-based column-averaged mixing ratios of CO2" (obs. XCO2) data over the Pacific Ocean. We analyzed the consistency of the obs. XCO2 dataset with satellite estimates from GOSAT (Greenhouse gases observing satellite: NIES v02.75, National Institute for Environmental Studies; ACOS v7.3, Atmospheric CO2 Observation from Space) and OCO-2 (Orbiting Carbon Observatory, v9r).

(Results)

Our analysis revealed that the new dataset accurately captures seasonal and interannual variations of CO2 over the western Pacific Ocean. In the comparison of satellite XCO2 from GOSAT and OCO-2 with the obs. XCO2 dataset, we found a negative bias of about 1 ppm in northern midlatitudes. This bias was substantially reduced for newer satellite products (ACOS v9, OCO-2 v10). The differences between the obs. XCO2 and satellite XCO2 could be attributed to remaining uncertainties in the satellite data, introduced by limitations in the retrieval algorithms due to the lack of validation data over open oceans. With our new approach, these uncertainties can be identified.

(Expectations for the future)

Advances in retrieval algorithms are made rapidly, with almost one new version each year. To evaluate the improvements in these algorithms, our new approach is of great importance. With the help of the private sector, we can rapidly extent the spatial and temporal coverage of reference data as complement to established validation networks. We expect that we can contribute with the new dataset to the further improvements of the satellite data and therefore, contribute to a better understanding of changes in the carbon cycle in response to climate change. In future, our new method of combining cargo ship and passenger aircraft observations will be spatially and temporally extended and applied to other important trace gases. In specific, we plan to use the new dataset for the evaluation GOSAT GW, which is scheduled to be launched in 2023.

Credit: 
National Institute for Environmental Studies