Tech

Research moves closer to brain-machine interface autonomy

image: University of Houston professor of biomedical engineering Joe Francis is reporting work that represents a significant step forward for prosthetics that perform more naturally.

Image: 
University of Houston

A University of Houston engineer is reporting in eNeuro that a brain-computer interface, a form of artificial intelligence, can sense when its user is expecting a reward by examining the interactions between single-neuron activities and the information flowing to these neurons, called the local field potential.

Professor of biomedical engineering Joe Francis reports his team's findings allow for the development of an autonomously updating brain-computer interface (BCI) that improves on its own, learning about its subject without having to be programed.

The findings potentially have applications for robotic prosthetics, which would sense what a user wants to do (pick up a glass, for example) and do it. The work represents a significant step forward for prosthetics that perform more naturally.

"This will help prosthetics work the way the user wants them to," said Francis. "The BCI quickly interprets what you're going to do and what you expect as far as whether the outcome will be good or bad." Francis said that information drives scientists' abilities to predict reward outcome to 97%, up from the mid-70s.

To understand the effects of reward on the brain's primary motor cortex activity, Francis used implanted electrodes to investigate brainwaves and spikes in brain activity while tasks were performed to see how interactions are modulated by conditioned reward expectations.

"We assume intention is in there, and we decode that information by an algorithm and have it control either a computer cursor, for example, or a robotic arm," said Francis. Interestingly even when the task called for no movement, just passively observing an activity, the BCI was able to determine intention because the pattern of neural activity resembled that during movement.

"This is important because we are going to have to extract this information and brain activity out of people who cannot actually move, so this is our way of showing we can still get the information even if there is no movement," said Francis. This process utilizes mirror neurons, which fire when action is taken and action is observed.

"This examination of reward motivation in the primary motor cortex could be useful in developing an autonomously updating brain machine interface," said Francis.

Credit: 
University of Houston

Plot twist: Straightening single-molecule conductors improves their performance

image: (Left) Chemical structures of completely insulated molecular wires derived from oligothiophene with nanometer length scales. The upper figure shows the flat conformation, while the lower figure displays the twisted form. (Right) Results of single-molecule conductance measurements, in which the conductivity is plotted as a function of oligomer length. The crossover from the tunneling regime to the hopping regime occurs at a shorter chain length in the flat nanowires owing to their increased conductivity.

Image: 
Osaka University

Osaka, Japan - A team at Osaka University has created single-molecule nanowires, complete with an insulation layer, up to 10 nanometers in length. When they measured the electrical properties of these nanowires, the researchers found that forcing the ribbon-like chains to be flat significantly improved their conductivity compared with a twisted conformation. The findings may allow for a new generation of inexpensive high-tech devices, including smartphone screens and photovoltaics.

Carbon-based polymers, which are long molecular chains made of repeating units, can be found everywhere, from the rubber in the soles of your shoes to the proteins that make up your body. We used to think that these molecules could not conduct electricity, but that all changed with the discovery of conducting polymers. These are a small subset of carbon-based molecules that can act like tiny wires owing to their alternating single and double chemical bonds, also called conjugated bonds. Since carbon-based conductors are much easier and cheaper to make and customize than conventional electronics, they have seen rapid adoption in OLED TVs, iPhone screens, and solar panels, while drastically reducing their cost.

Now, researchers at Osaka University have synthesized chains of oligothiophene of various lengths, with up to 24 repeat units. This means that single nanowires could be up to 10 nanometers in length. Insulation of the wires was needed to avoid interwire currents, so that the intrinsic conductivity of a single molecule could be measured accurately. On the basis of the rules of quantum mechanics, electrons in molecules behave more like spread-out waves than localized particles. The overlapping bonds in oligothiophene allow electrons to be entirely spread out over the polymer backbone, so they can easily transverse the molecule to create an electrical current.

This charge transport can occur is two very different ways. "Over short distances, electrons rely on their wave-like nature to 'tunnel' directly through barriers, but over long distances, they hop from site to site to reach their destination," first author Dr. Yutaka Ie explained. The team at Osaka University found that changing the oligothiophene chain from twisted to flat led to much greater overlap of the conjugated backbone of oligothiophene, which in turn meant a larger overall conductivity. As a result, the crossover from tunneling to hopping conduction took place with flat chains at shorter chain lengths, compared with those with the twisted conformation.

The researchers believe that this work can open a whole new world of devices. "This study demonstrates that our insulated nanowires have the potential to be used in novel 'single-molecule' electronics," lead author Dr. Yoshio Aso said. The work is published in The Journal of Physical Chemistry Letters as "Highly Planar and Completely Insulated Oligothiophenes: Effects of π-Conjugation on Hopping Charge Transport." (DOI: 10.1021/acs.jpclett.9b00747)

Credit: 
Osaka University

An innovative electron microscope overturning common knowledge of 88 years history

image: The newly developed magnetic objective-lens system is installed. Combined with a higher-order aberration corrector (shown above in the objective-lens system), this system can focus an electron beam to the atomic scale.

Image: 
JST

Under the JST-SENTAN program (Development of System and Technology for Advanced Measurement and Analysis, Japan Science and Technology Agency), the joint development team of Prof. Naoya Shibata at the University of Tokyo and JEOL Ltd., has developed a revolutionary electron microscope that incorporates newly designed magnetic objective lenses, and achieved direct, atom-resolved imaging of materials with sub-Å spatial resolution, with a residual magnetic field less than 0.2 mT at the sample position. To the best of our knowledge, this is the first time that such a goal has been achieved.

In the 88 years since the seminal invention of the transmission electron microscope (TEM) in 1931, researchers have continually pursued better spatial resolution. The design of magnetic objective lenses with smaller lens-aberration coefficients has been necessary, and aberration-correcting lens systems for scanning TEM (STEM) have achieved sub-Å spatial resolution.

One critical disadvantage of current magnetic condenser-objective-lens systems for atomic-resolution TEMs/STEMs is that the samples must be inserted into very high magnetic fields of up to 2-3 T. Such high fields can severely hamper atomic-resolution imaging of many important soft/hard magnetic materials, such as silicon steel, because the strong field can greatly alter--or even destroy--the material's magnetic and sometimes physical structure. Recently, the development of new magnetic materials has advanced rapidly. As atomic-scale structural analysis is key to the abovementioned technology, a solution to this problem has long been required.

The joint team has developed a new magnetic-field-free objective-lens system, containing two round lenses positioned in an exact mirror-symmetric configuration with respect to the sample plane. This new lens system provides extremely small residual magnetic fields at the sample position while placing the strongly excited front/back objective lenses close enough to the sample to obtain the short focus length condition indispensable for atomic-resolution imaging. Consequently, the residual magnetic fields generated near the sample center are much

The joint team has used this new system to observe the atomic structure of a grain-oriented silicon-steel sheet, which is one of the most important soft magnetic engineering materials. This sheet is used as a core material for electric transformers and motors, and its atomic-resolution characterization of individual defects has long been sought. Using the newly developed lens system, the resolved atomic structure of the silicon steel was clearly observed, and direct, atom-resolved imaging in a magnetic-field-free environment was realized for electron microscopy, enabling unprecedented atomic-level structural characterization of magnetic materials.

The newly developed electron microscope can be operated in the same manner as that of conventional TEMs/STEMs. It is expected to promote substantial further research and development in various nanotechnology fields.

Credit: 
Japan Science and Technology Agency

Marine oil snow

image: Andrew Wozniak monitored tanks filled with seawater from the Gulf of Mexico and mixed the seawater with oil, plankton and a chemical dispersant used during the Deepwater Horizon oil spill to conduct a chemical analysis of the particles of marine oil snow that fell to the bottom of the tanks.

Image: 
Andrew Wozniak

If you were able to stand on the bottom of the seafloor and look up, you would see flakes of falling organic material and biological debris cascading down the water column like snowflakes in a phenomenon known as marine snow.

Recent disasters like the Deepwater Horizon oil spill in the Gulf of Mexico, however, have added a new element to this natural process: oil.

During these events, the natural marine snow interacts with oil and dispersants to form what's known as marine oil snow as it sinks from the surface through water column to the seafloor sediments.

The danger with marine oil snow is that it transfers oil and its negative impacts from the water column to the sediments on the bottom of the seafloor, delivering a more diverse suite of oxygenated compounds to sediments and deep-sea ecosystems. These oxygenated forms of many oil compounds are more toxic to organisms in the sediments than are the non-oxygenated forms.

While this result may lessen the impact on near-surface organisms like fish and birds and shellfish, it transfers the oil to the deep ocean where it impacts fauna, deep corals, and fish down there, where adverse impacts were documented after the Deepwater Horizon oil spill.

The University of Delaware's Andrew Wozniak conducted research to investigate the fate and accumulation of marine oil snow in the Gulf of Mexico, the results of which were recently published in the Environmental Science and Technology journal.

Wozniak, assistant professor in the School of Marine Science and Policy in UD's College of Earth, Ocean and Environment, conducted the research while a research faculty member at Old Dominion University. He said that to recreate the conditions of the Gulf of Mexico, he and his collaborators used 100-liter glass tanks filled with seawater collected from the Gulf.

In addition to the seawater, they added plankton collected from coastal waters directly before the initiation of the experiment. They also added the kind of oil spilled during the Deepwater Horizon disaster, along with the chemical dispersant used to break it up, and monitored the tanks for four days.

Particles in the tanks formed on the surface, in the water column and the rest sank to the bottom. Wozniak collected the particles that sank and isolated the oil component to conduct a chemical analysis.

When they performed the chemical analysis and compared it to the initial oil, the samples differed in a way that could be attributed to microbial degradation.

Wozniak said this occurred as the marine oil snow sank through the water column.

When an event like an oil spill occurs, the phytoplankton and bacteria in the ocean interact with the oil -- which is bad for them -- and they release extracellular polymeric substances (EPS) which collects the oil.

"It's kind of a defense mechanism and because that EPS is sticky, it gets that oil aggregated and hopefully protects them from the oil," said Wozniak.

The result of the EPS protection is a base particle for other substances to glom onto.

"If something with enough density like minerals form on it, then they'll sink and that's when you get that marine oil snow," said Wozniak.

By looking at the degraded material at the bottom of the mesocosms, Wozniak could see that as the oil sank through the water column, it provided a microhabitat for microbes and microbes that prefer hydrocarbons and oil-like compounds proliferated.

In addition to supporting that community of bacteria, it also keeps a portion of oil that has been changed -- potentially for the worse -- in the ocean.

"It may have consequences for the toxicity of the oil because it oxygenates compounds," said Wozniak. "The oxygenated forms of some of the compounds, like Polycyclic Aromatic Hydrocarbons, tend to be more toxic and so it may have important implications for future study for what's happening in sediments or deep coral reefs."

Credit: 
University of Delaware

New pathogens in beef and cow's milk products: More research required

According to the DKFZ, they have been detected up to now in cow's milk, cow's milk products and the blood serum of healthy cattle. From the scientific findings made up to now, it seems possible that an indirect connection could be interpreted between the consumption of various foods originating from cattle and the occurrence of several cancer types in humans.

According to the DKFZ, BMMF are a new pathogen similar in type to both viruses and bacteria. Because they are related to plasmids, t
hey are currently being called "plasmidomes". As far as can be established by the DKFZ researchers, the BMMF do not occur as "naked" genetic material but rather together with proteins.

The German Federal Institute for Risk Assessment (BfR) and Max Rubner-Institut (MRI) jointly conclude that an assessment of the possi
ble risks posed by so-called BMMF as possible cancer risk factors has not been possible up to now due to the inadequate data situation.
The presumed connection between the BMMF and the incidence of cancer in humans should be examined further.

The DKFZ is assuming that infants whose immune system has not yet fully matured become infected with BMMF during their first year of life through supplementary feeding with cow's milk. They therefore conclude that infants should not be given cow's milk too early.

In line with the latest available information regarding nutrition, the BfR and MRI agree with the following recommendation: On the basis of the epidemiological studies published up to now on the connection between the consumption of red and processed meat and an in-
creased risk of colon cancer and in concurrence with the German Nutrition Society (DGE), it is recommended that meat consumption be limited to a maximum of 600 grams per week. Contrary to this, the consumption of cow's milk without any restriction is still recommended in compliance with the latest available knowledge. Breastfeeding in order to prevent various diseases is also fundamentally advocated.

Credit: 
BfR Federal Institute for Risk Assessment

Inducing seizures to stop seizures

image: Epilepsy patients with inserted electrodes often undergo cortical stimulation, a procedure that applies electrical current to the brain to map brain function but also to induce seizures for better understanding of the epileptic network. A new study finds that inducing seizures before surgery may be a convenient and cost-effective way to determine the brain region where seizures are coming from.

Image: 
The Neuro

Surgery is the only way to stop seizures in 30 per cent of patients with focal drug-resistant epilepsy. A new study finds that inducing seizures before surgery may be a convenient and cost-effective way to determine the brain region where seizures are coming from.

Epilepsy patients awaiting surgery often stay in hospital for one to two weeks under medical observation for the recording of seizures. By recording the source of the seizure, doctors can know what part of the brain to operate on to stop future seizures. This stay can be extremely inconvenient for patients and expensive for health care systems.

In approximately 20 per cent of patients, electrodes have to be inserted directly into the brain. Patients with inserted electrodes often undergo cortical stimulation, a procedure that applies electrical current to the brain to map brain function but also to induce seizures for better understanding of the epileptic network. So far, no study systematically addressed whether relying on induced seizures to plan the surgery is as effective as relying on spontaneous seizures.

Looking at data from 103 epilepsy patients in Montreal, Canada and Grenoble, France, a research team led by Dr. Birgit Frauscher at The Neuro (Montreal Neurological Institute and Hospital) used statistical methods to reveal correlations between the presence of stimulated seizures and their onset zone and patient outcome. They found the patients who had induced seizures had better outcomes than patients in whom no seizures could be induced. Also, there was strong similarity between the seizure onset zones identified by induced and spontaneous seizures.

This finding suggests that inducing seizures is as effective for determining the origin of seizures in the brain as spontaneous seizures. Using induced seizures in this way could mean much shorter hospital stays for patients awaiting surgery, and cost savings for hospitals that perform these operations.

Dr. Frauscher says her clinic has changed its standard practice, performing stimulation early after electrode insertion, and expects other clinics will follow as a result of this study.

"I think it would be a huge advantage if this procedure was done in the first days of a patient's stay," says Dr. Frauscher. "It's not a new procedure, but the approach is new in the sense that now we know it's very similar to a spontaneous seizure, so we can reduce hospital time. Instead of being in hospital for two weeks, patients can maybe be there for 48 or 72 hours and we only need to record maybe one additional spontaneous seizures and not several, and that is a huge difference."

Credit: 
McGill University

Cause of hardening of the arteries -- and potential treatment -- identified

image: False colour image of calcium phosphate deposits on bone.

Image: 
Melinda Duer/Cathy Shanahan

A team of UK scientists have identified the mechanism behind hardening of the arteries, and shown in animal studies that a generic medication normally used to treat acne could be an effective treatment for the condition.

The team, led by the University of Cambridge and King's College London, found that a molecule once thought only to exist inside cells for the purpose of repairing DNA is also responsible for hardening of the arteries, which is associated with dementia, heart disease, high blood pressure and stroke.

There is no current treatment for hardening of the arteries, which is caused by build-up of bone-like calcium deposits, stiffening the arteries and restricting blood flow to organs and tissues.

Supported by funding from the British Heart Foundation, the researchers found that poly(ADP ribose), or PAR, a molecule normally associated with DNA repair, also drives the bone-like calcification of arteries.

Additionally, using rats with chronic kidney disease, the researchers found that minocycline - a widely-prescribed antibiotic often used to treat acne - could treat hardening of the arteries by preventing the build-up of calcium in the circulatory system. The study, the result of more than a decade of fundamental research, is published in the journal Cell Reports.

"Artery hardening happens to everyone as they age, and is accelerated in patients on dialysis, where even children develop calcified arteries. But up until now we haven't known what controls this process and therefore how to treat it," said Professor Melinda Duer from Cambridge's Department of Chemistry, who co-led the research as part of a long-term collaboration with Professor Cathy Shanahan from King's College London.

"This hardening, or biomineralisation, is essential for the production of bone, but in arteries it underlies a lot of cardiovascular disease and other diseases associated with ageing like dementia," said Shanahan. "We wanted to find out what triggers the formation of calcium phosphate crystals, and why it seems to be concentrated around the collagen and elastin which makes up much of the artery wall."

In earlier research, Duer and Shanahan had shown that PAR - normally associated with the repair of DNA inside the cell - can in fact exist outside the cell and is the engine of bone production. This led the researchers to hypothesise that PAR may also play a role in biomineralisation. In addition, PARP1 and PARP2, the dominant PAR-producing enzymes, are expressed in response to DNA damage and oxidative stress, processes which are associated with both bone and vascular calcification.

"We could see signals from bone that we couldn't explain, so we looked for molecules from first principles to figure it out," said Duer.

"I'd been thinking for years that hardening of the arteries was linked to DNA damage, and that DNA damage is a pathway switched on by many agents including smoking and lipids," said Shanahan. "When this pathway is switched on, it drives the pathologies associated with ageing. If enough damage is present, the arteries will eventually reflect it."

Using NMR spectroscopy, the researchers found that when the cells become stressed and die, they release PAR, which binds very strongly to calcium ions. Once released, the PAR starts mopping up calcium into larger droplets which stick onto the components in artery walls that give the artery its elasticity, where they form ordered crystals and solidify, hardening the arteries.

"We never would have predicted that it was caused by PAR," said Duer. "It was initially an accidental discovery, but we followed it up - and it's led to a potential therapy."

Having discovered the links between DNA damage, PAR, bone and artery calcification, the researchers then looked into a way of blocking this pathway through the use of a PARP inhibitor.

"We had to find an existing molecule that is cheap and safe, otherwise, it would be decades before we would get a treatment," said Shanahan. "If something has already been shown to be safe in humans, the journey to the clinic can be much faster."

Working together with Cycle Pharmaceuticals, a Cambridge-based company, the researchers identified six known molecules that they thought might inhibit the PARP enzymes. Detailed experiments with these showed that the antibiotic minocycline was highly effective in preventing hardening of the arteries.

"It's been 12 years of basic research to get to this point," said Duer. "We set out with absolutely no expectation of finding a potential treatment - there is no treatment currently and nobody would have believed us if we had said at that point we were going to cure hardening of the arteries."

The technology has been patented and has been licensed to Cycle Pharmaceuticals by Cambridge Enterprise, the University's commercialisation arm. The researchers are hoping to carry out a proof of principle trial in patients in the next 12 to 18 months.

"Blood vessel calcification is a well-known risk factor for several heart and circulatory diseases, and can lead to high blood pressure and ultimately, a life-threatening heart attack," said Professor Jeremy Pearson, Associate Medical Director at the British Heart Foundation. "Now, researchers have shown how calcification of the walls of blood vessels takes place, and how the process differs from normal bone formation. By doing so, they have been able to identify a potential treatment to reduce blood vessel calcification without any adverse effects on bone. This type of treatment would benefit many people, and we eagerly await the results of the anticipated clinical trials looking at whether this drug lives up to its early promise."

Credit: 
University of Cambridge

Beewolves use a gas to preserve food

image: A female beewolf carries a paralyzed honey bee to its nest.

Image: 
Gudrun Herzner

Food stored in warm and humid conditions gets moldy very quickly und thus becomes inedible or even toxic. To prevent this, we use refrigerators and freezers as well as various other methods of preservation. Animals do not have such technical appliances and therefore need to find other ways to preserve food. The European beewolf Philanthus triangulum, a solitary wasp species whose females hunt honey bees, has evolved a successful method of food preservation. A female takes up to five honey bees into its brood cells where they serve as food for a young beewolf. Female beewolves prefer to build their nests in sunlit and sandy places. The nests are deep and therefore the brood cells are warm and humid. Such conditions are favorable for the development of the beewolf larvae; however, they also foster the growth of mold fungi. As a matter of fact, bees stored under such conditions in the lab were overgrown by mold within one to three days. Surprisingly, the mold risk for bees was much lower in the nests of beewolves, so that most beewolf larvae were able to finish their eight to ten-day development until they spin a cocoon.

Researchers from the University of Regensburg and the Johannes Gutenberg University in Mainz (previously at the Max Planck Institute for Chemical Ecology in Jena) have discovered an amazing mechanism that the beewolves have evolved in order to make sure that their larvae's food does not get moldy. "Shortly after oviposition, the brood cells of beewolves smell strikingly like 'swimming pool'. This smell comes from the egg itself", explains Prof. Dr. Erhard Strohm, the leader and main author of the study. Bioassays showed that beewolf eggs emit a gas that efficiently kills mold fungi. A chemical analysis revealed the surprising result that the gas is nitric oxide (NO). The eggs produce nitric oxide in large quantities and release it to the air where is reacts with atmospheric oxygen to nitrogen dioxide (NO2). The measured NO2 concentrations in the brood cells exceed both the occupational exposure limits of NO and NO2 as well as the EU maximum permissible values in cities.

Both NO and NO2 are very reactive and have a strong oxidizing effect. Therefore it is not surprising that high concentrations of the gases kill mold fungi. But how can beewolf eggs synthesize such amounts of NO? The scientists hypothesize that the fact that NO plays an extremely important role in many biochemical processes in almost all organisms from bacteria to mammals is a crucial precondition for the evolution of this mechanisms. In low doses and due to its high diffusion and reactivity, NO functions as a signal molecule and is, for example, involved in the adjustment of blood pressure and in developmental processes. Higher concentrations are used by many animals as an immune response to kill pathogens. Although beewolf eggs produce enormous amounts of NO, they use the same enzymes, NO synthases, which are also used by other organisms. Also, the responsible NO synthase gene in beewolves does not have any special characteristics. However, the researchers found a modification in the translation of the gene into the protein, which may be responsible for the unusually high synthesis rate of NO in beewolf eggs. "Due to so-called alternative splicing the enzyme in the beewolf eggs lacks a segment which may be responsible for regulation. This may have led to the significant increase in enzyme activity," says Dr. Tobias Engl, the second main author of the publication.

The use of a reactive gases to control mold on food supplies has improved survival of beewolf offspring considerably and represents an evolutionary key invention. This novel defense mechanism against microorganisms is a fascinating example of how existing processes are modified in the course of evolution in such a way that completely new functions are generated. The discovery of NO as a key defense component against mold fungi increases the spectrum of natural antimicrobial strategies and adds a surprising and intriguing facet to our understanding of this biologically important molecule.

The most amazing aspect of the defense strategy of beewolf eggs is the fact that the eggs are obviously able to survive the extremely toxic conditions they produce themselves. Which mechanism the eggs deploy is the subject of current investigations. The results may not only be interesting for basic research, but also for possible applications in human medicine. A harmful overproduction of NO may also be the result of certain diseases or acute infections. The mechanisms that beewolf eggs use to protect themselves from NO may help to find new therapeutic approaches.

Credit: 
Max Planck Institute for Chemical Ecology

Citizen scientists re-tune Hubble's galaxy classification

image: Spiral structure in the Pinwheel Galaxy (Messier 101), as observed by the Hubble Space Telescope.

Image: 
NASA, ESA, CXC, SSC, and STScI

Hundreds of thousands of volunteers have helped to overturn almost a century of galaxy classification, in a new study using data from the longstanding Galaxy Zoo project. The new investigation, published in the journal Monthly Notices of the Royal Astronomical Society, uses classifications of over 6000 galaxies to reveal that "well known" correlations between different features are not found in this large and complete sample.

Almost 100 years ago, in 1927, astronomer Edwin Hubble wrote about the spiral galaxies he was observing at the time, and developed a model to classify galaxies by type and shape. Known as the "Hubble Tuning Fork" due to its shape, this model takes account of two main features: the size of the central region (known as the 'bulge'), and how tightly wound any spiral arms are.

Hubble's model soon became the authoritative method of classifying spiral galaxies, and is still used widely in astronomy textbooks to this day. His key observation was that galaxies with larger bulges tended to have more tightly wound spiral arms, lending vital support to the 'density wave' model of spiral arm formation.

Now though, in contradiction to Hubble's model, the new work finds no significant correlation between the sizes of the galaxy bulges and how tightly wound the spirals are. This suggests that most spirals are not static density waves after all.

Galaxy Zoo Project Scientist and first author of the new work, Professor Karen Masters from Haverford College in the USA explains: "This non-detection was a big surprise, because this correlation is discussed in basically all astronomy textbooks - it forms the basis of the spiral sequence described by Hubble."

Hubble was limited by the technology of the time, and could only observe the brightest nearby galaxies. The new work is based on a sample 15 times larger from the Galaxy Zoo project, where members of the public assess images of galaxies taken by telescopes around the world, identifying key features to help scientists to follow up and analyse in more detail.

"We always thought that the bulge size and winding of the spiral arms were connected", says Masters. "The new results suggest otherwise, and that has a big impact on our understanding of how galaxies develop their structure."

There are several proposed mechanisms for how spiral arms form in galaxies. One of the most popular is the density wave model - the idea that the arms are not fixed structures, but caused by ripples in the density of material in the disc of the galaxy. Stars move in and out of these ripples as they pass around the galaxy.

New models however suggest that some arms at least could be real structures, not just ripples. These may consist of collections of stars that are bound by each other's gravity, and physically rotate together. This dynamic explanation for spiral arm formation is supported by state-of-the art computer models of spiral galaxies.

"It's clear that there is still lots of work to do to understand these objects, and it's great to have new eyes involved in the process", adds Brooke Simmons, Deputy Project Scientist for the Galaxy Zoo project.

"These results demonstrate that, over 170 years after spiral structure was first observed in external galaxies, we still don't fully understand what causes these beautiful features."

Credit: 
Royal Astronomical Society

Tracking major sources of energy loss in compact fusion facilities

image: Physicist Walter Guttenfelder.

Image: 
Elle Starkman/PPPL Office of Communications

A key obstacle to controlling on Earth the fusion that powers the sun and stars is leakage of energy and particles from plasma, the hot, charged state of matter composed of free electrons and atomic nuclei that fuels fusion reactions. At the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL), physicists have been focusing on validating computer simulations that forecast energy losses caused by turbulent transport during fusion experiments.

Researchers used codes developed at General Atomics (GA) in San Diego to compare theoretical predictions of electron and ion turbulent transport with findings of the first campaign of the laboratory's compact -- or "low-aspect ratio" -- National Spherical Torus Experiment-Upgrade (NSTX-U). GA, which operates the DIII-D National Fusion Facility for the DOE, has developed codes well-suited for this purpose.

Low-aspect ratio tokamaks are shaped like cored apples, unlike the more widely used conventional tokamaks that are shaped like doughnuts.

State-of-the-art codes

"We have state-of-the-art codes based on sophisticated theory to predict transport," said physicist Walter Guttenfelder, lead author of a Nuclear Fusion paper that reports the findings of a team of researchers. "We must now validate these codes over a broad range of conditions to be confident that we can use the predictions to optimize present and future experiments."

Analysis of the transport observed in NSTX-U experiments found that a major factor behind the losses was turbulence that caused the transport of electrons to be "anomalous," meaning that they spread rapidly, similar to the way that milk mixes with coffee when stirred by a spoon. The GA codes predict the cause of these losses to be a complex mix of three different types of turbulence.

The observed findings opened a new chapter in the development of predictions of transport in low-aspect ratio tokamaks -- a type of fusion facility that could serve as a model for next-generation fusion reactors that combine light elements in the form of plasma to produce energy. Scientists around the world are seeking to replicate fusion on Earth for a virtually inexhaustible supply of power to generate electricity.

Researchers at PPPL now aim to identify the mechanisms behind the anomalous electron transport in a compact tokamak. Simulations predict that such energy loss stems from the presence of three distinct types of complex turbulence -- two types with relatively long wavelengths and a third with wavelengths a fraction of the size of the larger two.

The impact of one of the two long-wave types, which is typically found in the core of low-aspect ratio tokamaks as well as in the edge of the plasma in conventional tokamaks, must be fully taken into account when predicting low-aspect ratio transport.

Challenge to simulate

However, the combined impact of all three types of turbulence is a challenge to simulate since scientists normally study the different wavelengths separately. Physicists at the Massachusetts Institute of Technology (MIT) have recently performed multi-scale simulations and their work highlights the significant supercomputer time such simulations require.

Researchers must now test additional simulations to achieve more complete agreement between predictions of transport and experiments on plasmas in low-aspect ratio tokamaks. Included in these comparisons will be measurements of turbulence taken by University of Wisconsin-Madison coauthors of the Nuclear Fusion paper that will better constrain predictions. Improved agreement will provide assurance of energy-loss predictions for present and future facilities.

Credit: 
DOE/Princeton Plasma Physics Laboratory

How can governments fight antimicrobial resistance with policy?

image: Governments have a wide variety of policy options at their disposal to respond to the growing threat of antimicrobial resistance, but many of these approaches have not been rigorously evaluated.

Image: 
NIAID, Flickr

Governments have a wide variety of policy options at their disposal to respond to the growing threat of antimicrobial resistance, but many of these approaches have not been rigorously evaluated, according to a new study published this week in PLOS Medicine by Susan Rogers Van Katwyk of the University of Ottawa, Canada, and colleagues.

Antimicrobial resistance (AMR) is on the rise due to persistent misuse and overuse of antimicrobials, and has already rendered some infections untreatable with existing drugs. In the new work, researchers systematically searched seven global databases for studies that clearly described and evaluated a government policy intervention aimed at reducing human antimicrobial misuse. They identified 69 published evaluations of such interventions carried out around the world.

Described in these 69 studies were 17 different types of policies that governments have deployed and tested to reduce antimicrobial use, including public awareness campaigns, antimicrobial guidelines, vaccination, and tailored regulations for prescribing and reimbursement. Unfortunately, most existing policy options have not been rigorously evaluated, which limits their usefulness in planning future policy interventions. Of the studies, only 4 had a randomized controlled design, the gold standard for medical interventions, while 35 used rigorous quasi-experimental designs and the remaining 30 were uncontrolled and descriptive. The current systematic review was unable to directly investigate the impact of the different interventions on AMR, but reductions in antimicrobial use are likely to lead to lower levels of resistance over time.

"To avoid future waste of public resources, and in line with WHO recommendations for national action on AMR, governments should ensure that AMR policy interventions are evaluated using rigorous study designs and that study results are published," the authors say.

Credit: 
PLOS

Catalog of north Texas earthquakes confirms continuing effects of wastewater disposal

A comprehensive catalog of earthquake sequences in Texas's Fort Worth Basin, from 2008 to 2018, provides a closer look at how wastewater disposal from oil and gas exploration has changed the seismic landscape in the basin.

In their report published in the Bulletin of the Seismological Society of America, Louis Quinones and Heather DeShon of Southern Methodist University and colleagues confirm that seismicity rates in the basin have decreased since 2014, a trend that appears to correspond with a decrease in wastewater injection.

However, their analysis also notes that new faults have become active during this period, and that seismicity continues at a greater distance from injection wells over time, suggesting that "far-field" changes in seismic stress will be important for understanding the basin's future earthquake hazard potential.

"One thing we have come to appreciate is how broadly injection in the basin has modified stress within entire basin," said DeShon.

The first thing researchers noted with wastewater injection into the basin "was the reactivation of individual faults," she added, "and what we're now starting to see is essentially the leftover energy on all sorts of little faults being released by the cumulative volume that's been put into the basin."

The earthquake catalog published in BSSA reports all seismicity recorded by networks operated by SMU between 2008 and 2018. Some seismic sequences in the catalog--such as the 2008 Dallas Fort Worth Airport earthquakes--are well-known and well-studied, while others such as the 2018 west Cleburne sequence are reported in the paper for the first time.

DeShon said publishing the complete catalog was important in part to help people recognize that "there are earthquakes throughout the basin, not just on these three or four sequences that have garnered a lot of press attention."

The researchers found that overall seismicity in the Fort Worth Basin has been strongly correlated in time and space with wastewater injection activities, with most seismicity occurring within 15 kilometers of disposal wells.

Wastewater disposal volume began to decrease from its peak in 2014, mostly as a result of lower oil and gas prices, and the study shows "tapering off of seismicity along the faults that were near high-injection wells," said Quinones.

There are exceptions to this pattern, including the 2015 Irving-Dallas and 2017 Lake Lewisville sequences that have no wells within 15 kilometers.

Induced earthquakes occur when wastewater injected back into the ground increases the pore pressure within the rocks and affects stress along faults in surrounding rock layers. In the Fort Worth Basin, these stress changes may propagate far--more than 10 kilometers--from the injection wells, the researchers suggest.

"Injection rates peaked in 2014, but we still don't understand how spatially extensive the modification of pore pressure is at depth, so we still don't understand how the hazard is going to reduce with time," said DeShon.

There are still far fewer induced earthquakes in the Fort Worth Basin compared to regions such as Oklahoma, which also has experienced a dramatic increase in seismicity in the past decade as the result of wastewater disposal from oil and gas production. The volumes of injected wastewater are much higher in Oklahoma, and the faults there tend to be much closer together, DeShon said.

By contrast, Quinones said, faults in the Fort Worth Basin are more widely spaced, and there are few instances of earthquakes jumping between faults.

However, the dense population of the Dallas-Fort Worth metropolitan area makes it critical to continue monitoring the region's induced earthquake risk, comparing seismic data with more information on wastewater injection.

For the moment, DeShon said, researchers only have access to monthly cumulative volume and average pressure at injection wellheads, in a report that is updated once a year. "It would be best if injection data were provided in a more timely fashion in Texas, and if more detailed daily information on injection rates and volumes and some measurements of downhole pressure were provided," she said.

Credit: 
Seismological Society of America

The new technology will significantly enhance energy harvest from PV modules

image: Optiverter, photo by Dmitri Vinnikov.

Image: 
Dmitri Vinnikov

The whole world is inevitably moving towards a more sustainable lifestyle. Sustainability of the environment requires changes in the current way of life and introduction of new, more sustainable solutions in our everyday consumption.

The TalTech Power Electronics Research Group led by Researcher-Professor Dmitri Vinnikov has been working on improvement of the efficiency of alternative energy generation units for over decade. "In the early years of the alternative energy deployment, it was outrageously expensive for an ordinary consumer, but the developments in recent years, the triumph of materials technology and the efforts of power electronics engineers have made the price much more affordable for the consumers," Professor Vinnikov says.

The research group led by Dmitri Vinnikov is focusing on the research on solar photovoltaic energy production. Under ideal conditions, any photovoltaic system (solar power plant) would supply consumers with electricity without any problems. Unfortunately, at our latitude there are no ideal operating conditions for PV modules. The environmental and natural factors that affect the performance of such systems most are (apart from Nordic sunlight) deposition of dirt (snow, soil, leaves) on PV module surfaces and long shadows created by lower sun angle affect.

"In order to convert energy, which comes from the renewable energy sources, into electricity for the consumers, a grid converter must be used, which transforms the output of the renewable energy source into a current suitable for home appliances. In addition to a converter, a special devices called power optimizer must be used to maximize energy harvest so that it would not be influenced by weather and would provide maximum benefit for the consumer," Dmitri Vinnikov explains. The power electronics researchers of Tallinn University of Technology have taken a step further to solve this problem - they have developed a hybrid technology Optiverter? that combines the key advantages of photovoltaic power optimizers and grid converters. It is a novel power semiconductor converter technology used in the power systems of small and medium-sized PV installations, and possibly for building integrated PV.

The first prototype of the Optiverter? was created already in 2016. After three years of comprehensive R&D activities the research group is planning, in cooperation with the Estonian company Ubik Solutions OÜ, to start in the near future mass production of Optiverters, which are indispensable in residential solar PV systems.

"Thanks to the patented multimode control, the input voltage range is up to three times wider compared to commercial competitors. Like the power optimizers of PV modules, the Optiverter? ensures maximum energy harvest even if a PV module is under heavy or opaque shade, which usually blocks energy production with conventional PV microinverters. This is an advantage that distinguishes it from the current technology available on the market," Professor Vinnikov says.

The scientists estimate the lifespan of the Optiverter? to be approximately 25 years (the same as the lifespan of a top-grade solar panel). The Optiverter? has invaluable benefits compared to our current, fossil fuel based energy production. In the long term, the Optiverter technology is not only environmentally friendly but also sustainable.

"It is obvious that renewable energy, be it wind energy, biofuel, natural gas or solar energy, is the future," Professor Vinnikov says. "Even if there weren't increasingly stringent EU requirements (applied to fuel prices of motor vehicles, but also for instance to construction work and energy production, etc.) to be complied with, life itself would force us to use more sustainable alternative technologies. All this makes the current, conventional energy production increasingly costly, while the PV systems are experiencing incredible cost reduction for the last five years."

Credit: 
Estonian Research Council

'Shield' of sea creature inspires materials that can handle their own impact

video: The mantis shrimp, one of the ocean's most ornery creatures, can take on attacks from its own species without getting injured. Its strategy could solve a big manufacturing problem: Creating lighter materials that absorb a lot of energy from a sharp impact within a limited amount of space.

Image: 
Purdue University video/Erin Easterling

WEST LAFAYETTE, Ind. -- The mantis shrimp, one of the ocean's most ornery creatures, can take on attacks from its own species without getting injured. Its strategy could solve a big manufacturing problem: Creating lighter materials that absorb a lot of energy from a sharp impact within a limited amount of space.

Think precious cargo. What if there were a material that could prevent car ceilings from caving in on passengers during an accident, or fragile objects from breaking when transported over long distances?

The mantis shrimp's secret is its tail appendage, called a telson. Engineers have now discovered what allows the telson to absorb the blows of its feisty self, with the goal of applying these lessons to protective gear.

The work, published in the journal Advanced Functional Materials, was performed by a team of researchers including David Kisailus' lab at the University of California, Riverside and Pablo Zavattieri's lab at Purdue University.

A telson can be shaped either as a territorial shield for "smasher" species or as a burrowing shovel for "spearer" species that also stabs prey. The researchers found out how the telson of the smasher, compared to that of the spearer, is better at protecting the mantis.

Their findings reveal that the smasher telson has curved ridges called carinae on the outside and a helicoidal structure shaped like a spiral staircase on the inside. UC Riverside ran tests on both the mantis shrimp itself and 3D-printed replicas of the telson, showing that the carinae both stiffen a smasher's shield and allow it to flex inward.

Together with the helicoidal structure, which prevents cracks from growing upon impact, the shield absorbs significant amounts of energy during a strike without falling apart.

Purdue researchers validated the role of carinae through computational models, simulating the attacks of one mantis against the telson of another. They even "invented" species with features between the smasher and spearer to evaluate which telson offered the best protection for the animal.

"We started with the telson of the spearer and gradually added features that start looking like the smasher," said Zavattieri, a professor of civil engineering at Purdue. A YouTube video is available at https://youtu.be/bXEfWqXyvfA.

"The smasher shield is clearly more ideal for preventing impact from reaching the rest of the body, which makes sense because the mantis has organs all the way to its tail," he said.

Zavattieri and Kisailus, a professor of chemical and environmental engineering and the Winston Chung Endowed Chair of Energy Innovation at UC Riverside, had previously observed the same helicoidal structure in the dactyl club appendage of the smasher mantis, which strikes a telson with the speed of a .22 caliber bullet.

"We realized that if these organisms were striking each other with such incredible forces, the telson must be architected in such a way to act like the perfect shield," Kisailus said. "Not only did the telson of the smasher contain the helicoid microstructure, but there were significantly more energy-absorbing helicoidal layers in the smashing type than the spearing type."

Zavattieri's group has already begun incorporating the crack propagation mechanisms of arthropod exoskeletons into 3D-printed cement paste, a key ingredient of the concrete and mortar used to build various elements of infrastructure. His lab plans to also try out advantageous structures from the mantis shrimp.

But there are still more clues to uncover about all that carinae and helicoidal structures have to offer, the researchers say, as well as how to manufacture them into new materials.

"The dactyl club is bulky, while the telson is very lightweight. How do we make protective layers, thin films and coatings for example, that are both stronger and lighter?" Zavattieri said.

Credit: 
Purdue University

Promising treatment option for Complex Regional Pain Syndrome

A study, published today in PNAS, has found a potential treatment for patients with Complex Regional Pain Syndrome (CRPS).

CRPS is a severe post-traumatic pain condition affecting one or more limbs and is associated with regional pain and sensory, bone and skin changes. The causes of CRPS, however, are yet to be fully understood.

Approximately 15 percent of patients with CRPS still have symptoms one year after onset that severely impact their quality of life. For these patients, prognosis is often poor and drug therapy for pain relief is rarely effective.

A team of international researchers, led by Dr Andreas Goebel from the University of Liverpool's Pain Research Institute, conducted a study to better understand the immunological causes for CRPS.

The researchers examined antibodies in the serum of these patients to ascertain the potential role of these proteins for causing the condition; they were particularly interested to assess 'neuroinflammation' - antibody-induced raised levels of inflammatory mediators such as Interleukin 1 (IL-1) in either peripheral tissues or brain.

IL-1 is known to normally induce local and systemic body-responses aimed to eliminate microorganisms and repair tissue damage. However, an increasing number of clinical conditions have been identified in which IL-1 production is considered inappropriate and IL-1 is part of the cause of the disease.

The researchers transferred the antibodies from patients with long-lasting CRPS to mice and found that these antibodies consistently caused a CRPS-like condition. An important element of 'transferred CRPS' was glial cell activation, a type of 'neuroinflammation' in pain-related parts of the mouse brains. The team then discovered that 'blocking' of IL-1 with a clinically available drug, 'anakinra' helped to both prevent and reverse all of these changes in the animals.

Researchers from the University of Pécs (Hungary), University of Budapest (Hungary), University of Manchester, University of Sheffield and The Walton Centre National Health Service Foundation Trust in Liverpool were also involved in the study.

Dr Andreas Goebel, said: "Our results support previous clinical observations that patients with persistent CRPS should respond to immune treatments with a reduction of at least some of their disease features.

"This approach has attractive therapeutic potential and could also have a real impact on the treatment of other unexplained chronic pain conditions; we plant now to apply for funds funds to test the effect of this and similar drugs in patients with CRPS."

Credit: 
University of Liverpool