Tech

Ozone threat from climate change

image: Increasing global temperatures will impact air quality.

Image: 
Photo illustration by Jeffrey C. Chase

Increasing temperatures due to climate change will shift climatic conditions, resulting in worse air quality by increasing the number of days with high concentrations of ozone, according to a new journal article on air quality throughout the Mid-Atlantic region from researchers at the University of Delaware's College of Earth, Ocean and Environment(CEOE).

Cristina Archer led a team from CEOE as the members compiled nearly 50 years' worth of data from Delaware Department of Natural Resources and Environmental Control (DNREC) air monitoring and climate models to analyze climatic trends. They found that rising temperatures will increase the number of days in a year where ozone levels in Earth's lower atmosphere become dangerous.

Archer said DNREC, which funded her study, is concerned with near-ground ozone levels for two main reasons: impacts on human health and compliance with federal and state regulations limiting high-ozone concentrations.

"Ozone has large negative impacts on health, especially affecting the cardiopulmonary and respiratory systems," Archer said. "It is especially bad if you already have a respiratory condition, asthma, for example, or an infection. In Delaware, we are barely in attainment or slightly in non-attainment (of ozone regulations). When we are not in attainment, the Environmental Protection Agency has to act. That is the relevance. That is why we need to know now there is a problem, so we can act on it."

The study, titled "Global Warming Will Aggravate Ozone Pollution in the U.S. Mid-Atlantic," was recently published in the Journal of Applied Meteorology and Climatology.

Archer is a professor in CEOE with a joint appointment between the Physical Ocean Science and Engineering (POSE) program of the School of Marine Science and Policy and the Department of Geography. Collaborators in the research and writing were Sara Rauscher, an associate professor in the Department of Geography, and Joseph Brodie, a former graduate student and postdoctoral researcher at CEOE who is currently director of atmospheric research at the Rutgers University Center for Ocean Observing Leadership.

Ozone in the upper atmosphere is beneficial for blocking harmful ultraviolet (UV) rays from the sun. However, ozone closer to the surface of the Earth -- the focus of the study -- can lead to pulmonary complications among the population. Near-ground ozone can lead to coughing, irritation of the throat and chest, exacerbation of asthma, inflammation of lung cells, aggravation of chronic lung diseases, and ozone even reduces the disease-fighting capabilities of the immune system. On days where ozone levels are high enough, prolonged exposure can even lead to permanent lung damage. Ozone is regulated as a pollutant by the EPA because of ozone's hazardous nature.

Near-ground ozone forms as a result of photochemical reactions between nitrogen oxides (NOx) and volatile organic compounds (VOCs). Intense UV rays from the sun are the catalyst for the reactions between NOx emissions and the VOCs. NOx emissions occur when cars or power plants burn fossil fuels such as coal and gasoline. VOCs are also man-made and derive from a variety of sources, including cars and gasoline-burning engines, paints, insecticides, cleaners, industrial solvents, and chemical manufacturing.

According to Archer, limiting ozone is difficult because it is a secondary pollutant.

"There are primary pollutants that are emitted and there are secondary pollutants that form in the air," said Archer. "Ozone is one of these [secondary pollutants]. You can't go to a smokestack and measure the ozone coming out. You'll get precursors or other compounds that form it but never ozone itself."

Most of the time, near-ground ozone is not an issue for Delaware. As outlined in Archer's paper, during the 1980s the average number of high-ozone days in Delaware was about 75, whereas by 2015 it was less than 20, decreasing by about two days every year due to stricter air quality regulations.

However, the team of researchers found that increasing temperatures due to climate change are threatening to reverse the decrease in near-ground ozone pollution and increase the number of days where surface ozone levels become dangerous.

Conditions that lead to high-ozone days are typical of hot summer days.

As global temperatures increase, summers will continue to get hotter and will lead to more days with high ozone concentrations. Archer also stated that more high-ozone days could also occur during the fall and spring, since increasing global temperatures will make those seasons warmer on average. According to the Intergovernmental Panel for Climate Change, global temperatures have increased by one degree Celsius as of 2019 and will increase by another one degree Celsius by the end of the 21st century. Archer also said high-ozone days themselves may become more intense due to increased ozone concentrations.

The increase in the number and intensity of high-ozone days is troubling because the adverse health effects impact anyone who spends ample time outdoors, including children and people who exercise outside. More people go outside more often during the summer, potentially increasing human exposure to dangerous levels of near-ground ozone.

In the article, Archer said that a "business as usual" approach will inevitably lead to a dangerous increase in high-ozone days. Archer said that the country needs stricter regulations if it is to limit the number of high-ozone days.

Credit: 
University of Delaware

Anonymizing personal data 'not enough to protect privacy,' shows new study

With the first large fines for breaching EU General Data Protection Regulation (GDPR) regulations upon us, and the UK government about to review GDPR guidelines, researchers have shown how even anonymised datasets can be traced back to individuals using machine learning.

The researchers say their paper, published today in Nature Communications, demonstrates that allowing data to be used - to train AI algorithms, for example - while preserving people's privacy, requires much more than simply adding noise, sampling datasets, and other de-identification techniques.

They have also published a demonstration tool (3) that allows people to understand just how likely they are to be traced, even if the dataset they are in is anonymised and just a small fraction of it shared.

They say their findings should be a wake-up call for policymakers on the need to tighten the rules for what constitutes truly anonymous data.

Companies and governments both routinely collect and use our personal data. Our data and the way it's used is protected under relevant laws like GDPR or the US's California Consumer Privacy Act (CCPA).

Data is 'sampled' and anonymised, which includes stripping the data of identifying characteristics like names and email addresses, so that individuals cannot, in theory, be identified. After this process, the data's no longer subject to data protection regulations, so it can be freely used and sold to third parties like advertising companies and data brokers.

The new research shows that once bought, the data can often be reverse engineered using machine learning to re-identify individuals, despite the anonymisation techniques.

This could expose sensitive information about personally identified individuals, and allow buyers to build increasingly comprehensive personal profiles of individuals.

The research demonstrates for the first time how easily and accurately this can be done - even with incomplete datasets.

In the research, 99.98 per cent of Americans were correctly re-identified in any available 'anonymised' dataset by using just 15 characteristics, including age, gender, and marital status.

First author Dr Luc Rocher of UCLouvain said: "While there might be a lot of people who are in their thirties, male, and living in New York City, far fewer of them were also born on 5 January, are driving a red sports car, and live with two kids (both girls) and one dog."

To demonstrate this, the researchers developed a machine learning model to evaluate the likelihood for an individual's characteristics to be precise enough to describe only one person in a population of billions.

They also developed an online tool, which doesn't save data and is for demonstration purposes only, to help people see which characteristics make them unique in datasets.

The tool first asks you put in the first part of their post (UK) or ZIP (US) code, gender, and date of birth, before giving them a probability that their profile could be re-identified in any anonymised dataset.

It then asks your marital status, number of vehicles, house ownership status, and employment status, before recalculating. By adding more characteristics, the likelihood of a match to be correct dramatically increases.

Senior author Dr Yves-Alexandre de Montjoye, of Imperial's Department of Computing, and Data Science Institute, said: "This is pretty standard information for companies to ask for. Although they are bound by GDPR guidelines, they're free to sell the data to anyone once it's anonymised. Our research shows just how easily - and how accurately - individuals can be traced once this happens.

He added: "Companies and governments have downplayed the risk of re-identification by arguing that the datasets they sell are always incomplete.

"Our findings contradict this and demonstrate that an attacker could easily and accurately estimate the likelihood that the record they found belongs to the person they are looking for."

Re-identifying anonymised data is how journalists exposed Donald Trump's 1985-94 tax returns in May 2019. (4)

Co-author Dr Julien Hendrickx from UCLouvain said: "We're often assured that anonymisation will keep our personal information safe. Our paper shows that de-identification is nowhere near enough to protect the privacy of people's data."

The researchers say policymakers must do more to protect individuals from such attacks, which could have serious ramifications for careers as well as personal and financial lives.

Dr Hendrickx added: "It is essential for anonymisation standards to be robust and account for new threats like the one demonstrated in this paper."

Dr de Montjoye said: "The goal of anonymisation is so we can use data to benefit society. This is extremely important but should not and does not have to happen at the expense of people's privacy."

Credit: 
Imperial College London

MicroRNAs from human fat cells can impair macrophage ability to eliminate cholesterol

IMAGE: Effect on THP-1 cells of adipocyte-derived extracellular vesicles (EVs) from lean and obese subjects on cholesterol efflux gene expression and cholesterol efflux to media. THP-1 cells were incubated with obese...

Image: 
"Cholesterol efflux alterations in adolescent obesity: role of adipose-derived extracellular vesical microRNAs, " published online July 23, 2019 in Journal of Translational Medicine. Matthew D. Barberio, Lora J. Kasselman, Martin P....

WASHINGTON-(July 23, 2019)-A multi-institutional team led by research faculty at Children's National in Washington, D.C., finds that extracellular vesicles (EVs) derived from kids' fat can play a pivotal role in ratcheting up risk for atherosclerotic cardiovascular disease well before any worrisome symptoms become visible. What's more, the team showed that EVs found in the body's fat stores can disrupt disposal of cholesterol in a variety of kids, from lean to obese, the team reports online July 22, 2019, in the Journal of Translational Medicine.

In atherosclerosis, blood vessels that carry oxygen-rich blood throughout the body become inflamed, and macrophages settle in the vessel wall and become overloaded with cholesterol. A plaque forms that restricts blood flow. But it remains a mystery how fat cells residing in one place in the body can trigger mayhem in cells and tissues located far away. EVs seemed likely troublemakers since they enable intercellular communication.

"We found that seven specific small sequences of RNA (microRNA) carried within the extracellular vesicles from human fat tissue impaired the ability of white blood cells called macrophages to eliminate cholesterol," says Robert J. Freishtat, M.D., MPH, senior scientist at the Center for Genetic Medicine Research at Children's National and the study's senior author. "Fat isn't just tissue. It can be thought of as a metabolic organ capable of communicating with types of cells that predispose someone to develop atherosclerotic cardiovascular disease, the leading cause of death around the world."

Research scientists and clinicians from Children's National, the George Washington University, NYU Winthrop Hospital and the National Heart, Lung and Blood Institute collaborated to examine the relationship between the content of EVs and their effect on macrophage behavior. Their collaborative effort builds on previous research that found microRNA derived from fat cells becomes pathologically altered by obesity, a phenomenon reversed by weight-loss surgery.

Because heart disease can have its roots in adolescence, they enrolled 93 kids aged 12 to 19 with a range of body mass indices (BMIs), including the "lean" group, 15 youth whose BMI was lower than 22 and the "obese" group, 78 youths whose BMI was in the 99th percentile for their age. Their median age was 17. Seventy-one were young women. They collected visceral adipose tissue during abdominal surgeries and visited each other's respective labs to perform the experiments.

"We were surprised to find that EVs could hobble the macrophage cholesterol outflow system in adolescents of any weight," says Matthew D. Barberio, Ph.D., the study's lead author, a former Children's National scientist who now is an assistant professor at the George Washington University's Milken Institute School of Public Health. "It's still an open question whether young people who are healthy can tolerate obesity--or whether there are specific differences in fat tissue composition that up kids' risk for heart disease."

The team plans to build on the current findings to safeguard kids and adults against future cardiovascular risk.

"This study was a huge multi-disciplinary undertaking," adds Allison B. Reiss, M.D., of NYU Winthrop Hospital and the study's corresponding author. "Ultimately, we hope to learn which properties belonging to adipose tissue EVs make them friendly or unfriendly to the heart, and we hope that gaining that knowledge will help us decrease morbidity and mortality from heart disease across the lifespan."

Credit: 
Children's National Health System

Airborne lidar system poised to improve accuracy of climate change models

WASHINGTON -- Researchers have developed a laser-based system that can be used for airborne measurement of important atmospheric gases with unprecedented accuracy and resolution. The ability to collect this data will help scientists better understand how these atmospheric gases affect the climate and could help improve climate change predictions.

In the Optical Society journal Applied Optics, researchers from Deutsches Zentrum für Luft- und Raumfahrt e.V. (DLR) -- Germany's national center for aerospace, energy and transportation research -- describe how their lidar instrument was used aboard an aircraft to acquire the first simultaneous measurements of the vertical structure of water vapor and ozone in the tropopause region of the atmosphere. The researchers say that the new system might even be useful for monitoring atmospheric gases from space.

The tropopause separates the surface-based troposphere layer where weather takes place from the overlying stratosphere that contains the ozone layer that protects life on Earth from harmful radiation. Scientists want to study water vapor and ozone in the tropopause because the distribution of these atmospheric gases in this layer plays a crucial role in the Earth's climate.

"The ability to detect the vertical structure of water vapor and ozone is critical for understanding the exchange of these atmospheric gases between the troposphere and the stratosphere," said Andreas Fix, who led the research team. "These measurements could help us identify errors and uncertainties in climate models that would help improve predictions of the future climate, which is one of the central challenges for our society and economy."

Gaining a 3D perspective

Atmospheric gases can be assessed with instruments flown into the atmosphere or with data acquired from satellites. However, these methods haven't been able to provide a full picture of atmospheric gas distribution because they either lack the vertical component or don't provide high enough resolution. Although instruments carried with balloons -- known as balloon sondes -- can provide highly resolved vertical profiles they don't offer detailed temporal resolution and can only be used at selected sites.

To solve these problems, the researchers developed a lidar system that uses laser light to measure both ozone and water vapor at the same time. Their approach, called differential absorption lidar (DIAL), uses two slightly different UV wavelengths to measure each gas. The UV radiation at one wavelength is mostly absorbed by the gas molecules while most of the other wavelength is reflected. Measuring the ratio of the UV signals returning from the atmosphere allows calculation of a detailed gas profile.

The gas profiles created using the new lidar system exhibit a vertical resolution of around 250 meters and a horizontal resolution of about 10 kilometers below the aircraft's flight track.

"This vertical capability is a significant advancement in studying exchange processes at the tropopause," said Fix. "It helps overcome significant shortcomings in resolving the fine-scale distribution that have made it difficult to understand processes responsible for exchange at the tropopause."

Achieving energy efficiency

To perform this method aboard a plane, the researchers used a highly efficient optical parametric oscillator (OPO) they previously developed to convert the laser output to the UV wavelengths needed to measure water vapor and ozone. "The conversion needs to be very energy efficient to generate UV radiation with adequate pulse energies and high average power from the limited energy available on board an aircraft," explained Fix.

Tests of the new lidar system showed that its accuracy matched well with that of balloon sondes. In 2017, the researchers flew the new system aboard the wave-driven isentropic exchange (WISE) mission, which involved multiple long-range flights over the North Atlantic and Northern Europe. They found that the instrument worked remarkably well, remained stable during use and could measure characteristic ozone and water vapor distributions at the tropopause.

The researchers plan to analyze the new vertical-component data acquired during WISE and integrate it into climate models. They expect to use the instrument to collect data atmospheric gas information aboard future flights.

Credit: 
Optica

New mechanism moving droplets at record-high speed and long distance without extra power

video: A novel mechanism to transport droplets at record-high velocity and distance without extra energy input.

Image: 
City University of Hong Kong

Transporting droplets on solid surfaces at high speed and long distance, even against gravity, without additional force has been a formidable task. But a research team comprising scientists from City University of Hong Kong (CityU) and three other universities and research institutes has recently devised a novel mechanism to transport droplets at record-high velocity and distance without extra energy input, and droplets can be moved upward along a vertical surface, which has never been achieved before. The new strategy to control droplet motion can open up new potential in applications in microfluidic devices, bio-analytical devices and beyond.

The conventional methods for transporting droplets include leveraging the wetting gradient on the surface to induce a driving force and move the droplet from hydrophobic to hydrophilic surface. However, the fundamental trade-off underpinning droplet hydrodynamics imposes limitations: transporting droplets at high speed necessitates a large wetting gradient and in turn is limited to a short distance, while long transport distance demands a small wetting gradient to reduce the adhesive force between the liquid and solid surface, and the transport velocity is then constrained.

To overcome these challenges, Professor Wang Zuankai of Department of Mechanical Engineering at CityU cooperated with Professor Xu Deng from the University of Electronic Science and Technology of China (UESTC) and Professor Hans-Jürgen Butt from the Max Planck Institute for Polymer Research (MPI-P) in Germany, as well as researchers from the University of Science and Technology of China (USTC). They have devised a new strategy that achieves the unidirectional and self-propelled liquid droplet transportation on diverse substrates. Their work demonstrates unprecedented performances: the highest transport velocity (1.1m/s) which is ten times higher than the literature ever reported, and the longest, in principle unlimited, transport distance.

Manipulation of surface charge density

The key to this breakthrough lies in the manipulation of surface charge via liquid contact, which was realized for the first time. The research team first dropped a chain of water droplets on the specially-designed superamphiphobic (both super water- and oil-repellent) surface that they had developed before. Upon impact on the surface, the droplets immediately spread, retracted and rebounded from the surface. This resulted in the separation of electrons from the droplets, and the impacted surface became negatively charged.

By adjusting the height from which the droplets fell on the surface, the surface charge density on the surface changed gradually, forming a gradient.

When a droplet was subsequently placed on that surface, the surface charge density gradient acted as a driving force. The droplet would then self-propelled and moved towards the direction of higher charge density.

Unlike the chemical or morphological gradients which are difficult to change once they are created, the charge density gradient can be easily changed, enabling the reprogramming of droplet motion paths. The research demonstrates that high velocity and ultra-long transport of droplets can be stimulated at room temperature and does not require extra energy.

Such droplet transport not only manifests on flat surfaces, but also flexible and vertically placed ones. In addition, various liquids can be transported, including those with low surface tension, low dielectric constant, blood and salt solutions.

Application potential in microfluidic devices

"We envision that our innovation in using surface charge density gradient to program droplet transport, which was not explored before, will open up a new research direction and potential in applications. For example, in bio-medicine, the design of surfaces with preferential charge density gradient may influence cell migration and other behaviours," said Professor Wang. Professor Deng also said that this strategy could be applied in microfluidic lab-on-a-chip devices and bio-analytical devices, as well as in the fields of materials science, fluid dynamics and beyond.

Credit: 
City University of Hong Kong

A torque on conventional magnetic wisdom

image: When a charge current is applied parallel with the magnetization, spin-orbit interaction generates a flow of transversely polarized spin current that gives rise to anomalous spin-orbit torque (ASOT), tilting the magnetization out of plane on the left and right surfaces. This is detected via a change in laser polarization upon reflection.

Image: 
Jose Vazquez, ITG, Beckman Institute, University of Illinois at Urbana-Champaign

Physicists at the University of Illinois at Urbana-Champaign have observed a magnetic phenomenon called the "anomalous spin-orbit torque" (ASOT) for the first time. Professor Virginia Lorenz and graduate student Wenrui Wang, now graduated and employed as an industry scientist, made this observation, demonstrating that there exists competition between what is known as spin-orbit coupling and the alignment of an electron spin to the magnetization. This can be thought of as analogous to the anomalous Hall effect (AHE).

For a long time now, physicists have known about interesting phenomena such as the AHE in which spins of a certain species accumulate on a film edge. Their accumulations are detectable with electric measurements. This type of experiment requires the magnetization of the film to point perpendicular to the plane of the film. In fact, the Hall effect and similar experiments such as the AHE in the past all use an applied magnetic field (for non-magnetic samples) or the magnetization of the film (for magnetic samples), always perpendicular to the plane of the film.

Effects like the AHE had not been found for magnetizations that point in-plane, until now.

By taking advantage of the magneto-optic Kerr effect (MOKE), which can probe the magnetization near the surface of a magnetic sample, Wang and Lorenz demonstrated that an electrical current modifies the magnetization near the surface of a ferromagnetic sample to point in a direction different from the magnetization of the interior of the sample. It is not necessarily strange that the magnetization near the surface can differ from that in the interior, as evidenced by previous experiments in spin-orbit torque. However, the Illinois researchers used a purely ferromagnetic film, whereas past experiments in spin-orbit torque combined ferromagnets with metals that have a property called "spin-orbit coupling."

This discovery has implications for energy-efficient magnetic-memory technology.

The team's findings are published in the July 22, 2019 issue of the journal Nature Nanotechnology.

Magnetism & conventional spin-orbit torque

Magnetism is ubiquitous--we use it every day, for example, to stick papers to a refrigerator door or to ensure that our phone chargers do not detach prematurely.

Microscopically, magnetism arises from a collection of electrons, which all have a property known as spin. Spin is one source of angular momentum for electrons and its "movement" can be likened to how toy tops spin--though in actuality, in quantum mechanics, the motion of spin does not resemble anything in classical mechanics. For electrons, spin comes in two species, formally called up spin and down spin. Depending on how the spins collectively point, a material might be ferromagnetic, having neighboring electron spins all pointing in the same direction, or antiferromagnetic, having neighboring electron spins pointing in opposite directions. These are just two of several types of magnetism.

But what happens when magnetism is combined with other phenomena such as spin-orbit coupling?

Lorenz notes, "There is an entire family of effects that are generated from simply running an electric current through a sample and having the spins separate. The anomalous Hall effect occurs in thin ferromagnetic films and is seen as the accumulation of spins on the edges of the sample. If the magnetization points out of the plane of the film--that is, perpendicular to the plane of the sample surface--and a current flows perpendicular to the magnetization, then accumulations of spins can be seen. But this happens only if the ferromagnetic film also has spin-orbit coupling."

Spin-orbit coupling causes the spin species--up or down--to move strictly in certain directions. As a simplistic model, from the point of view of electrons moving through a film, they can scatter to the left or right if something interrupts their movement. Interestingly, the spins are sorted based on the direction that an electron moves. If the left-scattered electrons have spin up, then the right-scattered electrons must have spin down and vice versa.

Ultimately, this leads to up spins accumulating on one edge of the film and down spins accumulating on the opposite edge.

Conventional spin-orbit torque (SOT) has been found in bilayer structures of a ferromagnetic film adjacent to a metal with spin-orbit coupling.

Lorenz points out, "In the past, this has always happened with two layers. You need not just a ferromagnet, but also some source for the spins to separate to induce a change in the ferromagnet itself."

If a current flows through the spin-orbit coupled metal, the up and down spins separate like in the AHE. One of those spin species will accumulate at the interface where the ferromagnet and the metal meet. The presence of those spins affects the magnetization in the ferromagnet near the interface by tilting the spins there.

Lorenz continues, "It was always assumed--or at least not investigated heavily--that we need these metals with a strong spin-orbit coupling to even see a change in the ferromagnet."

The results of Wang and Lorenz's experiment now directly challenge this assumption.

Observation of an anomalous spin-orbit torque

Wang and Lorenz found that it was unnecessary to place a metal with spin-orbit coupling adjacent to the ferromagnetic film in order to generate a SOT and observe an out-of-plane magnetization.

Wang comments, "Our work reveals a long-overlooked spin-orbit phenomenon, the anomalous spin-orbit torque, or ASOT, in well-studied metallic ferromagnetic materials such as permalloy. The ASOT not only complements the physics picture of electrical current-induced spin-orbit effects such as the anomalous Hall effect, but also opens the possibility of more efficient control of magnetism in spin-based computer memories."

The researchers ran a current from one edge of the film to its opposite and additionally forced the magnetization of the film to point in the same direction.

The physics here is complicated by the fact that there are two phenomena that are competing--magnetization and spin-orbit coupling. Magnetization is working to align the spin with itself; the electron spins like a top, but over time it aligns with the magnetization and stops its precession. Without spin-orbit coupling, this would mean that the magnetization on all edges would point in the same direction. However, spin-orbit coupling is working to maintain the spin's direction with the movement of the electron. When spin-orbit coupling and magnetization compete, the outcome is a compromise: the spin is halfway between the two effects.

Professor David Cahill, who also collaborated on the experiments at the University of Illinois, explains: "Ultimately, spins that accumulate on the surface of the film end up pointing partially out of the surface plane and spins that accumulate on the oppositely facing surface point partially out of the surface plane in the opposite direction."

Unlike the AHE, the ASOT cannot be detected electrically, so Wang and Lorenz employed MOKE measurements, shooting lasers at two exposed surfaces to show that the magnetization pointed out of the plane of the surface.

Lorenz credits her collaborator, Professor Xin Fan of the University of Denver, with conceiving of this experiment.

Fan explains, "MOKE is an effect to describe the change in polarization as the light is reflected from the surface of a magnetic material. The polarization change is directly correlated to the magnetization and light has a small penetration depth into the sample, which makes it popular to use as a surface probe for magnetization."

But that's not all. The researchers noted that the exchange interaction can suppress the effects of ASOT, so they carefully chose a sample that was thick enough that the spins on the two sides of the sample could not force each other to point in the same direction.

Wang and Lorenz demonstrated that on the two surfaces of the film where spins accumulate, the same Kerr rotation is observed. Technically, the Kerr rotation refers to how the reflected light changes its polarization, which is directly correlated with how the magnetization is rotated out of the plane of the permalloy film. This is indisputable evidence of ASOT.

Additional confirmation of the research findings come from theoretical work. The researchers have run simulations using their phenomenological model to show that there is strong agreement with their data. Additionally, theorist collaborators have also used density functional theory--a type of modelling that looks microscopically at atoms rather than assuming the properties of objects--to show qualitative agreement with experiment.

Lorenz notes that Stanford University Adjunct Professor and Lawrence Lab Staff Scientist Hendrick Ohldag made seminal contributions to the conception of the experiment. Lorenz says the experiment also benefited from contributions of collaborators at the Illinois Materials Research Science and Engineering Center, the University of Denver, the University of Delaware, and the National Institute of Standards and Technology in Maryland and Colorado.

Lorenz emphasizes, "What we've shown now is that a ferromagnet can induce a change in its own magnetization. This could be a boon to the research and development of magnetic memory technology."

Fan adds, "While spin-orbit torque in ferromagnet/metal bilayers has been demonstrated to have great potential in future-generation magnetic memories, because of the electric control of magnetization, our result shows that the ferromagnet can generate very strong spin-orbit torque on itself. If we can properly harness the spin-orbit coupling of the ferromagnet itself, we may be able to build more energy-efficient magnetic memories."

Credit: 
University of Illinois Grainger College of Engineering

August's SLAS technology cover article announced

image: SLAS Technology August cover.

Image: 
David James Group

Oak Brook, IL - The August edition of SLAS Technology features the cover article, "Technologies for the Directed Evolution of Cell Therapies," a review featured in the journal's March 2019 edition. The research, led by Dino Di Carlo, Ph.D., (University of California Los Angeles) highlights how the next generation of therapies are moving beyond the use of small molecules and proteins to using whole cells.

In the review, Di Carlo highlights the importance of automation tools in the selection of cells with unique properties and therapeutically beneficial traits to help drive the rise of cell therapies in the clinic. "Directed evolution of proteins with unique functions and stability was recognized with the Nobel Prize in 2018, and many molecular therapeutics, such as therapeutic antibodies, rely on evolutionary processes for their development," says Di Carlo. "Cell therapies are rising as a third pillar of modern medicine along with molecular and gene therapies, with exciting opportunities given the ability of cells to sense and respond to the environment in both chemical and physical manners."

In contrast, the article also discusses the current lack of fundamental knowledge to engineer complex traits within cells. Di Carlo argues that researchers should make use of directed evolution processes of mutagenesis (epigenetic or genetic) and selection to evolve complex traits of importance for new cell therapies that can, for example, target and kill solid tumors more effectively. He identifies specific traits like cell secretion, ability to apply force, deformability, and adhesiveness as unique traits that would be important to enable infiltration and killing of solid tumors.

Emerging tools such as image-activated cell sorters and cell secretion-based sorting using droplet microfluidics are featured as ways to potentially enable high-throughput automation technologies. Di Carlo indicates that with high-throughput selection tools and directed evolution processes applied to cells, researchers could accelerate the development of cellular products with extreme therapeutic benefit.

Credit: 
SLAS (Society for Laboratory Automation and Screening)

When you spot 1 driving hazard, you may be missing another

image: When people notice one traffic hazard, they are less likely to see a simultaneous second hazard, according to new research from NC State. The study used images like the one shown here. Viewers were less likely to notice the jogger on the left when the bicyclist was also in the image.

Image: 
Robert Sall, NC State University

When people notice one traffic hazard, they are less likely to see a simultaneous second hazard, according to new research from North Carolina State University. The finding has potential applications for both driver training and the development of automated, in-vehicle safety technologies.

"This is a phenomenon called a subsequent search miss (SSM), which was first described in the context of doctors evaluating medical images - their ability to spot a problem was hindered if they had already found another problem in the same image," says Jing Feng, corresponding author of a paper on the research and an associate professor of psychology at NC State. "We wanted to determine whether SSMs might impact driving safety. What we've found suggests that SSMs may play an important role."

To test this, researchers conducted three studies. Each study asked participants to evaluate 100 traffic images and identify any potential hazards that would prevent them from driving in a given direction. Each image contained between zero and two hazards. Some hazards were "high-salience" targets, meaning they were glaringly obvious - like a red sports car. Other hazards were low-salience targets, such as drably dressed pedestrians.

In the first study, researchers gave 20 participants approximately one second to identify any hazards. The participants were able to detect 70% of low-salience targets if they were the only hazard in the scene. But only 30% of the low-salience targets were identified when there were two hazards in the scene. In other words, low-salience hazards were 40% less likely to be identified when they appeared in the same scene as a high-salience hazard.

In the second study, researchers gave 29 participants up to five seconds to spot any hazards. In this study, participants did a better job of identifying both high-salience and low-salience targets - but low-salience targets were still 15% less likely to be identified in scenes where there were two hazards. In other words, while performance improved with extra time, SSMs were still present.

In the final study, researchers gave 30 participants up to five seconds to identify any hazards - but there was a twist. Scenes were introduced as having either a high risk or a low risk of containing multiple targets.

"Here, we found that participants spent more time evaluating traffic scenes after being told the scenes were high risk," says Robert Sall, first author of the paper and a Ph.D. student at NC State. "However, there was still a distinct pattern of performance that could be attributed to SSMs."

When told scenes were low-risk, low-salience targets were 18% less likely to be identified in two-hazard scenes. When given high-risk instructions, low-salience targets were 31% less likely to be identified in two-hazard scenes.

"This work gives us a much better understanding of why people miss certain hazards when driving," Sall says. "It could help us modify driver training to reduce accidents, and inform the development of in-vehicle technologies that focus on accident reduction."

"Our findings will also likely be useful for those whose work involves traffic accident diagnostics," Feng says. "It's now clear that SSMs have the potential to prevent drivers from noticing important pieces of visual information, which may contribute to lapses in driving performance. A great deal of work now needs to be done to determine the scope of the problem and what we can do about it."

Credit: 
North Carolina State University

Tourist photographs are a cheap and effective way to survey wildlife

image: Photograph of a spotted hyena.

Image: 
Megan Claase, Rafiq et al./ <em>Current Biology</em>, 2019

Tourists on safari can provide wildlife monitoring data comparable to traditional surveying methods, suggests research appearing July 22 in the journal Current Biology. The researchers analyzed 25,000 photographs from 26 tour groups to survey the population densities of five top predators (lions, leopards, cheetahs, spotted hyenas, and wild dogs) in northern Botswana, making it one of the first studies to use tourist photographic data for this purpose.

The idea came to lead author Kasim Rafiq (@Kasim21) after hours with his Land Rover grill-deep in an abandoned warthog burrow. Rafiq, then a Ph.D. candidate at Liverpool John Moores University, had been following the tracks of a one-eared leopard named Pavarotti that he'd been searching for for months.

"Eventually I got out of the hole and spoke with the safari guides who I met on the road nearby, and who were laughing," says Rafiq, who is about to begin a Fulbright Fellowship to expand the project further at UC Santa Cruz. "They told me that they'd seen Pavarotti earlier that morning. At that point, I really began to appreciate the volume of information that the guides and tourists were collecting and how it was being lost."

Traditionally, animal population surveys in Africa are done using one of three methods: camera traps, track surveys, and call-in stations. Each has advantages and disadvantages. Camera traps, for example, are particularly useful to understand the variety and densities of species in an area, but they also have an immense up-front cost with no guaranteed lifespan. "For one of my other projects, I had an elephant knock down one of the camera traps, and then lion cubs ran away with the camera. When I collected it, it just had holes in it," Rafiq says.

To test whether tourist photographs could be used for wildlife surveying, the researchers provided participating tourists with small GPS trackers, originally designed for tracking pet cats. These allowed researchers to later tag the wildlife photographs with location data. The photographs were then filtered not only by the species identified, but also by the individual animal, for the top predators, and then analyzed using computer modeling to estimate densities.

Rafiq and his team manually identified animals by their coloration patterns or, in the case of lions, by their whisker spots. The tourist photograph method was carried out alongside camera trap, track, and call-in station surveys to compare the wildlife density estimates obtained from each and the costs to get this information.

"The results suggest that for certain species and within areas with wildlife tourism, tourist-contributed data can accomplish a similar goal as traditional surveying approaches but at a much lower cost, relative to some of these other methods," says Rafiq.

For example, the tourist-photograph method was the only approach to identify cheetahs in the study area and provided density estimates for many of the other carnivore species that were largely comparable to those from the other methods. Most of the costs of the tourist photograph method were down to the manual processing of images. These are tasks that in the future could be outsourced to artificial intelligence to reduce survey costs further.

"If we could combine advances in artificial intelligence and automated image classification with a coordinated effort to collect images, perhaps by partnering with tour operators, we would have a real opportunity for continuous, rapid assessment of wildlife populations in high-value tourism areas," he says.

This method of surveying animal populations is most applicable when studying the charismatic megafauna that tourists are usually interested in, and in areas with established tourism programs.

"There isn't one silver bullet that will be useful in every situation," Rafiq says. "Instead, as conservationists and researchers, we have a toolkit of different techniques that we can dip into depending on our project's requirements and needs. This study adds to the growing body of evidence that citizen science is a powerful tool for conservation. This approach provides the opportunity to not only aid the monitoring of charismatic megafauna highly valued by society, but also has the potential to shape how we can meaningfully participate in conservation efforts."

Credit: 
Cell Press

NASA sees outside winds affecting new tropical Eastern Pacific depression

image: On July 22 at 4:50 a.m. EDT (0850 UTC) the MODIS instrument that flies aboard NASA's Aqua satellite showed strongest storms in Tropical Depression 5# were south of the elongated center where cloud top temperatures were as cold as minus 70 degrees Fahrenheit (red) (minus 56.6 Celsius).

Image: 
NASA/NRL

A new tropical depression formed in the Eastern Pacific Ocean, far enough away from the coast so that no coastal warnings are needed. Infrared imagery from NASA's Aqua satellite shows that Tropical Depression 5E's strongest storms were southwest of its center of circulation because of outside winds.

NASA's Aqua satellite used infrared light to analyze the strength of storms and found the bulk of them in the southern quadrant. Infrared data provides temperature information, and the strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

On July 22 at 4:50 a.m. EDT (0850 UTC), the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Aqua satellite gathered infrared data on Tropical Depression 5E. Strongest thunderstorms had cloud top temperatures as cold as minus 70 degrees Fahrenheit (minus 56.6 Celsius). Cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall. Those strongest storms were southwest of the center of circulation because of vertical wind shear (winds blowing at different speeds at different levels of the atmosphere).  The National Hurricane Center noted, "It appears that northeasterly shear is keeping much of the convection displaced to the west of the center of circulation."

At 11 a.m. EDT (1500 UTC) on July 22, the National Hurricane Center (NHC) said the center of Tropical Depression Five-E was located near latitude 15.9 degrees North and longitude 116.3 degrees west. That's about 640 miles (1.025 km) southwest of the southern tip of Baja California, Mexico.

The depression is moving toward the north near 9 mph (15 kph) and this motion is expected to continue for the next day or so, with a gradual turn to the northwest by midweek. Maximum sustained winds are near 35 mph (55 kph) with higher gusts. The estimated minimum central pressure is 1006 millibars.

NHC noted "Although convection [and thunderstorm development] has increased this morning, ] the [wind[shear is preventing the inner core of the depression from becoming better established." Some slight strengthening is possible over the next couple of days, and the depression is expected to become a tropical storm later today or tonight.

Credit: 
NASA/Goddard Space Flight Center

Helicopter transport for stroke patients decreases time to surgery, new study finds

Miami Beach, FL--The sooner that a severe stroke patient can access thrombectomy, the more likely they are to experience a good outcome. A new study shows that using emergency helicopter ambulance services to transfer a patient to a hospital that can perform a stroke thrombectomy--a minimally invasive surgery which removes the blood clot in the brain causing the stroke--ensures faster access to potentially life-saving care.

The study--An Analysis of Stroke Thrombectomy Interhospital Transportation Modality--took place between January 2015 and March 2018 and was released today at the 16th Annual Meeting of the Society of NeuroInterventional Surgery. The researchers behind the study aimed to determine if ground or air ambulance transportation methods allowed a patient to access thrombectomy the fastest.

The researchers analyzed 133 patients who were transferred to Rush University Medical Center in Chicago, a comprehensive stroke center, for thrombectomy. Patients who were transported to Rush University by helicopter had a significantly shorter time to surgery start. Furthermore, air transport reduced time more as distances between hospitals increased, getting patients to surgery on average 42 minutes faster when transported more than 30 miles. However, all transport of less than 10 miles was done by ground ambulance.

"We know that when a patient can receive earlier appropriate stroke care, they experience better outcomes. We should consider this study as evidence that transferring a patient by helicopter ambulance to a hospital that can perform a thrombectomy will likely provide the patient the best chance of recovering from a major stroke," says Dr. Hormuzdiyar Dasenbrock, first author of the study from Rush University Medical Center. "However, patient safety is the first concern, and sometimes air transport cannot be used due to weather; also, air transport may not be faster if the distance between hospitals is less than 10 miles."

This research reinforces a recent report from the Centers for Disease Control and Prevention's Division for Heart Disease and Stroke Prevention, which found that patients experienced decreased time and access to receiving crucial stroke medications (intravenous thrombolytics) when they were transported via air to an appropriate facility, and extends these findings to patients undergoing thrombectomy.

Credit: 
Society of NeuroInterventional Surgery

Study looks at melatonin use, sleep patterns in school-age kids

What The Study Did: This observational study used a study group of children from the Netherlands to examine how common was the use of melatonin and its association with sleep patterns in school-age children.

Author: Henning Tiemeier, M.D., Ph.D., of the Harvard T.H. Chan School of Public Health in Boston, is the corresponding author.

(doi:10.1001/jamapediatrics.2019.2084)

Editor's Note: The article contains funding/support and conflict of interest disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Racial disparity in fatal officer-involved shootings

High profile shootings of black Americans by police officers continue to make headlines and punctuate political debates, raising questions about whether white officers are responsible for a disproportionate amount of fatal shootings of minority citizens.

David Johnson, a postdoctoral researcher in the Lab for Applied Social Science Research at the University of Maryland, along with colleagues at Michigan State University, created the first comprehensive database of fatal officer-involved shootings (FOIS) in the United States during 2015.

The research team's analysis of the database, published July 22, 2019 in the Proceedings of the National Academy of Sciences, found white officers were not more likely to shoot minority citizens than black or Hispanic officers.

Further, Johnson and colleagues discovered that the strongest factor in predicting the race of a person fatally shot by a police officer were violent crime rates where the shooting took place. In counties where white individuals committed more violent crime, a person fatally shot by police was more likely to be white. The same relationship held true for blacks and Hispanics. Once racial differences in crime rates were controlled for, civilians fatally shot were not more likely to be black or Hispanic than white.

"We want to stress that our national-level findings cannot exonerate or incriminate officers in any specific case," Johnson said. "It's not our goal to argue that there are no racial disparities in all policing outcomes. However, our data do not support the idea that, at the national level, white officers are more responsible for fatal shootings of minority civilians."

The federal government's databases on officer-involved shootings are far from complete, so Johnson and colleagues used more complete lists collected by The Washington Post and The Guardian as a starting point. The research team spent roughly 1,500 hours contacting police departments and scouring department websites, case reports, legal documents and news stories to gather information on officers involved in more than 900 FOIS nationwide in 2015, the first year in which data was collected by these news organizations. Researchers examined officer race, sex, and experience.

The findings have important implications for the development of policies that promote diversity in the hiring process as a means of reducing fatal shootings. While such policies have merit by increasing public trust in law enforcement, Johnson said, they are unlikely to affect racial disparities in fatal shootings.

"If we want to reduce the rates at which people from minority racial groups are shot by police, we need to address differences in crime rates between races," Johnson said. "That involves considering what causes those differences, like racial disparities in wealth, unemployment and education. I'm not saying that's easy. We asked a difficult question, and the answer ended up being difficult."

Johnson and colleagues hope to expand their research to look at non-fatal shootings and other forms of police force. However, they say before any meaningful research can take place, the federal government needs more comprehensive data on police use of force. This year, the FBI launched the National Use of Force Data Collection, designed to gather more detailed information on a voluntary basis from law enforcement agencies about civilians, officers and circumstances involved in use of force incidents. While it's a start, Johnson says, the larger problem remains--data will only be collected from agencies that voluntarily report it.

"We have better information on how many people die of shark bites every year than we do people who are fatally shot by police," Johnson said. "That's a problem."

Credit: 
University of Maryland

Targeting old bottleneck reveals new anticancer drug strategy

The enzyme ribonucleotide reductase is a bottleneck for cancer cell growth. Scientists at Winship Cancer Institute of Emory University have identified a way of targeting ribonucleotide reductase that may avoid the toxicity of previous approaches, informing focused drug discovery efforts.

The results were published on July 19 in Nature Communications.

Ribonucleotide reductase controls the supply of DNA building blocks, which cancer cells need in abundance for fast growth. Cancer researchers have long had an interest in ribonucleotide reductase, which converts RNA components (ribonucleotides) into DNA building blocks. Several more traditional chemotherapy drugs, such as hydroxyurea, fludarabine, cladribine and gemcitabine, inhibit ribonucleotide reductase by a different mechanism.

Researchers led by Xingming Deng, MD, PhD, found that one of ribonucleotide reductase's two parts (RRM2) is regulated by a tag, called acetylation, and identified another enzyme (Kat7) that adds that tag. Acetylation at a particular site inactivates RRM2 by preventing individual molecules of RRM2 from pairing up.

"Based on our findings, we will develop novel anticancer agents that inhibit ribonucleotide reductase activity by directly regulating RRM2 acetylation in cancer cells," says Deng, who is professor of radiation oncology at Emory University School of Medicine and director of the discovery theme in the Discovery and Developmental Therapeutics research program at Winship.

In addition, Deng's team observed that Sirt2, an enzyme that removes acetylation from RRM2 and activates it, is more abundant in samples from lung cancer patients. Sirt2 could be a prognostic biomarker for lung cancer, the authors suggest.

Sirt2 was a hot target for anti-cancer researchers already, but the Winship results provide new insights into how Sirt2 inhibitors preferentially affect cancer cells. Sirt2 has been difficult to develop inhibitors for, because it is part of a family (sirtuins) and many compounds hit more than one.

Sirt2 has other substrates besides RRM2, Deng notes. Also, RRM2 becomes deacetylated after DNA damage, so Sirt2 inhibitors could sensitize cancer cells to chemotherapy or radiation.

Credit: 
Emory Health Sciences

ORNL scientists make fundamental discovery to creating better crops

image: Laccaria bicolor is fruiting above ground and colonizing the Populus deltoides plant root system below ground in a greenhouse setting.

Image: 
Annegret Kohler/INRA-Grand Est, France

OAK RIDGE, Tenn. July 22, 2019--A team of scientists led by the Department of Energy's Oak Ridge National Laboratory have discovered the specific gene that controls an important symbiotic relationship between plants and soil fungi, and successfully facilitated the symbiosis in a plant that typically resists it.

The discovery could lead to the development of bioenergy and food crops that can withstand harsh growing conditions, resist pathogens and pests, require less chemical fertilizer and produce larger and more plentiful plants per acre.

Scientists in recent years have developed a deeper understanding of the complex relationship plants have with mycorrhizal fungi. When they are united, the fungi form a sheath around plant roots with remarkable benefits. The fungal structure extends far from the plant host, increasing nutrient uptake and even communicating with other plants to "warn" of spreading pathogens and pests. In return, plants feed carbon to the fungus, which encourages its growth.

These mycorrhizal symbioses are believed to have supported the ancient colonization of land by plants, enabling successful ecosystems such as vast forests and prairies. An estimated 80% of plant species have mycorrhizal fungi associated with their roots.

"If we can understand the molecular mechanism that controls the relationship between plants and beneficial fungi, then we can start using this symbiosis to acquire specific conditions in plants such as resistance to drought, pathogens, improving nitrogen and nutrition uptake and more," said ORNL molecular geneticist Jessy Labbe. "The resulting plants would grow larger and need less water and fertilizer, for instance."

Finding the genetic triggers in a plant that allow the symbiosis to occur has been one of the most challenging topics in the plant field. The discovery, described in Nature Plants, came after 10 years of research at ORNL and partner institutions exploring ways to produce better bioenergy feedstock crops such as Populus, or the poplar tree. The work was accomplished by improvements over the past decade in genomic sequencing, quantitative genetics and high-performance computing, combined with experimental biology.

The scientists were studying the symbiosis formed by certain species of Populus and the fungus Laccaria bicolor (L. bicolor). The team used supercomputing resources at the Oak Ridge Leadership Computing Facility, a DOE Office of Science user facility at ORNL, along with genome sequences produced at the DOE Joint Genome Institute, a DOE Office of Science user facility at Lawrence Berkeley National Laboratory, to narrow down the search to a particular receptor protein, PtLecRLK1. Once they had identified the likely candidate gene, the researchers took to the lab to validate their findings.

"Experimental validation is the key to this discovery as genetic mapping revealed statistical associations between the symbiosis and this gene, but experimental validations provided a definitive answer that it is this particular gene that controls the symbiosis," said ORNL plant molecular biologist Jay Chen.

The researchers chose Arabidopsis, a plant that traditionally does not interact with the fungus L. bicolor, and even considers it a threat, for their experiments. They created an engineered version of the plant that expresses the PtLecRLK1 protein, and then inoculated the plants with the fungus. The fungus L. bicolor completely enveloped the plant's root tips, forming a fungal sheath indicative of symbiote formation.

"We showed that we can convert a non-host into a host of this symbiont," said ORNL quantitative geneticist Wellington Muchero. "If we can make Arabidopsis interact with this fungus, then we believe we can make other biofuel crops like switchgrass, or food crops like corn also interact and confer the exact same benefits. It opens up all sorts of opportunities in diverse plant systems. Surprisingly, one gene is all you need."

Scientists from the University of Wisconsin-Madison, the Universite de Lorraine in France, and the HudsonAlpha Institute for Biotechnology in Alabama also contributed to the project. The work was supported by DOE's Office of Science, DOE's Center for Bioenergy Innovation (CBI) and its predecessor the BioEnergy Science Center (BESC). One of CBI's key goals is to create sustainable biomass feedstock crops using plant genomics and engineering. Both BESC and CBI have developed experimental and computational approaches that accelerate the identification of gene function in plants.

"This is a remarkable achievement that could lead to the development of bioenergy crops with the ability to survive and thrive on marginal, non-agricultural lands," said CBI director Jerry Tuskan. "We could target as much as 20-40 million acres of marginal land with hardy bioenergy crops that need less water, boosting the prospects for successful rural, biobased economies supplying sustainable alternatives for gasoline and industrial feedstocks."

Credit: 
DOE/Oak Ridge National Laboratory