Tech

Slip layer dynamics reveal why some fluids flow faster than expected

image: As indicated by the dark red arrows, fluid flowing through a narrow cylindrical pipe moves at different speeds: faster near the center of the tube than at the edges (Poiseuille flow). The layer in contact with the internal surface of the pipe is known as slip layer or depletion layer, and allows the bulk fluid to 'slip' past the walls more efficiently. The IBS team developed a new technique (STED-anisotropy) to experimentally measure what happens directly at the slip layer, and characterized changes to the depletion layer dimension and composition as a function of flow rate. Careful analysis of polymer relaxation times shows that above a critical flow rate, shear forces lead to the elongation and alignment of the polymer (chain with white beads) along the direction parallel to the flow.

Image: 
IBS

Whether it is oil gushing through pipelines or blood circulating through arteries, how liquids flow through tubes is perhaps the most fundamental problem in hydrodynamics. The challenge is to maximize transport efficiency by minimizing the loss of energy to friction between the moving liquid and the stationary tube surfaces. Counterintuitively, adding a small amount of large, slow moving polymers to the liquid, thus forming a 'complex liquid', leads to faster, more efficient transport. This phenomenon was speculated to arise from the formation of thin layer around the internal wall of the tube, known as depletion layer or split layer, in which the polymer concentration was significantly lower than in the bulk solution. However, given the inherently thinness of this layer, which is only a few nanometers thick, on the order of the polymer size, direct experimental observation was difficult, and so progress in the field relied heavily on bulk measurements and computer simulations.

Researchers at the Center for Soft and Living Matter, within the Institute for Basic Science (IBS, South Korea), made a significant advance in the field by successfully imaging the depletion layer in polymer solutions flowing through microchannels. Their study, published in the Proceedings of the National Academy of Sciences, USA, relied on the development of a novel super-resolution microscopy technique that allowed the researchers to see this layer with unprecedented spatial resolution.

The first observation of this phenomenon was made nearly a century ago. Experimental studies on high molecular weight polymer solutions revealed a puzzling observation: there was an apparent discrepancy between the measured viscosity of the polymer solution and the rate at which it flowed through a narrow tube. The polymer solution would always flow faster than expected. Furthermore, the narrower the tube, the larger this discrepancy. This sparked an interest which persists to this day.

"Depletion layer dynamics was a problem we found very interesting, but it was challenging to make progress with current experimental techniques," says John T. King, the corresponding author on the study. "We knew the first step needed to be the development of a technique that could provide new information."

Using his expertise in super-resolution microscopy, Seongjun Park, the first author of the study, developed a novel adaptation of stimulated emission depletion (STED) microscopy that has sufficient spatial resolution and contrast sensitivity to directly observe depletion layers. At the same time, Anisha Shakya, the co-author of the study, applied her knowledge of polymer physics to optimize a suitable imaging system. The team decided that the best approach would be to apply the newly developed STED-anisotropy imaging to a solution of high molecular weight polymer, polystyrene sulfonate (PSS), flowing through 30 μm-wide silica microfluidic channels.

PSS' behaviour was tracked with the help of fluorescent dyes. Transient interactions between the side-chains of PSS and the dye slow the rotational movement of the dye molecule. These small changes reveal PSS position and concentration with a spatial resolution of 10s of nanometers.

The researchers first confirmed the formation of depletion layers at the wall and measured that the dimensions of the depletion layer were consistent with PSS size. They then observed that the thickness of the depletion layer narrowed when the solution started to flow. Interestingly, changes to the depletion layer dimension only onset after a critical flow rate that corresponds to known changes in the polymer conformation. This was the first direct experimental confirmation of this phenomenon, which was predicted from molecular dynamics simulations years ago.

Surprisingly, it was also observed that changes to the depletion layer composition occurs at unexpectedly low flow rates. In particular, polymer segments are pulled away from the wall, leaving almost pure solvent, without polymers, close to the wall. This can be attributed to hydrodynamic lift forces, like aerodynamic lift in airplanes, that arise from asymmetric flow at the wall. While hydrodynamic lift has been well characterized in computer simulations, and observed in macroscopic systems, (for instance, flounders fight against this lift better than other animals due to their flatter shape), direct experimental observations on nanoscopic length scales have remained elusive.

It is anticipated that this promising approach can provide new information on complex fluids under flow in different regimes, such as turbulent flow, like what is seen in swiftly flowing rivers, or flow through nanofluidic devices.

Credit: 
Institute for Basic Science

Barn owls may hold key to navigation and location

image: This is a split-gated transistor for mimicking the neurobiological algorithm that mimics sound localization in barn owls.

Image: 
ennifer McCann & Sarbashis Das, Penn State

The way barn owl brains use sound to locate prey may be a template for electronic directional navigation devices, according to a team of Penn State engineers who are recreating owl brain circuitry in electronics.

"We were already studying this type of circuitry when we stumbled across the Jeffress model of sound localization," said Saptarshi Das, assistant professor of engineering science and mechanics.

The Jeffress model, developed by Lloyd Jeffress in 1948, explains how biological hearing systems can register and analyze small differences in the arrival time of sound to the ears and then locate the sound's source.

"Owls figure out which direction the sound is coming from to within one to two degrees," said Saptarshi Das. "Humans are not that precise. Owls use this ability for hunting especially because they hunt at night and their eyesight isn't all that good."

The ability to use sound to locate relies on the distance between the ears. In barn owls, that distance is quite small, but the brain's circuitry has adapted to be able to discriminate this small difference. If the owl is facing the sound source, then both ears receive the sound simultaneously. If the sound is off to the right, the right ear registers the sound slightly before the left.

However, locating objects by sound is not that simple. The speed of sound is faster than the owl's nerves can function so after the owl brain converts the sound to an electrical pulse, the pulse is slowed down. Then the brain's circuitry uses a lattice of nerves of different lengths with inputs from two ends, to determine which length is where the two signals coincide or arrive at the same time. This provides the direction.

Saptarshi Das and his team have created an electronic circuit that can slow down the input signals and determine the coincidence point, mimicking the working of the barn owl brain.

The researchers, who include Saptarshi Das; Akhil Dodda, graduate student in engineering science and mechanics; and Sarbashis Das, graduate student in electrical engineering, note today (Aug 1.) in Nature Communications that "the precision of the biomimetic device can supersede the barn owl by orders of magnitude."

The team created a series of split-gate molybdenum sulfide transistors to mimic the coincidence nerve network in the owl's brain. Split-gate transistors only produce output when both sides of the gate match, so only the gate tuned to a specific length will register the sound. The biomimetic circuitry also uses a time-delay mechanism to slow down the signal.

While this proof-of-concept circuit uses standard substrates and device types, the researchers believe that using 2D materials for the devices would make them more accurate and also more energy efficient, because the number of split-gate transistors could be increased, providing more precise coincidence times. The reduction in power consumption would benefit devices working in the low-power domain.

"Millions of years of evolution in the animal kingdom have ensured that only the most efficient materials and structures have survived," said Sarbashis Das. "In effect, nature has done most of the work for us. All we have to do now is adapt these neurobiological architectures for our semiconductor devices."

"While we are trying to make energy-efficient devices, mammalian computing backed by natural selection has necessitated extreme energy-efficiency, which we are trying to mimic in our devices," said Dodda.

However, having only the direction will not provide the location of the sound source. To actually navigate or locate, a device would need to know the height of the sound source as well. Saptarshi Das noted that height is a property of the intensity of the sound and the researchers are working on this aspect of the problem.

"There are several animals that have excellent sensory processing for sight, hearing and smell," said Saptarshi Das. "Humans are not the best at these."

The team is now looking at other animals and other sensory circuitry for future research. While existing research in the field of neuromorphic computing focuses on mimicking the intellectual capacity of the human brain, this work sheds light on an alternate approach by replicating the super sensors of the animal kingdom. Saptarshi Das considers this a paradigm change in this field.

Credit: 
Penn State

Hidden chemistry in flowers shown to kill cancer cells

video: Researchers at the University of Birmingham explain how it's possible to produce a compound with anti-cancer properties directly from feverfew -- a common flowering garden plant.

Image: 
University of Birmingham

Researchers at the University of Birmingham have shown that it's possible to produce a compound with anti-cancer properties directly from feverfew - a common flowering garden plant.

The team was able to extract the compound from the flowers and modify it so it could be used to kill chronic lymphocytic leukaemia (CLL) cells in the laboratory.

Feverfew is grown in many UK gardens, and also commonly sold in health food shops as a remedy for migraine and other aches and pains.

The compound the Birmingham team were investigating is called parthenolide and was identified by scientists as having anti-cancer properties several years ago. Although available commercially, it is extremely expensive with poor "drug-like" properties and has not progressed beyond basic research.

The Birmingham team were able to show a method not only for producing the parthenolide directly from plants, but a way of modifying it to produce a number of compounds that killed cancer cells in in vitro experiments. The particular properties of these compounds make them much more promising as drugs that could be used in the clinic.

The parthenolide compound appears to work by increasing the levels of reactive oxygen species (ROS) in cells. Cancer cells already have higher levels of these unstable molecules and so the effect of the parthenolide is to increase levels of these to a critical point, causing the cell to die.

The study, published in MedChemComm, was a multidisciplinary programme, drawing together researchers from the University's Institute of Cancer and Genomic Studies, the School of Chemistry and the drug discovery services companies, Sygnature Discovery and Apconix. The University of Birmingham's Winterbourne Botanic Garden oversaw the cultivation of the plants in sufficient volume for the drug screen to take place.

It was initiated by Dr Angelo Agathanggelou, of the Institute of Cancer and Genomic Studies, who is investigating new ways to treat chronic lymphocytic leukaemia (CLL), a type of cancer which typically affects older people.

Dr Agathanggelou explains: "There are several effective treatments for CLL, but after a time the disease in some patients becomes resistant. We were interested in finding out more about the potential of parthenolide. With expertise from colleagues in the School of Chemistry we've been able to demonstrate that this compound shows real promise and could provide alternative treatment options for CLL patients."

Professor John Fossey, of the University's School of Chemistry, says: "This research is important not only because we have shown a way of producing parthenolide that could make it much more accessible to researchers, but also because we've been able to improve its "drug-like" properties to kill cancer cells. It's a clear demonstration that parthenolide has the potential to progress from the flowerbed into the clinic."

Lee Hale, Head of Winterbourne Botanic Garden and Abigail Gulliver, Winterbourne's Horticultural Adviser oversaw the cultivation and harvesting of the plants.

Hale explains: "After trials on related plant species within the Asteraceae family it soon became apparent that Tanacetum parthenium - feverfew - provided the optimum levels of parthenolide."

"Feverfew is a short lived perennial plant which we sowed on an annual basis for the trial to ensure continuity of supply. This was necessary as winter weather can result in plant losses," adds Abigail Gulliver.

Credit: 
University of Birmingham

is it time for another contraception revolution?

(Boston)--In an effort to protect the planet and preserve its natural treasures for future generations, another contraception revolution that provides options for populations not currently being served by modern contraception may be the answer according to a Perspective in this week's New England Journal of Medicine.

The expanding human population is stressing the planet. Dangerous levels of greenhouse gases produced by humans are causing global warming and climate disruption. Rapid depletion of resources from forests and oceans is destroying natural habitats and further contributing to climate change. As the population continues to grow, these pressures will increase and become more critical. Currently there are 7.7 billion people on earth and the United Nations (UN) predicts that the human population will reach 9 billion by 2050 and probably 11.2 billion by the end of the century.

"With approximately 40 percent of pregnancies being unplanned, the time seems ripe for another contraception revolution to provide options for the diverse populations that are not currently being served by modern contraception," said corresponding author Deborah Anderson, PhD, professor of obstetrics/gynecology, microbiology and medicine at Boston University School of Medicine.

Anderson points out that while the contraception revolution of the 20th century produced several effective birth-control methods that reshaped society, there is a need for more contraception options. "New contraception concepts are emerging that could help fill the remaining gap including male contraception being tested in clinical trials. Also, there is a new approach called multipurpose prevention technology (MPT) that offers dual protection against unintended pregnancy and highly prevalent sexually transmitted infections (STIs), such as human immunodeficiency virus type-1 and herpes simplex virus type 2, which has been enthusiastically endorsed by women and is currently under development," she said.

Anderson believes appropriate leadership and an infusion of funding could reignite contraception research, education and services. "This investment would be quickly offset by savings in health care and other costs attributable to pollution and global warming, which in the U.S. currently total $240 billion per year and are expected to increase to $350 billion per year in the next decade if drastic mitigation steps are not taken."

Credit: 
Boston University School of Medicine

NASA finds tropical storm Wipha whipped up

image: On July 31, 2019 at 2:30 a.m. EDT (0630 UTC) the MODIS instrument that flies aboard NASA's Aqua satellite showed strongest storms in Tropical Storm Wipha were southeast of Hainan Island, China, in the South China Sea. Cloud top temperatures were as cold as minus 80 degrees Fahrenheit (minus 62.2 Celsius).

Image: 
NASA/NRL

Tropical Storm Wipha formed quickly in the South China Sea. It was affecting Hainan Island, China when NASA's Aqua satellite passed overhead on July 31.

NASA's Aqua satellite used infrared light to analyze the strength of storms and found the bulk of them in the southern quadrant. Infrared data provides temperature information, and the strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

ON July 31 at 2:30 a.m. EDT (0630 UTC), the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Aqua satellite gathered infrared data on Tropical Storm Wipha. Strong thunderstorms circled the center where cloud top temperatures were as cold as minus 80 degrees Fahrenheit (minus 62.2 Celsius). Those storms were over the South China Sea, just southeast of Hainan Island, China. Another area of storms that strong were in a fragmented band to the northeast of the center.

Cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall. Those strongest storms were south and southeast of the center of the elongated circulation.

At 11 a.m. EDT (1500 UTC) on July 31, Tropical Storm Wipha had maximum sustained winds near 35 knots (40 mph/64 kph). It was located near 19.4 degrees north latitude and 112.2 degrees west longitude, about 207 miles south-southwest of Hong Kong, China. Wipha was moving to the northwest.

The Joint Typhoon Warning Center expects that Wipha will move northwest towards southern China. After passing the Leizhou Peninsula, the system will turn west and after crossing the Gulf of Tonkin will make landfall near Hanoi, Vietnam.

Credit: 
NASA/Goddard Space Flight Center

Faint foreshocks foretell California quakes

New research mining data from a catalog of more than 1.8 million southern California earthquakes found that nearly three-fourths of the time, foreshocks signalled a quake's readiness to strike from days to weeks before the the mainshock hit, a revelation that could advance earthquake forecasting.

"We are progressing toward statistical forecasts, though not actual yes or no predictions, of earthquakes," said Daniel Trugman, a seismologist at Los Alamos National Laboratory and coauthor of a paper out today in the journal Geophysical Research Letters. "It's a little like the history of weather forecasting, where it has taken hundreds of years of steady progress to get where we are today."

The paper, titled "Pervasive foreshock activity across southern Californa," notes foreshocks preceded nearly 72 percent of the "mainshocks" studied (the largest quakes in a particular sequence), a percentage that is significantly higher than was previously understood.

Many of these foreshocks are so small, with magnitudes less than 1, that they are difficult to spot through visual analysis of seismic waveforms. To detect such small events requires advanced signal processing techniques and is a huge, data-intensive problem. Significant computing capabilities were key to extracting these new insights from the southern California Quake Template Matching Catalog, recently produced by Trugman and coauthor Zachary Ross, an assistant professor in seismology at Caltech. The template matching took approximately 300,000 GPU-hours on an array of 200 NVIDIA-P100 GPUs, involving 3-4 weeks of computing time for the final run. GPUs are special types of computers, optimal for massively parallel problems, as each GPU has thousands of cores, and each core is capable of handling its own computational thread. For perspective, a standard laptop has either 2 or 4 cores.The earthquake catalog is archived by the Southern California Earthquake Data Center (scedc.caltech.edu/).

The small foreshocks may be too difficult to discern in real time to be of use in earthquake forecasting. Another important issue is that quakes run in packs: they cluster in both space and time, so sorting the foreshocks of a particular quake out from the family of preliminary, main and aftershock rumbles of its fellow earth adjustments is no simple task.

An earthquake prediction tool is still far off, Trugman explains, and for humans who like a yes or no answer, a statistical analysis that suggests a quake's probability is frustrating. But the potential insights and early warnings are improving, quake by quake.

Credit: 
DOE/Los Alamos National Laboratory

NASA finds heavy rain in hurricane Erick

image: The GPM core satellite passed over Hurricane Erick at 7:46 a.m. EDT (1146 UTC) on July 31. GPM found the heaviest rainfall (pink) was around the northern eyewall of the center of circulation. There, rain was falling at a rate of over 40 mm (about 1.6 inches) per hour.

Image: 
NASA/JAXA/NRL

NASA provided forecasters with a look at Hurricane Erick's rainfall rates and cloud temperatures with data from the GPM and Aqua satellites, as the storm headed to Hawaii.

Erick is a major hurricane in the Eastern Pacific Ocean and is a category 3 hurricane on the Saffir-Simpson Hurricane Wind Scale.

NASA's Aqua satellite analyzed Erick on July 30 at 7:17 a.m. EDT (1117 UTC) using the Atmospheric Infrared Sounder or AIRS instrument. The stronger the storms, the higher they extend into the troposphere, and they have the colder cloud temperatures. AIRS found coldest cloud top temperatures as cold as or colder than minus 63 degrees Fahrenheit (minus 53 degrees Celsius) in a large area around the center.

The Global Precipitation Measurement mission or GPM core satellite passed over Hurricane Erick at 7:46 a.m. EDT (1146 UTC) on July 31. GPM found the heaviest rainfall was around the northern eyewall of the center of circulation. There, rain was falling at a rate of over 40 mm (about 1.6 inches) per hour. GPM is a joint mission between NASA and the Japan Aerospace Exploration Agency, JAXA.

The National Hurricane Center (NHC) said at 11 a.m. EDT (5 a.m. HST/1500 UTC), the center of Hurricane Erick was located near latitude 14.5 North, longitude 147.5 West. Erick is moving toward the west near 13 mph (20 kph) and this motion is expected to continue for the next 48 hours. Maximum sustained winds are near 120 mph (195 kph) with higher gusts. Some weakening is forecast during the next 48 hours. The estimated minimum central pressure is 958 millibars.

NHC noted "Swells generated by Erick will arrive in the Hawaiian Islands over the next couple of days, potentially producing dangerous surf conditions, mainly along east facing shores. Moisture associated with Erick will spread over the Hawaiian Islands by Thursday afternoon [Aug. 1] and produce heavy rainfall. Rainfall is expected to be heaviest over the east and southeast slopes of the Big Island of Hawaii."

Credit: 
NASA/Goddard Space Flight Center

You can't squash this roach-inspired robot

video: A new insect-sized robot created by researchers at the University of California, Berkeley, scurries at the speed of a cockroach and can withstand the weight of a human.

Image: 
UC Berkeley video by Stephen McNally

If the sight of a skittering bug makes you squirm, you may want to look away -- a new insect-sized robot created by researchers at the University of California, Berkeley, can scurry across the floor at nearly the speed of a darting cockroach.

And it's nearly as hardy as a cockroach, too. Try to squash this robot under your foot, and more than likely, it will just keep going.

"Most of the robots at this particular small scale are very fragile. If you step on them, you pretty much destroy the robot," said Liwei Lin, a professor of mechanical engineering at UC Berkeley and senior author of a new study that describes the robot. "We found that if we put weight on our robot, it still more or less functions."

Small, durable robots like these could be advantageous in search and rescue missions, squeezing and squishing into places where dogs or humans can't fit, or where it may be too dangerous for them to go.

"For example, if an earthquake happens, it's very hard for the big machines, or the big dogs, to find life underneath debris, so that's why we need a small-sized robot that is agile and robust," said Yichuan Wu, first author of the paper, who completed the work as a graduate student in mechanical engineering at UC Berkeley through the Tsinghua-Berkeley Shenzhen Institute partnership. Wu is now an assistant professor at the University of Electronic Science and Technology of China.

The study appears today (Wednesday, July 31) in the journal Science Robotics.

The robot, which is about the size of a large postage stamp, is made of a thin sheet of a piezoelectric material called polyvinylidene fluoride, or PVDF. Piezoelectric materials are unique, in that applying electric voltage to them causes the materials to expand or contract.

The researchers coated the PVDF in a layer of an elastic polymer, which causes the entire sheet to bend, instead of to expand or contract. They then added a front leg so that, as the material bends and straightens under an electric field, the oscillations propel the device forward in a "leapfrogging" motion.

The resulting robot may be simple to look at, but it has some remarkable abilities. It can sail along the ground at a speed of 20 body lengths per second, a rate comparable to that of a cockroach and reported to be the fastest pace among insect-scale robots. It can zip through tubes, climb small slopes and carry small loads, such as a peanut.

Perhaps most impressively, the robot, which weighs less than one tenth of a gram can withstand a weight of around 60 kg - about the weight of an average human - which is approximately 1 million times the weight of the robot.

"People may have experienced that, if you step on the cockroach, you may have to grind it up a little bit, otherwise the cockroach may still survive and run away," Lin said. "Somebody stepping on our robot is applying an extraordinarily large weight, but [the robot] still works, it still functions. So, in that particular sense, it's very similar to a cockroach."

The robot is currently "tethered" to a thin wire that carries an electric voltage that drives the oscillations. The team is experimenting with adding a battery so the robot can roam independently. They are also working to add gas sensors and are improving the design of the robot so it can be steered around obstacles.

Credit: 
University of California - Berkeley

Clearing up the 'dark side' of artificial leaves

image: This is a schematic depiction of an artificial leaf with a membrane that reduces the release of carbon dioxide back into the atmosphere.

Image: 
Aditya Prajapati and Meng Lin

While artificial leaves hold promise as a way to take carbon dioxide -- a potent greenhouse gas -- out of the atmosphere, there is a "dark side to artificial leaves that has gone overlooked for more than a decade," according to Meenesh Singh, assistant professor of chemical engineering in the University of Illinois at Chicago College of Engineering.

Artificial leaves work by converting carbon dioxide to fuel and water to oxygen using energy from the sun. The two processes take place separately and simultaneously on either side of a photovoltaic cell: the oxygen is produced on the "positive" side of the cell and fuel is produced on the "negative" side.

Singh, who is the corresponding author of a new paper in ACS Applied Energy Materials, says that current artificial leaves are wildly inefficient. They wind up converting only 15% of the carbon dioxide they take in into fuel and release 85% of it, along with oxygen gas, back to the atmosphere.

"The artificial leaves we have today aren't really ready to fulfill their promise as carbon capture solutions because they don't capture all that much carbon dioxide, and in fact, release the majority of the carbon dioxide gas they take in from the oxygen-evolving 'positive' side," Singh said.

The reason artificial leaves release so much carbon dioxide back to the atmosphere has to do with where the carbon dioxide goes in the photoelectrochemical cell.

When carbon dioxide enters the cell, it travels through the cell's electrolyte. In the electrolyte, the dissolved carbon dioxide turns into bicarbonate anions, which travel across the membrane to the "positive" side of the cell, where oxygen is produced. This side of the cell tends to be very acidic due to splitting of water into oxygen gas and protons. When the bicarbonate anions interact with the acidic electrolyte at the anodic side of the cell, carbon dioxide is produced and released with oxygen gas.

Singh noted that a similar phenomenon of carbon dioxide release occurring in the artificial leaf can be seen in the kitchen when baking soda (bicarbonate solution) is mixed with vinegar (acidic solution) to release a fizz of carbon dioxide bubbles.

To solve this problem, Singh, in collaboration with Caltech researchers Meng Lin, Lihao Han and Chengxiang Xiang, devised a system that uses a bipolar membrane that prevents the bicarbonate anions from reaching the "positive" side of the leaf while neutralizing the proton produced.

The membrane placed in between the two sides of the photoelectrochemical cell keeps the carbon dioxide away from the acidic side of the leaf, preventing its escape back into the atmosphere. Artificial leaves using this specialized membrane turned 60% to 70% of the carbon dioxide they took in into fuel.

"Our finding represents another step in making artificial leaves a reality by increasing utilization of carbon dioxide," Singh said.

Earlier this year, Singh and colleagues published a paper in ACS Sustainable Chemistry & Engineering, where they proposed a solution to another problem with artificial leaves: current models use pressurized carbon dioxide from tanks, not the atmosphere.

He proposed another specialized membrane that would allow the leaves to capture carbon dioxide directly from the atmosphere. Singh explains that this idea, together with the findings reported in this current publication on using more of the carbon dioxide captures, should help make artificial leaf technology fully implementable.

Credit: 
University of Illinois Chicago

Actively swimming gold nanoparticles

Bacteria can actively move towards a nutrient source--a phenomenon known as chemotaxis--and they can move collectively in a process known as swarming. Chinese scientists have redesigned collective chemotaxis by creating artificial model nanoswimmers from chemically and biochemically modified gold nanoparticles. The model could help understand the dynamics of chemotactic motility in a bacterial swarm, concludes the study published in the journal Angewandte Chemie.

What causes swarming, and whether such collective behavior can be translated into artificial intelligent systems, is currently a topic of intensive scientific research. It is known that bacteria swimming in a dense pack feel the surrounding fluid differently from a sole swimmer. But to what degree swimmers are sped up in a swarm, and what other factors play a role, is still unclear. Colloidal chemist Qiang He at the Harbin Institute of Technology, China, and his colleagues, have now constructed a simple artificial model of bacteria-like nanoswimmers. They observed active chemotactic behavior and formation of the swimmers into a distinctly moving swarm.

He and his colleagues constructed their artificial swimmers from tiny spheres of gold. With a size 40 times smaller than a usual bacterium, the gold nanoparticles were below the detection limit of the microscope. However, thanks to a light-scattering phenomenon called the Tyndall effect, the scientists could observe larger changes in the solution containing swimmers, even with the naked eye. Using other analytical techniques, they also resolved the speed, orientation, and concentration of the particles in finer detail.

Scientists enjoy working with gold nanoparticles because the tiny spheres form a stable, disperse solution, are readily observed with an electron microscope, and molecules can be attached to them relatively easily. He and his team first loaded the surface of large silica spheres with gold particles. Then they attached polymer brushes on the exposed side of the gold spheres. These brushes were made of polymer chains, and with a length of up to 80 nanometers, they rendered the gold particles highly asymmetric.

The researchers dissolved the silica carrier and tethered an enzyme on the exposed side of the gold spheres so that the resulting nanoparticles were covered with long and thick polymer brushes on one side and with the enzyme on the other. In the presence of oxygen, the glucose oxidase enzyme decomposes glucose into a compound called gluconic acid.

To determine if the nanoswimmers would actively swim in a given direction, the authors placed them at one end of a small channel and placed a permanent glucose source at the other end. Similarly to living bacteria, the model swimmers actively traveled along the glucose gradient towards the glucose source. This fact alone was not surprising as enzymatically driven, self-propelling swimmers are known from experiment and theory. But the authors could also detect swarming behavior. The asymmetric nanoparticles condensed into a separate phase that moved collectively along the nutrient gradient.

The authors imagine that the nanoswimmers could be further developed as valuable and easily accessible physical models to study the chemotactic and swarming behavior of living or non-living things on the nanoscale.

Credit: 
Wiley

NASA casts a double eye on hurricane Flossie

image: On July 31, 2019 at 6:35 a.m. EDT (1035 UTC), the MODIS instrument that flies aboard NASA's Terra satellite showed strongest storms in Hurricane Flossie were around the center and south of the center where cloud top temperatures were as cold as minus 80 degrees Fahrenheit (minus 62.2 Celsius).

Image: 
NASA/NRL

NASA's Aqua and Terra satellites provided infrared views of Flossie before and after it became a hurricane while moving through the Eastern Pacific Ocean. Both satellites analyzed Flossie's cloud top temperatures and structure as the storm strengthened.

On July 30 at 5:41 a.m. EDT (0941 UTC) infrared data and cloud top temperatures were obtained in then Tropical Storm Flossie, using the Atmospheric Infrared Sounder or AIRS instrument. AIRS found coldest cloud top temperatures as cold as or colder than minus 63 degrees Fahrenheit (minus 53 degrees Celsius) around the center of circulation and in a large area west of the center. The stronger the storms, the higher they extend into the troposphere, and they have the colder cloud temperatures.

Flossie continued to strengthen after Aqua passed overhead, and by 5 p.m. EDT (11 a.m. HST/2100 UTC) on July 30, it became a hurricane.

On July 31 at 6:35 a.m. EDT (1035 UTC),the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite gathered infrared data on Flossie after it became a hurricane. Like the AIRS image the previous day, there were two areas of strongest storms. In the Terra imagery, the strongest storms were also colder and clouds tops higher in the troposphere than they were in the July 30 AIRS imagery.

On July 31, the strongest storms were located around the center and in a band of thunderstorms southwest of the center where cloud top temperatures were as cold as minus 80 degrees Fahrenheit (minus 62.2 Celsius).  Cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall.

At 11 a.m. EDT (5 a.m. HST/1500 UTC) the center of Hurricane Flossie was located near latitude 14.0 degrees north latitude and longitude 125.8 degrees west. Flossie is moving toward the west-northwest near 15 mph (24 kph).  A west-northwestward to westward motion at a similar forward speed is anticipated for the next several days.

Maximum sustained winds have decreased to near 75 mph (120 kph) with higher gusts.  Flossie is expected to weaken to tropical-storm strength later today.  Hurricane-force winds extend outward up to 30 miles (45 km) from the center and tropical-storm-force winds extend outward up to 105 miles (165 km).

The National Hurricane Center said that re-strengthening is possible later this week, and Flossie is forecast to become a hurricane again in a few days.

Credit: 
NASA/Goddard Space Flight Center

Walkability is key: A look at greenspace use

If city planners want more people to visit community greenspaces, they should focus on "putting humans in the equation," according to a new study from University of Arizona researchers.

The main finding, lead researcher Adriana Zuniga-Teran says, is simple: the easier and safer it is to get to a park, the more likely people are to visit the park frequently.

The paper is available online and set for publication in October in Landscape and Urban Planning.

Zuniga-Teran, assistant research scientist at the College of Architecture, Planning and Landscape Architecture and the Udall Center for Studies in Public Policy, studies greenspace in cities. She says walkability -- or how easy and safe it is for someone to walk from home to a greenspace -- is a deciding factor in how often people visit parks.

Tucson was an ideal location for the study because it is "almost surrounded by protected land" and features hundreds of parks scattered throughout the city, she said. Researchers gathered data from people in parks as well as from people in their homes, which Zuniga-Teran says is significant, as most similar previous efforts she could find focused exclusively on one group or the other.

Results From Residents

The data from those surveyed in their homes show that several factors that play into a neighborhood's walkability can significantly increase how often people visit greenspaces. For example, higher levels of perceived traffic safety and surveillance - or how well people inside nearby buildings can see pedestrians outside - corresponded with more frequent visits.

The research also suggests that people who travel to greenspaces by walking or biking are three-and-a-half times more likely to visit daily than those who get there by other means. Residents who have to drive are more likely to go only monthly.

Proximity to a park, though, played no significant role in how often people visited a park, Zuniga-Teran said.

"This was surprising because oftentimes we assume that people living close to a park are more likely to visit the park and benefit from this use."

Different levels of walkability may explain this result.

"Let's say you live in front of a huge park, but there's this huge freeway in the middle," Zuniga-Teran explained. "You're very close, but just crossing the major street, you might need to take the car and spend a long time in that intersection."

In situations like that, she said, a person probably won't visit that park frequently despite living close to it.

Results From Greenspace Users

The team of researchers, all from the UA, gathered data from more than 100 people visiting Rillito River Park and found only one walkability factor was significantly linked to more frequent visits: traffic safety. Those in the park who indicated their neighborhoods have fewer traffic-related safety concerns were one-and-a-half times more likely to visit greenspaces daily than those who reported concerns about traffic-related safety.

Unlike the people surveyed in their homes, those surveyed at greenspaces indicated that proximity is a major factor in how often they visit, with those who live close to greenspaces being six times more likely to go daily.

Moving Forward

It's important to gather and use this kind of information for the sake of human and environmental health, Zuniga-Teran says. Greenspaces clean the air and water, which benefits every resident of a community, she said. And when people use parks, that greenspace is more likely to be preserved.

It's up to community planners to use the research to shape policy, so that neighborhoods are developed in ways that connect residents more easily and safely with public greenspaces. For example, she said, the continuing emergence of gated communities can interrupt the flow to greenspaces. Cul-de-sac-heavy neighborhoods can do the same thing. Developers of those types of neighborhoods, Zuniga-Teran suggested, could work with city planners to "open a door to the park" by creating pathways that enhance connectivity.

Developers also could use the findings as a springboard into looking into whether their perceptions of walkability match those of the residents living in their communities, she says.

"We might think we are designing walkable neighborhoods," Zuniga-Teran says, "but people might not feel like that."

The next step, she hopes, is that researchers will take a deeper dive into what amenities or design features can draw new people into parks. Those could range from additional lighting and separate bike lanes to more accessibility for people with disabilities. Her team is continuing the effort with more detailed surveys in Tucson this summer.

Philip Stoker, co-author and assistant professor of planning and landscape architecture, says he hopes other research teams follow suit.

"I would like to see researchers across the country replicate this study to add external validity to our case study of Tucson. It is an interesting line of research that connects how people see their world with their own behaviors," he said. "In our context, we hope to see further evidence to support which perceptions influence the probability of visiting urban parks."

Zuniga-Teran says she hopes this and future research into greenspace use will show community leaders that, when it comes to improving public and environmental health by getting people into parks, "urban planning and architecture matter."

Credit: 
University of Arizona

Quantum computers to clarify the connection between the quantum and classical worlds

image: White crosses represent solutions to a simple quantum problem analyzed with a new quantum computer algorithm developed at the Los Alamos National Laboratory.

Image: 
LANL

Los Alamos National Laboratory scientists have developed a new quantum computing algorithm that offers a clearer understanding of the quantum-to-classical transition, which could help model systems on the cusp of quantum and classical worlds, such as biological proteins, and also resolve questions about how quantum mechanics applies to large-scale objects.

"The quantum-to-classical transition occurs when you add more and more particles to a quantum system," said Patrick Coles of the Physics of Condensed Matter and Complex Systems group at Los Alamos National Laboratory, "such that the weird quantum effects go away and the system starts to behave more classically. For these systems, it's essentially impossible to use a classical computer to study the quantum-to-classical transition. We could study this with our algorithm and a quantum computer consisting of several hundred qubits, which we anticipate will be available in the next few years based on the current progress in the field."

Answering questions about the quantum-to-classical transition is notoriously difficult. For systems of more than a few atoms, the problem rapidly becomes intractable. The number of equations grows exponentially with each added atom. Proteins, for example, consist of long strings of molecules that may become important biological components or sources of disease, depending on how they fold up. Although proteins can be comparatively large molecules, they are small enough that the quantum-to-classical transition, and algorithms that can handle it, become important when trying to understand and predict how proteins fold.

In order to study aspects of the quantum-to-classical transition on a quantum computer, researchers first need a means to characterize how close a quantum system is to behaving classically. Quantum objects have characteristics of both particles and waves. In some cases, they interact like tiny billiard balls, in others they interfere with each other in much the same way that waves on the ocean combine to make larger waves or cancel each other out. The wave-like interference is a quantum effect. Fortunately, a quantum system can be described using intuitive classical probabilities rather than the more challenging methods of quantum mechanics, when there is no interference.

The LANL group's algorithm determines how close a quantum system is to behaving classically. The result is a tool they can use to search for classicality in quantum systems and understand how quantum systems, in the end, seem classical to us in our everyday life.

Credit: 
DOE/Los Alamos National Laboratory

Postpartum transfusions on the rise, carry greater risk of adverse events

(WASHINGTON, July 31, 2019) -- Women who receive a blood transfusion after giving birth are twice as likely to have an adverse reaction related to the procedure, such as fever, respiratory distress, or hemolysis (destruction of red blood cells), compared with non-pregnant women receiving the same care, according to a new study published today in Blood Advances. Women with preeclampsia, a condition marked by high blood pressure during pregnancy, were found to be at the greatest risk for problems.

The study--the first to investigate the overall risk of transfusion reactions in pregnant women--also suggests that postpartum hemorrhages that require blood transfusion are becoming more common. In this study, the number of postpartum transfusions increased by 40 percent during the 20-year study period. Generally, up to 3 percent of all pregnant women receive a blood transfusion postpartum.

Although the exact reason for the increase in postpartum transfusions is not fully known, pregnant women today tend to be older, have higher body fat, more often conceive by in vitro fertilization, and are more likely to undergo cesarean delivery, have placental complications or multiple pregnancies, which are thought to be contributing factors, according to the researchers.

"Blood transfusions are on the rise among postpartum women. There seems to be something about pregnancy that makes transfusion-related adverse events and complications more likely," said Lars Thurn, PhD, of Karolinska Institutet in Stockholm, Sweden, and the study's senior author. "Based on our findings, obstetricians and clinicians need to be more aware of these potentially harmful reactions when evaluating pregnant women for blood transfusion, especially if they have preeclampsia, induced labor, or preterm delivery (before 34 weeks)."

This retrospective, population-based study included all women who gave birth in Stockholm County, Sweden, between 1990 and 2011. Researchers linked data from the Swedish National Birth Registry to the Stockholm Transfusion Database, which included information on blood components administered and whether a transfusion reaction occurred in women who received blood transfusions postpartum. These data were compared with the outcomes of non-pregnant women who received blood transfusions during the same study period.

Of the 517,854 women included, 12,183 (2.4 percent) received a blood transfusion. A total of 96 postpartum transfusion-related reactions were recorded, with a prevalence of 79 versus 40 occurring for every 10,000 pregnant and non-pregnant women, respectively. The risk of a transfusion-related reaction was also doubled among women diagnosed with preeclampsia compared with pregnant women who didn't have the condition.

Previous cesarean delivery also heightened the risk of placental complications and bleeding in subsequent pregnancies. In this study, 26 percent of all women who received more than 10 units of blood postpartum had a previous cesarean delivery compared with 8 percent of women who had no blood transfusion postpartum. The likelihood of problems was also significantly increased when a combination of all three types of blood component (red blood cells, plasma and platelets) was administered.

"Most blood transfusions are safe and many are life-saving, but adverse transfusion reactions and transfusion transmitted infections are a concern," said Dr. Thurn. "In this population blood products should only be administered when necessary and when alternative options have been considered."

During pregnancy, a woman's immune system changes to accept and protect the growing fetus. In addition, red cell, leukocyte, and platelet antibodies are often elevated in women who have had previous pregnancies, which have been found to be associated with a higher risk of transfusion reactions. These and other changes, including the rise in blood flow and workload on the heart, may play a role in transfusion-related complications, added Dr. Thurn.

He and his team are hopeful that this study will prompt research to understand the causes and/or risk factors associated with different types of transfusion reactions in pregnancy to help make transfusions safer.

Credit: 
American Society of Hematology

Targeting a blood stem cell subset shows lasting, therapeutically relevant gene editing

image: This is a microscopy image of blood stem cells that will go through CRISPR/Cas9 gene editing in a lab dish.

Image: 
Kiem lab / Fred Hutchinson Cancer Research Center

SEATTLE -- July 31, 2019 -- In a paper published in the July 31 issue of Science Translational Medicine, researchers at Fred Hutchinson Cancer Research Center used CRISPR-Cas9 to edit long-lived blood stem cells to reverse the clinical symptoms observed with several blood disorders, including sickle cell disease and beta-thalassemia.

It's the first time that scientists have specifically edited the genetic makeup of a specialized subset of adult blood stem cells that are the source of all cells in the blood and immune system.

The proof-of-principle study suggests that efficient modification of targeted stem cells could reduce the costs of gene-editing treatments for blood disorders and other diseases while decreasing the risks of unwanted effects that can occur with a less discriminating approach.

"By demonstrating how this select group of cells can be efficiently edited for one type of disease, we hope to use the same approach for conditions such as HIV and some cancers," said senior author Dr. Hans-Peter Kiem, director of the Stem Cell and Gene Therapy Program and a member of the Clinical Research Division at Fred Hutch.

"Targeting this portion of stem cells could potentially help millions of people with blood diseases," said Kiem, who holds the Stephanus Family Endowed Chair for Cell and Gene Therapy.

For this preclinical study, which is expected to lead to human trials, the researchers picked a gene related to sickle cell disease and beta-thalassemia, which are caused by a genetic defect in how hemoglobin is made. Other studies have shown that symptoms are reversed by reactivating a version of hemoglobin that works during fetal development but is then turned off by our first birthdays.

The Fred Hutch researchers used CRISPR-Cas9 gene editing to a remove a piece of genetic code that normally turns off fetal hemoglobin proteins. Snipping this control DNA with CRISPR enables red blood cells to continuously produce elevated levels of fetal hemoglobin.

The edits were taken up efficiently by the targeted cells: 78% took up the edits in the lab dish before they were infused. Once infused, the edited cells settled in ("engrafted"), multiplied and produced blood cells, 30% of which contained the edits. This resulted in up to 20% of red blood cells with fetal hemoglobin, the type of hemoglobin that reverses disease symptoms in sickle cell disease and thalassemia.

"Not only were we able to edit the cells efficiently, we also showed that they engraft efficiently at high levels, and this gives us great hope that we can translate this into an effective therapy for people," Kiem said. "Twenty percent of red blood cells with fetal hemoglobin --what we saw with this method -- would be close to a level sufficient to reverse symptoms of sickle cell disease."

The scientists also believe that carrying out genetic fixes on the smaller pool of cells required for therapeutic benefit will lessen safety concerns and reduce the risk of off-target effects.

"Since the CRISPR technology is still in early stages of development, it was important to demonstrate that our approach is safe. We found no harmful off-target mutations in edited cells and we are currently conducting long-term follow-up studies to verify the absence of any undesired effect," said first author Dr. Olivier Humbert, a staff scientist in the Kiem Lab.

This was the first study to specifically edit a small population of blood cells that Kiem's team identified in 2017 as solely responsible for regrowing the entire blood and immune system. His team distinguished this select group as CD90 cells, named for a protein marker that sets them apart from the rest of the blood stem cells (known by another protein marker, CD34).

The self-renewing properties of this population of stem cells make them a powerful potential candidate to deliver gene therapy because they can provide long-term production of these genetically modified blood cells and thus could cure a disease for an entire lifetime. Since they represent a mere 5% of all blood stem cells, targeting them with gene-editing machinery would require fewer supplies and potentially be less costly.

Credit: 
Fred Hutchinson Cancer Center