Tech

Optimizing neural networks on a brain-inspired computer

image: Left to right:
The experiment was performed on a prototype of the BrainScales-2 chip;
Schematic representation of a neural network;
Results for simple and complex tasks

Image: 
Heidelberg University

Many computational properties are maximized when the dynamics of a network are at a "critical point", a state where systems can quickly change their overall characteristics in fundamental ways, transitioning e.g. between order and chaos or stability and instability. Therefore, the critical state is widely assumed to be optimal for any computation in recurrent neural networks, which are used in many AI applications.

Researchers from the HBP partner Heidelberg University and the Max-Planck-Institute for Dynamics and Self-Organization challenged this assumption by testing the performance of a spiking recurrent neural network on a set of tasks with varying complexity at - and away from critical dynamics. They instantiated the network on a prototype of the analog neuromorphic BrainScaleS-2 system. BrainScaleS is a state-of-the-art brain-inspired computing system with synaptic plasticity implemented directly on the chip. It is one of two neuromorphic systems currently under development within the European Human Brain Project.

First, the researchers showed that the distance to criticality can be easily adjusted in the chip by changing the input strength, and then demonstrated a clear relation between criticality and task-performance. The assumption that criticality is beneficial for every task was not confirmed: whereas the information-theoretic measures all showed that network capacity was maximal at criticality, only the complex, memory intensive tasks profited from it, while simple tasks actually suffered. The study thus provides a more precise understanding of how the collective network state should be tuned to different task requirements for optimal performance.

Mechanistically, the optimal working point for each task can be set very easily under homeostatic plasticity by adapting the mean input strength. The theory behind this mechanism was developed very recently at the Max Planck Institute. "Putting it to work on neuromorphic hardware shows that these plasticity rules are very capable in tuning network dynamics to varying distances from criticality", says senior author Viola Priesemann, group leader at MPIDS. Thereby tasks of varying complexity can be solved optimally within that space.

The finding may also explain why biological neural networks operate not necessarily at criticality, but in the dynamically rich vicinity of a critical point, where they can tune their computation properties to task requirements. Furthermore, it establishes neuromorphic hardware as a fast and scalable avenue to explore the impact of biological plasticity rules on neural computation and network dynamics.

"As a next step, we now study and characterize the impact of the spiking network's working point on classifying artificial and real-world spoken words", says first author Benjamin Cramer of Heidelberg University.

Credit: 
Human Brain Project

Antioxidant-rich powders from blueberry, persimmon waste could be good for gut microbiota

Feeding the world's growing population in a sustainable way is no easy task. That's why scientists are exploring options for transforming fruit and vegetable byproducts -- such as peels or pulp discarded during processing -- into nutritious food ingredients and supplements. Now, researchers reporting in ACS' Journal of Agricultural and Food Chemistry have shown that blueberry and persimmon waste can be made into antioxidant-rich powders that might have beneficial effects on gut microbiota.

In recent years, fruit and vegetable powders have become popular as a way to add beneficial compounds, such as polyphenols and carotenoids (two types of antioxidants), to the diet, either by consuming the powders directly or as an ingredient in food products. However, in many cases these healthful compounds are present at similar or even higher levels in byproducts compared to those in other parts of the fruit or vegetable. Noelia Betoret, María José Gosalbes and colleagues wanted to obtain powders from persimmon and blueberry wastes, and then study how digestion could affect the release of antioxidants and other bioactive compounds. They also wanted to determine the effects of the digested powders on gut bacterial growth.

The researchers obtained powders from persimmon peels and flower parts, and from the solids left behind after making blueberry juice. The type of powder, drying method, fiber content and type of fiber determined the release of antioxidants during a simulated digestion. For example, freeze-drying preserved more anthocyanins, but these were more easily degraded during digestion than those in air-dried samples. Then, the team added the powders to a fecal slurry and conducted a mock colonic fermentation, sequencing the bacteria present before and after fermentation. Incubation with the fruit powders resulted in an increase in several types of beneficial bacteria, and some bacteria grew better with one powder compared to the other. These findings indicate that persimmon and blueberry waste powders could be included in food formulations to boost the content of carotenoids and anthocyanins, which could have a positive impact on human health, the researchers say.

Credit: 
American Chemical Society

More flowers and pollinator diversity could help protect bees from parasites

image: Having more flowers and maintaining diverse bee communities could help reduce the spread of bee parasites, according to a new study.

Image: 
Peter Graystock

Having more flowers and maintaining diverse bee communities could help reduce the spread of bee parasites, according to a new study.

The research, conducted on more than 5,000 flowers and bees, reveals how bee parasites spread and what measures could help control them.

Bees can be infected with a cocktail of parasites that can cause a range of symptoms from reduced foraging ability to dysentery and death. Though parasites contribute to bee declines, scientists are unsure how they spread between bee species.

Flowers are essential for bee health, but may also act as transmission hubs for bee diseases. Over a growing season the diversity and abundance of bees and flowers change but little is known about how this may be linked to the risk of parasite transmission.

The new study, published in Nature Ecology and Evolution, suggests having more flowers and a more diverse bee community could help dilute the load of parasites, and that this may be particularly important in areas with high densities of social bees, such as honeybees and bumblebees.

Most studies of bee parasites focus on social bee species that often live in farmed colonies. Little is therefore known about the interactions between parasites and wild solitary bee species, or how parasites are transferred between them. The team behind the new paper studied how parasites are spread across diverse bee and flower communities, including solitary bee species.

Lead author Dr Peter Graystock, who completed the work at Cornell University and now works in the Department of Life Sciences at Imperial College London, said: "We found that when bee communities are at their most diverse, the proportion of infected bees were at their lowest, and when flowers were at their most abundant, fewer were likely to be acting as transmission hubs.

"There are two things potentially occurring here. In diverse bee communities, parasites are more likely to end up in a species they are not compatible with, meaning they can't replicate and spread further. The second thing is by having more flowers, bees aren't all visiting and contaminating the same few flowers with high concentrations of parasites.

"It's a little like if subway cars are sites of transmission between humans - if there are more subway cars, there are less people in each and less chance for transmission. Furthermore if some of the 'people' riding the subway cars were different animal species that were not susceptible to the parasite, that too reduces the risk of transmission."

The team screened more than 5,000 wildflowers and bees across a 24-week growing season, capturing changes as different flowers bloomed and different species of bee dominated.

Over 110 bee species and 89 flower species were screened, revealing 42% of bee species (12.2% individual bees) and 70% of flower species (8.7% individual flowers) had at least one parasite in or on them.

Bees had the highest prevalence of parasites late in the season, when social bees formed the majority of screened bees and overall bee diversity was lowest. This suggests keeping bee diversity high, with a variety of social and solitary species present, could help reduce the spread of parasites.

Since social bees are likely to come from farmed colonies, the researchers also say their research points to the importance of keeping hives healthy, to avoid infecting wild bees.

The study is the first time researchers have screened wildflowers and bees for parasites over the season, and as well as the abundance of flowers affecting transmission, the team also found that the species of flowers played a role.
For example, the species Lychnis flos-cuculi, commonly known as 'ragged robin', often had multiple parasite species on them, whereas Lythrum salicaria, or 'purple loosestrife' had few.

Dr Graystock added: "The power of this study is the number of bees and flowers screened over time, allowing us to see if the patterns fit with parasite transmission theory. We next want to dig deeper and understand some of the underlying mechanisms - such as why some flowers are more likely to harbour parasites than others."

Credit: 
Imperial College London

Study: Novel PFAS comprise 24% of those measured in blood of Wilmington, N.C. residents

In a new paper detailing findings from North Carolina State University's GenX Exposure Study, researchers detected novel per- and polyfluoroalkyl substances (PFAS) called "fluoroethers" in blood from residents of Wilmington, North Carolina. The fluoroethers - Nafion byproduct 2, PFO4DA and PFO5DoA - represented 24% of the total PFAS detected in the blood of Wilmington residents and appear to leave the body faster than legacy PFAS. These are the first measurements of these chemicals in humans.

The GenX Exposure Study began in 2017 after NC State researchers found a chemical called GenX in Wilmington residents' drinking water. An upstream chemical facility had been releasing several PFAS (including GenX) into the Cape Fear River, the city's primary drinking water source, since 1980. The study aimed to answer community questions about GenX and other PFAS, including whether the chemicals were detectable in residents' blood.

NC State and East Carolina University researchers collected blood samples from 344 Wilmington residents (289 adults and 55 children) across two sampling efforts in November 2017 and May 2018. Additionally, 44 of the November participants had a second blood sample collected in May 2018, six months after their first one. All blood samples were collected after the chemical facility stopped releasing GenX into the Cape Fear River.

The first sample determined which PFAS were present and the second sample was used to see how the levels changed in six months. The study looked for 10 PFAS unique to the chemical facility (called "fluoroethers"), and 10 legacy PFAS (such as PFOA and PFOS), in the samples.

Ten PFAS were found in most of the blood samples. Three of these were the fluoroethers Nafion byproduct 2, PFO4DA and PFO5DoA, all of which were detected in over 85% of samples. Two other fluoroethers, PFO3OA and NVHOS, were infrequently detected. Nearly all samples had at least one fluoroether present, but GenX was not found in any samples.

In total, 24% of the measured PFAS in blood samples came from these novel fluoroethers. In participants with repeated samples, the median decrease in fluoroether levels ranged from 28% for PFO5DoA to 65% for PFO4DA in six months once exposure to contaminants in drinking water ceased.

Four legacy PFAS (PFHxS, PFOA, PFOS, PFNA) were detected in most (97% or greater) participants. The levels of legacy PFAS in the participants' blood were higher than U.S. national levels for the 2015-2016 National Health and Nutrition Examination Survey and did not change in the repeated samples.

"Accurately measuring the levels of PFAS in blood is the first step toward understanding what effects these novel chemicals may have on human health," says Jane Hoppin, professor of biological sciences, deputy director of NC State's Center for Human Health and the Environment (CHHE), and corresponding author of the paper describing the work.

Credit: 
North Carolina State University

Digitizing chemistry with a smart stir bar

image: The Smart Stirrer (above) can transmit data on the color, viscosity and other attributes of the solution it is stirring.

Image: 
Dmitry Isakov

Miniaturized computer systems and wireless technology are offering scientists new ways to keep tabs on reactions without the need for larger, cumbersome equipment. In a proof-of-concept study in ACS Sensors, researchers describe an inexpensive new device that functions like a conventional magnetic stir bar, but that can automatically measure and transmit information on a solution's color, viscosity and a variety of other attributes to a smart phone or computer.

Automatic, remote data collection can make chemical processes more reliable, as well as less labor intensive and safer. This technology has begun to make inroads into chemistry labs, but options for scientists looking to remotely mix their reactions and monitor several parameters while those reactions occur remain quite scarce, limited and expensive. Nikolay Cherkasov, Dmitry Isakov and colleagues wanted to develop a device capable of simultaneously detecting numerous parameters using freely available open source software and easy-to-get, inexpensive components. 

The team made two versions of the Smart Stirrer based on different integrated circuits, one a microcontroller and the other an all-in-one system on a chip. To customize the Smart Stirrer for different applications, the team used a modular design, in which sensors can be added as needed. In experiments, they confirmed they could detect color, electrical conductivity and, with careful calibration, viscosity. Additional low-cost commercially available sensors, as well as custom ones, could be used to adapt the device for many new applications, the researchers say. They note one limitation: the narrow range of temperatures within which the Smart Stirrer can function, a restriction that they say is inherent to most digital electronics. Ultimately, they predict that this approach could provide a platform for digitizing chemistry in research labs and industrial manufacturing. 

Credit: 
American Chemical Society

Shrinking (ultra)violet

image: A scientist places a water sample onto a custom-made platform before a test. Each water sample contains microorganisms such as the parasite Giardia and adenoviruses, both of which can make humans sick.

Image: 
T. Larason/NIST

While awaiting full access to their labs due to COVID-19 restrictions, scientists at the National Institute of Standards and Technology (NIST) have taken this rare opportunity to report the technical details of pioneering research they conducted on the disinfection of drinking water using ultraviolet (UV) light.

Back in 2012, the NIST scientists and their collaborators published several papers on some fundamental findings with potential benefits to water utility companies. But these articles never fully explained the irradiation setup that made the work possible.

Now, for the first time, NIST researchers are publishing the technical details of the unique experiment, which relied on a portable laser to test how well different wavelengths of UV light inactivated different microorganisms in water. The work appears today in the Review of Scientific Instruments (RSI).

"We've been wanting to formally write this up for years," said NIST's Tom Larason. "Now we have time to tell the world about it."

One urgency for publishing a full description of the NIST system is that researchers envision using this UV setup for new experiments that go beyond the study of drinking water and into disinfection of solid surfaces and air. The potential applications could include better UV disinfection of hospital rooms and even studies of how sunlight inactivates the coronavirus responsible for COVID-19.

"As far as I know, no one has duplicated this work, at least not for biological research," Larason said. "That's why we want to get this paper out now."

Good Enough to Drink

Ultraviolet light has wavelengths that are too short for the human eye to see. UV ranges from about 100 nanometers (nm) to 400 nm, whereas humans can see a rainbow of color from violet (about 400 nm) to red (about 750 nm).

One way to disinfect drinking water is to irradiate it with UV light, which breaks down harmful microorganisms' DNA and related molecules.

At the time of the original study, most water irradiation systems used a UV lamp that emitted most of its UV light at a single wavelength, 254 nm. For years, though, water utility companies had shown increasing interest in a different type of disinfection lamp that was "polychromatic," meaning it emitted UV light at multiple different wavelengths. But the effectiveness of the new lamps was not well defined, said Karl Linden, a University of Colorado Boulder (CU Boulder) environmental engineer who was a principal investigator on the 2012 study.

"We discovered in the mid-2000s that polychromatic UV sources were more effective for virus inactivation -- specifically because these lamps produced UV light at low wavelengths, under 230 nm," Linden said. "But it was hard to quantify how much more effective and what the mechanisms of that effectiveness were."

In 2012, a group of microbiologists and environmental engineers led by CU Boulder was interested in adding to the knowledge base that water utility companies had regarding UV disinfection. With funding from the Water Research Foundation, a nonprofit organization, the scientists were looking to methodically test how sensitive various germs were to different wavelengths of UV light.

Normally, the light source for these experiments would have been a lamp that generates a wide range of UV wavelengths. To narrow the band of frequencies as much as possible, the researchers' plan was to shine the light through filters. But that still would have produced relatively wide, 10-nm bands of light, and unwanted frequencies would have bled through the filter, making it difficult to determine exactly which wavelengths were inactivating each microorganism.

The microbiologists and engineers wanted a cleaner, more controllable source for the UV light. So, they called on NIST to help.

NIST developed, built and operated a system to deliver a well-controlled UV beam onto each sample of microorganisms being tested. The setup involved putting the sample in question -- a petri dish filled with water with a certain concentration of one of the specimens -- into a light-tight enclosure.

What makes this experiment unique is that NIST designed the UV beam to be delivered by a tunable laser. "Tunable" means it can produce a beam of light with an extremely narrow bandwidth -- less than a single nanometer -- over a wide range of wavelengths, in this case from 210 nm to 300 nm. The laser was also portable, allowing scientists to bring it to the lab where the work was being conducted. Researchers also used a NIST-calibrated UV detector to measure the light hitting the petri dish before and after each measurement, to make sure they really knew how much light was hitting each sample.

There were a lot of challenges to get the system to work. Researchers ferried the UV light to the petri dish with a series of mirrors. However, different UV wavelengths require different reflective materials, so NIST researchers had to design a system that used mirrors with various reflective coatings that they could swap out between test runs. They also had to procure a light diffuser to take the laser beam -- which has a higher intensity in the center -- and spread it out so that it was uniform across the entire water sample.

The end result was a series of graphs that showed how different germs responded to UV light of different wavelengths -- the first data for some of the microbes -- with greater precision than ever measured before. And the team found some unexpected results. For example, the viruses exhibited increased sensitivity as wavelengths decreased below 240 nm. But for other pathogens such as Giardia, UV sensitivity was about the same even as the wavelengths got lower.

"The results from this study have been used quite frequently by water utility companies, regulatory agencies and others in the UV field working directly on water -- and also air -- disinfection," said CU Boulder environmental engineer Sara Beck, first author on three papers produced from this 2012 work. "Understanding which wavelengths of light inactivate different pathogens can make disinfection practices more precise and efficient," she said.

I, UV Robot

The same system that NIST designed for delivering a controlled, narrow band of UV light to water samples can also be used for future experiments with other potential applications.

For example, researchers hope to explore how well UV light kills germs on solid surfaces such as those found in hospital rooms, and even germs suspended in the air. In an effort to reduce hospital-acquired infections, some medical centers have been blasting rooms with a sterilizing beam of UV radiation carried in by robots.

But there are no real standards yet for use of these robots, the researchers said, so although they can be effective, it's hard to know how effective, or to compare the strengths of different models.

"For devices that irradiate surfaces, there are a lot of variables. How do you know they're working?" Larason said. A system like NIST's could be useful for developing a standard way to test different models of disinfection bots.

Another potential project could examine the effect of sunlight on the novel coronavirus, both in the air and on surfaces, Larason said. And the original collaborators said they hope to use the laser system for future projects related to water disinfection.

"The sensitivity of microorganisms and viruses to different UV wavelengths is still very much relevant for current water and air disinfection practices," Beck said, "especially given the development of new technologies as well as new disinfection challenges, such as those associated with COVID-19 and hospital-acquired infections, for example."

Credit: 
National Institute of Standards and Technology (NIST)

We are mutating SARS-CoV-2, but it is evolving back

Scientists investigating the evolution of the virus that causes Covid19 say that its mutation seems to be directed by human proteins that degrade it, but natural selection of the virus enables it to bounce back. The findings could help in the design of vaccines against the virus.

All organisms mutate. You were for example born with between 10 and 100 new mutations in your DNA. Mutation is usually a random process often owing to mistakes made when DNA is copied. Recent work from researchers at the Universities of Bath and Edinburgh, suggests that in the case of SARS-CoV-2, mutation may well not be a random process and that instead humans are mutating it, as part of a defence mechanism to degrade the virus.

The team looked at over 15,000 virus genomes from all of the sequencing efforts around the world and identified over 6000 mutations. They looked at how much each of the four letters that make up the virus' genetic code (A, C, U and G) were mutating and discovered that the virus had a very high rate of mutations generating U residues.

Senior author Professor Laurence Hurst, Director of the Milner Centre for Evolution at the University of Bath, said: "I have looked at mutational profiles for many organisms and they all show some sort of bias, but I've never seen one as strong and strange as this."

In particular they found that mutation very commonly generated UU neighbouring pairs, mutating from the original sequence of CU and UC. They noted this is a fingerprint of the mutational profile of a human protein, called APOBEC, that can mutate viruses. Professor Hurst commented: "It looks like mutation isn't random, but instead we are attacking the virus by mutating it."

But what are these mutations doing to the virus? Are they helping or hindering it? Looking at the actual composition of the virus and by comparing between different sorts of sites within the virus they found evidence that natural selection - survival of the fittest - is allowing the virus to fight back against the mutational process.

From the mutational profile the team predicts, for example, that 65% of the residues should be a U and 40% should be UU pairs, but in practice U content is much lower and UU content is just about a quarter of that predicted.

Professor Hurst said: "This could be because the viruses that have too much U in them simply don't survive well enough to reproduce. We estimate that for every 10 mutations that we see, there are another six we never get to see because those mutant viruses are too poor at propagating."

And there are several reasons why this might be. U rich versions of the viruses' genes the team found to be less stable and are seen at lower levels. Humans also have other proteins that attack sequences that are rich in U residues that might also force destruction of some versions of the virus.

These results suggest that we are attacking the virus to mutate it in a manner that degrades the virus. This also has implications for some vaccine designs. Several research groups are currently trying to make synthetic versions of the virus in a manner that enables the virus to be viable, but only just, so called attenuated viruses.

Professor Hurst said: "Knowing what selection favours and disfavours in the virus is really helpful in understanding what an attenuated version should look like.

"We suggest for example that increasing U content, as APOBEC does within our cells, would be a sensible strategy."

Credit: 
University of Bath

NASA sees record-breaking new Tropical Storm Gonzalo strengthening

image: On July 22 at 12:11 a.m. EDT (0411 UTC) NASA's Aqua satellite analyzed Tropical Storm Gonzalo using the Atmospheric Infrared Sounder or AIRS instrument. AIRS found coldest cloud top temperatures as cold as or colder than (purple) minus 63 degrees Fahrenheit (minus 53 degrees Celsius).

Image: 
NASA JPL/Heidar Thrastarson

The seventh named tropical cyclone of the North Atlantic Ocean has formed, and like some others this season, it has broken a record. NASA's Aqua satellite provided a look at the small record-breaker.

The National Hurricane Center (NHC) reports that Gonzalo is the earliest seventh named storm on record in the Atlantic basin, beating Gert of 2005 by 2 days. Tropical Depression Seven formed on July 21 by 5 p.m. EDT in the Central North Atlantic Ocean, and by 8:50 a.m. EDT on July 22 it strengthened into a tropical storm and was renamed Gonzalo.

One of the ways NASA researches tropical cyclones is using infrared data that provides temperature information. The AIRS instrument aboard NASA's Aqua satellite captured a look at those temperatures in Gonzalo and gave insight into the size of the storm and its rainfall potential.

Cloud top temperatures provide information to forecasters about where the strongest storms are located within a tropical cyclone. Tropical cyclones do not always have uniform strength, and some sides have stronger sides than others. The stronger the storms, the higher they extend into the troposphere, and the colder the cloud temperatures. NASA provides that data to forecasters at NOAA's National Hurricane Center or NHC so they can incorporate it in their forecasting.

On July 22 at 12:11 a.m. EDT (0411 UTC) NASA's Aqua satellite analyzed the storm using the Atmospheric Infrared Sounder or AIRS instrument. AIRS found coldest cloud top temperatures as cold as or colder than minus 63 degrees Fahrenheit (minus 53 degrees Celsius). NASA research has shown that cloud top temperatures that cold indicate strong storms that have the capability to create heavy rain.

By 11 a.m. EDT, the NHC noted, "Satellite imagery indicates that the cyclone has become significantly better organized since the last advisory. Visible imagery shows a well-defined central dense overcast with a hint of an eye and an outer convective band in the western semicircle, while a microwave overpass showed a well-defined inner convective ring feature."

At 11 a.m. EDT (1500 UTC), NHC reported the center of Tropical Storm Gonzalo was located near latitude 9.9 degrees north and longitude 43.6 degrees west. That is about 1,205 miles (1,935 km) east of the southern Windward Islands. Gonzalo is moving toward the west near 14 mph (22 kph). A general westward motion at a faster forward speed is expected during the next few days. The estimated minimum central pressure is 1000 millibars.

Maximum sustained winds have increased to near 50 mph (85 kph) with higher gusts. Gonzalo is a small tropical cyclone, as tropical-storm-force winds extend outward only up to 25 miles (35 km) from the center. Additional strengthening is forecast during the next couple of days, and Gonzalo is expected to become a hurricane by Thursday, July 23.

Interests in the southern Windward Islands should monitor the progress of this system.

The AIRS instrument is one of six instruments flying on board NASA's Aqua satellite, launched on May 4, 2002.

For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future. NASA brings together technology, science, and unique global Earth observations to provide societal benefits and strengthen our nation. Advancing knowledge of our home planet contributes directly to America's leadership in space and scientific exploration.

For updated forecasts, visit: http://www.nhc.noaa.gov

By Rob Gutro
NASA's Goddard Space Flight Center

Credit: 
NASA/Goddard Space Flight Center

NASA infrared confirms Douglas still a tropical storm

image: On July 22 at 2:15 a.m. EDT (0615 UTC), the MODIS instrument aboard NASA's Terra satellite gathered temperature information about Tropical Storm Douglas' cloud tops. MODIS found powerful thunderstorms (red) where temperatures were as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 Celsius) around the center and in a thick band of thunderstorms north of the center.

Image: 
NASA/NRL

Infrared data from NASA's Terra satellite showed that dry air around Tropical Storm Douglas has been inhibiting it from strengthening into a hurricane.

On July 22 at 2:15 a.m. EDT (0615 UTC), the Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard NASA's Terra satellite gathered temperature information about Tropical Storm Douglas' cloud tops. Infrared data provides temperature information, and the strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

MODIS found powerful thunderstorms where temperatures were as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 Celsius) around the center and in a thick band of thunderstorms north of the center. Cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall.

Forecasters noted that Douglas' overall appearance has changed little over the early morning hours on July 22. Douglas has not intensified to hurricane status because dry air is affecting the eastern portion of the cyclone's circulation and preventing thunderstorm development. A tropical cyclone consists of hundreds of thunderstorms.

At 5 a.m. EDT (0900 UTC) on July 22, the National Hurricane Center (NHC) noted the center of Tropical Storm Douglas was located near latitude 11.9 degrees north and longitude 128.0 degrees west. Douglas is centered about 1,875 miles (3,020 km) east-southeast of Hilo, Hawaii.

Douglas is moving toward the west near 15 mph (24 km/h). A turn toward the west-northwest along with an increase in forward speed is forecast to occur by late Wednesday. The west-northwestward motion is forecast to continue at least through Saturday. Maximum sustained winds are near 65 mph (100 km/h) with higher gusts. The estimated minimum central pressure is 998 millibars.

Strengthening is forecast during the next couple of days, and Douglas is expected to become a hurricane on Wednesday, July 23. NHC forecaster Andrew Latto noted in the morning discussion, "The system is forecast to remain in a favorable environment for intensification for the next day or so. Beyond 36 hours, the combination of cooler SSTs and dry air should cause Douglas to slowly weaken."

Tropical cyclones/hurricanes are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

For updated forecasts, visit: http://www.nhc.noaa.gov

By Rob Gutro
NASA's Goddard Space Flight Center

Credit: 
NASA/Goddard Space Flight Center

Getting under the skin of psoriasis

Psoriasis, a chronic skin condition that causes itchy, red, scaly patches, afflicts more than 8 million Americans and 125 million people worldwide. Small molecule-based drugs like steroids can penetrate the skin to treat the condition, but they can cause skin irritation and thinning and their efficacy can decrease over time. Antibodies that target specific inflammation-related molecules associated with psoriasis have been developed, but because they cannot be delivered via the skin, they are injected using needles and syringes, which limits their acceptance and can have negative systemic side effects.

A team of researchers at Harvard's Wyss Institute for Biologically Inspired Engineering and John A. Paulson School of Engineering and Applied Sciences (SEAS) has circumvented these limitations by using an ionic liquid (IL) combination to successfully deliver a small interfering RNA (siRNA)-based treatment directly to the skin in a mouse model of psoriasis, significantly reducing levels of inflammatory cytokines and symptoms of psoriasis without systemic side effects. The research is published today in Science Advances.

"Compared to other technologies that have demonstrated delivery of nucleic acids to the skin, our IL platform offers unique opportunities in terms of tunability, an excellent safety profile, and economical scale-up," said first author Abhirup Mandal, Ph.D., a former Postdoctoral Fellow at the Wyss Institute and SEAS who is now a Senior Research Scientist at CAGE Bio. "We think that effective topical delivery of macromolecules will revolutionize the treatment options for debilitating dermatological disorders like psoriasis."

Running simulations to predict real success

Synthetic siRNAs are non-coding double-stranded RNA molecules that are routinely used in biological research to "silence" a target gene by destroying the gene's RNA transcripts. This ability also makes them very attractive candidates for treating diseases and disorders without modifying the DNA in a patient's cells. However, their use in medicine has been hampered because RNAs are large, hydrophilic molecules, and therefore have a hard time crossing cells' hydrophobic membranes.

The team at the Wyss Institute and SEAS tackled that challenge using a recently discovered class of material called ionic liquids (ILs), which are essentially salts that are liquid at room temperature. Based on earlier research investigating the interactions of ILs with lipids, the researchers had a hunch that ILs could stabilize siRNAs and improve their penetration across lipid-based cell membranes, enabling localized gene silencing.

The team first created a library of different ILs, then tested combinations of them to see which had the physical and chemical properties they were looking for. They settled on a mixture of two -- CAGE (choline and geranic acid) and CAPA (choline and phenylpropanoic acid) -- that helped associated siRNA molecules retain their structural integrity and led to increased siRNA penetration into pig skin in vitro. When they applied the CAGE+CAPA mixture as a thick topical liquid to the skin of living mice, they observed no inflammation or irritation, indicating that it was non-toxic.

Because ILs are a fairly new material, predicting their interactions with the cargoes they are meant to deliver is challenging. The researchers collaborated with co-author Charles Reilly, Ph.D., a Senior Staff Scientist in the Bioinspired Therapeutics & Diagnostics platform at the Wyss Institute, to perform molecular dynamics simulations to model and understand how the CAGE+CAPA solution would interact with siRNA and cell membranes at the molecular level. The observations from those simulations predicted that this IL-siRNA complex had superior stability due to its component ions' strong chemical interactions with the RNA base pairs. The model also suggested that it led to higher penetration of cell membranes because the ions in the IL were able to pack closely together, forming aggregates that augmented the complex's ability to disrupt the membrane and allow the siRNA's entry.

Breaking down barriers

Armed with an effective delivery vehicle, the team then coupled it with a specific siRNA designed to silence a gene called NFKBIZ, which has been implicated in the upregulation of a number of inflammatory molecules that are involved in psoriasis. They applied the CAGE+CAPA mixture along with the siRNA to the skin of mice with a psoriasis-like condition for four days, then compared those mice to others that had received CAGE+CAPA with a control siRNA, CAGE+CAPA alone, or no treatment.

The mice that were given the NFKBIZ siRNA treatment had reduced epidermal thickening, skin discoloration, and keratin overgrowth compared to the other experimental groups, as well as less redness and scaling. They also displayed a significant reduction in the expression of NFKBIZ and other psoriasis-related gene products in their skin cells, demonstrating for the first time that IL-siRNA complexes can induce a therapeutic effect at both molecular and macroscopic levels by silencing a target gene in vivo following topical administration.

"Topical creams have been used to treat skin conditions for hundreds of years, but the skin is a very effective barrier against most substances, which limits their effectiveness. Being able to bridge that barrier to deliver nucleic acid therapeutics directly to skin cells is a huge accomplishment in the quest for targeted, effective therapeutics," said corresponding author Samir Mitragotri, Ph.D., who is a Core Faculty member at the Wyss Institute and the Hiller Professor of Bioengineering and Hansjörg Wyss Professor of Biologically Inspired Engineering at SEAS.

This IL-based delivery platform can be easily scaled up and tuned to interface with a variety of therapeutic molecules, including DNA and antibodies. It could also empower transdermal drug delivery for the treatment of other dermatologic skin conditions including eczema, and improve the long-term efficacy of therapies by targeting genes that mediate multiple disease pathways.

Based on the encouraging results from this study, Mitragotri's lab is initiating new collaborations with researchers at various institutions focusing on understanding local and systemic mechanisms associated with autoimmune and inflammatory diseases in the skin.

"Many of the innovations that biologists have been using in research for years have significant clinical potential, but most haven't achieved it because of fundamental limiting factors such as, in this case, the barrier posed by the skin. This creative solution to this drug delivery problem holds great promise for enabling a new class of effective treatments that are long overdue," said the Wyss Institute's Founding Director and co-author of the paper Donald Ingber, M.D., Ph.D., who is also the Judah Folkman Professor of Vascular Biology at Harvard Medical School and Boston Children's Hospital, and Professor of Bioengineering at SEAS.

Credit: 
Wyss Institute for Biologically Inspired Engineering at Harvard

Visual working memory is hierarchically structured

image: According to the hierarchical encoding framework, it will be easier to recall the items in the first picture, rather than in the third one, due to the lower range of triangle orientations.

Image: 
Utochkin, I. S., & Brady, T. F. (2020). Individual representations in visual working memory inherit ensemble properties. Journal of Experimental Psychology: Human Perception and Performance, 46(5), 458-473.

Researchers from HSE University and the University of California San Diego, Igor Utochkin and Timothy Brady, have found new evidence of hierarchical encoding of images in visual working memory. It turns out that the precision of remembering and recalling individual objects in a group is affected by ensemble representations--the mean and standard deviation of all objects in the group. The study has been published in Journal of Experimental Psychology: Human Perception and Performance. https://doi.apa.org/doiLanding?doi=10.1037%2Fxhp0000727

Visual working memory stores information about a limited number of perceived objects during a short period of time while we are involved in a task exploiting this information. For example, you will be using visual working memory if you are asked to memorize circles on the screen and then recall the size of one of them.

It is known that working memory capacity is limited: we are capable of remembering about three or four objects on average. Many theories assume that each object is memorized, stored and forgotten in working memory independently of the other objects. In contrast, proponents of the hierarchical encoding framework do not agree with this assumption. According to this framework, memory encodes not only the information on each item separately, but also the information on the group of objects together. This generalized representation about the group is coded in ensemble summary statistics. The visual system can calculate the mean and standard deviation of all features of the items present. For example, we can easily assess and memorize the average size of the apples on a tree, as well as how similar all the apples are to this average apple.

The researchers carried out a series of experiments that demonstrate a strong impact of ensemble statistics on individual items' memory. In one of the experiments, the participants were shown a group of four isosceles triangles with different apex orientations. The range of apex orientations varied: they were pointing in about the same direction, or in completely different ones. The higher the variability, the more difficult it is to calculate the mean orientation.

The participants were asked to memorize the triangle orientations and then recall one randomly chosen triangle. If the hierarchical encoding theories are correct, the variety of apex orientations in a group would impact the quality (precision) of the ensemble statistics calculation (mean orientation of all triangles), and accordingly, the precision of an individual triangle.

The researchers found out that the precision of an individual orientation report depended on the variability of all orientations. Furthermore, there was a significant resemblance between how accurately participants could remember the orientations of the items in the display and how clustered the orientations to be remembered were.

'This says for a fact that even as we try to memorize items individually, our working memory also stores the summary of the whole group,' commented Igor Utochkin, https://www.hse.ru/en/staff/utochkin Professor at the HSE School of Psychology. If precise information about a specific item isn't in memory, one uses the ensemble statistics to recall the approximate characteristics of the object. The more precise these statistics are, the more precise the response concerning that object is.

Credit: 
National Research University Higher School of Economics

Triple negative breast cancer meets its match

image: Qing Zhang, Ph.D.

Image: 
UT Southwestern Medical Center

DALLAS - July 22, 2020 - One member of a larger family of oxygen sensing enzymes could offer a viable target for triple negative breast cancer (TNBC), UTSW researchers report in a new study. The findings, published online this week in Cancer Discovery, might offer hope to this subset of patients who have few effective treatment options and often face a poor prognosis.

TNBC - so called because it lacks estrogen receptors, progesterone receptors, and overexpression of the growth-promoting protein HER2 - makes up only 15 to 20 percent of all breast cancers. However, explains Qing Zhang, Ph.D., associate professor in the department of pathology at UTSW and a Cancer Prevention and Research Institute of Texas (CPRIT) Scholar in Cancer Research, it's the deadliest of all breast cancers, with a five-year survival rate of 77 percent compared with 93 percent for other types.

Unlike other cancers which are hormone receptor or HER2 positive, TNBC has no targeted treatments, so patients must rely only on surgery, chemotherapy, and radiation, which are less effective than targeted treatments and can harm healthy tissue.

Zhang's lab studies how cancers can thrive in low-oxygen environments. Looking for viable drug targets for TNBC, Zhang and his colleagues zeroed in on 2-oxoglutarate (2OG)-dependent enzymes, a family of 70 enzymes including some that function as oxygen sensors in cells. To determine their role in TNBC, the researchers used a library of short interfering RNAs - snippets of genetic material that can shut off the expression of specific genes - to individually turn off each member of the 2-OG-dependent family in different TNBC and healthy breast cell lines.

Their focus quickly narrowed to a specific 2-OG-dependent member called gamma-butyrobetaine hydroxylase 1 (BBOX1) that's known to facilitate cell synthesis of carnitine, a molecule that plays a key role in energy metabolism. When the researchers turned off the gene responsible for producing BBOX1, TNBC cells stopped dividing and eventually died, although turning off this gene in healthy breast cell lines had no effect.

The opposite was also true: Overexpressing the gene for BBOX1 caused TNBC cell lines to wildly proliferate. Further investigation showed that it wasn't carnitine - the enzyme's end product - that caused this effect. The enzyme itself appeared to be key for TNBC cells' survival and growth.

To investigate how BBOX1 exerts this effect, Zhang and his colleagues searched for which proteins this enzyme interacts with in cells. Their experiments showed that BBOX1 uniquely binds to a protein called IP3R3, which studies have previously linked to other malignancies. IP3R3 is important for helping mitochondria - organelles that serve as cells' energy factories - extract energy from sugar. By binding to this protein, BBOX1 stops it from being degraded, giving TNBC cells the energy boost they need to grow, Zhang explains.

On the flip side, eliminating BBOX1 could have the potential to stop TNBC tumors in their tracks. The researchers demonstrated this in mice by injecting the animals with TNBC cells that were modified so that the gene for BBOX1 could be turned off either directly or by feeding the animals an antibiotic called doxycycline. Turning off the BBOX1 gene directly in these cancer cells stopped primary tumor growth. In another strategy to simulate the breast cancer growth and treatment in patients, the researchers injected tumor cells into the mice and let the TNBC cells grow unabated into sizeable tumors. Then, the mice received doxycycline to turn off the BBOX1 gene. They found that these tumors stopped growing and shrank.

Zhang's team was able to replicate these results by dosing mice with pharmaceuticals that inhibit BBOX1. These drugs effectively fought TNBC tumors but had no negative effects on normal breast tissue or on the animals as a whole, with no detectable toxicity.

Zhang points out that while two of these drugs are still in development, one of them - known as Mildronate - is already prescribed in some European countries for increasing oxygen to tissues to treat coronary artery disease. This, or other BBOX1 inhibitors, could eventually be the targeted treatments for TNBC that patients have been waiting for, he says.

"Right now, TNBC patients have limited options, which leads to dismal clinical consequences," says Zhang, the study's senior author. "We think BBOX1 inhibitors could be a powerful new weapon in the arsenal to treat these cancers."

Credit: 
UT Southwestern Medical Center

Wireless, optical cochlear implant uses LED lights to restore hearing in rodents

video: A 3D x-ray visualization of an implanted cochlea in a rat. This material relates to a paper that appeared in the Jul. 22, 2020, issue of Science Translational Medicine, published by AAAS. The paper, by D. Keppeler at University Medical Center Göttingen in Göttingen, Germany; and colleagues was titled, "Multichannel optogenetic stimulation of the auditory pathway using microfabricated LED cochlear implants in rodents."

Image: 
[D. Keppeler <i>et al., Science Translational Medicine</i> (2020)]

Scientists have created an optical cochlear implant based on LED lights that can safely and partially restore the sensation of hearing in deaf rats and gerbils. Their design's light-based approach allowed it to deliver more accurate and pinpointed signals to auditory nerves compared with current implants based on electricity, which often suffer from poor sound quality. The technology also offers some improvements over previous experimental optical implants, suggesting it could boost the clinical viability of cochlear implants to treat hearing impairment. Cochlear implants are biomedical devices that can partially restore hearing in patients with hearing loss, which affects approximately 5% of people worldwide. Most cochlear implants reproduce sound through the use of electrode contacts, but the resulting electrical stimulation is not very specific and tends to spread over a large area of nerves, leading to less detailed sounds and lower-quality hearing. Daniel Keppeler and colleagues instead turned to optical cochlear implants, an alternative class of designs that use light instead of electricity to stimulate sound-sensing neurons after genetically modifying those neurons to respond to light. Unlike other implants, their wireless device uses multiple stimulation channels, and integrates power-efficient chips of blue LED lights to activate modified neurons inside the cochlea. The scientists confirmed that the device generated more selective signals than prior designs when implanted into deafened rats and gerbils, and the animals successfully navigated sound-based behavioral tests over the course of several weeks. Keppeler et al. note that more work is needed to address the device's large size and broad spread of light before it can be tested in clinical studies.

Credit: 
American Association for the Advancement of Science (AAAS)

2,000 years of storms in the Caribbean

image: Aerial photograph of the Blue Hole, a flooded karst sinkhole on Lighthouse Reef, Belize, where the research team from Frankfurt was able to tap into 2,000-year-old sediment layers.

Image: 
(Photo: Gischler)

FRANKFURT. The hurricanes in the Caribbean became more frequent and their force varied noticeably around the same time that classical Mayan culture in Central America suffered its final demise: We can gain these and other insights by looking at the climate archive created under the leadership of geoscientists from Goethe University and now presented in an article in "Nature" journal's "Scientific Reports" on 16 July.

Tropical cyclones in the Atlantic (hurricanes) are a substantial threat for the lives and property of the local population in the Caribbean and neighboring regions, such as the south-east of the USA. The storms' increasing force, described in Chapter 15 of the report by the Intergovernmental Panel on Climate Change (IPCC Report), raises the probability of ecological and social catastrophes, as the occurrence of such cyclones over the past 20 years, which caused devastating damage, has shown. The climate models used to date, which could help to estimate the danger better, are, however, based on data that are lacking in spatial and temporal depth. Instrumental climate data, such as regular measurement of sea surface temperatures and reliable chronicling of hurricanes, date back only to the 19th century, at most.

In the framework of a research project (Gi 222/31) funded by the German Research Foundation, the Biosedimentology Working Group at the Department of Geosciences of the Faculty of Geosciences and Geography (Professor Eberhard Gischler) of Goethe University has now been able to build up and analyze a sedimentary "storm archive" that covers almost the entire Common Era (2,000 years) with annual resolution. The archive comprises fine-grained annual layers of sediments from the 125-meter-deep bottom of the Blue Hole, a flooded karst sinkhole on the Lighthouse Reef Atoll off the coast of Belize (Central America). There, 2.5 mm of lime mud, composed of shell debris from organisms in the reef lagoon along with changing amounts of organic matter, collect year after year. Coarser layers up to several centimeters thick that constitute tempestites (storm sediments) are intercalated in these fine-grained sediments. They mostly consist of shell debris from reef organisms living on the edge of the atoll. The almost 9-metre-long drill core from the bottom of the Blue Hole, which was recovered with the help of an electrical vibracorer, spans the last 1,885 years with a total of 157 storm layers.

In the framework of extensive studies conducted by doctoral researcher Dominik Schmitt and collaboration between the Biosedimentology Working Group and colleagues at the University of Bern (Switzerland), it has become apparent that both short-term and long-term climate phenomena, such as the El Niño Southern Oscillation (ENSO), the North Atlantic Oscillation (NAO) and the Atlantic Multidecadal Oscillation (AMO), have influenced storm activity over the last 2,000 years and are mirrored in the new climate archive. The beginning of the Medieval Warm Period (approx. AD 900-1100) constitutes an important transition period when the activity of tropical cyclones changed substantially, presumably in conjunction with the shift of the Intertropical Convergence Zone (the low-pressure zone where northern and southern trade winds converge) towards the south: From AD 100-900, storm activity in the region tended to be more stable and weaker, while since AD 900 up until today it has been more variable and more vigorous. Interestingly, this change in the increase of cyclone frequency goes hand in hand with the occurrence of a few, very thick, coarse-grained storm layers and coincides with the final demise of the classical Mayan culture in Central America. It is possible that the increased impact of hurricanes on the Central American mainland, combined with extensive flooding of cultivated land in the Mayan lowlands and rainfall-induced erosion in the backlands of the Mayan Mountains of Belize - apart from the recurring periods of drought already known - was another environmental factor that influenced the end of the Maya's high culture.

Credit: 
Goethe University Frankfurt

Narcissists don't learn from their mistakes because they don't think they make any

image: Researcher Satoris (Tori) Howes at Oregon State University - Cascades with the OSU College of Business.

Image: 
Courtesy OSU-Cascades

BEND, Ore. -- When most people find that their actions have resulted in an undesirable outcome, they tend to rethink their decisions and ask, "What should I have done differently to avoid this outcome?"

When narcissists face the same situation, however, their refrain is, "No one could have seen this coming!"

In refusing to acknowledge that they have made a mistake, narcissists fail to learn from those mistakes, a recent study from Oregon State University - Cascades found.

The mental process of analyzing past actions to see what one should have done differently is called "should counterfactual thinking." Counterfactual thinking is the mental process of imagining a different outcome or scenario from what actually occurred.

All of us engage in some level of self-protective thinking, said study author Satoris Howes, a researcher at OSU-Cascades with the OSU College of Business. We tend to attribute success to our own efforts, but blame our failures on outside forces -- while often blaming other people's failure on their own deficiencies.

"But narcissists do this way more because they think they're better than others," Howes said. "They don't take advice from other people; they don't trust others' opinions. ... You can flat-out ask, 'What should you have done differently?' And it might be, 'Nothing, it turned out; it was good.'"

Narcissism is typically defined as a belief in one's superiority and entitlement, with narcissists believing they are better and more deserving than others.

The study, published recently in the Journal of Management, consisted of four variations on the same experiment with four different participant groups, including students, employees and managers with significant experience in hiring. One of the four was conducted in Chile with Spanish-speaking participants.

Participants first took a test that ranked their narcissism by having them choose among pairs of statements ("I think I am a special person" versus "I am no better or worse than most people"). In the first of the four variations, they then read the qualifications of hypothetical job candidates and had to choose whom to hire. After choosing, they were given details about how this hypothetical employee fared in the job, and were assessed regarding the extent they engaged in "should counterfactual thinking" about whether they made the right decision.

The four variations employed different methods to analyze how counterfactual thinking was affected by hindsight bias, which is the tendency to exaggerate in hindsight what one actually knew in foresight. The researchers cite the example of President Donald Trump saying in 2004 that he "predicted the Iraq war better than anybody."

The authors note that prior research has shown that hindsight bias is often reversed as a form of self-protection when a prediction proves to be inaccurate -- e.g., Trump saying in 2017 that "No one knew health care could be so complicated" after failing to put forth a successful alternative to the Affordable Care Act.

In the OSU study, researchers found that when narcissists predicted an outcome correctly, they felt it was more foreseeable than non-narcissists did ("I knew it all along"); and when they predicted incorrectly, they felt the outcome was less foreseeable than non-narcissists did ("Nobody could have guessed").

Either way, the narcissists didn't feel they needed to do something differently or engage in self-critical thinking that might have positive effects on future decisions.

"They're falling prey to the hindsight bias, and they're not learning from it when they make mistakes. And when they get things right, they're still not learning," Howes said.

Narcissists often rise in the ranks within organizations because they exude total confidence, take credit for the successes of others and deflect blame from themselves when something goes wrong, Howes said.

However, she said, over time this can be damaging to the organization, both because of low morale of employees who work for the narcissist and because of the narcissist's continuing poor decisions.

To avoid the trap of hindsight bias, Howes said individuals should set aside time for reflection and review after a decision, even if the outcome is positive. Whether the decision was favorable or unfavorable, they should ask themselves what they should have done differently. And because narcissists don't engage in this process, Howes said it would be wise to have advisory panels provide checks and balances when narcissists have decision-making authority.

Credit: 
Oregon State University