Tech

Buying local? Higher price means higher quality in consumers' minds

BLOOMINGTON, Ind. -- Why are we willing to pay much more for a six pack of craft beer, a locally produced bottle of wine or a regional brand item, often choosing them over national brands?

It's because when people prefer to "buy local," they more frequently base their decisions on price as a perception of quality, according to research from the Indiana University Kelley School of Business and three other universities.

The study, published in the Journal of Marketing, suggests that marketers can use this understanding of local identity versus global identity to shape consumers' price perceptions and behavior.

"Consumers tend to use price to judge a product's quality when their local identity is most important to them," said Ashok Lalwani, associate professor of marketing at Kelley. "When promoting high-priced or branded products, marketers can situationally activate consumers' local identity. To accomplish this objective, businesses can encourage consumers to 'think local' or employ local cultural symbols in advertising and other promotional material.

The researchers also suggested that the opposite was true for low-price products.

"Discount stores, such as dollar stores, should discourage consumers from using the price of a product to infer its quality," Lalwani said. "They would be better served by temporarily making consumers' global identity more prominent. Cues in advertisements that focus on a product's global appeal would help achieve that goal."

Many companies find it difficult to set and increase prices in the digital marketplace because of the pricing transparency of the internet, consumers' deal-seeking attitudes and global product availability.

For their study, Lalwani and his colleagues conducted in-depth interviews, two field studies and seven experiments, and reviewed secondary data. In their interviews with 15 senior-level managers from Fortune 500 companies, they found that while the executives considered local or global communities in their pricing decisions, none knew when such strategies were effective or why.

For example, an executive at a snack food maker told them, "It is important to have a reasonably high price since it communicated 'premium-ness' and then reinforce it with advertising and packaging. But we don't know for sure why such consumers prefer premium brands." A pet products manager said, "In dog sweaters, it is difficult to judge quality, so I am sure that my pet parents use price, in addition to other factors, to choose."

Through the field studies, experiments and secondary data, the researchers found that when consumers choose to identify more with others around them, they perceive greater variance among brands, which increases their reliance on price as a cue to judge quality.

Past research has found that consumers from more globalized countries and communities, such as the United States and its larger cities, often have a stronger global mindset because they interact with many types of people and cultures and hear news from abroad. In contrast, those living in smaller population areas or from isolated or insular nations often have a stronger local identity because they have less access to other cultures.

This paper provides useful guidelines for firms to adapt strategies for different regions and address whether companies should be more locally or globally oriented.

"For products to be marketed to the places where people tend to have a more local identity (such as rural areas), local flavors and ingredients can be used in the products. As these consumers are more likely to make price-quality associations, marketers may not need to allocate much ad budget to convince consumers about price-quality associations," Lalwani and his co-authors wrote.

The opposite is true as well, according to the authors, indicating that in more metropolitan areas, consumers most often don't have an established connection between price and quality. For marketers, this means that putting additional effort into differentiating their brand will help consumers associate a higher price with higher quality.

Lalwani is in the process of reviewing results of a large-scale national survey of the U.S. that measures which states tend to have more of a local identity versus a global one, for a follow-up study.

Credit: 
Indiana University

How to restore a coral reef

image: Genetic differences among these bleached and healthy corals of the same species contributed to their fate. New guidelines for coral restoration outline a concrete plan for collecting, raising, and replanting corals that prioritizes genetic diversity, maximizing the potential for corals to adapt to their changing environment.
https://www.usgs.gov/media/images/coral-bleaching-fl-keys-national-marin...

Image: 
Ilsa Kuffner, US Geological Survey

New guidelines drafted by a consortium of concerned experts could enable corals to adapt to changing environments and help restore declining coral populations in the Caribbean. The guidelines provide a definitive plan for collecting, raising, and replanting corals that maximizes their potential for adaptation.

A new paper outlining the guidelines, authored by the restoration genetics working group of the Coral Restoration Consortium, a group of scientists, restoration practitioners, educators, and concerned members of the public, appears online July 22, 2019 in the journal Ecological Applications.

"The Caribbean has experienced tremendous coral loss over the last few decades, and coral restoration has become an urgent issue in the region," said Iliana Baums, professor of biology at Penn State and chair of the Coral Restoration Consortium restoration genetics working group. "But few of the traditional guidelines for conservation, which tend to focus on vertebrates or plants, apply to corals. In this paper, we provide concrete guidelines for restoring coral populations, using the best available data."

Corals serve as the foundation for reefs, which protect coastal communities, provide food and medicinal compounds, and lead to an estimated $9.9 trillion per year in goods and services around the globe. But reefs worldwide face a variety of threats--foremost among them rising ocean temperatures--and are declining, particularly in the Caribbean.

A recent National Oceanographic and Atmospheric Administration-commissioned report from the National Academies of Sciences, Engineering, and Medicine provides a broad overview of 23 coral restoration strategies, though most are largely untested and not ready for implementation.

"The guidelines in this new paper are among those that can be implemented immediately and are grounded in the idea that coral populations can naturally respond to change if they have enough genetic diversity," said Baums. "We are focusing on maintaining or increasing the genetic diversity of coral populations, which will provide more options for the corals to adapt to their changing environments."

Coral populations grow in a variety of environments, covering a range of temperatures, depths, and light conditions, and they tend to adapt to local conditions. Thus, individuals in different environments should have differences in their genetic code that allow them to thrive. The consortium recommends collecting corals from these different environments to capture as much genetic diversity as possible. Then corals should be raised in a nursery, where they can quickly grow, and replanted on reefs.

"Corals can reproduce both asexually and sexually," said Baums. "We can break off a small piece of a colony and replant it, essentially yielding a clone of the original coral. But sexual reproduction is key to naturally producing genetic diversity, and rates of sexual reproduction on reefs are dropping dramatically, especially for true reef-building corals. By replanting diverse corals in small groups, we enable the corals to sexually reproduce with each other."

Collected corals could be replanted in locations similar to their original environment, or in locations that may soon become similar to their original environment.

"By taking advantage of improved climate models, we can anticipate where these traits may be beneficial in the future," said Baums.

"We hope these guidelines for collecting, raising, and replanting corals will help to establish self-sustaining, sexually reproducing coral populations," said Baums. "The situation surrounding coral reef decline is certainly dire, but we have a tremendous community of people that is dedicated to solving the problem. We have made enormous progress in figuring out how to do coral restoration, and we can make a difference in coral populations today. But for every minute that passes, it gets harder. With every missed opportunity to curb carbon emissions, which contribute to rising ocean temperatures, it gets even harder. Coral reefs are the world's most diverse ecosystems and they provide incredibly important ecosystem services, so we really cannot afford to lose them."

Support for this work was provided by the Coral Restoration Consortium, the National Oceanographic and Atmospheric Administration (NOAA), the Penn State Institute for Sustainability, the Penn State Institute for Energy and the Environment, the Penn State Center for Marine Science and Technology, and the National Science Foundation.

Credit: 
Penn State

Hidden world of stream biodiversity revealed through water sampling for environmental DNA

CORVALLIS, Ore. - For the first time, researchers have used a novel genomics-based method to detect the simultaneous presence of hundreds of organisms in a stream.

Scientists at Oregon State University and the U.S. Forest Service Pacific Northwest Research Station recently published the results of their findings in the journal Environmental DNA.

For the study, the collaborators extracted genetic material from an assortment of physical matter left behind in a stream by a wide range of organisms - from fish to flies - including skin cells and excrement. Using this method, they detected microscopic species as well.

Although they weren't found in this study, this method has the potential to detect potent plant-damaging water molds that are responsible for root and stem rot diseases, and pathogens that cause fungal diseases such as Chytrid fungus, which is killing amphibians all over the world.

Some of the key applications for the method include monitoring disease, invasive species, and rare or endangered species, said Tiffany Garcia, an aquatic ecologist in Oregon State's College of Agricultural Sciences and co-author on the study.

"This is like sampling the air in a terrestrial environment and getting airborne cues from all the different species, which is currently impossible. But with water, it's possible," Garcia said.

The new method could offer an alternative to electrofishing, which sends an electric current through water to temporarily stun fish and has been the chosen method for sampling fish populations in rivers and streams. The new method used by the Oregon State and Forest Service researchers, which used a microfluidic device, has several advantages over electrofishing, according to the researchers.

Collecting environmental DNA is less labor-intensive than electrofishing, it doesn't stress the organisms, and it doesn't require animal handling permits, they said.

"Single species eDNA work has been around for a while, but our use of a microfluidic platform greatly expands the approach," said Laura L. Hauck, a molecular biologist with the Pacific Northwest Research Station and co-lead author on the study. "We can still hone in on a single species of interest but at the same time we are capturing biodiversity and ecosystem health data from hundreds of organisms - all from that same single sample."

For the study, the research team collected water samples from five sites in Fall Creek in the Oregon Coast Range in 2017. Water samples were collected immediately prior to electrofishing surveys. They filtered three-liter samples through fine mesh filters to collect biological particles in the water.

The filters were then brought back to a laboratory to extract and analyze the DNA. Using computer programs, the researchers classified 3.2 million DNA sequences into 828 predicted taxonomic groups by comparing them to sequences contained in GenBank, the international genetic sequence database maintained by the U.S. National Institutes of Health.

"When we compared our water samples to the electrofishing results, we found the same species of fish, amphibians and crayfish, but in addition we detected the whole community of organisms that were inhabiting that stream," said Kevin Weitemier, a research associate at OSU and co-lead author on the study.

The water samples, it turned out, contained DNA from 647 species, including 307 insects.

Credit: 
Oregon State University

Finding alternatives to diamonds for drilling

image: Diamond is one of the only materials hard and tough enough for the job of constant grinding without significant wear, but diamonds are pricey. High costs drive the search for new hard and superhard materials. However, the experimental trial-and-error search is expensive. A simple, reliable way to predict new material properties is needed to facilitate modern technology development. Using a computational algorithm, Russian theorists have published a predictive tool in the Journal of Applied Physics. This image shows an Ashby plot showing materials with the best combination of high hardness and fracture toughness.

Image: 
Kvashnin, Skoltech

WASHINGTON, D.C., July 23, 2019 -- Diamonds aren't just a girl's best friend -- they're also crucial components for hard-wearing industrial components, such as the drill bits used to access oil and gas deposits underground. But a cost-efficient method to find other suitable materials to do the job is on the way.

Diamond is one of the only materials hard and tough enough for the job of constant grinding without significant wear, but as any imminent proposee knows, diamonds are pricey. High costs drive the search for new hard and superhard materials. However, the experimental trial-and-error search is itself expensive.

A simple and reliable way to predict new material properties is needed to facilitate modern technology development. Using a computational algorithm, Russian theorists have published just such a predictive tool in the Journal of Applied Physics, from AIP Publishing.

"Our study outlines a picture that can guide experimentalists, showing them the direction to search for new hard materials," said the study's first author Alexander Kvashnin, from the Skolkovo Institute of Science and Technology and Moscow Institute of Physics and Technology.

As fiber optics, with its fast transmission rate, replaced copper wire communications, so too do materials scientists search to find new materials with desirable properties to support modern technology. When it comes to the mining, space and defense industries, it's all about finding materials that don't break easily, and for that, the optimal combination of hardness and fracture toughness is required. But it's tricky to theoretically predict hardness and fracture toughness. Kvashnin explained that although lots of predictive models exist, he estimates they are 10%-15% out off the mark at best.

The Russian team recently developed a computational approach that considers all possible combinations of elements in Dmitri Mendeleev's periodic table -- christened "Mendelevian search." They've used their algorithm to search for optimal hard and tough materials.

By combining their toughness prediction model with two well-known models for material hardness, the scientists' algorithm learned which regions of chemical space of compounds were most promising for tough, hard phases that could be easily synthesized.

Results were plotted on a "treasure map" of toughness vs. hardness, and the scientists were impressed by what they saw. All known hard materials were predicted with more than 90% accuracy. This proved the search's predictive power, and the newly revealed combinations are potential treasures for industry.

Kvashnin explained he is part of an industrial project devoted to new materials for drilling bits, where experimentalists are now synthesizing one of these hard material treasures -- tungsten pentaboride (WB5).

"This computational search is a potential way to optimize the search for new materials, much cheaper, faster and quite accurately," said Kvashnin, who hopes that this new approach will enable the speedy development of new materials with enhanced properties.

But they aren't stopping there with the theory. They want to use their modern methods and approaches to pin down the general rules for what makes hard and superhard materials among the elements to better guide researchers of the future.

Credit: 
American Institute of Physics

Type of stent affects immediate and long-term outcomes

Miami Beach, FL--A new study comparing the outcomes of different types of stents used to treat cerebral aneurysms shows that the type of stent used affects a patient's immediate and long-term health outcomes. The study was presented at the Society of NeuroInterventional Surgery's (SNIS) 16th Annual Meeting.

Endovascular stent-assisted coiling is a new, minimally invasive procedure in which a stent is placed inside a wide neck aneurysm to anchor tiny coils that will protect damaged blood vessel walls. While the procedure has become more widely used, there has been little research on how each type of stent affects safety and outcomes.

Stent-assisted Coiling of Cerebral Aneurysms: Multi-center Analysis of Radiographic and Clinical Outcomes in 659 Patients compared the outcomes of endovascular coiling using three types of stents - Neuroform (NEU), Enterprise (EP), and Low-profile Visualized Intraluminal Support (LVIS).

"While all the stents were effective, we did find that the LVIS was associated with superior rates of angiographic occlusion in the treatment of cerebral aneurysms," said Dr. Maxim Mokin, lead author of the study, neurointerventionalist, and associate professor in the Department of Neurosurgery and Brain Repair at the University of South Florida. "This study's findings show that randomized trials to study the outcomes of different types of stents would be a good next step to further improve clinical outcomes and safety."

The study's researchers analyzed 659 patients with 670 cerebral aneurysms and considered factors such as patient characteristics, clinical outcomes, and complications. Researchers found a significant difference in complete occlusion among the three stents during angiographic follow-ups: LVIS 84%, NEU 78%, and EP 67%.

Credit: 
Society of NeuroInterventional Surgery

New map outlines seismic faults across DFW region

image: A simplified version of the fault map created by the team of researchers. The map includes faults that are visible at the surface (green) and faults that are underground (black). The solid line indicates underground faults that researchers were able to map at a high resolution. The dotted line indicates faults that were mapped at a medium resolution. According to the research, in the presence of wastewater injection activity, the majority of the faults in the area are as susceptible to slipping as those faults that have already produced earthquakes. The map also marks earthquake locations and waste-water injection well locations and amounts.

Image: 
U of T's Bureau of Economic Geology

DALLAS (SMU) - Scientists from SMU, The University of Texas at Austin and Stanford University found that the majority of faults underlying the Fort Worth Basin are as sensitive to forces that could cause them to slip as those that have hosted earthquakes in the past.

The new study, published July 23rd by the journal Bulletin of the Seismological Society of America (BSSA), provides the most comprehensive fault information for the region to date.

Fault slip potential modeling explores two scenarios: a model based on subsurface stress on the faults prior to high-volume wastewater injection and a model of those forces reflecting increase in fluid pressure due to injection.

None of the faults shown to have the highest potential for an earthquake are located in the most populous Dallas-Fort Worth urban area or in the areas where there are currently many wastewater disposal wells.

Yet, the study also found that the majority of faults underlying the Fort Worth Basin are as sensitive to forces that could cause them to slip and cause an earthquake as those that have hosted earthquakes in recent years.

Though the majority of the faults identified on this map have not produced an earthquake, understanding why some faults have slipped and others with similar fault slip potential have not continues to be researched, said SMU seismologist and study co-author Heather DeShon, who has been the lead investigator of a series of other studies exploring the cause of the North Texas earthquakes.

Earthquakes were virtually unheard of in North Texas until slightly more than a decade ago. But more than 200 earthquakes have occurred in the region since late 2008, ranging in magnitude from 1.6 to 4.0. A series of studies have linked these events to the disposal of wastewater from oil and gas operations by injecting it deep into the earth at high volumes, triggering "dead" faults nearby.

A total of 251 faults have been identified in the Fort Worth Basin, but the researchers suspect that more exist that haven't been identified.

The study found that the faults remained relatively stable if they were left undisturbed. However, wastewater injection sharply increased the chances of these faults slipping, if they weren't managed properly.

"That means the whole system of faults is sensitive," said the lead author of the study Peter L. Hennings, a research scientist from UT Austin's Bureau of Economic Geology and the principal investigator at the Center for Integrated Seismicity Research (CISR).

DeShon said the new study provides fundamental information regarding earthquake hazard to the Dallas-Fort Worth region.

"The SMU earthquake catalog and the Texas Seismic Network catalog provide necessary earthquake data for understanding faults active in Texas right now," she said. "This study provides key information to allow the public, cities, state and federal governments and industry to understand potential hazard and design effective public policies, regulations and mitigation strategies."

"Industrial activities can increase the probability of triggering earthquakes before they would happen naturally, but there are steps we can take to reduce that probability," added co-author Jens-Erik Lund Snee, a doctoral student at Stanford University.

Earthquake rates, like wastewater injection volumes, have decreased significantly since a peak in 2012. But as long as earthquakes occur, earthquake hazard remains. Dallas-Fort Worth remains the highest risk region for earthquakes in Texas because of population density.

Even after the earthquakes died away, North Texas residents have wondered about the region's vulnerability to future earthquakes - especially since no map was available to pinpoint the existence of all known faults in the region. The new data, while still incomplete, benefited from information gleaned from newly released reflection seismic data held by oil and gas companies, reanalysis of publicly available well logs, and geologic outcrop information.

U of T at Austin and Stanford University provided the fault data and calculated fault slip potential. SMU, meanwhile, has been tracking seismic activity--which measures when the earth shakes--since people in the Dallas-Fort Worth area felt the first tremors near DFW International Airport in 2008. A catalog of all those tremors was recently published in June in the journal BSSA.

SMU seismologists have also been the lead or co-authors of a series of studies on the North Texas earthquakes. SMU research showed that many of the Dallas-Fort Worth earthquakes were triggered by increases in pore pressure--the pressure of groundwater trapped within tiny spaces inside rocks in the subsurface. An independent study done by SMU's seismologist Beatrice Magnani found that wastewater injection reactivated dormant faults near Dallas that had been dormant for the last 300 million years.

DeShon said any future plan to mine for oil or natural gas in Fort Worth basin should be done with an understanding that the basin contains several faults that are highly-sensitive to pore-pressure changes. The study noted that rates of injection dropped sharply in the Fort Worth basin, but the practice still continues. Most of the injection that has taken place has been concentrated in the Johnson, Tarrant, and Parker counties, near areas of continued seismic activity.

"The largest earthquake the Dallas-Fort Worth region experienced was a magnitude 4 in 2015" DeShon said. "The U.S. Geological Survey and Red Cross provide practical preparedness advice for your home and work places. Just as we prepare for tornado season in north Texas, it remains important for us to have a plan for experiencing earthquake shaking."

Credit: 
Southern Methodist University

NASA analyzes new Atlantic depression's tropical rainfall

image: The Global Precipitation Measurement mission or GPM core satellite passed over Tropical Depression 3 at 5:21 a.m. EDT (0921 UTC) on July 23. GPM found the heaviest rainfall (orange) was northeast of the center of circulation. There, rain was falling at a rate of 25 mm (about 1 inch) per hour.

Image: 
NASA/JAXA/NRL

Tropical Depression 3 has formed about off the eastern coast of central Florida. NASA analyzed the rainfall that the new depression was generating using the Global Precipitation Measurement mission or GPM core satellite.

The third depression of the Atlantic Ocean hurricane season developed around 5 p.m. EDT on July 22 about 120 miles (195 km) southeast of West Palm Beach, Florida.

The Global Precipitation Measurement mission or GPM core satellite passed over Tropical Depression 3 at 5:21 a.m. EDT (0921 UTC) on July 23. GPM found the heaviest rainfall was northeast of the center of circulation. There, rain was falling at a rate of 25 mm (about 1 inch) per hour. The National Hurricane Center noted in their discussion, "Although deep convection has redeveloped near and to the northeast of the low-level center, the overall convective appearance is somewhat ragged."

On July 23, the National Hurricane Center or NHC noted at 5 a.m. EDT (0900 UTC), the center of Tropical Depression Three was located near latitude 27.0 degrees North and longitude 79.5 degrees west. That puts the center of Tropical Depression 3 (TD3) about 40 miles (70 km) east-northeast of West Palm Beach, Florida, and about 55 miles (90 km) northwest Freeport, Grand Bahama Island.

Maximum sustained winds had increased to near 35 mph (55 kph) with higher gusts. No significant increase in strength is anticipated and the depression is forecast to dissipate on Wednesday, July 24.

The depression is moving toward the north near 12 mph (19 kph). A motion toward north-northeast with an increase in forward speed is expected tonight, followed by a turn toward the northeast on Wednesday.

On the NHC forecast track, the center of the depression should remain offshore the coast of the southeastern United States through Wednesday.

GPM is a joint mission between NASA and the Japan Aerospace Exploration Agency, JAXA.

For forecast updates on TD3, visit: http://www.nhc.noaa.gov.

Credit: 
NASA/Goddard Space Flight Center

Is Instagram behavior motivated by a desire to belong?

image: Cyberpsychology, Behavior, and Social Networking is an authoritative peer-reviewed journal that explores the psychological and social issues surrounding the Internet and interactive technologies.

Image: 
Mary Ann Liebert, Inc., publishers

New Rochelle, NY, July 23, 2019--Does a desire to belong and perceived social support drive a person's frequency of Instagram use? The relationship between these motivating factors as predictors of Instagram use are published in a new study in Cyberpsychology, Behavior, and Social Networking, a peer-reviewed journal from Mary Ann Liebert, Inc., publishers. Click here to read the full-text article free on the Cyberpsychology, Behavior, and Social Networking website through August 23, 2019.

"Desire to Belong Affects Instagram Behavior and Perceived Social Support" was coauthored by Dorothy Wong, Krestina Amon, and Melanie Keep, University of Sydney (Australia). The researchers found that a desire to belong was a significant positive predictor of more frequent Instagram use and perceived social support in general and from friends and significant others. However, frequency of Instagram use did not predict perceived social support, and therefore it did not mediate the relationship between motivation and social support.

"In his well-known 'Hierarchy of Needs,' Abraham Maslow found the need to belong is one of the five innate human needs," says Editor-in-Chief Brenda K. Wiederhold, PhD, MBA, BCB, BCN, Interactive Media Institute, San Diego, California and Virtual Reality Medical Institute, Brussels, Belgium. "Understanding how Instagram and other image-based SNS may help individuals fulfill this need is important as more of our lives are played out online."

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

Stretch-sensing glove captures interactive hand poses accurately

image: The stretch-sensing soft glove captures hand poses in real time and with high accuracy. It functions in diverse and challenging settings.

Image: 
© ETH Zurich.

Capturing interactive hand poses in real time and with realistic results is a well-examined problem in computing, particularly human-centered computing and motion capture technology. Human hands are complex--an intricate system of flexors, extensors, and sensory capabilities serving as our primary means to manipulate physical objects and communicate with one another. The accurate motion capture of hands is relevant and important for many applications, such as gaming, augmented and virtual reality domains, robotics, and biomedical industries.

A global team of computer scientists from ETH Zurich and New York University have further advanced this area of research by developing a user-friendly, stretch-sensing data glove to capture real-time, interactive hand poses with much more precision.

The research team, including Oliver Glauser, Shihao Wu, Otmar Hilliges, and Olga Sorkine-Hornung of ETH Zurich and Daniele Panozzo of NYU, will demonstrate their innovative glove at SIGGRAPH 2019, held 28 July-1 August in Los Angeles. This annual gathering showcases the world's leading professionals, academics, and creative minds at the forefront of computer graphics and interactive techniques.

The main advantage of their stretch-sensing gloves, say the researchers, is that they do not require a camera-based set-up--or any additional external equipment--and could begin tracking hand poses in real time with only minimal calibration.

"To our best knowledge, our gloves are the first accurate hand-capturing data gloves based solely on stretch sensors," says Glauser, a lead author of the work and a PhD student at ETH Zurich. "The gloves are soft and thin, making them very comfortable and unobtrusive to wear, even while having 44 embedded sensors. They can be manufactured at a low cost with tools commonly available in fabrication labs."

Glauser and collaborators set out to overcome some persisting challenges in the replication of accurate hand poses. In this work, they addressed hurdles such as capturing the hand motions in real time in a variety of environments and settings, as well as using only user-friendly equipment and an easy-to-learn approach for set-up. They demonstrate that their stretch-sensing soft gloves are successful in accurately computing hand poses in real-time, even while the user is holding a physical object, and in conditions such as low lighting.

The researchers utilized a silicone compound in the shape of a hand equipped with 44 stretch sensors and attached this to a glove made of soft, thin fabric. To reconstruct the hand pose from the sensor readings, the researchers use a data-driven model that exploits the layout of the sensor itself. The model is trained only once; and to gather training data, the researchers use an inexpensive, off-the-shelf hand pose reconstruction system.

For the study, they compare the accuracy of their sensor gloves to two state-of-the-art commercial glove products. In all but one hand pose, the researchers' novel, stretch-sensing gloves received the lowest error return for each interactive pose.

In future work, the team intends to explore how a similar sensor approach could be used to track a whole arm to get the global position and orientation of the glove, or perhaps even a full body suit. Currently the researchers have fabricated medium-sized gloves, and they would like to expand to other sizes and shapes.

"This is an already well-studied problem but we found new ways to address it in terms of the sensors employed in our design and our data-driven model," notes Glauser. "What is also exciting about this work is the multidisciplinary nature of working on this problem. It required expertise from various fields, including material science, fabrication, electrical engineering, computer graphics, and machine learning."

Credit: 
Association for Computing Machinery

Novel powdered milk method yields better frothing agent

image: Researchers were able to generate a powder that can be used as a natural foaming agent. A potential application is the use in vending machines for cappuccino coffee, as the foam is abundant and lasting. Below is a comparison of two cappuccino coffees ... one made with standard skim-milk powder and the other using powder obtained with high-pressure jet spraying and drying.

Image: 
Federico Harte, Penn State

A novel method of processing -- using high-pressure jets to spray milk and then quickly drying the spray -- yields skim milk powders with enhanced properties and functionality, according to Penn State researchers, who say the discovery may lead to "cleaner" labels on foods.

"Food manufacturers know consumers would like to see products that have ingredients that they can recognize," said Federico Harte, professor of food science. "The hope offered by our work is that we will be able to use milk proteins as emulsifiers or as foaming agents in food products in which a clean label is important, such as ice cream."

Milk proteins yielded by his new processing method could replace food emulsifying and foaming agents such as carrageenan, agar, albumin, alginates, glycerol monostearate, polysorbate, saccharides and lecithin, Harte pointed out.

"On the label, it would just say, 'milk proteins' -- that is something all consumers can recognize, nothing is synthetic," he said. "Concerns about 'clean labels' are growing in the food industry -- these are definitely buzzwords. There is no legal definition for what a clean label is, but the best way I can define it is a label that my grandmother can recognize all the ingredients."

There is nothing wrong with most of these unfamiliar ingredients, so far as we know, Harte added. But, increasingly, consumers do not want them, so the food industry hopes to remove synthetic ingredients such as emulsifiers and foaming agents from the labels of foods, using this novel processing technology.

Among the most promising properties researchers saw in the skim milk powder created by high-pressure jet spraying and drying milk were marked increases in foam expansion and foam-volume stability. That means the skim milk powder is a great candidate for use in lattes, Harte explained.

"The thing that we found most attractive was the enhanced foaming properties, and we may be able to develop vending machine mixing powders consisting of just milk and coffee that will create a long-lasting foam," he said. "How long do we need foam to last in hot lattes? Perhaps not for hours, but think about a bottled cold cappuccino coffee -- that is where we need a longer-lasting foam."

High-pressure jet processing of food is a completely new concept, Harte pointed out, and he has been experimenting with the idea for about four years at Penn State. His latest research, recently published in the Journal of Food Engineering, was conducted in a pilot plant in the Rodney A. Erickson Food Science Building. The study focused on a device that pressurized pasteurized, skim milk using an intensifier pump, and then sprayed the milk through a diamond or sapphire nozzle.

The liquid exits the nozzle as a jet of fine droplets that collide with the air, forming an aerosol. Then the spray is quickly dried to obtain skim milk powders. In comparison to liquids, Harte noted, powders possess a broader spectrum of applications due to the inherent shelf-life stability and lower cost associated with their transportation and storage.

The challenge now is to scale-up the process for industry, Harte explained. And it will not be easy.

"The flow-through of these pumps is relatively low, that is the number one difficulty," he said. "We need to achieve a throughput that is attractive to industry. We are talking about a few liters per minute now, and industry needs hundreds of liters per minute. How do we get from here to there? We are discussing with the manufacturers of the pumps ways to scale this up."

Credit: 
Penn State

Daily e-cigarette use may help smokers quit regular cigarettes

BOSTON-- A new study from the Massachusetts General Hospital's (MGH) Tobacco Research and Treatment Center provides critical population-level evidence demonstrating that using e-cigarettes daily helps U.S. smokers to quit smoking combustible (i.e. regular) cigarettes.

The report, published in Nicotine and Tobacco Research online, provides the first longitudinal data about the effectiveness of e-cigarettes for cessation from a survey that is representative of the U.S. population. The MGH team analyzed data from the first three years of the Population Assessment of Tobacco and Health (PATH) study, a survey representative of the U.S. adult population that interviews the same individuals each year. The survey allowed the researchers to measure an individual's change in tobacco use over time.

Using data from more than 8,000 adult smokers, the investigators measured how likely a smoker was to quit smoking and stay quit, comparing daily and non-daily e-cigarette users with those who smoked only regular cigarettes. They found that smokers who used e-cigarettes every day, compared to e-cigarette non-users, were more likely to quit combustible cigarettes within one year and to stay quit for at least another year. They also found that smokers who used e-cigarettes were no more likely to relapse back to smoking regular cigarettes than smokers not using e-cigarettes.

At the start of the study, 3.6% of smokers were current daily e-cigarette users, 18% were current non-daily e-cigarette users, and 78% did not use e-cigarettes at all. By the second and third years of data gathering, daily e-cigarette users reported a higher rate of prolonged abstinence from cigarette smoking (11%) than non-users (6%). Smokers who used e-cigarettes, but not daily, were not more likely than non-users to demonstrate prolonged abstinence from combustible cigarettes.

"This finding suggests that smokers who use e-cigarettes to quit smoking need to use them regularly - every day--for these products to be most helpful," says lead author Sara Kalkhoran, MD, MAS, MGH physician and assistant professor of Medicine at Harvard Medical School.

"Smokers who plan to stop smoking should still be encouraged to first use FDA-approved therapies rather than e-cigarettes," says Nancy Rigotti, MD, senior author of the paper and director of the MGH Tobacco Research and Treatment Center. FDA-approved therapies for smoking cessation include varenicline, bupropion, or nicotine patches, gum, or lozenges. "But, this study suggests e-cigarettes may be helpful for some smokers who are not able to quit with these existing treatments," she added.

E-cigarettes contain nicotine but do not burn tobacco, which is responsible for many of the health problems associated with smoking combustible cigarettes. "For a smoker, e-cigarettes are less harmful to their health than continuing to smoke cigarettes, says Rigotti, who is also a professor of Medicine at Harvard Medical School. "But e-cigarettes have become popular so quickly that many questions remain about how they can best be used to help smokers to quit and minimize any harm." The third member of this MGH research team was Yuchiao Chang, PhD.

Although the rate of smoking in this country has been falling, the U.S. Centers for Disease Control (CDC) reports that more than 34 million Americans currently smoke cigarettes. Smoking causes more than 480,000 deaths per year in this country alone, including more than 41,000 deaths from secondhand smoke exposure.

Credit: 
Massachusetts General Hospital

NASA finds depression strengthen into Tropical Storm Dalila

image: On July 23 at 5:35 a.m. EDT (0935 UTC), the MODIS instrument that flies aboard NASA's Aqua satellite showed strongest storms (yellow) in Tropical Storm Dalila had cloud top temperatures as cold as minus 80 degrees Fahrenheit (minus 62.2 Celsius).

Image: 
NASA/NRL

Satellite imagery on July 22 showed that wind shear was preventing the Eastern Pacific Ocean's Tropical Depression 5 from consolidating and strengthening. Infrared imagery from NASA's Aqua satellite on July 23 showed that the wind shear eased and the storm was able to strengthen.

NASA's Aqua satellite used infrared light to analyze the strength of storms circled the center of circulation, a change from 24 hours before when wind shear pushed them away from the center. Now, the storm appears much more circular.

Infrared data provides temperature information, and the strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

On July 23 at 5:35 a.m. EDT (0935 UTC), the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Aqua satellite gathered infrared data on the strengthened Tropical Storm Dalila. Strongest thunderstorms had cloud top temperatures as cold as minus 80 degrees Fahrenheit (minus 62.2 Celsius). Cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall.

At 5 a.m. EDT (0900 UTC) on July 23, the National Hurricane Center (NHC) the center of newly formed Tropical Storm Dalila was located near latitude 18.0 degrees north and longitude 117.3 degrees west. That is about 585 miles (945) km) southwest of the southern tip of Baja California, Mexico. There are no coastal watches or warnings in effect.

Maximum sustained winds have increased to near 40 mph (65 kph) with higher gusts. The estimated minimum central pressure is 1005 millibars. Dalila is moving toward the north-northwest near 7 mph (11 kph). A turn to the northwest is anticipated on Wednesday, followed by a movement more to the west-northwest on Thursday and Friday.

NHC noted, "Some weakening is forecast to begin on Wednesday, and Dalila could degenerate into a remnant low on Thursday."

For updated forecasts, visit: https://www.nhc.noaa.gov

Credit: 
NASA/Goddard Space Flight Center

Cancer lab on chip to enable widespread screening, personalized treatment

image: Pathology labs mounted on chips are set to revolutionize the detection and treatment of cancer by using devices as thin as a human hair to analyze bodily fluids. The technology, known as microfluidics, promises portable, cheap devices that could enable widespread screening for early signs of cancer and help to develop personalized treatments for patients, said Ciprian Iliescu, a co-author of a review of microfluidic methods for cancer analysis published in the journal Biomicrofluidics. This image shows circulating tumor cells trapping on a porous membrane using microfluidics (scale bar is 10 micrometers).

Image: 
Florina Silvia Iliescu

WASHINGTON, D.C., July 23, 2019 -- A new generation of pathology labs mounted on chips is set to revolutionize the detection and treatment of cancer by using devices as thin as a human hair to analyze bodily fluids.

The technology, known as microfluidics, promises portable, cheap devices that could not only enable widespread screening for early signs of cancer but also help to develop personalized treatments for patients, said Ciprian Iliescu, a co-author of a review of microfluidic methods for cancer analysis published in the journal Biomicrofluidics, from AIP Publishing.

"If you isolate some cells and expose them to drug candidates, you can predict the response of the patient in advance," said Iliescu, a researcher at IMT-Bucharest in Romania. "Then you can track how the tumor is evolving in response to treatment."

The devices scan blood, saliva or urine for certain cells, proteins or tissue that are produced by tumors and then spread throughout the body.

The use of fluids as a liquid biopsy, instead of a conventional solid biopsy from a tumor, has many advantages. It is less invasive, reducing patient discomfort, and also provides information about hard-to-access tumors, such as in unborn children.

Because the biological clues, or biomarkers, of cancer end up in the bloodstream, a liquid biopsy can give insights to genomic state of all cancer in the body, including at its primary site and if it has spread. The authors call these insights understanding the "global molecular status of the patient."

The biggest challenge is the diversity of cancer. Each of the more than 100 known cancers have their own biomarkers, which the authors classify into four categories: cellular aggregates (circulating tumor microemboli); free cells (circulating tumor cells, circulating endothelial progenitor cells and cancer stem cells); platelets and cellular vesicles (exosomes) and macro- and nanomolecules (nucleic acids and proteins).

A wide range of microfluidic devices are being designed to isolate these biomarkers, leveraging on the boom in nanofabrication in recent decades. Complex structures, such as forked flow channels, pillars, spirals and pools, precisely sieve and control flow rates, while surfaces are lined with molecules that attract specific species. Some devices also use electrical, magnetic or acoustic fields to help select the biomarker target and even have smart, built-in electronic circuits for data processing.

There are already devices on the market, such as CellSearch, which isolate circulating tumor cells. However, more sensitive and faster systems are being developed for many different cancer biomarkers.

Combining more than one method may help with accuracy, although at the cost of speed. Sensitivity can also be improved by culturing the biomarkers to increase their concentration. Iliescu said the field has potential but is still in its infancy.

"We need more and more clinical tests to bring this technology to maturity," he said.

Credit: 
American Institute of Physics

A new concept for self-assembling micromachines

image: Wheel mounting in seconds: as soon as a non-uniform electric field is switched on, the chassis of a microvehicle pulls its own wheels into wheel pockets. After just over a second, all the wheels are in place.

Image: 
MPI for Intelligent Systems / <i>Nature Materials</i> 2019

In the future, designers of micromachines can utilize a new effect. A team led by researchers from the Max Planck Institute for Intelligent Systems in Stuttgart have presented a concept that enables the components of microvehicles, microrotors and micropumps to assemble themselves in an electric field. The new concept may help to construct medical microrobots for use in the human body or to fit laboratory devices on a microchip.

Approximately half the thickness of a human hair, microvehicles could in the future deliver drugs directly to the source of disease, help with diagnosis and take minimally invasive surgery to the next level. However, miniaturization is also of interest for medical, biological and chemical laboratories. With a laboratory on a microchip, medical or environmental chemistry analyses that currently require a room full of equipment could also be performed on the move.

Researchers have long relied on methods to build tiny machines that rely on components finding each other: magnetic particles that come together in a magnetic field, for example, or components that dock to each other thanks to chemical reactions. They now have an additional principle for self-assembly of micro-machines in their toolbox. Scientists working under Metin Sitti, Director at the Max Planck Institute for Intelligent Systems, achieve this using "dielectrophoresis". This involves an electric field of varying strength polarising an electrically insulating plastic frame along with further plastic or quartz glass components. The polarized components, in turn, modify the non-uniform electric field. This is dependent on their shape and can be theoretically modelled by a computer. "If we change the shape of the components, we can control how the components attract each other," explains Yunus Alapan, who was instrumental in developing the concept. By carefully designing the components, a field is formed in which the parts position themselves precisely alongside each other as required for the construction.

A self-assembling magnetically propelled microvehicle

The technique allowed the researchers to design a microvehicle with a non-magnetic chassis and magnetic beads as wheels. "We designed the chassis with wheel pockets because, structurally, this generates forces that are ideal for attracting the magnetic wheels," says Alapan. "Only seconds after we turned on the electric field, the wheels were pulled into the wheel bags."

For the vehicle to drive, the wheels need to be able to freely rotate. And this is precisely one of the advantages of the approach pursued by the Stuttgart researchers. "The components of our micromachines are not tightly bound," says Berk Yigit, who was involved in the research for his doctorate. "Rather than forming rigid connections, each part can move independently." The researchers were, therefore, able to drive the microvehicle using a rotating magnetic field that, likewise, rotated the wheels.

Utilising the concept of dielectrophoretic self-assembly, the scientists from Stuttgart were able to assemble many other types of micromachines, including a micropump that could be deployed in a laboratory on a chip. They also designed machines that assemble themselves from several larger and smaller components into a more complex structure. And, using the electric field, they repositioned a microsphere, to form a type of miniaturized bumper car. In one position they could propel the vehicle, while in two others they could turn it to the left or right. "Micromachines that have a high degree of mobility could in the future be used to deliver drugs to manipulate individual cells - currently, constructing machines of this size is a huge challenge," says Metin Sitti. "Our new approach has the potential to reduce the complexity of such construction".

Credit: 
Max-Planck-Gesellschaft

Monsoon rains have become more intense in the southwest in recent decades

image: Individual, isolated monsoon rainstorms have gotten more intense and are happening more often in the Southwest US, according to an ARS study.

Image: 
ARS-USDA

TUCSON, ARIZONA, July 23, 2019--Monsoon rain storms have become more intense in the southwestern United States in recent decades, according to a study recently published by Agricultural Research Service scientists.

Monsoon rains--highly localized bursts of rain--have become stronger since the 1970s, meaning the same amount of rain falls in a shorter amount of time--by 6 to 11 percent. In addition, the number of rainfall events per year increased on average 15 percent during the 1961-2017 period.

Monsoon rain events are usually the result of strong convection or upwelling air currents due to the difference in temperature between the earth's hot surface and the cooler atmosphere. It is characterized by intense downpours that fall in less than 1 hour.

"We attribute these monsoon rain increases to climate change in the southwest, which the General Circulation Models (GCMs) predicted would happen if the atmosphere gets warmer. What is unique about our study is that we have validated the GCM simulations with observed rainfall data," explained hydrologist/meteorologist Eleonora M. C. Demaria with the ARS Southwest Watershed Research Center in Tucson, Arizona, who co-led the study.

Temperatures in the Southwest have increased by 0.4 degrees F (0.22 degrees Celsius) on average per decade, which is likely a result of global climate change.

While the storms were, on average, each more intense, they do not appear to be larger or cover more territory during each one.

"It is crucial that we track changes in individual rain storm intensities, especially in regions like the Southwest, where high-intensity, short duration storms are responsible for the majority of the annual rain fall. Such changes can have important impacts on the ecology and are more likely to cause problems such as flash floods," Demaria added. "These results also mean rangeland producers will also need more robust soil conservation plans to protect soils from erosion."

For transportation departments and developers, designs of bridges, culverts, and overall storm water drainage infrastructure must also be more robust, and more expensive, to handle the more intense rains."

This study is the first time the intensity of individual, very localized monsoon rain storms has been able to be measured in the Southwest. Before now, analyses of the impact of a warmer atmosphere on monsoon rainfall intensities were contradictory. Some studies reported increases in rainfall intensities over time and others found decreases. These discrepancies hail from an inadequate number of rain gauges being used in the analysis that were too far apart to capture the variability of monsoon storms, or climate models with grid cells that were too large to represent intense, but isolated thunderstorms.

While as many as 90 percent of the individual rain gauges, which are now as close as 2,100 feet (640 m) apart, showed an increase in rainfall intensity, there were still some rain gauges that showed either a decrease or no change in intensity, which reflects the wide variability in where and how monsoon rains fall.

"But the Walnut Gulch Experimental Watershed rain gauge network in the southwest, now part of the nationwide Long Term Agro-ecosystems Research network, developed by ARS in the 1950s, made the precision measurements possible to give us a definitive answer. It was designed to be as spatially uniform as possible to be able to capture summer storms that are short-lived and localized to a small area," said ARS research hydraulic engineer David Goodrich, co-leader of the study.

Credit: 
US Department of Agriculture - Agricultural Research Service