Brain

With a zap of light, system switches objects' colors and patterns

image: A new system uses UV light projected onto objects coated with light-activated dye to alter the reflective properties of the dye, creating images in minutes.

Image: 
Image courtesy of Michael Wessley, Stefanie Mueller, et al

When was the last time you repainted your car? Redesigned your coffee mug collection? Gave your shoes a colorful facelift?

You likely answered: never, never, and never. You might consider these arduous tasks not worth the effort. But a new color-shifting "programmable matter" system could change that with a zap of light.

MIT researchers have developed a way to rapidly update imagery on object surfaces. The system, dubbed "ChromoUpdate" pairs an ultraviolet (UV) light projector with items coated in light-activated dye. The projected light alters the reflective properties of the dye, creating colorful new images in just a few minutes. The advance could accelerate product development, enabling product designers to churn through prototypes without getting bogged down with painting or printing.

ChromoUpdate "takes advantage of fast programming cycles -- things that wouldn't have been possible before," says Michael Wessley, the study's lead author and a postdoc in MIT's Computer Science and Artificial Intelligence Laboratory.

The research will be presented at the ACM Conference on Human Factors in Computing Systems this month. Wessely's co-authors include his advisor, Professor Stefanie Mueller, as well as postdoc Yuhua Jin, recent graduate Cattalyya Nuengsigkapian '19, MNG '20, visiting master's student Aleksei Kashapov, postdoc Isabel Qamar, and Professor Dzmitry Tsetserukou of the Skolkovo Institute of Science and Technology.

ChromoUpdate builds on the researchers' previous programmable matter system, called PhotoChromeleon. That method was "the first to show that we can have high-resolution, multicolor textures that we can just reprogram over and over again," says Wessely. PhotoChromeleon used a lacquer-like ink comprising cyan, magenta, and yellow dyes. The user covered an object with a layer of the ink, which could then be reprogrammed using light. First, UV light from an LED was shone on the ink, fully saturating the dyes. Next, the dyes were selectively desaturated with a visible light projector, bringing each pixel to its desired color and leaving behind the final image. PhotoChromeleon was innovative, but it was sluggish. It took about 20 minutes to update an image. "We can accelerate the process," says Wessely.

They achieved that with ChromoUpdate, by fine-tuning the UV saturation process. Rather than using an LED, which uniformly blasts the entire surface, ChromoUpdate uses a UV projector that can vary light levels across the surface. So, the operator has pixel-level control over saturation levels. "We can saturate the material locally in the exact pattern we want," says Wessely. That saves time -- someone designing a car's exterior might simply want to add racing stripes to an otherwise completed design. ChromoUpdate lets them do just that, without erasing and reprojecting the entire exterior.

This selective saturation procedure allows designers to create a black-and-white preview of a design in seconds, or a full-color prototype in minutes. That means they could try out dozens of designs in a single work session, a previously unattainable feat. "You can actually have a physical prototype to see if your design really works," says Wessely. "You can see how it looks when sunlight shines on it or when shadows are cast. It's not enough just to do this on a computer."

That speed also means ChromoUpdate could be used for providing real-time notifications without relying on screens. "One example is your coffee mug," says Wessely. "You put your mug in our projector system and program it to show your daily schedule. And it updates itself directly when a new meeting comes in for that day, or it shows you the weather forecast."

Wessely hopes to keep improving the technology. At present, the light-activated ink is specialized for smooth, rigid surfaces like mugs, phone cases, or cars. But the researchers are working toward flexible, programmable textiles. "We're looking at methods to dye fabrics and potentially use light-emitting fibers," says Wessely. "So, we could have clothing -- t-shirts and shoes and all that stuff -- that can reprogram itself."

The researchers have partnered with a group of textile makers in Paris to see how ChomoUpdate can be incorporated into the design process.

Credit: 
Massachusetts Institute of Technology

Being around children makes adults more generous

Adults are more compassionate and are up to twice as likely to donate to charity when children are present, according to a new study from psychologists.

The research, conducted by social psychologists at the University of Bath and Cardiff University and funded by the Economic and Social Research Council (ESRC), examined how the presence of children influences adults' compassionate motivations and behaviours.

Across eight experiments and more than 2,000 participants, the researchers asked adults to describe what typical children are like. After focusing on children in this way, participants subsequently indicated higher motivations towards compassionate values, such as helpfulness and social justice, and they reported greater empathy with the plight of other adults.

In a field study, which built on these findings, the researchers found that adult passers-by on a shopping street in Bath were more likely to donate to charity when more children were around relative to adults.

When no children were present and all passers-by were adults, a student research team from the University of Bath observed roughly one donation every ten minutes. But when children and adults were equally present on the shopping street, adult passers-by made two donations every ten minutes.

These effects could not be accounted for by higher footfall during busy times or whether donors were accompanied by a child or not. Instead, they suggest that the presence of children can nudge adults to behave more generously and donate more often. The on-street donations were made to 'Bath Marrow', a charity which supports people with blood cancer.

Interestingly, these findings point to a widely applicable effect. The researchers observed that the 'child salience effect' was evident among both parents and non-parents, men and women, younger and older participants, and even among those who had relatively negative attitudes towards children. The researchers involved suggest these effects could also have widespread implications.

Lead researcher Dr Lukas Wolf from the Department of Psychology at Bath explains: "While previous evidence has shown that we are typically more helpful and empathetic towards children, no research has been done to date to examine whether the presence of children alone encourages us to be more pro-social towards others in general. Our research addresses this gap by showing that the presence of children elicits broad pro-social motivation and donation behaviour towards causes not directly related to children."

Dr Wolf says that this potential for widespread effect is important because it indicates society needs to consider new ways to involve children more directly in various aspects of life.

"Our findings showing the importance of children for compassionate behaviour in society provides a glimpse of a much bigger impact," he says.

"Children are indirectly dependent on how adults behave towards each other and towards the planet. Yet, children are also separated from many adult environments, such as workplaces and from political bodies where important decisions affect their futures."

He adds: "The finding that the presence of children motivates adults to be more compassionate towards others calls for more integration of children in contexts where adults make important long-term decisions, such as on climate change."

Various initiatives over recent years have been established to increase the prominence of young voices, for example Children's Parliaments. Future work from the researchers involved in this study will look in more detail at the nature of the child salience effect and its ramifications for society and the planet.

Credit: 
University of Bath

New understanding of ovarian follicle development may lead to novel reproductive therapies

BOSTON -- For the first time, researchers have shown how Mullerian inhibiting substance (MIS), also known as anti-Mullerian hormone, a key reproductive hormone, suppresses follicle development and prevents ovulation in females. "Understanding the mechanism of follicle development by MIS opens the door to creating novel approaches to contraception, preserving the eggs of young girls undergoing chemotherapy, enhancing the success of fertility treatment, and potentially delaying menopause," says David Pépin, PhD, an associate molecular biologist in the Department of Surgery at Massachusetts General Hospital (MGH) and senior author of new research published in the Proceedings of the National Academy of Sciences (PNAS). (In press at PNAS, DOI 10.1073/pnas.2100920118.)

Follicles are like small cocoons within the ovary that house eggs, which, when activated, nurture the growth of an egg and secrete hormones that influence stages of the menstrual cycle. Women are born with all the ovarian follicles and immature eggs they will ever have--about a million--which are continuously used until they are depleted at menopause. Nearly all these follicles will never reach maturity, instead mostly degenerating during growth, leaving only the best to ovulate. As a result, only a few hundred will ever reach ovulation starting at puberty. "Even in utero, primordial--or immature--follicles start to activate and most are lost even before puberty is reached," says Pépin. Still, some primordial follicles can stay dormant for decades until they are roused and grow large enough to release an egg, a process that can take as long as a year. "One role of MIS is to slow the development of primordial follicles so that they last throughout the entire reproductive lifespan," says Pépin. "But until now, we didn't know how primordial follicles responded to MIS to stay dormant."

In a series of experiments using mice, the researchers conclusively showed that there is an MIS receptor on the primordial follicles' granulosa cells, which guide egg development--an area of previous debate--and that the hormone inhibits their growth, keeping them dormant. Surprisingly, MIS treatment also inhibited almost every cell type within the ovary and interfered with the communication between germ cells and granulosa cells, which is necessary to coordinate follicle growth. The researchers injected mice with a gene-therapy virus, which caused them to produce elevated levels of MIS.

"We discovered that high amounts of the hormone will shut down the ovaries, putting them in a kind of hibernation and preventing follicles from growing normally," says Marie-Charlotte Meinsohn, PhD, a research fellow in the Department of Surgery at MGH and lead author of the study. The researchers then identified the genes that were regulated by MIS in dormant follicles.

The research has multiple applications that are currently being investigated. Targeting the MIS receptor with a drug, for example, may help preserve the primordial follicles in girls undergoing chemotherapy for cancer, thus avoiding infertility. Learning how primordial follicles remain dormant could teach us how to slow the aging process in the ovaries by maintaining a greater reserve of follicles, thereby maintaining production of hormones such as estrogen. Delaying menopause may not only expand a woman's reproductive life, but it might also delay some of the health problems women encounter with the loss of estrogen after menopause, leading to healthier aging.

Therapeutic treatment with MIS may also enhance the success of in vitro fertilization. "One of the challenges of IVF is to synchronize the development of multiple follicles so one can retrieve more eggs," says Pépin. "If we can temporarily suppress follicle advancement with MIS, more synchronized follicles will be available to be stimulated with fertility treatment, resulting in the retrieval of many more eggs," as the team recently showed in a publication in the Journal of the Endocrine Society.

The researchers are also studying MIS as a novel hormonal contraceptive. "Other hormone contraceptives interfere with ovulation, which occurs at a late stage of follicle development," says Pépin. "We are interested in developing a contraceptive that blocks primordial follicles from maturing at an earlier stage so ovulation can't occur."

A contraceptive targeted at early follicle development could also prevent cycling and menstruation, which would be beneficial in disorders such as endometriosis and the heavy bleeding that can occur with uterine fibroids. The researchers are also investigating whether an injection of a gene-therapy virus that elevates levels of MIS can provide permanent contraception for feral cats and dogs.

The next step for the researchers is determining which of the identified genes regulated by MIS play the most important role in preventing primordial follicle activation. "Some of the pathways identified in this study could represent new drug targets which would allow us to translate MIS for the benefit of women's health," says co-author Patricia K. Donahoe, MD, director of the Pediatric Surgical Research Laboratories.

Credit: 
Massachusetts General Hospital

As wildfires increase in severity, experts call for coordinated federal response

image: Wildland Fire Workshop Report from the American Thoracic Society shares recommendations for the federal government ahead of wildfire season.

Image: 
ATS

(New York, NY) - May 3, 2021 - In advance of a wildfire season projected to be among the worst, the American Thoracic Society has released a report that calls for a unified federal response to wildfires that includes investment in research on smoke exposure and forecasting, health impacts of smoke, evaluation of interventions, and a clear and coordinated communication strategy to protect public health.

The report, Respiratory Impacts of Wildland Fire Smoke: Future Challenges and Policy Opportunities, was published online ahead of print in the Annals of the American Thoracic Society on May 3, 2021.

The report comes at a time when the U.S. is experiencing an increasing frequency of very large destructive wildfires, due to years of fire suppression, population expansion, and a lengthening of the fire season because of climate change. For example, the Camp fire in California in 2018 is believed to be the most deadly in the state's history and parts of the Midwest have also experiences record-breaking fire devastation. Inhalation of wildfire smoke can have serious health consequences, particularly for children, older adults, and those with lung diseases such as asthma and COPD. Less appreciated is that smoke plumes can travel hundreds and even thousands of miles affecting regions remote to the fire where there may not even be a realization that the smoke has traveled such a distance and impacted air quality.

The report highlighted the need for more research "on the effects of long-term and repeated wildland fire smoke pollution on respiratory, cardiovascular, neurological and psychological health across life-stages, including developing fetuses and children" as well as firefighters.

"Wildland fires release complex mixtures of particles and gases into the air that, when inhaled, can harm the lungs and cardiovascular system in a number of ways," said Mary Rice, MD, MPH, the lead author of the report, a pulmonologist who studies the respiratory health effects of pollution. "We see the health effects of wildfire pollution in terms of more frequent asthma symptoms, emergency visits and hospitalization for asthma and COPD, and higher mortality during smoke events."

Children, whose lungs are still developing, and adults with lung disease are among those at highest risk, especially those in areas that experience fires season after season. Firefighters are repeatedly exposed to high levels of smoke, often with little respiratory protection. The report emphasizes that little is known about the long-term consequences of repeated smoke exposure, or the potential benefits of interventions to mitigate health effects.

The authors of the report maintain that a coordinated approach across agencies is essential to assessing and ultimately developing strategies to manage the health risk of wildland fires.

"Air pollution resulting from wildfires crosses borders and economic lines," said Dan Costa, ScD, a co-author of the report. "Therefore, dedicated research funding and ongoing support from the federal government is needed to fill knowledge gaps, including developing models for fire management as well as assessing toxicity levels and interventions."

Despite the public health impact, the authors found that the public's and even many physicians' appreciation of the health risks and consequences of wildfire smoke exposure was lacking.

"The air quality problem of wildfire smoke is only going to get worse," says Dr. Rice. "The medical community is in need of evidence-based guidance to protect patients and their families from the harms of smoke inhalation."

The report is the culmination of a workshop that convened in 2019 comprising a multidisciplinary group of 19 experts, including pediatric and adult pulmonologists, toxicologists, epidemiologists, public health officials and wildland fire managers.

Credit: 
American Thoracic Society

Equipping crop plants for climate change

Biologists at Ludwig-Maximilians-Universitaet (LMU in Munich) have significantly enhanced the tolerance of blue-green algae to high light levels - with the aid of artificial evolution in the laboratory. 

Sunlight, air and water are all that cyanobacteria (more commonly known as blue-green algae), true algae and plants need for the production of organic (i.e. carbon-based) compounds and molecular oxygen by means of photosynthesis. Photosynthesis is the major source of building blocks for organisms on Earth. However, too much sunlight reduces the efficiency of photosynthesis because it damages the 'solar panels', i.e. the photosynthetic machineries of cyanobacteria, algae and plants. A team of researchers led by LMU biologist Dario Leister has now used "artificial laboratory evolution" to identify mutations that enable unicellular cyanobacteria to tolerate high levels of light. The long-term aim of the project is to find ways of endowing crop plants with the ability to cope with the effects of climate change.

The cyanobacteria used in the study were derived from a strain of cells that were used to grow at low levels of light. "To enable them to emerge from the shadows, so to speak, we exposed these cells to successively higher light intensities," says Leister. In an evolutionary process based on mutation and selection, the cells adapted to the progressive alteration in lighting conditions - and because each cell divides every few hours, the adaptation process proceeded at a far higher rate than would have been possible with green plants. To help the process along, the researchers increased the natural mutation rate by treating cells with mutagenic chemicals and irradiating them with UV light. By the end of the experiment, the surviving blue-green algae were capable of tolerating light intensities that were higher than the maximal levels that can occur on Earth under natural conditions.

To the team's surprise, most of the over 100 mutations that could be linked to increased tolerance to bright light resulted in localized changes in the structures of single proteins. "In other words, the mutations involved primarily affect the properties of specific proteins rather than altering the regulatory mechanisms that determine how much of any given protein is produced," Leister explains. As a control, the team then introduced the genes for two of the altered proteins, which affect photosynthesis in different ways, into non-adapted strains. - And in each case, they found that the change indeed enabled the altered cells to tolerate higher light intensities than the progenitor strain.

Enhancing the tolerance of crop plants to higher or fluctuating light intensities potentially provides a means of increasing productivity, and is of particular interest against the background of ongoing global climate change. "Application of genetic engineering techniques to plant breeding has so far concentrated on quantitative change - on making more or less of a specific protein," says Leister. "Our strategy makes qualitative change possible, allowing us to identify new protein variants with novel functions. Insofar as these variants retain their function in multicellular organisms, it should be possible to introduce them into plants."

Credit: 
Ludwig-Maximilians-Universität München

Newly identified saber-toothed cat is one of largest in history

image: Image of the humerus bone excavated from north central Oregon, which is now on display in the University of Oregon Museum of Natural and Cultural History.

Image: 
Photo courtesy of John Orcutt

COLUMBUS, Ohio - A giant saber-toothed cat lived in North America between 5 million and 9 million years ago, weighing up to 900 pounds and hunting prey that likely weighed 1,000 to 2,000 pounds, scientists reported today in a new study.

The researchers completed a painstaking comparison of seven uncategorized fossil specimens with previously identified fossils and bone samples from around the world to describe the new species. Their finding makes a case for the use of the elbow portion of the humerus - in addition to teeth - to identify fossils of large saber-toothed cats whose massive forearms enabled them to subdue their prey.

The newly identified cat weighed an average of around 600 or so pounds and could have managed to kill prey weighing up to 6,000 pounds, the scientists estimate, suggesting that their findings provide evidence for another giant cat, one of the largest in Earth history.

"We believe these were animals that were routinely taking down bison-sized animals," said study co-author Jonathan Calede, an assistant professor of evolution, ecology and organismal biology at The Ohio State University's Marion campus. "This was by far the largest cat alive at that time."

Calede completed the study with John Orcutt, assistant professor of biology at Gonzaga University, who initiated the project. Orcutt found a large upper arm bone specimen that had been labeled as a cat in the University of Oregon Museum of Natural and Cultural History collection when he was a graduate student, and collaborated with Calede on the years-long effort to figure out what kind of cat it could be.

They have determined that the new species is an ancient relative of the best-known saber-toothed cat Smilodon, the famous fossil found in the La Brea Tar Pits in California that went extinct about 10,000 years ago.

The Oregon specimen was excavated on the traditional lands of the Cayuse, a tribe joined with the Umatilla and Walla Walla in the Confederated Tribes of the Umatilla Indian Reservation. In recognition of its origin, Calede and Orcutt collaborated with the Tamástslikt Cultural Institute to name the new species Machairodus lahayishupup. Machairodus is a genus of large saber-toothed cats that lived in Africa, Eurasia and North America, and in the Old Cayuse language, Laháyis Húpup means "ancient wild cat."

The study is published today (May 3, 2021) in the Journal of Mammalian Evolution.

Orcutt and Calede found similar uncategorized upper arm fossil specimens at the Idaho Museum of Natural History, where a big cat forearm was accompanied by teeth - generally considered the gold standard for identifying new species - as well as at the University of California Museum of Paleontology and Texas Memorial Museum.

"One of the big stories of all of this is that we ended up uncovering specimen after specimen of this giant cat in museums in western North America," Orcutt said. "They were clearly big cats. We started with a few assumptions based on their age, in the 5 1/2 to 9 million-year-old range, and based on their size, because these things were huge.

"What we didn't have then, that we have now, is the test of whether the size and anatomy of those bones tells us anything - and it turns out that yes, they do."

The largest of the seven Machairodus lahayishupup humerus fossils available for the analysis was more than 18 inches long and 1.7 inches in diameter. By comparison, the average modern adult male lion's humerus is about 13 inches long.

The researchers hypothesized that if an isolated forearm bone were useful in telling species apart, that would be true among the big cat species alive today. Calede and Orcutt visited numerous museums in the U.S., Canada and France to photograph forearm specimens of lions, pumas, panthers, jaguars and tigers, as well as fossils of previously identified extinct big cats.

Calede used software to place landmark points on each digitized sample that, when drawn together, would create a model of each elbow.

"We found we could quantify the differences on a fairly fine scale," Calede said. "This told us we could use the elbow shape to tell apart species of modern big cats.

"Then we took the tool to the fossil record - these giant elbows scattered in museums all had a characteristic in common. This told us they all belonged to the same species. Their unique shape and size told us they were also very different from everything that is already known. In other words, these bones belong to one species and that species is a new species."

The researchers calculated estimates of the new species' body size based on the association between humerus size and body mass in modern big cats, and speculated about the cat's prey based on its size and animals known to have lived in the region at that time: rhinoceros were particularly abundant, as well as giant camels and giant ground sloths.

The teeth from the Idaho Museum of Natural History came from the lower part of the jaw and did not include the saber-shaped canines, but provided additional evidence that the fossil belonged to the Machairodus genus, which gave its name to the machairodontines - the technical name for a saber-toothed cat, Orcutt said.

"We're quite confident it's a saber-toothed cat and we're quite confident it's a new species of the Machairodus genus," he said. "The problem is, in part because we haven't necessarily had a clear image in the past of how many species were out there, our understanding of how all these saber-toothed cats are related to each other is a little fuzzy, particularly early in their evolution."

Establishing that the humerus alone can be analyzed to identify a fossil cat has important implications for the field - saber-toothed cats' "big, beefy" forearm bones are the most common specimens of fossil cats found in excavations, he said.

Only a reconstruction of the evolutionary history of saber-toothed cats can determine where this new species fits in, but Orcutt and Calede believe Machairodus lahayishupup existed early in the evolution of the group.

A discovery that this giant cat in North America existed at the same time similar animals lived around the world also raises another evolutionary question, Calede said.

"It's been known that there were giant cats in Europe, Asia and Africa, and now we have our own giant saber-toothed cat in North America during this period as well," he said. "There's a very interesting pattern of either repeated independent evolution on every continent of this giant body size in what remains a pretty hyperspecialized way of hunting, or we have this ancestral giant saber-toothed cat that dispersed to all of those continents.

"It's an interesting paleontological question."

Credit: 
Ohio State University

Consumers make decisions based on how and why products are recommended online

UNIVERSITY PARK, Pa. -- As more people go online for shopping, understanding how they rely on e-commerce recommendation systems to make purchases is increasingly important. Penn State researchers now suggest that it's not just what is recommended, but how and why it's recommended, that helps to shape consumers' opinions.

In a study, the researchers investigated how people reacted to two product recommendation systems. The first system generated recommendations based on the user's earlier purchases -- often referred to as content-based recommendation systems. The second provided recommendations based on what other people bought -- called collaborative recommendation systems.

The researchers, who report their findings in the Journal of Advertising, found that people who like to think and solve problems for themselves -- a personality type the researchers describe as "high need for cognition" -- find content-based recommendations more persuasive. However, those who are low in their need for cognition are more persuaded by collaborative recommendation systems, which may serve as a signal that other buyers have already vetted the product for them.

The nature of the recommendation system and its degree of confidence in suggesting the right products can be very important in guiding people when making online purchases, said S. Shyam Sundar, James P. Jimirro Professor of Media Effects in the Donald P. Bellisario College of Communications and co-director of the Media Effects Research Laboratory.

"In the pre-Internet era, before artificial intelligence, we would ask another person at a cocktail party, 'I heard you went to Italy, can you give me some recommendations, I'm going there next month,' as a way of gathering information for making our decisions," said Sundar, who is also an affiliate of Penn State's Institute for Computational and Data Sciences. "Now, we go online and can access information from just about everybody who has gone to Italy last month, not just the friend you ran into at the cocktail party. You are now able to get that information about the collective experience of others, as well as how it squares with your own background and prior travels."

According to Mengqi Liao, a doctoral student in mass communication and first author of the paper, a subtle "bandwagon effect" may be persuading people.

"From a layperson's perspective, we might not know that these are actually two different recommendation systems," said Liao. "One system might just tell the customer that the recommendation is based on what they bought before. But the collaborative recommendation system conveys that a lot of other people bought this product, which adds another layer of persuasive appeal."

The researchers also found that the effectiveness of the recommendation systems was tied to the type of product that the system recommended. When making decisions about experiences, such as movies, travel destination and dining, consumers with a high need for cognition were more likely to respond to information about the extent to which the recommended product reflects their personal preferences -- expressed in terms of percentage match of products recommended by content-based filtering systems.

However, consumers with low need for cognition preferred collaborative filtering because they were more persuaded by the percentage of other people who purchased the recommended item, which also promoted their intentions to buy the item.

Such differences were not found for recommendations of "search products," information about which can be obtained by searching online. Both personality types preferred collaborative recommendation systems.

"You can think of it as a sort of cognitive outsourcing," said Sundar. "A customer might see the ad for a smart watch, for example, and see the features, but think, 'I'm not going to do the hard work of examining all the details and coming to a conclusion of which is better, I'll just outsource this to others.' If they say it's a good smart watch, then they'll buy it."

According to Liao, most research into recommendation systems focuses on optimizing the suggestions of these systems. These findings suggest that developers may need to consider other factors, such as personality types and product types, for improving the user experience of their systems, rather than on focusing solely on the accuracy of their algorithm's suggestions.

"A lot may depend on how users receive the information on the recommendations provided by the systems," said Liao. "It matters why these systems are providing the recommendations for products and experiences."

The researchers recruited 469 people on an online crowdsourced microtask site for the study and randomly assigned them to an experimental website that either used a collaborative or content filtering algorithm.

For collaborative systems, the researchers used a percentage range to indicate how many similar people used the recommended product -- or percentage match -- and serve as a cue for the bandwagon effect. For content-based systems, the same percentage numbers were used to suggest the extent to which the recommended product matched the consumer's personal characteristics based on their user profile. There were three levels of percentage match indicators -- low, medium and high.

In testing the two different types of products -- search and experience -- the researchers used a smart watch recommendation as an example of a search product and a tourism destination recommendation to explore participants' reactions to experience products.

Before they browsed the e-commerce site, all participants responded to a series of questions to determine whether they were high need for cognition, or low need for cognition, personality types.

Because the researchers only tested two products and two common recommendation systems, future research could look at the psychological effects of other systems and investigate other types of products. The researchers said this could help verify the validity of their findings.

Credit: 
Penn State

CO2 catalysis made more accessible

Many industrial processes emit carbon dioxide into the atmosphere. Unfortunately, however, current electrochemical separation methods are expensive and consume large amounts of power. They also require expensive and rare metals as catalysts. A study in the journal Angewandte Chemie describes a new aerogel electrocatalyst formed from an inexpensive metal alloy, which enables highly efficient electrochemical conversion of carbon dioxide. The main product is formic acid, which is a nontoxic basic chemical.

Capturing and chemically fixing carbon dioxide from industrial processes would be a huge step towards carbon neutrality. To prevent the notorious greenhouse gas from escaping into the air, it can be compressed and stored. Another option is electrochemical conversion to give other carbon compounds.

However, due to high power consumption and the cost of catalysts, electrochemical separation methods cannot be used on an industrial scale. This prompted Tianyi Ma of Swinburne University of Technology in Hawthorn, Australia, and colleagues to investigate replacement materials. The electrocatalysts currently used are made from precious metals such as platinum and rhenium. They catalyze electrochemical carbon fixation processes very efficiently, but they are also very expensive.

The authors discovered that the nonprecious metals tin and bismuth can form aerogels, which are incredibly light materials with particularly promising catalyst properties. Aerogels contain an ultraporous network that promotes electrolyte transport. They also offer up abundant sites where the electrochemical processes can take place.

To produce the aerogels, the team mixed a solution of bismuth and tin salts with a reducing agent and a stabilizer. Simply stirring this mixture led to a stable hydrogel of a bismuth-tin alloy after six hours at room temperature. A straightforward freeze-drying process produced the aerogel, formed of loosely interwoven and branched nanowires.

The authors found the bimetallic aerogel performed outstandingly well for carbon dioxide conversion. Compared to pure bismuth, pure tin, or the non-freeze-dried alloy, a significantly higher current density was observed. The conversion took place with an efficiency of 93%, which was at least as efficient, if not more so, than the standard materials currently used, indicating a low-waste process.

The process showed "excellent selectivity and stability for the production of formic acid under normal pressure at room temperature." The only byproducts were carbon monoxide and hydrogen formed in miniscule amounts. The authors explain that this selectivity and stability was a result of the energy conditions at the surface of the alloy. Here, the carbon dioxide molecules accumulate in such a way that the carbon atom is free to bind hydrogen atoms from water molecules. This gives formic acid as the favored product.

This research hints at positive future prospects for other combinations of metals. It is likely that other nonprecious metals would convert to aerogels, forming inexpensive, nontoxic, and highly efficient catalysts for electrochemical carbon dioxide reduction.

Credit: 
Wiley

Wildfire smoke trends worsening for Western US

image: Changes in daily average particulate matter in the month of August from 2000 to 2019. Points outlined in black indicate statistical significance.

Image: 
Kai Wilmot

From the Pacific Northwest to the Rocky Mountains, summers in the West are marked by wildfires and smoke. New research from the University of Utah ties the worsening trend of extreme poor air quality events in Western regions to wildfire activity, with growing trends of smoke impacting air quality clear into September. The work is published in Environmental Research Letters.

"In a big picture sense, we can expect it to get worse," says Kai Wilmot, lead author of the study and doctoral student in the Department of Atmospheric Sciences. "We're going to see more fire area burned in the Western U.S. between now and in 2050. If we extrapolate our trends forward, it seems to indicate that a lot of urban centers are going to have trouble in meeting air quality standards in as little time as 15 years."

Drawing the connection

Many of the West's inhabitants have seen smoky summer skies in recent years. Last year, dramatic images of an orange-tinted San Francisco Bay Area called attention to the far-reaching problem of wildfire smoke. Wilmot, a native of the Pacific Northwest, has seen the smoke as well and, with his colleagues, looked at trends of extreme air quality events in the West from 2000 to 2019 to see if they correlated with summer wildfires.

Using air measurements of PM2.5, or the amount of particulate matter in the air with diameters less than 2.5 microns, from the Environmental Protection Agency and the IMPROVE monitoring network, along with measurements of fire area burned and the PM2.5 emitted from those fires, the researchers found consistent trends in air quality that correlated with wildfire activity--but that had different spatial patterns in August than in September.

Trends in August and September

Over the years studied, the researchers noticed that the mean air quality was worsening in the Pacific Northwest in the average August when sensors indicated wildfire smoke events.

"That's pretty dramatic," Wilmot says, "that extreme events are strong enough to pull the mean up so that we're seeing an overall increase in particulate matter during August across much of the Pacific Northwest and portions of California. The Pacific Northwest seems like it's just really getting the brunt of it."

The reason for that, he says, is that the regions around the Pacific Northwest, in British Columbia and Northern California, both experience wildfires around August. The mountainous Pacific Northwest, Wilmot says, sits in the middle.

But by September, the researchers found, wildfire activity slows in British Columbia and shifts to the Rocky Mountains. The smoke shifts too--the researchers saw emerging trends correlating wildfire smoke with declines in September air quality in Wyoming and Montana. "We see the PM2.5 trends start to pick up a bit more in the Rockies and they become more statistically significant, a little bit stronger and more spatially coherent," Wilmot says.

What about Utah? The study findings show that the magnitude and significance of air quality trends increases as you go from the southern states of Arizona and New Mexico toward the Pacific Northwest. In Utah, Wilmot says, air quality trends are near the edge of statistical significance, with evidence for impact from wildfires, but evidence that's less robust than in the Pacific Northwest and California. "Thinking about events like the smoke transport from fires in the Bay Area this past summer," Wilmot says, "I would not be surprised to see trends in Utah become increasingly convincing with additional data."

Looking to the future

Other researchers in other studies have suggested that the future will bring more fire areas burned in the Western U.S., with an accompanying increase in wildfire smoke exposure throughout the West and the impacts of that smoke on human health.

Wilmot notes that the trends the researchers see in the Pacific Northwest in August are "pretty robust," he says, while the September trends in Montana and Wyoming are still "emerging."

"I think the concern is that, given more time, those emerging trends are going to start looking a lot more like what we're seeing in August," he says. "I hope that's not the case, but it seems entirely within the realm of possibility."

His next step is to develop simulation models to more precisely link wildfire emissions in urban centers to smoke source regions.

"The big picture," he says, "is aiming to help forest management in terms of identifying wildfire emissions hotspots that are particularly relevant to air quality in the Western U.S., such that if we had funding to spend on some sort of intervention to limit wildfire emissions, we would know where to allocate those funds first to get the most out of it."

Find the full study here.

Credit: 
University of Utah

Molecular biologists travel back in time 3 billion years

image: Suparna Sanyal is a Professor at the Department of Cell and Molecular Biology, Uppsala University.

Image: 
David Naylor

A research group working at Uppsala University has succeeded in studying 'translation factors' - important components of a cell's protein synthesis machinery - that are several billion years old. By studying these ancient 'resurrected' factors, the researchers were able to establish that they had much broader specificities than their present-day, more specialised counterparts.

In order to survive and grow, all cells contain an in-house protein synthesis factory. This consists of ribosomes and associated translation factors that work together to ensure that the complex protein production process runs smoothly. While almost all components of the modern translational machinery are well known, until now scientists did not know how the process evolved.

The new study, published in the journal Molecular Biology and Evolution, took the research group led by Professor Suparna Sanyal of the Department of Cell and Molecular Biology on an epic journey back into the past. A previously published study used a special algorithm to predict DNA sequences of ancestors of an important translation factor called elongation factor thermo-unstable, or EF-Tu, going back billions of years. The Uppsala research group used these DNA sequences to resurrect the ancient bacterial EF-Tu proteins and then to study their properties.

The researchers looked at several nodes in the evolutionary history of EF-Tu. The oldest proteins they created were approximately 3.3 billion years old.

"It was amazing to see that the ancestral EF-Tu proteins matched the geological temperatures prevailing on Earth in their corresponding time periods. It was much warmer 3 billion years ago and those proteins functioned well at 70°C, while 300 million year old proteins were only able to withstand 50°C," says Suparna Sanyal.

The researchers were able to demonstrate that the ancient elongation factors are compatible with various types of ribosome and therefore can be classified as 'generalists', whereas their modern descendants have evolved to fulfil 'specialist' functions. While this makes them more efficient, they require specific ribosomes in order to function properly. The results also suggest that ribosomes probably evolved their RNA core before the other associated translation factors.

"The fact that we now know how protein synthesis evolved up to this point makes it possible for us to model the future. If the translation machinery components have already evolved to such a level of specialisation, what will happen in future, for example, in the case of new mutations?" ponders Suparna Sanyal.

The fact that researchers have demonstrated that it is possible to recreate such ancient proteins, and that extremely old translation factors work well with many different types of ribosome, indicates that the process is of potential interest for protein pharmaceuticals research. If it turns out that other ancient components of protein synthesis were also generalists, it might be possible to use these ancient variants to produce therapeutic proteins in future with non-natural or synthetic components.

Credit: 
Uppsala University

Time for a mass extinction metrics makeover

New Haven, Conn. -- Researchers at Yale and Princeton say the scientific community sorely needs a new way to compare the cascading effects of ecosystem loss due to human-induced environmental change to major crises of the past.

For too long, scientists have relied upon metrics that compare current rates of species loss with those characterizing mass extinctions in the distant past, according to Pincelli Hull, an assistant professor of Earth and planetary sciences at Yale, and Christopher Spalding, an astrophysicist at Princeton.

The result has been projections of extinction rates in the next few decades that are on the order of a hundred times higher than anything observed in the last few million years of the fossil record.

"The problem with using extinction rates this way is that their assessment is riddled with uncertainty," said Hull, who has conducted extensive research on mass extinctions of marine life in the ancient world. "We need a better thermometer for biodiversity crises."

Furthermore, the researchers said, mass extinction predictions do not fully convey the severity of damage done to an ecosystem when species are depleted but not entirely wiped out.

In a new study in the journal Proceedings of the Royal Society B, Spalding and Hull point out deep flaws in the way mass extinctions are being projected and propose a new model for assessing biodiversity loss.

Part of the problem, they said, has to do with comparing extinctions found in the fossil record over millions of years with human-influenced extinctions from only the past century. Mass extinctions in the ancient world were typically characterized by "pulses" of extinctions, preceded and followed by quieter periods; the longer time frame reduces the historic average because it includes the surrounding quiet periods.

What's more, there are large gaps in the ancient fossil record. For example, it is well documented that frog species today are at high risk of extinction -- yet frogs are only rarely found in the fossil record. In addition, certain habitats with many extinctions today -- such as islands -- are also not represented in the ancient fossil record. Rather, the fossil record tends to be dominated by larger species and geographically larger habitats.

"It's difficult to confidently deduce whether today's rates are objectively higher than those of the fossil record," Spalding said. "Meanwhile, we know that ecosystems may be totally decimated, yet suffer very few extinctions. In that sense, extinction rates may even underestimate our influence upon the biosphere."

Spalding and Hull took pains to describe the perilous state of the natural world today, beyond the numbers of species extinctions. According to an Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES) report in 2019, nearly 75% of all freshwater resources on Earth are used by crop and livestock production; human activities have significantly altered 75% of all ice-free terrestrial environments and 66% of marine environments.

Spalding and Hull's proposal is to change the metric from species loss to changes in the rocks beneath their feet.

"Humans change the rock record as soon as they enter an area, whether it is agrarian societies, beaver trapping, or the damming of rivers," Hull said. "We completely change the way the Earth forms itself and this can be seen in the rocks left behind."

The researchers said a variety of measurable metrics -- such as the chemical composition of sediments and grains of rocks -- are more readably comparable to ancient timescales.

"Historical comparisons offer the hope that we might begin to understand the relative scope and the eventual ramifications of our modification of the biosphere," Spalding said. "If we think these comparisons are important, we need to get them right."

Credit: 
Yale University

Suppressing the impact of COVID-19 using controlled testing and isolation

The outbreak of COVID-19 has revealed the widespread effects a pandemic can have on all spheres of life from health, to social life, to the economy. The main thrust of efforts to control the spread has been to decrease the reproduction rate to flatten the curve of the total number of infected individuals per day in order to reduce overload on the health system. The most widely implemented response to the exponential growth of the infection has been widespread quarantine and lockdown. While isolation is an effective tool to decelerate the spread, repeatedly imposing complete quarantine for a relatively long period of time until the virus is suppressed has negative effects on people's lives and on the economy. Moreover, if proper population monitoring is not enforced further waves of the disease may ensue. Therefore, early detection of positive test results is of paramount importance to suppress the spread of COVID-19 as soon as possible.

A second component in fighting the pandemic is to devise better testing policy. Today, health systems around the world prioritize the administration of tests to people who have had direct contact with an infected person or who are symptomatic. Some countries are implementing extensive resources to test people widely and randomly to detect areas where the pandemic is likely to spread. Nevertheless, these strategies do not use resources effectively.

A recent study, published in the journal Scientific Reports by Prof. Amir Leshem, from the Kofkin Faculty of Engineering at Bar-Ilan University, and Dr. Kobi Cohen, of Ben-Gurion University, offers a more effective method for the prioritization of tests using a feedback methodology. The key to the proposed method is to test individuals with a high probability of being infected and identify them before symptoms appear. The probability of individuals being infected acts as an input to selecting additional individuals for testing. Those with the highest probability of infection are selected. Their test results then act as feedback to update the probability of other individuals being infected, and so on.

By reducing the lapse in time between contact with an infected person and taking a COVID-19 test, it has been demonstrated that the proposed method, when combined with contact tracing, may reduce the rate of isolation and mortality by up to 50% compared to existing methods. The key is to test individuals with a high probability of being infected before symptoms appear. These probabilities are updated based on contact tracing and test results. Proper use of the controlled testing methodology significantly improves the outcome for any testing capacity. However, the most significant results are when daily testing capacity is between 0.3% of the population and 3% of the population.

The authors also demonstrated that for small, new outbreaks, controlled testing can prevent the large spread of new waves. Tracing and quarantining only those at risk -- for instance those in close contact with an infected individual, depending on the duration of contact and distance - can contain epidemics without using aggressive lockdowns using surveillance," says Prof. Leshem. "There are two elements to halting epidemics: quarantine and testing," explains Prof. Leshem. "One approach is to test only those who have symptoms, as was done during Israel's first lockdown, but this proved ineffective. Another approach is to randomly perform as many tests as possible. According to calculations, random tests will only be effective if 5% of the population is tested. In our article we propose acceleration by initiating the test rather than waiting for people to do them. Technological means, such as contact tracing, can facilitate this." If within a day people suspected of being infected were called in for testing based on data regarding their proximity to someone known to have contracted the virus, infection rates and morbidity would be reduced, unnecessary tests would not be performed and less unnecessary isolation would be imposed.

Prof. Leshem and Dr. Cohen propose the use of technology that is not based on GPS and does not involve tracking geographical locations, but rather depends on proximity between devices that transmit waves at short distances, such as Bluetooth. With this less-intrusive method, the individual's identity is random and stored in the device, and is "exposed" only if he has been in the vicinity of a person who has tested positive within his Bluetooth range. People will be called in for a test based on the number of infected people who have been around them and the length of time they were in the same vicinity. The risk will be calculated using a dedicated algorithm.

Furthermore, the adoption of control testing methods may prevent new outbreaks (as a result of variants or other viruses) in the early stages and in widespread outbreaks. For example, in a scenario of a single outbreak in a population of 100,000 people, which takes into account that only 70% of the population will comply with the isolation provisions and that the tests are only 80% reliable. Existing methods have been found to require more than a million cumulative days of isolation until the spread is completely eradicated, while mortality reaches about 2,000 people. In contrast, the method proposed by Leshem and Cohen found that only half a million days of isolation were required until the spread was completely eradicated, while mortality reached only about 1,000 people. While existing policies taken by governments around the world have failed to control the outbreak and led to full lockdowns, in terms of the economic cost, the proposed method can save around 8 percent of GDP, for each month of full lockdown that can be saved.

Credit: 
Bar-Ilan University

Digital mental health interventions for young people are perceived promising, but are they effective

April 29, 2021 -An increasing number of digital mental health interventions are designed for adolescents and young people with a range of mental health issues, but the evidence on their effectiveness is mixed, according to research by Columbia University Mailman School of Public Health and Spark Street Advisors.

Computerized cognitive behavioral therapy was found effective for anxiety and depression in adolescents and young people holding promise for increasing access to mental health treatment for these conditions. However, the effectiveness of other digital interventions, including therapeutic video games, mobile apps, or social networking sites, and addressing a range of other mental health outcomes remain inconclusive. The findings are published online in the journal JMIR Mental Health.

According to UNICEF, nearly 1 in 5 adolescents experience a mental health disorder each year but because of barriers to accessing and seeking care, most remain undiagnosed and untreated.

"While there is evidence that some interventions can be effective when delivered digitally, it is still somewhat of a wild west when it comes to digital mental health apps," said Nina Schwalbe, adjunct assistant professor of Population and Family Health at Columbia Mailman School.

The researchers conducted an analysis of 18 systematic reviews and meta-analyses of digital health interventions. In addition to the findings on computerized cognitive behavioral therapy, some therapeutic areas of digital interventions improved outcomes relative to controls for those who are on the waitlist for services, suggesting that the interventions can be used for supplementing and supplanting traditional mental health treatment in cases where access to care is limited or wait times to access are long.

The Investigators point out that the vast majority - over 90 percent - of interventions studied are implemented in high-income countries, with very little information about the background of participants. Therefore, the generalizability of the findings to young people from different socioeconomic, cultural, racial, or other communities is weak. ""It is critical to assess the effectiveness among different racial and ethnic groups and across geographies," observed Susanna Lehtimaki of Spark Street Advisors.

"There was also no indication of costs of developing the tools or long-term benefits," noted Susanna Lehtimaki of Spark Street Advisors. "Moving forward with effective digital health interventions, it will be important to understand how they fit within the public health ecosystem and to what extent they are effective across a range of settings with different resources or populations."

According to the research, digital mental health interventions were well accepted by those 10 to 24 years of age, however, dropout was common and adherence weak. Engagement of a health professional, peer, or parent as part of the digital intervention were found to strengthen the effectiveness.

Schwalbe notes, "In the spirit of "do no harm" it is really important that the excitement over the promise of digital mental health interventions does not cloud the need for high quality effectiveness studies in a range of settings and with a diverse group of youth." She also notes, "it should go without saying that adolescents also need to be consulted in every stage of the design process and while it may be assumed that young people prefer digital services, we need to continually challenge whether this is true."

Credit: 
Columbia University's Mailman School of Public Health

Study: Older adults found resilience during pandemic through community, human connection

Older adults were significantly affected by isolation and stress during Oregon's initial COVID-19 lockdown last spring, but they were also able to find connection and meaning in community, new hobbies and time for themselves, a recent Oregon State University study found.

If resilience is understood as the ability to see positives in the midst of a negative situation, then many of the study's participants demonstrated resilience during that time, the researchers said.

"A lot of times we think about resilience as a personality trait, and it's true that there are some qualities that may help people experience that. But in the end, resilience is something that is shared," said Heidi Igarashi, first author on the study and a recent doctoral graduate of OSU's College of Public Health and Human Sciences. "One of the things that came out in our study was the degree to which the people-connection was really significant."

The study, published in the Journals of Gerontology: Psychological Sciences, surveyed 235 adults ages 51 to 95 about their experiences from April 28-May 4, 2020, when Oregon's statewide stay-at-home order had been in place for about a month.

The online survey asked participants about recent and ongoing difficulties in their lives caused by COVID-19, as well as recent positive experiences.

People shared experiences at the personal, interpersonal and societal levels. Personal difficulties included the stress of constant vigilance around ensuring safety in everyday activities, as well as fear of death and uncertainty about the future. Interpersonal challenges included social isolation, lack of physical contact and fear for loved ones' health. Societal stressors were centered on lack of scientific leadership and concerns for the community at large.

While 94% of participants listed difficulties, roughly 63% shared positive experiences. At the personal level, these included things like trying new projects -- gardening, cooking -- and increased gratitude for the simpler, slower pace of life. Interpersonal joys were found in new friendships or reconnecting with old friends, and in people caring for one another. At the societal level, some noted the benefit to the environment from people driving less and the sense of increased community solidarity.

Older adults took comfort in seeing neighbors and friends taking care of each other, while simultaneously adding to community resilience by looking after friends and neighbors themselves and joining group efforts like mask-sewing drives, said co-author Carolyn Aldwin, the Jo Anne Leonard Endowed Director of the Center for Healthy Aging Research at OSU.

"It's a mistake to think of older adults as just being sort of victims during COVID," Aldwin said. "They're a lot more resilient than we think they are, and they're important for the community."

Many of the survey respondents engaged in Zoom calls with family and friends, enjoyed time spent in nature and finally finished projects that had been sitting in the closet or garage.

Retired folks had a harder time than those who are employed because the lockdown was more disruptive to their routine, including closing off regular volunteer opportunities because of older adults' high-risk status. But some respondents reported feeling relief at being able to focus on themselves for a change, with pursuits like meditation and journaling, rather than spending all their time caring for other people.

The study was conducted via internet survey, which affected response rates; the majority of participants were white, female, retired and highly educated, as opposed to the racial, ethnic and socioeconomic groups that have been hardest hit by COVID-19 infections and death, the researchers said.

But Aldwin cautions against assumptions about resilience among less-advantaged groups. While they may have experienced more loss and financial distress, a key factor in resilience is being able to find purpose in life, which can occur through helping others.

"There's this meaning that's found in caregiving, a reason for living, where our study group often didn't have these demands on them, and they were feeling a lack of sense of meaning," she said. "If you're the person who's holding the family together during this crisis, that's a source of meaning. Clearly we would have seen more loss and more difficulty, but we also might have seen sources of resilience that we didn't see in the study group."

Credit: 
Oregon State University

Kratom use rare, but more common among people with opioid use disorder

Less than one percent of people in the United States use kratom, a plant-based substance commonly used to manage pain and opioid withdrawal, according to a study published in the American Journal of Preventive Medicine. However, the use of kratom--which is legal but carries the risk of addiction and harmful side effects--is more prevalent among people who use other drugs, particularly those with opioid use disorder.

Derived from a tree native to Southeast Asia, kratom can be taken as a pill, capsule, or extract, or brewed as a tea. It acts on the brain's opioid receptors; at low doses, kratom is a stimulant, while at higher doses, it can relieve pain. Some people report using kratom as a substitute for opioids in an effort to limit their opioid use and ameliorate withdrawal. Others use kratom recreationally for relaxation or to self-treat pain, anxiety, or depression.

Kratom is legal at the federal level and in most states, but safety concerns have led to multiple warnings from the U.S. Food and Drug Administration and the Drug Enforcement Administration identifying it as a "drug of concern." Long-term or frequent use can lead to dependence, and adverse events, ranging from mild to severe, have been reported. Kratom has been linked to thousands of poisonings and hundreds of deaths in the U.S., although most involve the use of other drugs, especially opioids.

"Few national studies have examined kratom use among the general population, and such studies can give us a better idea regarding who has been using the substance," said study author Joseph Palamar, PhD, MPH, an associate professor of population health at NYU Grossman School of Medicine and an affiliated researcher with the Center for Drug Use and HIV/HCV Research (CDUHR) at NYU School of Global Public Health.

Using the 2019 National Survey on Drug Use and Health, which captured data on 56,136 adolescents and adults in the U.S., Palamar examined how many people use kratom and what other substances they use. He found that an estimated 0.7 percent of adults and adolescents used kratom in the past year.

Kratom use was more prevalent among people who use other drugs, including cannabis, stimulants, and cocaine, and was particularly common among those who misuse prescription opioids, with 10.3 percent of people with opioid use disorder reporting kratom use.

Men, white people, and those with depression and serious mental illness were also more likely to report using kratom. Teens and adults over 50 were less likely to report use.

"This study adds to our understanding of kratom's prevalence and its connection to opioid misuse," said Palamar. "More research is needed to determine the substance's effectiveness in treating opioid withdrawal, and more research is needed to determine how safe this substance is when combined with other drugs."

Credit: 
New York University