Brain

'Swiss Army knife' catalyst can make natural gas burn cleaner

image: Reza Shahbazian-Yassar, professor of mechanical and industrial engineering at the UIC College of Engineering and Zhennan Huang, a Ph.D. student in Shahbazian-Yassar's lab and co-first author of the paper.

Image: 
Photo: Jenny Fontaine/UIC

Reza Shahbazian-Yassar, professor of mechanical and industrial engineering at the University of Illinois Chicago.

Shahbazian-Yassar and colleagues facilitated the development of a cutting edge "Swiss Army knife" catalyst made up of 10 different elements - each of which on its own has the ability to reduce the combustion temperature of methane - plus oxygen. This unique catalyst can bring the combustion temperature of methane down by about half - from above 1400 degrees Kelvin down to 600 to 700 degrees Kelvin.

Their findings are reported in the journal Nature Catalysis.

In previously-published research, Shahbazian-Yassar and colleagues demonstrated the ability to create multi-element nanoparticle catalysts, known as high entropy alloys using a unique shock-wave technique. Before this, materials scientists didn't make serious attempts to create nanoparticles out of more than three elements because of the tendency of each elements' atoms to separate from each other and become useless.

Taking advantage of the unique real-time, high-temperature electron microscopy system at UIC, Shahbazian-Yassar's team showed that high entropy nanoparticles made up of 10 metal oxides were highly stable at temperatures up to 1,073 degrees Kelvin and the individual elements were distributed evenly throughout each nanoparticle forming a single, solid-state stable crystalline structure.

Their metal oxide alloy contained various mixtures of transition metals, which are rare-earth elements, and noble metals plus oxygen.

"It is almost impossible to maintain a perfect mix of these elements in a solid phase due to the differences in atomic radius, crystal structure, oxidation potential, and electronic properties of the elements," said Zhennan Huang, a Ph.D. student in Shahbazian-Yassar's lab and co-first author in the paper. "But we were able to show that this is possible."

"Among multiple alloys with multiple elements that we created, the particles made of 10 elements not only were most effective in reducing the combustion point of methane gas but also the most stable at those temperatures," said Shahbazian-Yassar, who is a corresponding author on the paper.

The researchers believe the catalyst could be used to reduce the output of harmful greenhouse gases produced by burning natural gas in individual households, to power turbines and even in cars that run on compressed natural gas.

Credit: 
University of Illinois Chicago

Study finds new evidence of health threat from chemicals in marijuana and tobacco smoke

image: Marijuana plant

Image: 
Pexels

Scientists at Dana-Farber Cancer Institute and the Centers for Disease Control and Prevention have uncovered new evidence of the potential health risks of chemicals in tobacco and marijuana smoke.

In a study published online today by EClinicalMedicine, the researchers report that people who smoked only marijuana had several smoke-related toxic chemicals in their blood and urine, but at lower levels than those who smoked both tobacco and marijuana or tobacco only. Two of those chemicals, acrylonitrile and acrylamide, are known to be toxic at high levels. The investigators also found that exposure to acrolein, a chemical produced by the combustion of a variety of materials, increases with tobacco smoking but not marijuana smoking and contributes to cardiovascular disease in tobacco smokers.

The findings suggest that high acrolein levels may be a sign of increased risk of cardiovascular disease and that reducing exposure to the chemical could lower that risk. This is particularly important for people infected with HIV, the virus that causes AIDS, given high rates of tobacco smoking and the increased risk of heart disease in this group.

"Marijuana use is on the rise in the United States with a growing number of states legalizing it for medical and nonmedical purposes - including five additional states in the 2020 election. The increase has renewed concerns about the potential health effects of marijuana smoke, which is known to contain some of the same toxic combustion products found in tobacco smoke," said the senior author of the study, Dana Gabuzda, MD, of Dana-Farber. "This is the first study to compare exposure to acrolein and other harmful smoke-related chemicals over time in exclusive marijuana smokers and tobacco smokers, and to see if those exposures are related to cardiovascular disease."

The study involved 245 HIV-positive and HIV-negative participants in three studies of HIV infection in the United States. (Studies involving people with HIV infection were used because of high tobacco and marijuana smoking rates in this group.) The researchers collected data from participants' medical records and survey results and analyzed their blood and urine samples for substances produced by the breakdown of nicotine or the combustion of tobacco or marijuana. Combining these datasets enabled them to trace the presence of specific toxic chemicals to tobacco or marijuana smoking and to see if any were associated with an increased risk of heart disease.

The investigators found that participants who exclusively smoked marijuana had higher blood and urine levels of several smoke-related toxic chemicals such as naphthalene, acrylamide, and acrylonitrile metabolites than non-smokers did. However, the concentrations of these substances were lower in marijuana-only smokers than in tobacco smokers.

Investigators also found that acrolein metabolites - substances generated by the breaking down of acrolein - were elevated in tobacco smokers but not marijuana smokers. This increase was associated with cardiovascular disease regardless of whether individuals smoked tobacco or had other risk factors.

"Our findings suggest that high acrolein levels may be used to identify patients with increased cardiovascular risk," Gabuzda said, "and that reducing acrolein exposure from tobacco smoking and other sources could be a strategy for reducing risk."

Credit: 
Dana-Farber Cancer Institute

Understanding origins of Arizona's Sunset Crater eruption of 1,000 years ago

image: Around 1085 AD, along the southern rim of Northern Arizona's elevated Colorado Plateau, a volcano erupted, forever changing ancient Puebloan fortunes and all nearby life. Today, ASU School of Earth and Space Exploration scientist Amanda Clarke and her team have been working to solve the mysterious root cause of the Sunset Crater eruption and any lessons learned to better understand the threats similar volcanoes may pose around the world today.

Image: 
U.S. Geological Survey

Around 1085 AD, along the southern rim of Northern Arizona's elevated Colorado Plateau, a volcano erupted, forever changing ancient Puebloan fortunes and all nearby life. Among the 600 or so volcanoes that dot the landscape of the San Francisco volcanic fields, this one blew. It was the very first (and last) eruption for what came to be known as Sunset Crater, aptly named for its multi-hued, 1,000-foot-tall cinder cone.

Today, ASU School of Earth and Space Exploration scientist Amanda Clarke and her team have been working to solve the mysterious root cause of the Sunset Crater eruption and any lessons learned to better understand the threats similar volcanoes may pose around the world today.

"This is a common thing in volcanology, to reconstruct past eruptions to try to understand what the volcano or region might do in the future," said Clarke. "We did the field work and we combined data from a previous study and used some modern techniques to put the story together."

Working alongside several collaborators, they have painstakingly mapped every fissure, eruption deposit, and ancient lava flow of Sunset Crater to reconstruct the complete splatter patterns and geochemical compositions of all ejected materials, or tephra, from the eruption.

An explosive past

"When you visit the site, there are these lava flows that are obvious, but also this big tephra blanket that extends far beyond the volcanic edifice itself, way beyond the vent," said Clarke. "My interest was first piqued when I learned on a field trip many years ago with former ASU professor Stephen Self, that Sunset Crater had an explosive past."

In a previous study, Clarke's group first showed that the volcanic activity developed in seven or eight distinct phases: initial fissure phases, followed by highly explosive phases, and finally, low-explosivity, waning phases. "It's not clear how this happens, but eventually, the eruption settled on this single pipeline to the surface, and that's where a lot of our work picks up the story," said Clarke.

At several points during the explosive phase, the sky was filled with basaltic, cindery ash up to 20 to 30km high, making it one of the most explosive volcanic eruptions of its kind ever documented in the world.

"People in Winslow [100km away] would have been able to see it," said Clarke. To give one an idea of the eruption size, they measured the total volume of eruption material, or 0.52 km3 dense rock equivalent (DRE)----which, by comparison, turned out to be similar to the volume of the infamous 1980 Mount St. Helens eruption. "

It was very similar to Mt. St. Helens in terms of height and volume," said Clarke. "You think these things that are cinder cones are going to be something like Stromboli in Italy----a fire fountain of a couple of hundred meters and people might be able to watch it from their terrace----but this peak phase was St. Helens scale."

1`

Mysterious magma

But as to why it erupted, that has remained a mystery, until now. "The science question is how these more liquidy magmas behave like viscous magmas," said Clarke. The study, published in the journal Nature Communications was the result of a collaboration between SESE PhD alumna Chelsea Allison (now at Cornell University) and research scientist Kurt Roggensack. "Chelsea was a graduate student who did some innovative analysis and Kurt has this expertise in petrology and more small-scale analysis while I am more of a physical volcanologist; so that's where we came together," said Clarke.

Measuring the factors that led to the Sunset Crater eruption 1,000 years later is an extremely difficult task because the gases that make up the magma usually escape into the sky during the eruption, forever lost in time. But to better reconstruct the past, the group have taken advantage of extensive microanalyses from the tiniest blobs and bubbles that are the best representation of the composition of magma from Sunset Crater before the eruption, known as melt inclusions. Roggensack is recognized as a world expert in innovative melt inclusion analysis, especially in basaltic magmas.

How tiny? Melt inclusions are less than a thousandth of an inch across. They become embedded in time within growing crystals of the magma plumbing system that forms before a volcano erupts. "They've been liberated from the magma in the explosion," said Clarke.

They are like a fizzy, soda concoction of trapped gas, frozen in time from the surrounding magma as they crystalize, yet able to reveal the gas composition and secret history of an eruption so long ago.

Think of the basaltic Sunset Crater having more of a maple syrup consistency versus the peanut butter variety of the rhyolite magma of Mt. St. Helens. "Those are viscous magmas that can have a lot of water stuffed in them," said Clarke.

What were the conditions and ingredients that could lead to the Sunset Crater eruption?

"That leads to the big questions of what is the volatile content of the magma because that is going to control the explosivity," said Clarke. "To answer the questions, you have to dig down deep into the plumbing system, and that's what we did."

Clarke's group is among the first to show the importance of carbon dioxide in volcanic eruptions, partly because it wasn't easy task to measure in the first place. "We think this eruption could have pumped a fair amount of carbon dioxide and also sulfur dioxide into the atmosphere," said Clarke.

"Water is usually the main component [as in Mt. St. Helens] but what we are finding at Sunset is that carbon dioxide is very abundant and that tends to be more critical in the deeper part of the system to get the magma moving toward the surface. We think that played a big role in this. And the carbon dioxide is probably coming from deep in the mantle within the source area."

The melt inclusions (MIs) were specifically chosen to provide a representative sample of textural features observed in the Sunset Crater eruption (e.g., varying bubble volumes, sizes and shapes). Some of the tools of the trade used were microscopes to bring the details of crystallization and bubble formation for each tiny melt inclusion to life, as well as sensitive instruments to measure the amount of volatiles trapped in the quenched glass.

"That can tell us some of the details of the last moments of the magma before it was quenched."

Tiny bubbles

Using a custom-built Raman spectrometer at ASU in the LeRoy Eyring Center for Solid State Science (LE-CSSS), Chelsea Allison set up the melt inclusion analysis in which samples are first excited using a blue sapphire laser. High-quality melt inclusions were polished and imaged with a petrographic microscope in preparation for Raman analysis.

Like a Russian doll, nestled inside the little crystal is this little melt inclusion (now glass), and then inside the melt inclusion is a bubble, and inside the bubble is carbon dioxide.

"Raman spectroscopy can be used to measure the density of carbon dioxide, and then from the volume and density of the bubble, you can use that to calculate a mass," said Clarke. "Allison had to do all kinds of stuff including creating standards to ensure what she was measuring was accurate. She used known amounts of carbon dioxide inside little glass tubes to make a calibration curve."

"People used to ignore the bubbles, thinking there was nothing important inside, but it turns out it was almost all carbon dioxide," said Clarke. "We've added that carbon dioxide inside the bubble to the total carbon dioxide budget of the magma."

"That all ties together, because once you have the volumes of the eruption, and the total volatile content of the magma, you can start understanding how much got ejected into the atmosphere, and what does that look like compared to other eruptions."

It came from the deep

The carbon dioxide gas phase played a critical role in driving the explosive eruption, with the gas stored in the magma of Sunset Crater as deep as 15km below the surface.

"We think that magma was bubbling already at 15km deep, and that's not what people typically think about magma systems with these volcanoes. It has been demonstrated before that you have a bubble phase. And if you have a system that is already bubbly and that deep, it means you might have a really rapid ascent."

Although, the impact of basaltic volcanism on the global atmospheric system is largely unknown, this high carbon dioxide and sulfur from the eruption could have also had a large impact on the atmosphere at the time of the eruption.

They also compared the magmatic volatiles at Sunset Crater to those in explosive caldera-forming silicic eruptions such as the Bishop Tuff to highlight differences in their abundance and composition. This comparison suggested that the carbon dioxide rich phase is a critical pre-eruptive condition that drives highly explosive basaltic eruptions.

Explosive silicic eruptions, although still much larger in terms of erupted volume, are better analogies to the dynamics of the Sunset Crater eruption. Two such historical eruptions, the 1991 eruption at Pinatubo (Philippines) and the 1815 eruption of trachyandesite at Tambora (Indonesia), resulted in profound atmospheric impacts.

The Pinatubo eruption, which had significant impact on global climate for three years post-eruption, erupted 10 times the mass of magma (5 km3 DRE) as Sunset Crater (0.5 km3 DRE), but released just ~3 times the mass of sulfur dioxide. The Tambora eruption was responsible for the "year without a summer", and while it erupted ~60 times the mass of magma (30 km3 DRE) as Sunset Crater, it released only ~9 times the mass of sulfur dioxide.

The lessons learned from Sunset Crater and its type of basaltic volcanism could still inform us today.

"Now we can ask, are the conditions that led to the Sunset Crater eruption really that unusual?" said Clarke. "How common is it for us to see a basaltic cinder cone that we think should be a gentle, observable eruption turn into something that is much more hazardous to aircraft flying overhead or to the people around it? We can start to apply these concepts to active systems."

"And remember, though the vent at Sunset Crater is not going to erupt again, the San Francisco field is still active. There will probably be another eruption there. It could be anywhere, and probably in the eastern sector, but we don't know where and when. It could be on a scale of thousands of years."

Credit: 
Arizona State University

New statistical method exponentially increases ability to discover genetic insights

Pleiotropy analysis, which provides insight on how individual genes result in multiple characteristics, has become increasingly valuable as medicine continues to lean into mining genetics to inform disease treatments. Privacy stipulations, though, make it difficult to perform comprehensive pleiotropy analysis because individual patient data often can't be easily and regularly shared between sites. However, a statistical method called Sum-Share, developed at Penn Medicine, can pull summary information from many different sites to generate significant insights. In a test of the method, published in Nature Communications, Sum-Share's developers were able to detect more than 1,700 DNA-level variations that could be associated with five different cardiovascular conditions. If patient-specific information from just one site had been used, as is the norm now, only one variation would have been determined.

"Full research of pleiotropy has been difficult to accomplish because of restrictions on merging patient data from electronic health records at different sites, but we were able to figure out a method that turns summary-level data into results that are exponentially greater than what we could accomplish with individual-level data currently available," said the one of the study's senior authors, Jason Moore, PhD, director of the Institute for Biomedical Informatics and a professor of Biostatistics, Epidemiology and Informatics. "With Sum-Share, we greatly increase our abilities to unveil the genetic factors behind health conditions that range from those dealing with heart health, as was the case in this study, to mental health, with many different applications in between."

Sum-Share is powered by bio-banks that pool de-identified patient data, including genetic information, from electronic health records (EHRs) for research purposes. For their study, Moore, co-senior author Yong Chen, PhD, an associate professor of Biostatistics, lead author Ruowang Li, PhD, a post-doc fellow at Penn, and their colleagues used eMERGE to pull seven different sets of EHRs to run through Sum-Share in an attempt to detect the genetic effects between five cardiovascular-related conditions: obesity, hypothyroidism, type 2 diabetes, hypercholesterolemia, and hyperlipidemia.

With Sum-Share, the researchers found 1,734 different single-nucleotide polymorphisms (SNPs, which are differences in the building blocks of DNA) that could be tied to the five conditions. Then, using results from just one site's EHR, only one SNP was identified that could be tied to the conditions.

Additionally, they determined that their findings were identical whether they used summary-level data or individual-level data in Sum-Share, making it a "lossless" system.

To determine the effectiveness of Sum-Share, the team then compared their method's results with the previous leading method, PheWAS. This method operates best when it pulls what individual-level data has been made available from different EHRs. But when putting the two on a level playing field, allowing both to use individual-level data, Sum-Share was statistically determined to be more powerful in its findings than PheWAS. So, since Sum-Share's summary-level data findings have been determined to be as insightful as when it uses individual-level data, it appears to be the best method for determining genetic characteristics.

"This was notable because Sum-Share enables loss-less data integration, while PheWAS loses some information when integrating information from multiple sites," Li explained. "Sum-Share can also reduce the multiple hypothesis testing penalties by jointly modeling different characteristics at once."

Currently, Sum-Share is mainly designed to be used as a research tool, but there are possibilities for using its insights to improve clinical operations. And, moving forward, there is a chance to use it for some of the most pressing needs facing health care today.

"Sum-Share could be used for COVID-19 with research consortia, such as the Consortium for Clinical Characterization of COVID-19 by EHR (4CE)," Yong said. "These efforts use a federated approach where the data stay local to preserve privacy."

Credit: 
University of Pennsylvania School of Medicine

Unravelling the mystery that makes viruses infectious

image: Capsid protein pentamers (subunits colour-coded) being recruited to the growing protein shell (brown) during virion assembly by formation of sequence-specific contacts between the genome (packaging signals shown as orange space-filled models) and the Enterovirus-E capsid.

Image: 
University of Leeds

Researchers have for the first time identified the way viruses like the poliovirus and the common cold virus 'package up' their genetic code, allowing them to infect cells.

The findings, published today (Friday, 8 January) in the journal PLOS Pathogens by a team from the Universities of Leeds and York, open up the possibility that drugs or anti-viral agents can be developed that would stop such infections.

Once a cell is infected, a virus needs to spread its genetic material to other cells. This is a complex process involving the creation of what are known as virions - newly-formed infectious copies of the virus. Each virion is a protein shell containing a complete copy of the virus's genetic code. The virions can then infect other cells and cause disease.

What has been a mystery until now is a detailed understanding of the way the virus assembles these daughter virions.

Professor Peter Stockley, former Director of the Astbury Centre for Structural Molecular Biology at Leeds, who part supervised the research with Professor Reidun Twarock from York, said: "This study is extremely important because of the way it shifts our thinking about how we can control some viral diseases. If we can disrupt the mechanism of virion formation, then there is the potential to stop an infection in its tracks."

"Our analysis suggests that the molecular features that control the process of virion formation are genetically conserved, meaning they do not mutate easily - reducing the risk that the virus could change and make any new drugs ineffective."

The research at Leeds and York brings together experts in the molecular structure of viruses, electron microscopy and mathematical biology.

The study focuses on a harmless bovine virus that is non-infectious in people, Enterovirus-E, which is the universally adopted surrogate for the poliovirus. The poliovirus is a dangerous virus that infects people, causing polio and is the target of a virus eradication initiative by the World Health Organization.

The enterovirus group also includes the human rhinovirus, which causes the common cold.

The study published today details the role of what are called RNA packaging signals, short regions of the RNA molecule which together with proteins from the virus's casing ensure accurate and efficient formation of an infectious virion.

Using a combination of molecular and mathematical biology, the researchers were able to identify possible sites on the RNA molecule that could act as packaging signals. Using advanced electron microscopes at the Astbury Biostructure Laboratory at the University of Leeds, scientists were able to directly visualise this process - the first time that has been possible with any virus of this type.

Professor Twarock added: "Understanding in detail how this process works, and the fact that it appears conserved in an entire family of viral pathogens, will enable the pharmaceutical industry to develop anti-viral agents that can block these key interactions and prevent disease."

Credit: 
University of Leeds

Fatal health threat to young African children reduced by innovative artistic intervention

Please note that, due to production issues, publication of the PLOS Medicine paper cited in this release has been delayed. It will be included in this release when a new publication date is confirmed.

The fatal threat from diarrhoea and pneumonia to young children in the world's poorer countries can be drastically reduced by using traditional performing arts to encourage mothers to provide youngsters with safe food and water, a new study reveals.

The Gambia, like many other Low- and Middle-income Countries (LMICs) faces high rates of under-five deaths due to diarrhoea and pneumonia - the two highest causes of death in this age group in this country and globally.

Children transitioning from breastfeeding to eating food are at most risk, as complementary food becomes contaminated. Researchers working in The Gambia discovered that mothers' food safety and hygiene behaviours were massively improved by a low-cost behaviour change community programs trialled in rural villages.

After six months, researchers observed that hospital admissions had reduced by 60% for diarrhoea and 30% for respiratory infection. After 32 months, the mothers continued to practice improved food safety and hygiene practices, informing and encouraging new mothers to do the same.

Led by experts from the University of Birmingham, the international research team has now published its findings in PLOS Medicine.

Lead researcher, Dr Semira Manaseki-Holland, Clinical Senior Lecturer in Public Health at the University of Birmingham, commented: "We developed a low-cost, but seemingly effective, community health intervention that if replicated in countries around the globe could save thousands, if not millions, of lives in the years ahead.

"Gambian rural villages are similar to thousands in sub-Saharan Africa and these methods can be used in many countries across Africa and Asia. We saw the food hygiene practices of Gambian mothers with weaning age children improve dramatically. Although we could not measure death rates, since diarrhoea and pneumonia is a leading cause of death in young children, we can deduce that the program can result in fewer young children dying from diarrhoea and pneumonia."

The research programme used a randomised trial across 30 Gambian villages (15 got the program and 15 instead got messages about household gardens) to identify and correct behaviour around critical points in food preparation and handling when contamination can occur.

Researchers translated food safety and hygiene information into stories and songs with a central figure called 'MaaChampian' - a role model mother with behaviours that mothers and families strived to achieve. Five community visits included performances and music -honouring the achievements of mothers and other community members towards becoming mentoring figures themselves.

Dr Buba Manjang, Gambian lead researcher and Director of Public Health Directorate of the Ministry of Health of the Gambia, commented: "Communities and mothers know most of the correct behaviours, but for some reason they don't do them, even if the means are available. This research offers a low-cost, effective solution for The Gambia and other countries to use cultural performing arts in similar behavioural change interventions. This will help to reduce the fatal impact of diarrhoea and pneumonia by involving whole communities to support the mothers and improve child health."

Historically, expensive and resource intensive water, sanitation, and hygiene (WASH) interventions were and still are the main accepted way of addressing diarrhoea and pneumonia - involving building toilets, providing safe water, creating sewage systems.

However, changing behaviour of communities is as important as these large infrastructure programs. Without involving communities and local people, these new developments can either be ignored or not adequately suit the life of people they are intended for.

Many of these programs rely on home visits to the mother informing and encouraging her to change her practices without adequately addressing the community support that she needs to do this.

Credit: 
University of Birmingham

Sleep is irreplaceable for the recovery of the brain

Sleep is ubiquitous in animals and humans and vital for healthy functioning. Thus, sleep after training improves performance on various tasks in comparison to equal periods of active wakefulness. However, it has been unclear so far whether this is due to an active refinement of neural connections or merely due to the absence of novel input during sleep. Now researchers at the Medical Center - University of Freiburg have succeeded in showing that sleep is more than rest for improving performance. The findings, which were published in the journal SLEEP on January 6, 2021, provide important information for planning periods of intensive learning or training.

"Sleep is irreplaceable for the recovery of the brain. It cannot be replaced by periods of rest for improved performance. The state of the brain during sleep is unique," says Prof. Dr. Christoph Nissen, who headed the study as research group leader at the Department of Psychiatry and Psychotherapy at the Medical Center - University of Freiburg and is now working at the University of Bern, Switzerland. In earlier studies, Nissen and his team provided evidence for the notion that sleep has a dual function for the brain: Unused connections are weakened and relevant connections are strengthened.

In the current study, the researchers conducted a visual learning experiment with 66 participants. First, all participants were trained distinguishing certain patterns. Afterwards, one group was awake watching videos or playing table tennis. The second group slept for one hour and the third group stayed awake, but was in a darkened room without external stimuli and under controlled sleep laboratory conditions. Not only did the group that slept perform significantly better than the group that was awake and active, but the sleep group also performed significantly better than the group that was awake, but deprived from any external stimuli. The improvement in performance was linked to typical deep-sleep activity of the brain, which has an important function for the connectivity of nerve cells. "This shows that it is sleep itself that makes the difference," says co-study leader Prof. Dr. Dieter Riemann, head of the sleep laboratory at the Department of Psychiatry and Psychotherapy at the Medical Center - University of Freiburg. In control experiments, the Freiburg researchers ensured that fatigue and other general factors had no influence on the results.

The study shows that sleep cannot be replaced by rest during phases of intensive performance demands at work or in everyday life.

Credit: 
University of Freiburg

What happens when your brain can't tell which way is up or down?

TORONTO, January 7, 2021- What feels like up may actually be some other direction depending on how our brains process our orientation, according to psychology researchers at York University's Faculty of Health.

In a new study published in PLoS One, researchers at York University's Centre for Vision Research found that an individual's interpretation of the direction of gravity can be altered by how their brain responds to visual information. Laurence Harris, a professor in the Department of Psychology in the Faculty of Health and Meaghan McManus, a graduate student in his lab, found, using virtual reality, that people differ in how much they are influenced by their visual environment.

Harris and McManus say that this difference can help us better understand how individuals use visual information to interpret their environment and how they respond when performing other tasks.

"These findings may also help us to better understand and predict why astronauts may misestimate how far they have moved in a given situation, especially in the microgravity of space," says Harris.

In this virtual-reality-based study, McManus and Harris had their participants lie down in a virtual environment that was tilted so that the visual "up" was above their head and not aligned with gravity. They found that the participants could be divided into two groups: one group who perceived they were standing up vertically (aligned with the visual scene) even though they were actually lying down, and a second group who maintained a more realistic idea of their lying position.

The researchers called the first group, "Visual Reorientation Illusion vulnerable" (VRI-vulnerable). The two groups of participants, while in the same physical orientation and seeing the same scene, experienced simulated self-motion through the environment differently. Those that were VRI-vulnerable reported feeling that they were moving faster and further than those that were not. "Not only did the VRI-vulnerable group rely more on vision to tell them how they were oriented, but they also found visual motion to be more powerful in evoking the sensation of moving through the scene," added Harris.

"On Earth, the brain has to constantly decide whether a given acceleration is due to a person's movements or to gravity. This decision is helped by the fact that we normally move at right angles to gravity. But if a person's perception of gravity is altered by the visual environment or by removing gravity, this distinction becomes much harder."

"The findings reported in this paper could be helpful when we land people on the Moon again, on Mars, or on comets or asteroids, as low-gravity environments might lead some people to interpret their self-motion differently - with potentially catastrophic results," says Harris. The findings could also be helpful for virtual reality game designers, as certain virtual environments may lead to differences in how players interpret and move through the game. Researchers say that the findings may also inform models of how aging may affect the ability to move around and to balance.

Credit: 
York University

Study reveals structure of protein and permits search for drugs against neglected diseases

image: Discovery paves the way for the study of more potent molecules capable of directly destroying parasites with fewer adverse side-effects

Image: 
CQMED

Brazilian researchers have managed to decipher the structure of a protein found in parasites that cause neglected tropical diseases, paving the way to the development of novel medications. Thanks to the discovery it will be possible to seek more potent molecules capable of destroying the pathogens directly, with fewer adverse side-effects for patients.

The study detailed the structural characteristics of the protein deoxyhypusine synthase (DHS), found in Brugia malayi, one of the mosquito-borne parasites that cause elephantiasis, and in Leishmania major, the protozoan that causes cutaneous leishmaniasis.

Elephantiasis, also known as lymphatic filariasis, is an infection of the lymph system that can lead to swelling of the legs, arms, and genitalia. It may also harden and thicken the skin, limiting movement, and hindering normal activities.

Cutaneous leishmaniasis produces skin lesions weeks or months after the insect bite that transfers the parasite. Lesions may persist for years and leave scars similar to those caused by burns. Over 300,000 cases were notified in Brazil between 2003 and 2018, according to data from the Ministry of Health.

The study is reported in an article published in PLOS Neglected Tropical Diseases. The first authors are Suélen Silva and Angélica Klippel, PhD candidates at São Paulo State University (UNESP) in Araraquara. Silva is in the Biotechnology  Program, and Klippel is in Bioscience and Biotech Applied to Pharmacy, with a scholarship from FAPESP, supervised by Cleslei Zanelli

The research is conducted under the aegis of the National Institute of Science and Technology (INCT) for Open-Access Medicinal Chemistry hosted by the University of Campinas’s Center for Medicinal Chemistry (CQMED-UNICAMP). It receives funding from FAPESP, and federally from the National Council for Scientific and Technological Development (CNPq) at the Ministry of Science and Technology and CAPES, the Ministry of Education’s Coordination for the Improvement of Higher Education Personel.

The group achieved two important major advances with regard to DHS: standardization of a yeast-based platform to study the enzyme for Leishmania major, and three-dimensional assembly of the molecule found in the elephantiasis protozoan.

The identification of this novel target will now be followed by more research to develop or find molecules that inhibit DHS-mediated biochemical processes and stop the disease from progressing. If specific inhibitors are identified as the basis for drug development, it will be possible to reduce or completely avoid the adverse side-effects of current treatments, such as fever, nausea, and insomnia. In the case of elephantiasis, some medications are not even capable of killing adult worms.

“At CQMED we try to elucidate proteins that haven’t been studied in depth and to determine their crystal structure. The study revealed the structure of DHS in these parasites for the first time. We began with these two parasites [Brugia and Leishmania], but we want to go on to investigate four more organisms including Plasmodium, which causes malaria,” Katlin Massirer, principal investigator of the INCT, told Agência FAPESP. Massirer is affiliated with the University of Campinas’s Center for Molecular Biology and Genetic Engineering (CBMEG-UNICAMP). 

Crystallography is an essential tool in research on the three-dimensional structure of proteins. Scientists use the technique to analyze the positions of the proteins’ atoms and their interactions, to understand how they act in the human organism, and to investigate how a prospective drug will have to bind to a given protein. Analyzing crystal structure also helps understand how each drug works so as to enhance its efficacy.

The study included in vitro synthesis of the active enzyme, the form in which DHS acts on the parasite, and assays to test its activity while screening the molecules that could become novel medications. 

“The structure of the protein serves as a basis on which to identify the differences and try to find an inhibitor that will fit into the parasite’s enzyme without affecting the similar protein found in humans. It’s a big challenge,” Klippel said.

Progress

According to Massirer, it takes between 12 and 15 years on average to research, develop and bring novel medications to market, but in the case of neglected tropical diseases (NTDs) lead times can be even longer because so little research is done on these health problems. “You could say that the stage we’re at is equivalent to two years of this process,” she said.

NTDs are estimated to affect some 1.5 billion people in more than 150 countries, mainly in the world’s poorest regions. Besides leishmaniasis and elephantiasis, the list of NTDs published by the World Health Organization (WHO) includes Chagas disease, schistosomiasis, dengue fever, and chikungunya.

More than 350 research institutions, civil society organizations, companies, and governments of countries around the world signed up to celebrate the first-ever World NTD Day on January 30, 2020. The date was chosen because it was the anniversary of the 2012 London Declaration on NTDs, which redoubled global efforts to eradicate NTDs. Pharmaceutical companies, donors, endemic countries, and NGOs committed to control, eliminate or eradicate ten diseases by 2020 and improve the lives of over a billion people. Not all the goals have been met, however.

Scant interest in NTDs was one of the reasons Klippel chose to work on the problem. “For my PhD research I wanted to leverage my training as a pharmacist and to focus on a neglected area, to participate in something important but not highly valued,” she said. “Every stage is a victory. We’ve built a tool, and now we must take the next step.”

At CQMED, Massirer explained, the next step will consist of acquiring DHS from other species and at the same time searching repositories for molecules at home and abroad. The research has been proceeding more slowly because of the COVID-19 pandemic. “Our center lets groups from anywhere in Brazil propose innovative projects for collaborative research by the network,” she said.

The institute’s mission includes fostering partnerships among researchers at different centers to investigate little-studied proteins and increase the impact of discoveries.

CQMED is a unit of the Brazilian Agency for Industrial Research and Innovation (EMBRAPII). It was created with support from FAPESP through its Research Partnership for Technological Innovation Program (PITE), in cooperation with the Structural Genomics Consortium (SGC), a collaborative network of academics, pharmaceutical companies, and funding agencies working together to accelerate drug development. 

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Research confirms increase in river flooding and droughts in US, Canada

image: Blue boxes indicate change in high flow frequency during each season. High-flow seasons are not decreasing in any region in the U.S. and Canada.

Red boxes indicate where low-flow events are increasing significantly. This is especially prevalent in the drought-prone Southwest and Southeast U.S.

Image: 
Figure courtesy of Evan Dethier.

HANOVER, N.H. - January 7, 2021 - The number of "extreme streamflow" events observed in river systems have increased significantly across the United States and Canada over the last century, according to a study from Dartmouth College.

In regions where water runoff from snowmelt is a main contributor to river streamflow, the study found a rise in extreme events, such as flooding.

In drought-prone regions in the western and southeastern U.S., the study found that the frequency of extreme low-flow events has also become more common, particularly during summer and fall.

The research, published in Science Advances, analyzed records dating back to 1910 to confirm the effects of recent changes in precipitation levels on river systems.

"Floods and droughts are extremely expensive and often life-threatening events," said Evan Dethier, a postdoctoral researcher at Dartmouth and the lead author of the paper. "It's really important that we have good estimates of how likely extreme events are to occur and whether that likelihood is changing."

Although changes in precipitation and extreme streamflows have been observed in the past, there has been no research consensus on whether droughts and floods have actually increased in frequency.

Past research efforts have mostly focused on annual peak flows, potentially missing important seasonal changes to extreme low-flow events that can be pulled from daily streamflow records. Those efforts have also been hampered by the mixing of data from regions that have different precipitation patterns and natural seasonal cycles.

According to the research paper: the results demonstrate that "increases in the frequency of both high- and low-flow extreme streamflow events are, in fact, widespread."

"Previous attempts to analyze regional pattern in streamflow were usually based on fixed geographical regions that were largely unsuccessful," said Carl Renshaw, a professor of earth sciences at Dartmouth. "The novel clustering approach used in this research defines regions based on the hydrology--not geographical or political boundaries--to better reveal the significant shifts occurring for both high and low streamflows."

The Dartmouth study combined 541 rivers in the U.S. and Canada into 15 hydrological regions organized by seasonal streamflow characteristics, such as whether streams flood due to tropical storms or rain falling on melting snow. This grouping allowed for more sensitive detection of trends in extreme flow events on both an annual and seasonal basis.

Out of the 15 "hydro-regions" created, 12 had enough rivers to be analyzed in the study. The rivers studied were judged to be minimally affected by human activity and included extensive records that span 60 or more years.

"The shifts toward more extreme events are especially important given the age of our dams, bridges, and roads. The changes to river flows that we found are important for those who manage or depend on this type of infrastructure," said Dethier.

According to the study, in the regions where streamflow changes were found to be statistically significant, floods and droughts have, on average, doubled in frequency relative to the period of 1950 to 1969.

Significant changes in the frequency of floods were found to be most common in the Canadian and northern U.S. regions where annual peak flows are consistently associated with spring snowmelt runoff.

The increase in flooding has come despite reduction in snowpack caused by warming winter temperatures. The research team believes that the increases in extreme precipitation during the high-flow season may make up for the reduction in snowpack storage.

Changes in drought and extreme low-flow frequency were found to be more variable.

While floods were found to be more localized, droughts were found to be "generally reflective of large-scale climatic forcing" and more likely to be widespread across a region.

Credit: 
Dartmouth College

NYUAD study informs research of child development and learning in conflict-affected areas

Abu Dhabi, UAE, January 6, 2021: To provide effective aid to children who live in areas of conflict it is necessary to understand precisely how they have been impacted by the crises around them. One area of importance is the effect of conflict and trauma on a child's development and education.

In a new paper, Global TIES for Children researchers J. Lawrence Aber, Carly Tubbs Dolan, Ha Yeon Kim, and Lindsay Brown, present a review of opportunities and challenges they have encountered in designing and conducting rigorous research that advances our understanding of this effect. Global TIES for Children, an international research center based at NYU Abu Dhabi and NYU New York, generates evidence to support the most effective humanitarian and development aid to promote children's academic and socio-emotional learning.

This review focuses on their efforts to test the effectiveness of educational programming that incorporates skill-targeted social and emotional learning (SEL) programs. SEL programs are designed to help participants apply knowledge and skills towards managing their stress and feelings, establishing positive relationships, achieving goals, and making responsible decisions.

The results of the paper titled, Children's Learning and Development in Conflict- and Crisis- Affected Countries: Building a Science for Action, published in the Cambridge University Press journal Development and Psychopathology, indicated positive impacts of remedial education and social and emotional learning programs on academic skills, and presented key themes to be addressed when designing future refugee education programming and related research.

Aber and colleagues note the importance of long-term partnerships between researchers, practitioners, policymakers, and donors to provide higher quality evidence for decision making about programs and policies. Additionally, context-relevant measures and research methods are needed to enable the study of under-resourced, crisis-affected communities.

The researchers also argue for a global research effort on building cumulative and revisable developmental science that is based on the children's lived experience in their own culture and context and grounded in ethical principles and practical goals.
The paper's findings will also guide the development of effective research that can better study various communities and conditions.

"It is our hope that the findings of this paper can spark a conversation about how best to assess and meet the needs of children in conflict-affected areas, and can allow for the development of more effective aid programs," said Aber.

Credit: 
New York University

Researchers synthesize bio-based Methylcyclopentadiene with 3-Methylcyclopent-2-enone

image: Direct hydrodeoxygenation of MCP to MCPD on the partially reduced Zn-Mo oxide catalyst.

Image: 
DICP

Methylcyclopentadiene (MCPD) is an important monomer in the production of RJ-4 fuel, a high-energy-density rocket fuel, and various valuable products.

Currently, MCPD is mainly obtained from the by-products of petroleum cracking tar at a very low yield of ~ 0.7 kg ton-1 and high price of ~10,000 USD ton-1. The exploration of highly efficient processes to convert renewable biomass to MCPD is stimulated by the energy and environment problems.

Recently, a group led by Prof. LI Ning and Prof. ZHANG Tao from the Dalian Institute of Chemical Physics (DICP) of the Chinese Academy of Sciences (CAS) synthesized bio-based MCPD via direct hydrodeoxygenation of 3-methylcyclopent-2-enone (MCP) derived from cellulose.

Their study was published in Nature Communications on Jan. 4.

The researchers found that selective hydrodeoxygenation of MCP to MCPD could be achieved on the partially reduced Zn-Mo oxide catalyst.

The Zn-Mo oxide catalyst formed ZnMoO3 species during the reduction of ZnMoO4, which might preferentially adsorb C=O bond in the presence of C=C bond in vapor phase hydrodeoxygenation of MCP led to highly selective formations of MCPD with a carbon yield of 70%.

"This is a following work of our previous report about the synthesis of 2,5-hexanedione by the direct hydrogenolysis of cellulose and the intramolecular aldol condensation of 2,5-hexanedione to MCP," said Prof. Li.

This study opens up a horizon for the production of dienes with unsaturated ketone by a direct hydrodeoxygenation process.

Credit: 
Dalian Institute of Chemical Physics, Chinese Academy Sciences

New hard disk write head analytical technology can increase hard disk capacities

image: Overview of the developed analysis technology

Image: 
Toshiba

Using synchrotron radiation at SPring-8 - a large-scale synchrotron radiation facility - Tohoku University, Toshiba Corporation, and the Japan Synchrotron Radiation Research Institute (JASRI) have successfully imaged the magnetization dynamics of a hard disk drive (HDD) write head for the first time, with a precision of one ten-billionth of a second. The method makes possible precise analysis of write head operations, accelerating the development of the next-generation write heads and further increasing HDD capacity.

Details of the research were published in the Journal of Applied Physics on October 6 and presented at the 44th Annual Conference on Magnetics in Japan, on December 14.

International Data Corporation predicts a five-fold increase in the volume of data generated worldwide in the seven years between 2018 and 2025. HDDs continue to serve as the primary data storage devices in use, and in 2020 the annual total capacity of shipped HDDs is expected to exceed one zettabyte (1021 bytes), with sales reaching $20 billion. Securing further increases in HDD capacity and higher data transfer rates with logical write head designs requires an exhaustive and accurate understanding of write head operations.

There are, however, high barriers to achieving this: current write heads have a very fine structure, with dimensions of less than 100 nanometers. Magnetization reversal occurs in less than a nanosecond, rendering experimental observations of write head dynamics difficult. Instead, the write head analysis has been conducted by simulations of magnetization dynamics, or done indirectly by evaluating the write performance on the magnetic recording media. Both approaches have their drawbacks, and there is clear demand for a new method capable of capturing the dynamics of a write head precisely.

Tohoku University, Toshiba, and JASRI used the scanning soft X-ray magnetic circular dichroism microscope installed on the BL25SU beamline at SPring-8 to develop a new analysis technology for HDD write heads.

The new technology realizes time-resolved measurements through synchronized timing control, in which a write head is operated at an interval of one-tenth of the cycle of the periodic X-ray pulses generated from the SPring-8 storage ring. Simultaneously, focused X-rays scan the medium-facing surface of a write head, and magnetic circular dichroism images temporal changes in the magnetization. This achieves temporal resolution of 50 picoseconds and spatial resolution of 100 nanometers, enabling analyses of the fine structures and fast write head operation. This method has the potential to achieve even higher resolutions by improving the focusing optics for the X-rays.

The development team used the new technology to obtain the time evolution of the magnetization images during reversal of the write head. The imaging revealed that magnetization reversal of the main pole is completed within a nanosecond and that spatial patterns from magnetization appear in the shield area in response to the main pole reversal. No previous research into write head operations has achieved such high spatial and temporal resolutions, and the use of this approach is expected to support high-precision analyses of write head operations, contributing to the development of the next-generation write heads and the further improvements in HDD performance.

Toshiba is currently developing energy-assisted magnetic recording technologies for next-generation HDD and aims to apply the developed analysis method and the knowledge obtained about write head operations to the development of a write head for energy-assisted magnetic recording.

Credit: 
Tohoku University

Israel can expect a major earthquake of 6.5 on the Richter scale in the coming years

image: Prof. Shmuel Marco.

Image: 
Tel Aviv University

A first-of-its-kind study conducted under the bed of the Dead Sea reveals that a devastating earthquake measuring 6.5 on the Richter scale is expected to hit our region in the coming years. The study showed that an earthquake of this magnitude occurs in the land of Israel on an average cycle of between 130 and 150 years, but there have been cases in history where the lull between one earthquake and another was only a few decades long.

The last earthquake with a magnitude of 6.5 on the Richter scale was felt in the Dead Sea valley in 1927, when hundreds of people were injured in Amman, Jerusalem, Bethlehem and even Jaffa. Now, in the wake of the findings of the study, the researchers are warning that another earthquake is very likely to occur in our lifetime, in the coming years or decades.

The research was carried out by an international team of researchers, including Prof. Shmuel Marco, Head of Tel Aviv University's Porter School of the Environment and Earth Sciences, and his fellow researchers Dr. Yin Lu (postdoc at TAU), Prof. Amotz Agnon (Hebrew University), Dr. Nicolas Waldmann (Haifa University), Dr. Nadav Wetzler (Israel Geological Survey) and Dr. Glenn Biasi (US Geological Survey). The results of the groundbreaking study were published in the prestigious journal Science Advances.

As part of the study, the research was carried out under the auspices of the International Continental Scientific Drilling Program (ICDP), which conducts deep drilling in lakebeds all over the world with the aim of studying Earth's ancient climate and other environmental changes. In 2010, a rig was placed in the center of the Dead Sea and began drilling to a depth of hundreds of meters, enabling an analysis of some 220,000 years of Dead Sea geology.

According to Prof. Marco, because the Dead Sea is the lowest place on earth, every winter, the flood waters that flow into the Dead Sea carry with them sediment which accumulates at the bottom of the lake into different layers. A dark layer of about one millimeter represents the winter flash-flood sediment and a lighter layer, also about a millimeter thick, represents the increased evaporation of water during the summer months, with every two layers representing a different year.

As soon an earthquake occurs, the sediments swirl together, with the layers that had previously settled in perfect sequence blending into one another and resettling in a different arrangement. Using equations and computational models that the researchers developed specifically for this study, they were able to understand the physics of the process and reconstruct from the geological record the history of earthquakes over time.

An analysis of the record - the longest record of its kind in the world, shows that the frequency of earthquakes in the Dead Sea valley is not fixed over time. There were periods of thousands of years with more earthquake activity and thousands of years with less. Moreover, the researchers found that there was a significant underestimation of the frequency of earthquakes in Israel.

If until now researchers thought that the Dead Sea rift trembled at a magnitude of 7.5 or higher on the Richter scale every 10,000 years on average, it now appears that such destructive earthquakes are much more frequent, with an average cycle ranging from 1,300 to 1,400 years. The researchers estimate that the last earthquake of this magnitude struck the region in 1,033 - that is, almost a thousand years ago. This means that in the next few centuries, we can expect another earthquake of a magnitude of 7.5 or higher.

In contrast, the researchers found that earthquakes with a magnitude of 6.5 occur in the region every 130 to 150 years on average, but that the frequency between earthquakes varies; while there were cases in which the lapse between one earthquake and another lasted hundreds of years, there were also cases in which powerful earthquakes occurred within only a few decades of each other.

"I don't want to cause alarm," concludes Prof. Marco, "but we are living in a tectonically active period. The geological record does not lie and a major earthquake in Israel will come. Of course, we have no way of predicting exactly when the earth will shake under our feet - this is a statistical projection - but unfortunately, I can say that an earthquake that will cause hundreds of casualties will hit in the coming years. It could be in ten years or in several decades, but it could also be next week, and we need to constantly be prepared for that."

Credit: 
Tel-Aviv University

Why we use our smartphone at cafés

Maybe you're like us. We're the folks who are on our smartphones almost all the time, even when we're with others. We know it annoys a lot of people, but we do it anyway. Why?

Researchers at the Norwegian University of Science and Technology (NTNU) have looked at why people in cafés pull out their phones, and how this affects café life. Three main reasons they identified are: to delay or pause a conversation (interaction suspension); to get out of a conversation (deliberately shielding interaction); and to share something with others (accessing shareables).

But what does that actually mean?

The smartphone is the world's most ubiquitous personal tech gizmo. The vast majority of adults have one.

"This makes the smartphone important, both socially and sociologically," says postdoctoral fellow and first author Ida Marie Henriksen.

She is affiliated with NTNU's Department of Interdisciplinary Studies of Culture, and has written an article with Professor Aksel Tjora and PhD candidate Marianne Skaar from the Department of Sociology and Political Science.

The use of smartphones is connected to so many of our activities, both ones we do alone and ones we do with others. We look for tempting cafés online, pay for the bus ticket to the café with it, invite friends to come join us, use the phone to identify the music that the café is playing, and lots of other things.

Smartphones give us even better opportunities to be social. But they also enable us to distance ourselves from others.

The researchers visited 52 people at cafés in Trondheim, and interviewed them in depth about their mobile phone use and how they interacted with other people.

"We focused exclusively on people who seemed to know each other from before and who met to socialize. In addition, we observed 108 other meetings at a distance, kind of like research flies on the wall," says Skaar.

By design, cafés are a place where you can be especially social with others. But some people use it instead as a place to hide away with a good drink for a while and keep a suitable distance from people, or as a workplace, preferably with a laptop or tablet in addition to the ubiquitous mobile phone.

So what do the three main types of cell phone use involve?

Delaying interaction is what happens when we interrupt a conversation with our café partner to check an email, a phone conversation, a picture on Snap or just to make sure we haven't missed anything on social media the last few minutes.

This is also called "phubbing" (from phone + snubbing), when the phone gets your attention instead of the live person you're with.

How this behaviour is perceived depends on how the conversation partners understand the situation. You can get annoyed about it and see it as rude. But that's not necessarily the case.

"On the one hand, how you suspend your interaction plays a role. If you explain to the person you're with why you have to postpone your physical interaction, it's perceived as more polite than if you just disappear and start "phubbing," that is, phoning someone else and ignoring the person who's physically present. At the same time, some people may appreciate a short break from a longer conversation, and using the phone can also be a natural, interwoven part of the social interaction that takes place in the café," says Tjora.

Deliberately shielding interaction is a slightly different, more subtle way of using the phone than suspending interaction exposure.

"When the person you're with gets busy on their smartphone, the other person in the social setting can pick up their smartphone to demonstrate that they're busy too and not being involuntarily left to themselves. Or if you're in a group, you can pick up your phone to avoid a conversation topic by signalling that you are busy. The smartphone offers a break from face-to-face social situations," says Henriksen.

Some of us take this a step further by keeping our cell phone in silent mode. Then we can pretend we've received an important message or conversation, and that we have to hurry to answer it, maybe even leaving the company we're sitting with.

You can escape a lot of boring meetings this way. But it's not exactly pleasant.

Content sharing is the more pleasant or useful sharing side of using a phone and can sometimes be almost the opposite of taking a break from interacting.

"When you take a selfie together, or show pictures of your new girlfriend or kids, or of the house you want to bid on, or the map of where you were on holiday, you're sharing content," says Tjora.

Maybe you have an email, an SMS or a document that you want to show your café partner. That belongs in this category, too. Or when you disagree about what that actor or musician's name was, and a quick web search can tell you which one of you was right.

Content sharing in a café setting often raises new conversation topics and can enrich the interaction. Sharing is probably also the most socially accepted use of mobile phones.

Of course, there are overlapping and grey areas too.

"For example, a mutual understanding can allow those who are meeting to take pictures of the coffee cup at the very beginning of the conversation and perhaps share the picture on social media for uninterested acquaintances. But then they put away their phones, either until a message appears, or perhaps even until the physical meeting comes to an end. If you go to a café to be social, the person with you in real life is the focus," says Henriksen.

Sometimes there are also situations where café partners jointly agree to check this and that on their phones for a short while, but then put them away and concentrate on each other.

Others use their phone while the café partner is ordering at the counter or going to the toilet, simply to have something to do while the other person is away. This is almost like a kind of addiction, where we constantly have to be doing something and fill in all the breaks. The phone is immediately available, willing and able to satisfy this aversion to silence.

The smartphone is a tool for signalling interest or distance, but it can also enrich conversations and be used to share experiences with other people than only those who are physically present.

"The study dispels the myth that everyone is constantly staring at their screens no matter the occasion, and shows that a form of courtesy with the phone has been established, at least in situations where the social aspect is prioritized," says Tjora.

"Whatever the reasons, one thing seems certain: smartphones have changed how we behave socially, for better or for worse. But maybe socializing has just become different in a way we need to become conscious of.

Credit: 
Norwegian University of Science and Technology