Earth

How volcanoes explode in the deep sea

Most volcanic eruptions take place unseen at the bottom of the world's oceans. In recent years, oceanography has shown that this submarine volcanism not only deposits lava but also ejects large amounts of volcanic ash.

"So even under layers of water kilometers thick, which exert great pressure and thus prevent effective degassing, there must be mechanisms that lead to an 'explosive' disintegration of magma," says Professor Bernd Zimanowski, head of the Physical-Volcanological Laboratory of Julius-Maximilians-Universität (JMU) Würzburg in Bavaria, Germany.

Publication of an international research group

An international research group led by Professors James White (New Zealand), Pierfrancesco Dellino (Italy) and Bernd Zimanowski (JMU) has now demonstrated such a mechanism for the first time. The results have been published in the journal Nature Geoscience.

The lead author is Dr. Tobias Dürig from the University of Iceland, a JMU alumnus and former Röntgen Award winner of the JMU Institute of Physics. Before he went to Iceland, Dürig was a member of the research groups of Professor Zimanowski and Professor White.

Diving robot sent to a depth of 1,000 metres

The team did research at the Havre Seamount volcano lying northwest of New Zealand at a depth of about 1,000 metres below the sea surface. This volcano erupted in 2012, and the scientific community became aware of it.

The eruption created a floating carpet of pumice particles that expanded to about 400 square kilometres - roughly the size of the city of Vienna. Now a diving robot was used to examine the ash deposits on the seabed. From the observational data the group of James White detected more than 100 million cubic meters of volcanic ash.

The diving robot also took samples from the seafloor, which were then used in joint experimental studies in the Physical-Volcanological Laboratory of JMU.

Experiments in the Physical-Volcanological Laboratory

"We melted the material and brought it into contact with water under various conditions. Under certain conditions, explosive reactions occurred which led to the formation of artificial volcanic ash," explains Bernd Zimanowski. The comparison of this ash with the natural samples showed that processes in the laboratory must have been similar to those that took place at a depth of 1,000 meters on the sea floor.

Zimanowski describes the decisive experiments: "In the process, the molten material was placed under a layer of water in a crucible with a diameter of ten centimeters and then deformed with an intensity that can also be expected when magma emerges from the sea floor. Cracks are formed and water shoots abruptly into the vacuum created. The water then expands explosively. Finally, particles and water are ejected explosively. We lead them through an U-shaped tube into a water basin to simulate the cooling situation under water". The particles created in this way, the "artificial volcanic ash", corresponded in shape, size and composition to the natural ash.

Possible effects on the climate

"With these results, we now have a much better understanding of how explosive volcanic eruptions are possible under water," says the JMU professor. Further investigations should also show whether underwater volcanic explosions could possibly have an effect on the climate.

"With submarine lava eruptions, it takes a quite long time for the heat of the lava to be transferred to the water. In explosive eruptions, however, the magma is broken up into tiny particles. This may create heat pulses so strong that the thermal equilibrium currents in the oceans are disrupted locally or even globally." And those very currents have an important impact on the global climate.

Info Box: Volcanoes on the ocean floor

There are around 1,900 active volcanoes on land or as islands. The number of submarine volcanoes is estimated to be much higher. Exact numbers are not known because the deep sea is largely unexplored. Accordingly, most submarine volcanic eruptions go unnoticed. Submarine volcanoes grow slowly upwards by recurring eruptions. When they reach the water surface, they become volcanic islands - like the active Stromboli near Sicily or some of the Canary Islands.

Credit: 
University of Würzburg

Engineered immune cells recognize, attack human and mouse solid-tumor cancer cells

image: U. of I. postdoctoral researcher Preeti Sharma and her colleagues engineered a molecule that targets both human and mouse solid-tumor cancer cells.

Image: 
Photo by Fred Zwicky

CHAMPAIGN, Ill. -- A method known as CAR-T therapy has been used successfully in patients with blood cancers such as lymphoma and leukemia. It modifies a patient's own T-cells by adding a piece of an antibody that recognizes unique features on the surface of cancer cells. In a new study, researchers report that they have dramatically broadened the potential targets of this approach - their engineered T-cells attack a variety of solid-tumor cancer cells from humans and mice.

They report their findings in the Proceedings of the National Academy of Sciences.

"Cancer cells express on their surface certain proteins that arise because of different kinds of mutations," said Preeti Sharma, a postdoctoral researcher at the University of Illinois at Urbana-Champaign who led the research with biochemistry professor David Kranz, a member of the Cancer Center at Illinois and an affiliate of the Carl R. Woese Institute for Genomic Biology, also at the U. of I. "In this work, we were looking at protein targets that have short sugar chains attached to them."

The abnormally short sugar chains on some types of cancer cells result from mutations that disrupt the molecular pathway that attaches these sugars to proteins, Sharma said. Drugs that bind to the aberrant sugars preferentially recognize cancer cells and spare healthy cells.

CAR-T therapy is a promising treatment for patients with certain types of blood cancers. But identifying binding sites in solid tumors has been more difficult, Kranz said.

"A major challenge in the field has been to identify targets that exist on cancer cells in solid tumors that are not present on normal tissue," he said.

The team started with a piece of an antibody that could serve as a receptor. The antibody was known to interact with a specific type of abnormally formed sugar attached to a protein on solid-tumor cancer cells in mice.

"We realized that because this receptor binds both to the protein and the sugar on the surface of the cancer cell, there might be room to change the antibody so that it can bind to more than one protein attached to the short sugar," Sharma said. "This could make it broadly reactive to different kinds of cancers."

Study co-author Qi Cai, another postdoctoral researcher in the Kranz lab, tested whether changes in the sequence of amino acids in the vicinity of the abnormal sugar affected the receptor's binding to the site. This allowed the team to determine if the antibody could be slightly changed to accommodate other sugar-linked cancer targets.

They conducted a series of mutation experiments focused on the essential parts of the antibody, Sharma said.

"We generated almost 10 million mutant versions of our receptor, and then we screened those to find the property we wanted," she said. "In this case, we wanted to broaden the specificity of that antibody so that it reacts not only to the mouse target but also to human targets."

Once they found the antibodies with the desirable traits, the researchers engineered them into T-cells and tested them with mouse and human cancer cell lines.

"Our engineered T-cells are showing activity against both human and mouse cancer cell lines," Sharma said. "And the T-cells can now recognize several different proteins that have short sugars attached to them. This is really important because in cancer therapy, most of the time you are going after a single target on a cancer cell. Having multiple targets makes it very difficult for the cancer to evade the treatment."

"Although these engineered cells are early in development, we are particularly excited that we can use the same T-cell product to study efficacy and safety against cancers in mice and humans," Kranz said.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Newly designed ligands for a catalytic reaction to synthesize drugs and useful compounds

image: Tetrasubstituted chromanones are useful for drug design, but an effective strategy for their generation has been lacking until now.

Image: 
Gwangju Institute of Science and Technology

Currently, various therapeutic compounds in the market, such as proteins, enzymes, and amino acids, are "chiral compounds"--molecules with two structures that are "mirror" images of each other but cannot be superimposed. Although the two variants of the molecule, also called "enantiomers," are structurally the same, how they are oriented (their "chirality") makes them functionally different from each other. Medicinal drugs can be either a single enantiomer or racemic mixtures (consisting of both enantiomers), often designated as (S) or (R), respectively. They often have distinct biological activities: for example, one enantiomer of a pharmaceutical may be far more effective than its counterpart (such as thalidomide, a racemic mixture that caused various birth defects in children). Thus, synthesizing chiral compounds in an effective manner is crucial to the field of drug design.

In a new study published in Chemical Science, a group of scientists, led by Prof Sukwon Hong of Gwangju Institute of Science and Technology and Prof Brian M. Stoltz of California Institute of Technology, designed a novel catalytic method that can generate useful chiral compounds. Prof Hong explains, "Chiral molecules have played a key role in modern chemistry, especially in medicinal chemistry. Their development can provide an effective synthetic way to design pharmaceutical products."

To begin with, the scientists focused on designing novel chiral "ligands," which are molecules that act as catalysts by binding to metals and can, in this case, facilitate the generation of chiral products called chromanones. Previous studies have already reported different types of reactions that can produce chromanones, but they had focused on trisubstituted chromanones (with three functional groups or substituent atoms in the molecule). In this study, the scientists designed chiral ligands called "pyridine-dihydroisoquinoline (PyDHIQ) ligands." They used these ligands in a catalytic reaction called "asymmetric conjugate addition," wherein these ligands act as a catalyst by binding to palladium metal, generating tetrasubstituted chromanones (those with four functional groups). Not only did this reaction using the novel ligands generate useful chiral compounds with numerous bioactivities in a single step, but the products also had a good yield and high enantioselectivity--making the process efficient and cost-effective.

The scientists then tested these ligands in reactions with various different sources, which resulted in the efficient generation of tetrasubstituted chromanones. This was the first method for the synthesis of highly enantioselective chromanone products containing tetrasubstituted stereocenters. Prof Hong says, "Ligand design is the most important concept of this research. We have introduced a new 'moiety' called the dihydroisoquinoline moiety in the ligand structure, which helped to create an optimal steric environment to generate tetrasubstituted chromanones."

The development of novel drugs and useful compounds is an important aspect of advancements in the field of medicine. This study offers a novel one-step strategy to develop bioactive compounds, which have a myriad of applications in drug development. Prof Hong optimistically concludes, "Our newly developed catalytic reaction paves the way for synthesizing novel drugs and natural products."

Credit: 
GIST (Gwangju Institute of Science and Technology)

Historic floods reveal how salt marshes can save lives in the future

image: These are salt marshes in the Westerschelde (near Rilland).

Image: 
Edwin Paree

YERSEKE (THE NETHERLANDS), 29 JUNE 2020 - Coastal wetlands like salt marshes are increasingly recognized as valuable natural defenses that protect coasts against strong wave attacks. Yet their performance during real-world, extreme storms has rarely been told. By digging into major historic records of flood disasters, a research team led by scientists from the Royal Netherland Institute for Sea Research (NIOZ), Delft University of Technology, Deltares and Antwerp University, reveal in a publication this week in Nature Sustainability that the value of nature for flood defense has actually been evident for hundreds of years.

Salt marshes have reduced the number of dike breaches during the well-known 1717 historic flood disaster. More interestingly, the 1953 flood disaster also tells us that salt marshes are not only 'wave absorbers' that ease wave attacks on the dike, but are also 'flood fighters' that lower the flood depth by limiting the size of breaches when the dike would fail during severe storms. And having smaller and shallower breaches because of salt marsh protection can save many lives.

Salt marshes have made dikes more stable during severe historic storms

Rising sea levels and stronger storms raise coastal flood risks and inspire development of new strategy of flood dense: supplementing engineered structures with coastal wetlands like salt marshes. Although we have learnt from experiments and models that these natural buffers are 'wave absorbers' that reduce storm impact, it is unclear whether and how they can indeed add considerable safety to engineered defenses during severe, real world storms. 'Evidence from two notorious flood disasters that killed thousands of people after dike breaching: 1717 Christmas flood and 1953 North Sea flood, however, show that salt marshes have already displayed their role of 'flood fighter' for hundreds of years', says Zhenchang Zhu, the leading author of this paper, who conducted this research at NIOZ, but is currently working at Guangdong University of Technology, China. 'Salt marshes not only reduced the number and total width of dike breaches during the 1717 Christmas flood, but was also found to confine the breach depth during the 1953 North Sea flood. Especially the latter, previously unknown function of natural defenses, can greatly reduce flood damage by lowing inundation depth', Zhu continues.

Protected by salt marshes during dike breaches: how does it work
https://www.nioz.nl/en/expertise/wadden-delta-research-centre/news-media/videos/coastal-protection/protected-by-salt-marshes-during-dike-breaches-how-does-it-work

Hidden value of natural defense inspires novel flood protection designs

What can we learn from historic lessons? 'Flood defenses combining green and gray features are actually more beneficial than considered earlier. Beyond wave attenuation, salt marshes can lower flood impacts simply by limiting the size of dike breach, and continues to do so under sea level rise', Zhu adds. This generally overlooked function of salt marshes is actually more applicable than wave dissipation, as it is not limited to wave-exposed locations. To harness natural defense, marshes ideally have to be preserved or developed at the seaside of the dike to buffer the waves. This may, however, not always be possible. The study implies that even in this situation, it may still be possible to enhance coastal safety by creating salt marshes in between double dikes, where a secondary more landward dike is present and the most seaward primary dike is opened to allow natural processes to ensure marsh development. Despite no longer useful for wave reduction, such marshes are still very helpful for flood protection by making the landward dike more stable during extreme storms and buffer the effects of the rising sea in the long run. 'Overall this research enables novel designs of nature-based coastal defenses by smartly harnessing different natural flood defense functions', says Zhenchang Zhu.

For more information about the future use of double dikes in Zeeland, see also:

Double dikes for flood safety
https://www.nioz.nl/en/expertise/wadden-delta-research-centre/news-media/videos/coastal-protection/double-dykes-for-flood-safety

Credit: 
Royal Netherlands Institute for Sea Research

Study shows antibiotic resistance genes persist in E. coli through "genetic capitalism"

We have known for some time that over-use of antibiotics is causing a frightening increase in antibiotic resistance in bacteria, through the rapid spread of antibiotic resistance genes. What may be behind this is not just the spread of these genes, but a fundamental change in the way evolution is driving the economy of gene content among microbes.

Normally, according to evolutionary theory, genes that become prevalent in a population are chosen through natural selection, where survival of organisms carrying a specific gene is determined by an economic no-nonsense cost-benefit analysis. Now, however, it appears that excessive human interference is turning bacteria into rapacious gene-hoarding "genetic capitalists," driving a more unexpected evolutionary process.

A new study, published in the current issue of the journal Cladistics, analyzes a massive genetic data set involving genomes of 29,255 strains of the bacterium Escherichia coli (E. coli) collected between 1884 and 2018 to examine the evolution of 409 different genes that enable various strains of bacteria to resist various antibiotics. The researchers examined whether the genes that confer antibiotic resistance, once acquired, tended to persist massively in the bacterial lineage -- a phenomenon known as "genetic capitalism" -- or disappear once they are no longer required for survival, through a normal evolutionary process known as "stabilizing selection."

In a normal, non-disrupted world, the processes of evolution course corrects changes to bacterial genomes to account for "cost." In adding an extra gene that introduces an extra cost by adding to the bacterium's processes, natural selection balances the change with larger or more long-term survival parameters, such as more rapid growth and more reproduction - and a "stabilizing selection" process should theoretically dominate by favoring the elimination of genes whose costs unnecessarily interfere.

A wide variety of genes that help bacteria resist antibiotic compounds (antibiotics released by other microorganisms and now co-opted in human medicine and agriculture) have probably been around for more than a billion years, but were never so necessary to bacterial survival as to be widespread in bacterial genomes. The expectation, following evolutionary theory, was that these genes, like bulky, high-maintenance tools being carried in a bacterial toolkit, would tend to disappear when they were no longer needed.

The study found that "stabilizing selection" is no longer the evolutionary rule for antibiotic resistance genes.

"Bacteria are under constant competitive pressure from other microorganisms, battling for resources and space or defending against attack," noted Daniel Janies, the Carol Grotnes Belk Distinguished Professor of Bioinformatics and Genomics at the University of North Carolina at Charlotte, and the study's corresponding author. "The energy budget of E. coli is pretty tight -- it's been said that even adding an extra base to a gene will make the bacterial lineage less fit.

"In the absence of selective forces of antibiotics, the bacterial lineage would evolve to lose genes that confer antibiotic resistance - anything that's unnecessary. That's stabilizing selection - the bacterial lineages should come back to the wild type through selective forces or be outcompeted," he said.

And yet instead, in the evolutionary history of E. coli over the past 134 years, the study found that preserving genetic changes that conferred antibiotic resistance was more likely to occur than losing them through long-term selection.

"Most of the genes we examined show gains in a bacterial lineage, but rarely show losses," Janies said. "Imagine how carrying all these genes - sometimes up to 30 of them -- should impact the evolutionary fitness of a bacterium."

While increasing antibiotic resistance in bacteria is hardly a new discovery, Janies notes that the scope of its occurrence shows that massive human interference is causing the advent of a widespread hoarding of genes that confer antibiotic resistance that was unlikely to have happened under normal evolutionary pressure.

"Since the industrialization of antibiotics we've seen that the cost of thriving or just surviving for E. coli requires one or more -- or upwards of thirty of antibiotic resistance genes, " Janies said. "The forces that we are applying through industrialization of antibiotics are very strong."

However, the study also shows that not all genes that confer antibiotic resistance for E. coli are effected to the same extent by the new evolutionary pressures. The study tracks five different ways that genes can confer resistance to antibiotics and measures differences between these broad antibiotic classes in whether they are pushed from stabilizing selection to genetic capitalism or not.

"What we wanted to do is to look at the history of these processes through the lens of a very large data set collected over 134 years and see if there were qualitative differences and functional differences in the genes that behave by the principles of stabilizing selection and those that exhibit genetic capitalism."

The study did find that antibiotic resistance genes that work through mechanisms of "replacement" (replacing bacterial cell molecules that are targets of antibiotic compounds with different molecules) or "efflux" (causing the transport of antibiotic compounds out of the cell) are still more likely to be subject to being eliminated through stabilizing selection than to participate as currency in genetic capitalism - probably because these two mechanisms are extremely costly to the routine functioning of the bacterial cell.

Nevertheless, all the other antibiotic resistance mechanisms behave as if they are under the principle of "genetic capitalism," favoring the persistence of genes, showing that, overall, the tendency to retain these costly resistance genes has become the new rule for bacterial lineages.

"This study really helps to stratify the severity or risk of different types of resistance," noted UNC Charlotte bioinformatician Colby Ford, the paper's first author. "In other words, we can better pinpoint antibiotics that are at a higher risk of bacteria developing a more permanent form of resistance to, which should be avoided."

The researchers note that some antibiotic resistance genes (the types of that still show strong effects of stabilizing selection) may still be reduced in bacterial populations by "antibiotic cycling" - taking certain antibiotics out of use for a while until stabilizing selection reduces the presence of the resistance gene in bacterial populations.

"It's an alarming finding, but I didn't want to write the 'doom and gloom paper' - I think there is some hope for managing some kinds of antibiotic resistance," Janes said. "If there is a take-home message that can be used for antibiotic stewardship it's that some classes of antibiotics, those that work via target replacement and efflux, are subject to stabilizing selection, if we have the will and organization to invoke antibiotic cycling."

Credit: 
University of North Carolina at Charlotte

Analysis of complex geometric models made simple

image: Carnegie Mellon University researchers have shown complex shapes need not be divided into intricate meshes, left, to perform geometric analysis. Instead of spending 14 hours creating a mesh, they use Monte Carlo methods to get initial results in less than a minute of the amount of heat radiated from an ant's body, center. Additional computation further refines the results, right.

Image: 
Carnegie Mellon University

PITTSBURGH--Researchers at Carnegie Mellon University have developed an efficient new way to quickly analyze complex geometric models by borrowing a computational approach that has made photorealistic animated films possible.

Rapid improvements in sensor technology have generated vast amounts of new geometric information, from scans of ancient architectural sites to the internal organs of humans. But analyzing that mountain of data, whether it's determining if a building is structurally sound or how oxygen flows through the lungs, has become a computational chokepoint.

"The data has become a monster," said Keenan Crane, assistant professor of computer science and robotics. "Suddenly, you have more data than you can possibly analyze -- or even care about."

Crane and Rohan Sawhney, a Ph.D. student in the Computer Science Department, are taming the monster by using so-called Monte Carlo methods to simulate how particles, heat and other things move through or within a complex shape. The process eliminates the need to painstakingly divide shapes into meshes -- collections of small geometric elements that can be computationally analyzed. The researchers will present their method at the SIGGRAPH 2020 Conference on Computer Graphics and Interactive Techniques, which will be held virtually in July.

"Building meshes is a minefield of possible errors," said Sawhney, the lead author. "If just one element is distorted, it can throw off the entire computation. Eliminating the need for meshes is pretty huge for a lot of industries."

Meshing was also a tough problem for filmmakers trying to create photorealistic animations in the 1990s. Not only was meshing laborious and slow, but the results didn't look natural. Their solution was to add randomness to the process by simulating light rays that could bounce around a scene. The result was beautifully realistic lighting, rather than flat-looking surfaces and blocky shadows.

Likewise, Crane and Sawhney have embraced randomness in geometric analysis. They aren't bouncing light rays through structures, but they are using Monte Carlo methods to imagine how particles, fluids or heat randomly interact and move through space. First developed in the 1940s and 1950s for the U.S. nuclear weapons program, Monte Carlo methods are a class of algorithms that use randomness in an ordered way to produce numerical results.

Crane and Sawhney's work revives a little-used "walk on spheres" algorithm that makes it possible to simulate a particle's long, random walk through a space without determining each twist and turn. Instead, they calculate the size of the largest empty space around the particle -- in the lung, for instance, that would be the width of a bronchial tube -- and make that the diameter of each sphere. The program can then just jump from one random point on each sphere to the next to simulate the random walk.

While it might take a day just to build a mesh of a geometric space, the CMU approach allows users to get a rough preview of the solution in just a few seconds. This preview can then be refined by taking more and more random walks.

"That means one doesn't have to sit around, waiting for the analysis to be completed to get the final answer," Sawhney said. "Instead, the analysis is incremental, providing engineers with immediate feedback. This translates into more time doing and less time banging one's head against the wall trying to understand why the analysis isn't working."

Sawhney and Crane are working with industry partners to expand the kinds of problems that can be solved with their methods. The National Science Foundation, Packard Fellowship, Sloan Foundation, Autodesk, Adobe, Disney and Facebook provided support for this work.

Credit: 
Carnegie Mellon University

Study: Gay and bisexual youth more likely to abandon churchgoing as they reach adulthood

Religious beliefs have shaped societal attitudes toward sexual minorities, with many religious denominations vocally opposing expanded sexual minority rights. Because of this stigmatization, lesbian, gay and bisexual individuals are less likely to affiliate with a religious group -- but research from the University of Nebraska-Lincoln and Old Dominion University suggests they are not abandoning their faith altogether.

In a new study, sociologists Brandi Woodell and Philip Schwadel found that emerging adults -- from adolescence to early adulthood -- with same-sex attraction are twice as likely to disaffiliate from organized religion than their heterosexual peers, but there was little change in prayer.

"I think that is something we expected, that there'd be a difference between affiliation on one hand and prayer on the other," said Schwadel, Happold Professor of Sociology at Nebraska. "In the previous research on adolescent religion, in particular, and in later adolescence or early emerging adulthood, we see a lot of declines in the organized aspects of religion, but we see less of a decline in prayer. Prayer is something people can often do on their own at home or wherever they want."

And, not in an environment that may be stigmatizing toward sexual minorities, the authors wrote in the paper.

The scholars used two longitudinal surveys, the National Longitudinal Study of Adolescent to Adult Health, and the National Study of Youth and Religion to examine -- for the first time -- these declines in religiosity over time for sexual minorities in emerging adulthood.

"Almost all previous research was cross-sectional, only looking at, 'do people who identify as gay or lesbian -- are their religious activities and beliefs different?'" Schwadel said. "It didn't look at how they change over time, especially during this stage of the life course, when individuals are really figuring out who they are."

The study also showed a significant difference in religiosity declines between gay and bisexual individuals, further demonstrating that sexual minorities are not a monolithic group.

Woodell, a 2018 Nebraska alumna and assistant professor of sociology at Old Dominion University in Norfolk, Virginia, said this study joins a novel line of research examining the differences between bisexual, gay and lesbian individuals.

"Past research has most often combined sexual minorities into one group, and that was largely due to a lack of data that separated them, but some newer research has suggested there are differences, which led us to separate the groups out," Woodell said. "We found that those who identify as bisexual show a greater decline in their religious attendance than gay and lesbian individuals."

This difference could be explained by some research that has found bisexuals are less likely to be accepted than their gay counterparts, even in affirming denominations, Woodell said.

"There is newer research showing that bisexuals have experienced stigmatization in their congregation because their sexuality is viewed as a choice," Woodell said.

While the study found little change in prayer among the sexual minority groups, there was a small decline among bisexuals. Schwadel and Woodell said they are pursuing this research further, breaking down differences among gender.

"We're currently looking at how these things differ for men and women," Schwadel said. "We know that gender is strongly related to religiosity, and we expect that gender plays a role in terms of how sexuality is related to religious change."

Further research is also needed, they said, to examine how these declines in religiosity among lesbian, gay and bisexual individuals continue to change in later adulthood.

Credit: 
University of Nebraska-Lincoln

Older adults share fewer memories as they age

By the time people reach a certain age, they've accumulated enough life experience to have plenty of stories to tell about life "back in their day."

However, a new study suggests that the older a person is, the less likely they are to share memories of their past experiences. And when they do share memories, they don't describe them in as much detail as younger people do.

The results of the study, conducted by researchers at the University of Arizona and published in the journal Frontiers in Human Neuroscience, echo previous findings from lab-based research suggesting that memory sharing declines with age.

The UArizona study came to the conclusion in a new way: by "eavesdropping" on older adults' conversations "in the wild."

Most research on memory takes place in a laboratory setting, where participants often are asked to memorize lists or recall and describe specific memories from the past. The UArizona researchers wanted to know how often older adults spontaneously bring up memories in the course of their daily conversations - outside of a controlled laboratory setting.

"This study really gives us one of the first glimpses of people sharing these memories in their day-to-day life," said senior study author Matthew Grilli, an assistant professor in the UArizona Department of Psychology.

Over the course of four days, the daily conversations of 102 cognitively healthy older adults, ages 65 to 90, were monitored with the EAR, or electronically activated recorder - a smartphone app that lets researchers record random samples of study participants' conversations.

Participants kept their phones on them for the duration of the study, and the EAR captured 30-second snippets every six to 18 minutes each day. The participants didn't know at which points the recordings started or ended.

The researchers then analyzed the audio and tallied the number of times participants shared autobiographical memories - or memories about their past experiences.

"We found that the older individuals in our study shared fewer memories," said lead study author Aubrey Wank, a UArizona graduate student in psychology. "Additionally, we found that the level of detail also decreased with older age as people were describing these memories."

It's important for people to recall and share memories, Grilli said. Doing so can help them connect with others. It can also guide planning and decision-making and help people find meaning in other life events and circumstances.

The reason memory sharing declines with age is not entirely clear, but it may be linked to age-related changes in the brain, Grilli and Wank said.

"There are a number of regions in the brain that seem to play an important role in how often we think about our personal past or future," Grilli said. "These brain areas tend to show change with older age, and the idea is that because of these changes, older adults might reflect less on their personal past and future when they're talking with other people."

While the study focused specifically on older adults, future research might consider how that population compares with a younger sample, and if the audience to whom a person is speaking affects how often memories are shared, Wank said.

'Eavesdropping' on the brain

The study's use of the EAR app could have implications for how researchers study memory and cognition in the future.

Developed by UArizona psychology professor and study co-author Matthias Mehl, the EAR started as a standalone recording device designed to help researchers obtain more natural observations of people's everyday lives. It has since evolved into a mobile app that has proven to be a valuable tool for psychologists who study social interactions. The memory study suggests that the EAR could also benefit neuropsychology researchers like Grilli and Wank, who are interested in the relationship between the brain and behavior.

"Assessing cognition on a smartphone is sort of like having a mobile neuropsychologist," Grilli said. "It follows you around and collects a bunch of data on your cognition, and that might give us a better chance not only to get a more precise estimate of your learning and memory, but also to be able to track changes in cognition over time."

Being able to track those changes could help researchers better understand how cognition evolves in aging adults, as well as other populations, such as those with depression or risk factors for Alzheimer's disease.

"One of the reasons we're really interested in better tracking cognitive decline is because we're learning that diseases like Alzheimer's are impacting cognition probably decades before obvious symptoms arise," Grilli said. "The idea that we can develop tools that can track change earlier is intriguing, and it will be important to see if smartphone apps can do that."

Credit: 
University of Arizona

Researchers look for answers as to why western bumblebees are declining

image: Christy Bell, a Ph.D. student in the University of Wyoming Department of Zoology and Physiology, observes a Western bumblebee. Bell and Lusha Tronstad, lead invertebrate zoologist with the Wyoming Natural Diversity Database, are co-authors of a paper about Western bumblebees.

Image: 
Christy Bell

A University of Wyoming researcher and her Ph.D. student have spent the last three years studying the decline of the Western bumblebee. The two have been working with a group of bumblebee experts to fill in gaps of missing information from previous data collected in the western United States. Their goal is to provide information on the Western bumblebee to the U.S. Fish and Wildlife Service while it considers listing this species under the U.S. Endangered Species Act.

"The decline of the Western bumblebee is likely not limited to one culprit but, instead, due to several factors that interact such as pesticides, pathogens, climate change and habitat loss," says Lusha Tronstad, lead invertebrate zoologist with the Wyoming Natural Diversity Database (WYNDD). "Western bumblebees were once the most abundant bumblebees on the West Coast of the U.S., but they are much less frequently observed there now. Pathogens (or parasites) are thought to be a major reason for their decline."

Tronstad and Christy Bell, her Ph.D. student in the Department of Zoology and Physiology, from Laramie, are co-authors of a paper, titled "Western Bumble Bee: Declines in the United States and Range-Wide Information Gaps," that was published online June 26 in Ecosphere, a journal that publishes papers from all subdisciplines of ecological science, as well as interdisciplinary studies relating to ecology.

The two are co-authors because they are members of the Western Bumble Bee Working Group and serve as experts of the Western bumblebee in Wyoming, Tronstad says.

Other contributors to the paper are from the U.S. Geological Survey; U.S. Fish and Wildlife Service; Canadian Wildlife Service; Xerces Society for Invertebrate Conservation in Portland, Ore.; British Columbia Ministry of Environment and Climate Change Strategy; University of Hawaii-Hilo; U.S. Department of Agriculture; The Institute for Bird Populations; University of Vermont; Utah State University; Ohio State University; Denali National Park and Preserve; and the Royal Saskatchewan Museum.

This paper is the result of the Western Bumble Bee Working Group, which is a group of experts on this species who came together to assemble the state of knowledge on this species in the United States and Canada, Tronstad says. The paper shows both what is known and knowledge gaps, specifically in the lack of samples and lack of knowledge about the species. Some prime examples of where spatial gaps in limited sampling exist include most of Alaska, northwestern Canada and the southwestern United States.

"Some areas in the U.S. have less bumblebee sampling in the past and present," Tronstad explains. "This could be for a variety of reasons such as lack of funding for such inventories, lack of bee expertise in that state, etc."

Using occupancy modeling, the probability of detecting the Western bumblebee decreased by 93 percent from 1998-2018, Tronstad says. Occupancy modeling is a complex model that estimates how often the Western bumblebee was detected from sampling events between 1998-2018 in the western United States.

"The data we assembled will be used by the U.S. Fish and Wildlife Service to inform its decision on whether or not to protect the Western bumblebee under the U.S. Endangered Species Act," Tronstad says. "At WYNDD, we collect data, and that data is used by managers. Our mission is to provide the most up-to-date data on which management decisions can be based."

Tronstad says there are several things that homeowners or landowners can do to help this species of bumblebee survive and thrive. These include:

Plant flowers that bloom throughout the summer. Make sure these flowers have pollen and produce nectar, and are not strictly ornamental.

Provide a water source for bees. Tronstad says she adds a piece of wood to all of her stock tanks so bees can safely get a drink.

Provide nesting and overwintering habitat. Most bumblebees nest in the ground, so leaving patches of bare ground covered with litter or small mammal holes will benefit these bees. Be sure not to work these areas until after you see large bumblebees (queen bees) buzzing around in the spring, usually in April for much of Wyoming, so you can find out where they are nesting.

Tronstad says Bell's research will continue this summer, as Bell will investigate pathogens in the Rocky Mountains of Wyoming that affect Western bumblebees there. Max Packebush, a UW sophomore majoring in microbiology and molecular biology, from Littleton, Colo.; and Matt Green, a 2018 UW graduate from Camdenton, Mo., will assist Bell in her research. NASA and the Wyoming Research Scholars Program will fund Packebush to conduct his work.
The U.S. Geological Survey and the U.S. Fish and Wildlife Service funded the research for this paper.

Credit: 
University of Wyoming

New treatment for common form of muscular dystrophy shows promise in cells, animals

image: University of Alberta medical geneticist Toshifuma Yokota led a research team that created a potential new treatment for one of the most common forms of muscular dystrophy.

Image: 
University of Alberta

Researchers have designed a potential new treatment for one of the most common forms of muscular dystrophy, according to a new study published today in the Proceedings of the National Academy of Sciences.

Toshifumi Yokota, professor of medical genetics at the University of Alberta, led a team from Canada and the U.S. to create and test synthetic DNA-like molecules that interfere with the production of a toxic protein that destroys the muscles of people who have facioscapulohumeral muscular dystrophy (FSHD).

FSHD occurs in one in 8,000 people and causes progressive weakness in the muscles of the face, shoulders and limbs. Onset is usually in the teens or early adulthood. Some patients have trouble breathing; many use a wheelchair. All face lifelong disability.

"There is no cure for FSHD at the moment," said Yokota, who has devoted his career to searching for treatments for all forms of muscular dystrophy.

"This paper shows the potential for this new type of therapy and makes progress towards finding a treatment candidate."

There are dozens of types of muscular dystrophy, almost all involving different genetic mutations that lead to weak muscles. FSHD, the third most common form of muscular dystrophy, causes patients to produce the protein DUX4, which damages muscle cells and causes the cells to die.

"Our goal is to knock down the production of DUX4 so their muscle cells can survive," said Yokota.

His team designed the treatment molecules, technically known as locked nucleic acid (LNA) gapmer antisense oligonucleotides (AOs), or "gapmers" for short. They specifically target the location in the gene that causes DUX4 production.

The researchers tested the treatment in patient-derived cells in the laboratory and in mice.

"We used a very low concentration of the treatment and it knocked down more than 99 per cent of the DUX4 production, so this is extremely efficient," Yokota said.

The researchers found the muscle cells were larger and more functional after treatment.

Yokota noted that gapmer therapy has been developed for diseases such as inherited high cholesterol, Huntington's disease and even some cancers. None has yet been approved for muscle diseases such as muscular dystrophy.

Next steps for the research team include testing better delivery methods, studying safety and side-effects, and determining how long the drug's benefits last. The researchers have applied for a patent and are seeking a pharmaceutical company partner to conduct a human trial.

"We are not ready to start clinical trials but it's a significant first step towards future drug development," Yokota said.

Credit: 
University of Alberta Faculty of Medicine & Dentistry

NUS researchers uncover a novel protein which drives cancer progression

Cancers arise when the genetic code of normal cells is altered, causing excessive growth. Researchers from the Cancer Science Institute of Singapore (CSI Singapore) at the National University of Singapore (NUS) have discovered a protein that drives the growth of cancers of the esophagus or liver by altering the genetic code in a novel way.

The protein, death associated protein 3 (DAP3), represses a process called adenosine-to-inosine (A-to-I) RNA editing that normally corrects the genetic code to ensure that genes are expressed correctly. By inhibiting RNA editing, DAP3 acts as an oncogene -- a gene that has the potential to cause cancer. This discovery offers the potential of developing novel drugs that target DAP3 for cancer treatment.

The study was led by Assistant Professor Polly Chen, a Principal Investigator at CSI Singapore, and the findings were published in the scientific journal Science Advances on 17 June 2020.

Understanding A-to-I RNA editing

RNAs are one of the most important classes of molecules in cells. They not only convert the genetic information stored in DNA to proteins, but also play critical regulatory roles in various biological processes. RNA editing is a process in which RNA is changed after it is made from DNA, resulting in an altered gene product. In humans, the most common type of RNA editing is A-to-I editing, which is mediated by ADAR proteins (ADAR1 and ADAR2). In the past decade, many studies have reported that the accumulation of deleterious changes in A-to-I RNA editing can trigger a cell to develop into cancer. However, the current knowledge of how the A-to-I RNA editing process is regulated in cancer is still limited.

The CSI Singapore research team therefore conducted a research study to understand how DAP3 -- the interacting protein of the A-to-I RNA editing catalytic enzymes (ADAR1 and ADAR2) -- regulates this process in cancer cells.

Promising drug target for cancer treatment

The team demonstrated that DAP3 could destroy the binding of ADAR2 protein to its target RNAs, thereby inhibiting the A-to-I RNA editing in cancer cells. This suppression is likely to be one of the pathways by which DAP3 could promote cancer development.

Their analysis also revealed that the expression of DAP3 is elevated in 17 types of cancer. Further experiments demonstrated that DAP3 acted as an oncogene in esophageal cancer and liver cancer cells. Interestingly, they also identified the gene PDZD7, one of DAP3-inhibited editing targets and discovered that altered editing of PDZD7 generated a new PDZD7 protein product which contributed to the DAP3-driven tumor growth.

Overall, these observations shed light on the complexity of the regulation of the A-to-I RNA editing process in cancer cells, and suggest that DAP3 could be a promising target for future cancer drug development.

"With this new knowledge, we can now look into how we can intervene in the interactions between DAP3 and ADAR proteins in order to interfere with cancer-promoting processes mediated by RNA editing in the cell," said research leader Asst Prof Chen.

Credit: 
National University of Singapore

Wrapping up hydrophobic hydration

In water, hydrophobic molecules are surrounded by a two different water populations: the inner shell forms a two-dimensional network of water molecules. The next layer is formed by a second water population that is almost bulk like but forms slightly stronger hydrogen bonds to the bulk water. The assumption to date was that tetrahedral, "ice-like" water dominate in the innermost hydration shell of hydrophobic molecules. The opposite is the case. These new findings were published by the team headed by Professor Martina Havenith, Chair of Physical Chemistry II and Speaker of the Ruhr Explores Solvation Cluster of Excellence at Ruhr-Universität Bochum (RUB) in The Journal of Physical Chemistry Letters on 18 June 2020.

Insights by THz spectroscopy and simulations

In their study, the researchers investigated the hydrogen bond network around the hydrophobic solvated alcohol tert-butanol, as researchers use alcohols as a prototype models for hydrophobic molecules. The team combined results from terahertz (THz) spectroscopy and simulations.

In THz spectroscopy, researchers measure the absorption of THz radiation in a sample. The absorption spectrum provides a fingerprint of the water network.

A thin layer

In their study they obtained a detailed picture of the water layers surrounding the molecule. "We refer to the innermost layer as 'HB-wrap', where HB stands for water hydrogen bond," explains Martina Havenith. The top layer is called 'HB-hydration2bulk' as it described the interface to the bulk water. Combined, both layers of the coating are sometimes no thicker than a single layer of water molecules. "Occasionally, a single water molecule may be part of both layers," says Havenith.

Inner layer is longer stable

When the temperature is increased, the outer layer melts first, the HP-wrap layer remains longer intact. "The inner layer has also less freedom to form distinct configurations due to the hydrophobicity of the solute," elaborates the researcher. "As individual water molecules must always turn away from the alcohol, they form a two-dimensional, loose network." Water molecules in the outer layer have more freedom to move and therefore also more possibilities to connect with other water molecules; researchers refer to this phenomenon as greater entropy.

This type of interaction is relevant for the folding processes of proteins as well as biomolecular recognition between a drug and its target molecule. Understanding the role of water plays a crucial role in the process.

Credit: 
Ruhr-University Bochum

Soft coral garden discovered in Greenland's deep sea

A deep-sea soft coral garden habitat has been discovered in Greenlandic waters by scientists from UCL, ZSL and Greenland Institute of Natural Resources, using an innovative and low-cost deep-sea video camera built and deployed by the team.

The soft coral garden, presented in a new Frontiers in Marine Science paper, is the first habitat of this kind to have been identified and assessed in west Greenland waters.

The study has direct implications for the management of economically important deep-sea trawl fisheries, which are immediately adjacent to the habitat. The researchers hope that a 486 km2 area will be recognised as a 'Vulnerable Marine Ecosystem' under UN guidelines, to ensure that it is protected.

PhD researcher Stephen Long (UCL Geography and ZSL (Zoological Society London)), first author on the study, said: "The deep sea is often over-looked in terms of exploration. In fact we have better maps of the surface of Mars, than we do of the deep sea.

"The development of a low-cost tool that can withstand deep-sea environments opens up new possibilities for our understanding and management of marine ecosystems. We'll be working with the Greenland government and fishing industry to ensure this fragile, complex and beautiful habitat is protected."

The soft coral garden discovered by the team exists in near total darkness, 500m below the surface at a pressure 50 times greater than at sea-level. This delicate and diverse habitat features abundant cauliflower corals as well as feather stars, sponges, anemones, brittle stars, hydrozoans bryozoans and other organisms.

Dr Chris Yesson (ZSL), last author on the study, said "Coral gardens are characterised by collections of one or more species (typically of non-reef forming coral), that sit on a wide range of hard and soft bottom habitats, from rock to sand, and support a diversity of fauna. There is considerable diversity among coral garden communities, which have previously been observed in areas such as northwest and southeast Iceland."

The discovery is particularly significant given that the deep sea is the most poorly known habitat on earth, despite being the biggest and covering 65% of the planet. Until very recently, very little was known about Greenland's deep-sea habitats, their nature, distribution and how they are impacted by human activities.

Surveying the deep sea has typically proved difficult and expensive. One major factor is that ocean pressure increases by one atmosphere (which is the average atmospheric pressure at sea level) every 10 metres of descent. Deep-sea surveys therefore have often only been possible using expensive remote operating vehicles and manned submersibles, like those seen in Blue Planet, which can withstand deep-sea pressure.

The UK-Greenland research team overcame this challenge by developing a low-cost towed video sled, which uses a GoPro video camera, lights and lasers in special pressure housings, mounted on a steel frame.

The lasers, which were used to add a sense of scale to the imagery, were made by combining high-powered laser pointers with DIY housings made at UCL's Institute of Making, with help from UCL Mechanical Engineering.

The team placed the video sledge - which is about the size of a Mini Cooper - on the seafloor for roughly 15 minutes at a time and across 18 different stations. Stills were taken from the video footage, with 1,239 images extracted for further analysis.

A total of 44,035 annotations of the selected fauna were made. The most abundant were anemones (15,531) and cauliflower corals (11,633), with cauliflower corals observed at a maximum density of 9.36 corals per square metre.

Long said: "A towed video sled is not unique. However, our research is certainly the first example of a low-cost DIY video sled led being used to explore deep-sea habitats in Greenland's 2.2million km² of sea. So far, the team has managed to reach an impressive depth of 1,500m. It has worked remarkably well and led to interest from researchers in other parts of the world."

Dr Yesson added: "Given that the ocean is the biggest habitat on earth and the one about which we know the least, we think it is critically important to develop cheap, accessible research tools. These tools can then be used to explore, describe and crucially inform management of these deep-sea resources."

Dr Martin Blicher (Greenland Institute of Natural Resources) said: "Greenland's seafloor is virtually unexplored, although we know is it inhabited by more than 2000 different species together contributing to complex and diverse habitats, and to the functioning of the marine ecosystem. Despite knowing so little about these seafloor habitats, the Greenlandic economy depends on a small number of fisheries which trawl the seabed. We hope that studies like this will increase our understanding of ecological relationships, and contribute to sustainable fisheries management."

Credit: 
University College London

Ecosystem degradation could raise risk of pandemics

Environmental destruction may make pandemics more likely and less manageable, new research suggests.

The study, by the University of the West of England and the Greenpeace Research Laboratories at the University of Exeter, presents the hypothesis that disease risks are "ultimately interlinked" with biodiversity and natural processes such as the water cycle.

Using a framework designed to analyse and communicate complex relationships between society and the environment, the study concludes that maintaining intact and fully functioning ecosystems and their associated environmental and health benefits is key to preventing the emergence of new pandemics.

The loss of these benefits through ecosystem degradation - including deforestation, land use change and agricultural intensification - further compounds the problem by undermining water and other resources essential for reducing disease transmission and mitigating the impact of emerging infectious diseases.

Lead author Dr Mark Everard, of the University of the West of England (UWE Bristol), said: "Ecosystems naturally restrain the transfer of diseases from animals to humans, but this service declines as ecosystems become degraded.

"At the same time, ecosystem degradation undermines water security, limiting availability of adequate water for good hand hygiene, sanitation and disease treatment.

"Disease risk cannot be dissociated from ecosystem conservation and natural resource security."

Dr David Santillo, of the Greenpeace Research Laboratories at Exeter, added: "The speed and scale with which radical actions have been taken in so many countries to limit the health and financial risks from COVID-19 demonstrate that radical systemic change would also be possible in order to deal with other global existential threats, such as the climate emergency and collapse of biodiversity, provided the political will is there to do so."

The researchers say the lesson from the COVID-19 pandemic is that societies globally need to "build back better", including protecting and restoring damaged ecosystems (in line with the goals of the 2021-2030 UN Decade on Ecosystem Restoration) keeping the many values of nature and human rights at the very forefront of environmental and economic policy-making.

Credit: 
University of Exeter

Researchers print, tune graphene sensors to monitor food freshness, safety

image: Researchers are using aerosol-jet-printing technology to create these graphene biosensors that can detect histamine, an allergen and indicator of spoiled fish and meat.

Image: 
mage courtesy of Jonathan Claussen/Iowa State University

AMES, Iowa - Researchers dipped their new, printed sensors into tuna broth and watched the readings.

It turned out the sensors - printed with high-resolution aerosol jet printers on a flexible polymer film and tuned to test for histamine, an allergen and indicator of spoiled fish and meat - can detect histamine down to 3.41 parts per million.

The U.S. Food and Drug Administration has set histamine guidelines of 50 parts per million in fish, making the sensors more than sensitive enough to track food freshness and safety.

Making the sensor technology possible is graphene, a supermaterial that's a carbon honeycomb just an atom thick and known for its strength, electrical conductivity, flexibility and biocompatibility. Making graphene practical on a disposable food-safety sensor is a low-cost, aerosol-jet-printing technology that's precise enough to create the high-resolution electrodes necessary for electrochemical sensors to detect small molecules such as histamine.

"This fine resolution is important," said Jonathan Claussen, an associate professor of mechanical engineering at Iowa State University and one of the leaders of the research project. "The closer we can print these electrode fingers, in general, the higher the sensitivity of these biosensors."

Claussen and the other project leaders - Carmen Gomes, an associate professor of mechanical engineering at Iowa State; and Mark Hersam, the Walter P. Murphy Professor of Materials Science and Engineering at Northwestern University in Evanston, Illinois - have recently reported their sensor discovery in a paper published online by the journal 2D Materials. (See sidebar for a full listing of co-authors.)

The National Science Foundation, the U.S. Department of Agriculture, the Air Force Research Laboratory and the National Institute of Standards and Technology have supported the project.

The paper describes how graphene electrodes were aerosol jet printed on a flexible polymer and then converted to histamine sensors by chemically binding histamine antibodies to the graphene. The antibodies specifically bind histamine molecules.

The histamine blocks electron transfer and increases electrical resistance, Gomes said. That change in resistance can be measured and recorded by the sensor.

"This histamine sensor is not only for fish," Gomes said. "Bacteria in food produce histamine. So it can be a good indicator of the shelf life of food."

The researchers believe the concept will work to detect other kinds of molecules, too.

"Beyond the histamine case study presented here, the (aerosol jet printing) and functionalization process can likely be generalized to a diverse range of sensing applications including environmental toxin detection, foodborne pathogen detection, wearable health monitoring, and health diagnostics," they wrote in their research paper.

For example, by switching the antibodies bonded to the printed sensors, they could detect salmonella bacteria, or cancers or animal diseases such as avian influenza, the researchers wrote.

Claussen, Hersam and other collaborators (see sidebar) have demonstrated broader application of the technology by modifying the aerosol-jet-printed sensors to detect cytokines, or markers of inflammation. The sensors, as reported in a recent paper published by ACS Applied Materials & Interfaces, can monitor immune system function in cattle and detect deadly and contagious paratuberculosis at early stages.

Claussen, who has been working with printed graphene for years, said the sensors have another characteristic that makes them very useful: They don't cost a lot of money and can be scaled up for mass production.

"Any food sensor has to be really cheap," Gomes said. "You have to test a lot of food samples and you can't add a lot of cost."

Claussen and Gomes know something about the food industry and how it tests for food safety. Claussen is chief scientific officer and Gomes is chief research officer for NanoSpy Inc., a startup company based in the Iowa State University Research Park that sells biosensors to food processing companies.

They said the company is in the process of licensing this new histamine and cytokine sensor technology.

It, after all, is what they're looking for in a commercial sensor. "This," Claussen said, "is a cheap, scalable, biosensor platform."

Credit: 
Iowa State University