Culture

A new synthesis method for three-dimensional nanocarbons

image: A new synthesis method creates curved octagonal structures by linking benzene rings.

Image: 
Issey Takahashi

A team of scientists led by Kenichiro Itami, Professor and Director of the Institute of Transformative Bio-Molecules (WPI-ITbM), has developed a new method for the synthesis of three-dimensional nanocarbons with the potential to advance materials science.

Three-dimensional nanocarbons, next-generation materials with superior physical characteristics which are expected to find uses in fuel cells and organic electronics, have thus far been extremely challenging to synthesize in a precise and practical fashion. This new method uses a palladium catalyst to connect polycyclic aromatic hydrocarbons to form an octagonal structure, enabling successful three-dimensional nanocarbon molecule synthesis.

Nanocarbons, such as the fullerene (a sphere, recipient of the 1996 Nobel Prize), the carbon nanotube (a cylinder, discovered in 1991) and graphene (a sheet, recipient of the 2010 Nobel Prize) have attracted a great deal of attention as functional molecules with a variety of different properties. Since Mackay et al. put forward their theory in 1991, a variety of periodic three-dimensional nanocarbons have been proposed. However, these have been extraordinarily difficult to synthesize. A particular challenge is the eight-membered ring structure, which appears periodically, necessitating an efficient method for its synthesis. To do so, Dr Itami's research team developed a new method for connecting polycyclic aromatic hydrocarbons using a palladium catalyst to produce eight-membered rings via cross-coupling, the first reaction of its type in the world.

The success of this research represents a revolutionary achievement in three-dimensional nanocarbon molecule synthesis. It is expected to lead to the discovery and elucidation of further novel properties and the development of next-generation functional materials.

Credit: 
Institute of Transformative Bio-Molecules (ITbM), Nagoya University

The behavior of coral reefs is simulated in order to optimize space in industrial plants

Many factors must be kept in mind when designing a hospital, a factory, a shopping center or any industrial plant, and many questions can arise before deciding on the floor plans. What is the best placement for each different space? What distribution is the most appropriate in order to improve efficiency in these large areas? University of Cordoba researchers Laura García and Lorenzo Salas are trying to provide an answer to these questions, and to do so, they have turned to the marine world to simulate the behavior of coral reefs.

Within these picturesque underwater structures, holding a wide range of species biodiversity, there is a constant battle for space, where available hollows are fully optimized in pursuit of survival. It is precisely this natural distribution model that has led the way for this research team, who over the last few years has been working towards answering the following question: What is the best solution when designing the layout of an industrial plant?

The first to incorporate the behavior of these coral reefs into a computer algorithm was researcher Sancho Salcedo, at the University of Alcalá de Henares, in 2013. Since then, and stemming from a partnership, the team established a line of research inspired by the life and reproduction of these living beings in order to make the most of space. Recently, the group published a new article that improves upon this bio-inspired algorithm. "Instead of simulating a flat coral reef, as we had done previously, we were able to replicate the structure in three dimensions, making it possible to come up with many more solutions and offer better results", explains lead author of the research, Laura García.

In the real world, the algorithm is able to offer novel designs that had not been assessed before and new floor plans for how an industrial plant could look when space is optimized to the utmost, resulting in saving money and improving the efficiency of these buildings. To do so, after validating the new tool in different industrial areas such as a slaughterhouse, paper and plastic recycling plants and buildings of up to 60 departments, the algorithm is capable of factoring in different variables such as distribution, amount of material, the cost of moving said material from one place to another, noises to avoid and necessary parameters of proximity and remoteness.

An algorithm that includes subjective preferences

In this respect, over the last few months the team has published other work that delves deeper into the same line of research in highly regarded scientific journals. Recently, the group was able to incorporate an interactive tool into the algorithm that includes subjective preferences in the design. "By means of a device that analyzes the way the person in charge of designing the project looks at the floor plans and the degree to which their pupil is dilated, their opinion can be transmitted to the floor plans being proposed", underscores Laura García.

The research carried out over the last few months, in which other UCO professors such as José Antonio García, Carlos Carmona and Adoración Antolí also participated, made it possible to establish partnerships with universities in Portugal, Saudi Arabia and the United States, with contributions from José Valente de Oliveira (at the University of Algarve), Sancho Salcedo Sanz (at the University of Alcalá de Henares) and Ajith Abraham (at Machine Intelligence Research Labs).

Credit: 
University of Córdoba

Using games to study law of motions in mind

image: The link between GR theory and Flow theory. The force is normalized over various mass with momentum and potential energy. Such a measure was discovered from the game's underlying mechanisms playing progress of Go, Chess, and Shogi board. Also, acceleration provides a reasonable indicator of the flow state to measure the balance between challenge and ability. The cross point at m=1/2 is vital as a border between competitive and non-game context (education). In contrast, potential energy peak at m=1/3 is prominent in non-game settings (e.g., most engaged).

Image: 
JAIST

At Japan Advanced Institute of Science and Technology, researchers have successfully established relationships between games and law of motions in mind through analogy of physics and game refinement theory.

Establishing several physics quantities (such as mass, speed, and acceleration) relative to the game progress model allowed for the player's entertainment experiences for a specific game to be determined through the Newtonian laws of motion, specifically the Force, Momentum, and Potential Energy. Such a law of motion reveals the feeling of a player in their mind. Mapping different games originated from different cultures to the state of the human mind; a measure of sophistication that leads to a natural yet pleasurable experience.

Uncovering the fundamental mechanisms of game playing mechanisms had been the primary goal in the IIDA laboratory. Game refinement theory is the fruit of labor for several years--the relationships between game progress and entertainment experience from the perspective of game design. Several sub-branch of the study had been explored through board games (e.g., Chess, Go, etc.), sports (e.g., Basketball, Table tennis, etc.), and video game (e.g., action games). From a non-game context had also been previously explored (such as business, education, and loyalty programs). Interestingly, all of those studies found that game refinement measure converges to approximately similar "zone" value (a region named as the noble uncertainty).

Based on the notion of the uncertainty of the game outcome and gamified experience, several models have been introduced to fundamentally capture the essence of game playing in a variety of contexts. The move selection model and scoring model were established for the board games and sports, via the ratio of solved uncertainty over the game's length. Hence, the game refinement (GR) measure can be obtained concerning the magnitude of gravitational acceleration felt in mind. Then, the notion of speed in the game was established. The difficulty of solving the outcome uncertainty defines mass in mind. With mass, speed, and acceleration, the Newtonian laws of motion can be analogously measured, which mainly reflects motion in mind. Through games, the level of engagement, thrills, and competitiveness, can be successfully illustrated according to the interplays of the momentum, energy, and force in the game. The third derivative of the game progress model is also demonstrated from the jerk quantity (a derivative of acceleration), which is an essential measure in mechanical engineering and influences the force quantity, where the notion of effort, achievement, and discomfort felt in our mind is established.

"It is exciting to understanding how people think and feel inside (mind and body) when playing game, and it is especially curious as to why most game revolves around the established 'zone' value."

Hiroyuki Iida, Trustee and Vice President for Educational and Student Affairs,
Head of IIDA Laboratory, and Director of the Research Center for Entertainment Science,
Japan Advanced Institute of Science and Technology

Under the guidance of Dr. Mohd Nor Akmal Khalid and Prof. Hiroyuki Iida, colleagues in Research Center for Entertainment Science, as well as various interactions between current and previous students, the research frameworks through the game refinement theory had successfully been established as a multidisciplinary and emerging research field for design and analysis of games.

At present, the IIDA laboratory opened its arm to various multi-national students from multiple backgrounds. Also, current design and development are focused on expanding the notion of game refinement theory to a variety of game types, related fields of education, business, engineering, system design, artificial intelligence in games, search algorithms, and many more.

"The establishment of the link between game refinement theory and flow theory is a start, where we hope the current framework will open-up more opportunity for collaboration and at the same time generalizes as a cross-disciplinary field while contributes to the society at large in a more meaningful ways. At present, most work is still fundamental in nature."

Mohd Nor Akmal Khalid
Assistant Professor, Research Center for Entertainment Science,
Japan Advanced Institute of Science and Technology

Credit: 
Japan Advanced Institute of Science and Technology

Tinkering with roundworm proteins offers hope for anti-aging drugs

image: VRK-1 that was visualized by tagging with green fluorescence protein (GFP) in C. elegans.

Image: 
Seung-Jae V. LEE, KAIST.

KAIST researchers have been able to dial up and down creatures' lifespans by altering the activity of proteins found in roundworm cells that tell them to convert sugar into energy when their cellular energy is running low. Humans also have these proteins, offering up the intriguing possibilities for developing longevity-promoting drugs. These new findings were published on July 1 in Science Advances.

The roundworm Caenorhabditis elegans (C. elegans), a millimeter-long nematode commonly used in lab testing, enjoyed a boost in its lifespan when researchers tinkered with a couple of proteins involved in monitoring the energy use by its cells.

The proteins VRK-1 and AMPK work in tandem in roundworm cells, with the former telling the latter to get to work by sticking a phosphate molecule, composed of one phosphorus and four oxygen atoms, on it. In turn, AMPK's role is to monitor energy levels in cells, when cellular energy is running low. In essence, VRK-1 regulates AMPK, and AMPK regulates the cellular energy status.

Using a range of different biological research tools, including introducing foreign genes into the worm, a group of researchers led by Professor Seung-Jae V. Lee from the Department of Biological Sciences at KAIST were able to dial up and down the activity of the gene that tells cells to produce the VRK-1 protein. This gene has remained pretty much unchanged throughout evolution. Most complex organisms have this same gene, including humans.

Lead author of the study Sangsoon Park and his colleagues confirmed that the overexpression, or increased production, of the VRK-1 protein boosted the lifespan of the C. elegans, which normally lives just two to three weeks, and the inhibition of VRK-1 production reduced its lifespan.

The research team found that the activity of the VRK-1-to-AMPK cellular-energy monitoring process is increased in low cellular energy status by reduced mitochondrial respiration, the set of metabolic chemical reactions that make use of the oxygen the worm breathes to convert macronutrients from food into the energy "currency" that cells spend to do everything they need to do.

It is already known that mitochondria, the energy-producing engine rooms in cells, play a crucial role in aging, and declines in the functioning of mitochondria are associated with age-related diseases. At the same time, the mild inhibition of mitochondrial respiration has been shown to promote longevity in a range of species, including flies and mammals.

When the research team performed similar tinkering with cultured human cells, they found they could also replicate this ramping up and down of the VRK-1-to-AMPK process that occurs in roundworms.

"This raises the intriguing possibility that VRK-1 also functions as a factor in governing human longevity, and so perhaps we can start developing longevity-promoting drugs that alter the activity of VRK-1," explained Professor Lee.

At the very least, the research points us in an interesting direction for investigating new therapeutic strategies to combat metabolic disorders by targeting the modulation of VRK-1. Metabolic disorders involve the disruption of chemical reactions in the body, including diseases of the mitochondria.

But before metabolic disorder therapeutics or longevity drugs can be contemplated by scientists, further research still needs to be carried out to better understand how VRK-1 works to activate AMPK, as well as figure out the precise mechanics of how AMPK controls cellular energy.

Credit: 
The Korea Advanced Institute of Science and Technology (KAIST)

The Lancet Public Health: UK and US healthcare workers report higher rates of COVID-19 compared to general population in early pandemic period

The authors say that health-care systems should ensure adequate availability of PPE and develop additional strategies to protect health-care workers from COVID-19, particularly those from Black, Asian, and minority ethnic backgrounds
Frontline healthcare workers may have substantially higher risk of reporting a positive test for COVID-19 than people from the general population, according to an observational study of almost 100,000 healthcare workers in the UK and USA published today in The Lancet Public Health journal.

The study, based on self-reported data from users of the COVID Symptom Study smartphone app between 24 March and 23 April 2020, found the prevalence of COVID-19 was 2,747 per 100,000 app users among frontline care workers compared with 242 per 100,000 app users from the general community. After accounting for differences in testing for healthcare workers compared with the general community, the researchers estimate frontline workers are around 3.4 times more likely to test positive for COVID-19.

Professor Andrew Chan, senior author, from Massachusetts General Hospital, USA, said: "Previous reports from public health authorities suggest that around 10-20% of COVID-19 infections occur among health workers. Our study provides a more precise assessment of the magnitude of increased infection risk among healthcare workers compared to the general community. Many countries, including the US, continue to face vexing shortages of PPE. Our results underscore the importance of providing adequate access to PPE and also suggest that systemic racism associated with inequalities to access to PPE likely contribute to the disproportionate risk of infection among minority frontline healthcare workers." [1]

Gloves, gowns, and face masks are recommended for those caring for COVID-19 patients, but surging demand and supply chain disruptions have resulted in global shortages. Some areas have attempted to conserve PPE by reusing items or using them for longer periods of time, but data on the safety of such practices is scarce.

The latest study is based on data collected from the COVID Symptom Study smartphone app between 24 March and 23 April 2020. App users were asked to provide background information about themselves, such as age, race and whether they already have any medical conditions.

Participants were also asked if they worked in health care and, if yes, whether they had direct patient contact in their job. For the purposes of the study, frontline healthcare workers were defined as participants with direct patient contact, and this group was further subdivided according to whether they cared for patients with suspected or confirmed COVID-19 and the frequency with which they used PPE (always, sometimes, never). They were also asked to report if they had enough PPE when needed, if they had to reuse PPE, or if they did not have enough because of shortages. In addition, they were asked if they worked in inpatient care, nursing home, outpatient, home health, ambulatory clinic, or other, but they were not asked to give their specific role.

All participants were asked if they felt physically well at the outset of the study and again with daily reminders. If they reported not feeling well, they were asked about their symptoms. They were also asked if they had been tested for COVID-19 and what the result had been.

Some 2.6 million people from the UK (2,627,695) and 182,408 people from the USA were enrolled in the study. The researchers excluded 670,298 people who used the app for less than 24 hours and 4,615 people who tested positive for COVID-19 at the outset. This left 2,135,190 participants, of whom 99,795 people identified themselves as frontline healthcare workers. The participants reported symptoms for an average of 19 days.

Of those included in the study, there were 5,545 reports of a positive COVID-19 test over 34 435 272 person-days [2].

In secondary analyses, after accounting for pre-existing medical conditions, frontline healthcare workers who reported having inadequate PPE were 1.3 times more likely to report a positive COVID-19 test than those who said they had adequate equipment to protect themselves (inadequate PPE: 157 positive COVID-19 tests in 60,916 person days; adequate PPE: 592 positive tests in 332,901 person days). The increase in risk was similar for healthcare workers who reported reusing PPE, who were almost 1.5 times more likely to report a positive COVID-19 test than those with adequate PPE (reuse of PPE: 146 positive COVID-19 tests in 80,728; adequate PPE: 592 positive tests in 332,901 person days).

Risks were highest for healthcare workers caring for patients with confirmed COVID-19 without adequate PPE, at almost six times higher than healthcare workers with adequate PPE who were not exposed to COVID-19 patients (inadequate PPE caring for COVID-19 patients: 83 positive tests in 11,675 person days; adequate PPE not exposed to COVID-19 patients: 186 positive tests in 227,654).

Even with adequate PPE, however, the risk of testing positive for SARS-CoV-2 was almost 2.4 times greater for those caring for people with suspected COVID-19 and around 5 times greater for healthcare workers caring for people with a confirmed COVID-19 diagnosis, compared with those who were not exposed to COVID-19 patients (adequate PPE caring for suspected COVID-19 patients: 126 positive tests in 54,676 person days; adequate PPE caring for confirmed COVID-19: 280 positive tests in 50,571 person days; adequate PPE, no exposure to patients with COVID-19: 186 positive tests in 227,654 person days.)

However, the data was collected at a time of global shortages of PPE, so it is not clear if the risks would be the same for people working on the front lines of the COVID-19 medical response today. The researchers also note that they did not ask about the type of PPE used and the study was carried out at a time when disinfection protocols for PPE were not yet established. They caution that their findings relating to PPE reuse should not be extended to reflect the risk of PPE reuse after disinfection protocols, which have now been implemented in various settings.

In post-hoc analyses, healthcare workers from Black, Asian and minority ethnic (BAME) backgrounds had greater risk of testing positive for COVID-19 compared with non-Hispanic white healthcare workers. After accounting for pre-existing medical conditions, healthcare workers from BAME backgrounds were almost five times more likely to report a positive COVID-19 result than somebody from the general community (98 positive COVID-19 tests in 72,556 person days for BAME healthcare workers vs 1,498 positive tests in 23,941,092 person days for the general community). In comparison, non-Hispanic white healthcare workers were around 3.5 times more likely to report a positive COVID-19 test (726 positive tests in 935,860 person days).

There were also differences in the reported adequacy of PPE according to race. Around one in three BAME healthcare workers reported reuse of PPE or use of inadequate PPE (36.7%), compared with around one in four non-Hispanic white care workers (27.7%).

The prevalence of positive COVID-19 tests was higher in US healthcare workers compared with the UK (US: 461 positive tests per 100,000 app users; UK: 227 positive cases per 100,000 app users). However, after accounting for differences in access to testing, the researchers found the risk of predicted COVID-19 based on symptoms reported through the app was double that of the general population for UK-based healthcare workers compared with 1.3 times greater for healthcare workers in the US. The researchers say this higher infection rate could be the result of differences in the availability of PPE between the two countries.

Dr Erica Warner, a co-author of the study, from Harvard Medical School and Massachusetts General Hospital, said: "Our findings highlight structural inequities in COVID risk. BAME healthcare workers were more likely to work in high-risk clinical settings, with known or suspected COVID patients, and had less access to adequate PPE. Ensuring access to, and appropriate use of, high-quality PPE across care settings would help mitigate these disparities." [1]

The study did not ask people to give their specific job type, their level of experience or the frequency of their exposure to patients with COVID-19. This means that those identifying as frontline healthcare workers might include roles with limited but some contact with patients, such as receptionists, hospital porters or cleaners, and this might affect the findings. In addition, they were not asked about the type of PPE used and whether or not they received appropriate training. The researchers say they are planning additional questionnaires to probe these topics in more detail.

Writing in a linked Comment article, Prof Linda McCauley from Emory University, USA, who was not involved in the study, said: "If we are ever to outpace COVID-19, there must be accountability at every level, from the community to top government officials. By combining a centralised mechanism for supply chain oversight, with universal masking and data transparency at local levels, it is possible to afford health-care workers the protection they deserve."

Credit: 
The Lancet

HudsonAlpha scientists help identify important parts of the human genome

July 31, 2020 (Huntsville, Ala.) - Although the Human Genome Project sequenced a nearly-complete human genome almost 20 years ago, only about two percent of the genes in the human genome have been extensively studied and identified as protein-coding genes. Proteins form the basis of living tissues and play a central role in biological processes.

Through a decades long, worldwide collaborative effort with the Encyclopedia of DNA Elements (ENCODE) Project, researchers at HudsonAlpha Institute for Biotechnology, along with collaborators at other institutions, have been trying to elucidate the importance of the remaining 98 percent of the genome. The goal of the Project is to understand how the human genome functions, and to identify the important parts of the human genome.

ENCODE Project's third phase offers new insights into the organization and regulation of our genes and genome

Results from the latest phase of the project (ENCODE 3) include multiple contributions from the Myers lab at HudsonAlpha, and the laboratories of Barbara Wold, PhD (CalTech); Ross Hardison, PhD (Penn State); Ali Mortazavi, PhD (UC Irvine); and Eric Mendenhall, PhD (University of Alabama in Huntsville and HudsonAlpha); and were published on July 30 as a 9-manuscript compendium in Nature, accompanied by 21 additional in-depth studies published in other major journals.

ENCODE is funded by the National Human Genome Research Institute, part of the National Institutes of Health. A detailed press release describing the flagship Nature paper can be found on both the NHGRI and Nature websites.

During the third phase, ENCODE consortium researchers drew closer to their goal of developing a comprehensive map of the functional elements of human and mouse genomes by adding to the ENCODE database millions of candidate DNA switches that regulate when and where genes are turned on. ENCODE 3 also offers a new registry that assigns some of the DNA switches to useful biological categories.

HudsonAlpha researchers perform the largest study of transcription factors expressed at physiological levels to date

As a part of their contribution to ENCODE 3, project manager Mark Mackiewicz, PhD, oversaw HudsonAlpha's Myers and Mendenhall labs in a study of transcription factor biology in genome-wide experiments to expand the knowledge of DNA elements in both human and mouse genomes. Transcription factors are the largest class of proteins in the human genome and are involved in making an RNA copy of the genes encoded in DNA, which are then used to make proteins that are so important in cells.

All of our cells (barring a few small exceptions) have the same two full copies of our genome. However, in order for the more than 200 different cell types in our body to perform different functions, not all of the genes can be active (or expressed) in every cell type. Gene expression is regulated by the binding of transcription factor proteins to short stretches of DNA, referred to as regulatory elements, that serve as on/off switches for gene expression. The ability of genes to be turned on and off allows for unique combinations of genes to be expressed in each different cell type.

The HudsonAlpha researchers, along with collaborators Mendenhall, Wold, Mortazavi and Hardison, studied about a quarter of all of the expressed transcription factors in a human liver cancer cell line, making it the largest study of transcription factors expressed at physiological, or normal, levels to date.

"Understanding the genomic targets of transcription factors is vitally important to understand many aspects of biology, including gene regulation, development, and to help identify the biological mechanisms of many diseases and disorders," said Chris Partridge, PhD, senior scientist at HudsonAlpha and co-first author with PhD student Surya Chhetri on the study.

By analyzing such a large group of transcription factors, the researchers were able to identify novel associations between transcription factors, elaborate on their spatial interactions on DNA, and distinguish between those that interact with promoters and those that interact with enhancers in the genome. All of these findings bring researchers one step closer to understanding how the human genome functions, both in normal biology and disease states.

ENCODE Project highlights the importance of collaborative research, a hallmark of HudsonAlpha's mission

In addition to the ambitious efforts to understand the human genome, a key hallmark of the Project is the complete and rapid open access availability of data generated by members of the ENCODE Consortium which has led to more than 2,000 publications from non-ENCODE researchers who used data generated by the ENCODE Project. Rick Myers, PhD, President, Science Director, and M. A. Loya Chair in Genomics at HudsonAlpha, whose lab has been a member of the ENCODE Consortium since its inception in 2003, says that he is proud of the long-term and wide-reaching collaborative nature of the Project.

"One thing we learned working on the Human Genome Project is that huge endeavors like the ENCODE Project work much more efficiently when research groups coordinate their efforts, particularly because ENCODE's charge is to generate a resource of data, materials and results that are meant to be used by the entire research and biomedical community," said Myers. "Another thing we recognized early on is that making the data freely available to everyone on a weekly basis, prior to publication and with no strings attached, allows researchers everywhere to make advances in their research much faster than would otherwise be possible."

Credit: 
HudsonAlpha Institute for Biotechnology

Differences between discs of active and non-active galaxies detected for the first time

image: Image illustrating the comparison between an active spiral galaxy (orange box) and its non-active twin (blue box)

Image: 
Gabriel Pérez Díaz, SMM (IAC).

This study, just accepted for publication in Astronomy & Astrophysics Letters, is the first evidence for large scale dynamical differences between active and non-active galaxies in the local universe. The astronomers participating are from the Instituto de Astrofísica de Canarias (IAC) and the University of La Laguna (ULL); as well as the National Autonomous University of Mexico (UNAM), the Complutense University of Madrid (UCM) and the Instituto de Astrofísica de Andalucia (IAA).

There is now evidence that the supermassive black holes at the centres of the majority of galaxies have a basic influence on their evolution. In some of them, the black hole is ingesting the material surrounding it at a very high rate, emitting a large quantity of energy. In those cases we say that the galaxy has an active nucleus (AGN). The material which feeds the AGN must initially be quite distant from the nucleus, in the disc of the galaxy, rotating around its centre. This gas must, one way or another, have been "braked" in order to fall into the central zone, a process known as loss of angular momentum.

"Studying the mechanisms which control the relation between the active nucleus and the rest of the galaxy -explains Ignacio del Moral Castro, a doctoral student in the IAC and the University of La Laguna (ULL) and first author of the article- is necessary to understand how these objects form and evolve, and to be able to throw light on this question we need to compare active and non-active galaxies. With this purpose, the main idea of my doctoral thesis is centred on the study and comparison of galaxies which are almost twin, but with the difference being nuclear activity".

The work has consisted of comparing the dynamics of the galactic discs of various active/non-active pairs. The researchers used data from the CALIFA survey (Calar Alto Legacy Integral Field Area). This contains spectroscopic data over complete 2D fields for more than 600 galaxies, taken at the Calar Alto Observatory in Almería, which allow observations of virtually the whole of each galaxy, so that its global characteristics can be studied.

Novel methodology

Previously, in the majority of studies the procedure used was the identification of a sample of active galaxies within a large survey, which were then compared to the rest of the galaxies in the survey having similar properties which do not show nuclear activity. However, this time, the researchers used a novel method: they performed one-to-one comparisons. Firstly, they identified active spiral galaxies in the CALIFA sample, and for each of them they looked for a non-active galaxy which had equivalent global properties, i.e. with the same mass, brightness, orientation and so on, and very similar in appearance.

Using this method the team put forward two scenarios to explain the dynamical differences between active and non-active galaxies. In the first, the explanation would be that it is the trace of the angular momentum transfer between the gas which has fallen into the centre and the material which remains in the disc. The second attributes the difference to the infall of gas from outside, via the capture of small nearby satellite galaxies, in which case, this capture should occur more frequently in the active galaxies. Both scenarios are compatible with this result and they are not mutually exclusive.

"The result surprised us; we really didn't expect to find this type of differences on a large scale, give that the duration of the active phase is very short in comparison with the lifetime of a galaxy, and with the time needed to produce morphological and dynamical changes", says Begoña García Lorenzo, and IAC researcher, and a coauthor of the article.

"Up to now we thought that all galaxies go through active phases during their lifetimes, but this result could mean that this is not the case, which would imply a major change to current models", adds Cristina Ramos Almeida, also an IAC researcher and coauthor of the article.

Credit: 
Instituto de Astrofísica de Canarias (IAC)

NASA sun data helps new model predict big solar flares

image: An X-class solar flare flashes on the edge of the Sun on March 7, 2012. This image was captured by NASA's Solar Dynamics Observatory and shows a type of light that is invisible to human eyes, called extreme ultraviolet light.

Image: 
NASA's Goddard Space Flight Center/SDO

Using data from NASA's Solar Dynamics Observatory, or SDO, scientists have developed a new model that successfully predicted seven of the Sun's biggest flares from the last solar cycle, out of a set of nine. With more development, the model could be used to one day inform forecasts of these intense bursts of solar radiation.

As it progresses through its natural 11-year cycle, the Sun transitions from periods of high to low activity, and back to high again. The scientists focused on X-class flares, the most powerful kind of these solar fireworks. Compared to smaller flares, big flares like these are relatively infrequent; in the last solar cycle, there were around 50. But they can have big impacts, from disrupting radio communications and power grid operations, to - at their most severe - endangering astronauts in the path of harsh solar radiation. Scientists who work on modeling flares hope that one day their efforts can help mitigate these effects.

Led by Kanya Kusano, the director of the Institute for Space-Earth Environmental Research at Japan's Nagoya University, a team of scientists built their model on a kind of magnetic map: SDO's observations of magnetic fields on the Sun's surface. Their results were published in Science on July 30, 2020.

It's well-understood that flares erupt from hot spots of magnetic activity on the solar surface, called active regions. (In visible light, they appear as sunspots, dark blotches that freckle the Sun.) The new model works by identifying key characteristics in an active region, characteristics the scientists theorized are necessary to setting off a massive flare.

The first is the initial trigger. Solar flares, especially X-class ones, unleash huge amounts of energy. Before an eruption, that energy is contained in twisting magnetic field lines that form unstable arches over the active region. According to the scientists, highly twisted rope-like lines are a precursor for the Sun's biggest flares. With enough twisting, two neighboring arches can combine into one big, double-humped arch. This is an example of what's known as magnetic reconnection, and the result is an unstable magnetic structure - a bit like a rounded "M" - that can trigger the release of a flood of energy, in the form of a flare.

Where the magnetic reconnection happens is important too, and one of the details the scientists built their model to calculate. Within an active region, there are boundaries where the magnetic field is positive on one side and negative on the other, just like a regular refrigerator magnet.

"It's similar to an avalanche," Kusano said. "Avalanches start with a small crack. If the crack is up high on a steep slope, a bigger crash is possible." In this case, the crack that starts the cascade is magnetic reconnection. When reconnection happens near the boundary, there's potential for a big flare. Far from the boundary, there's less available energy, and a budding flare can fizzle out - although, Kusano pointed out, the Sun could still unleash a swift cloud of solar material, called a coronal mass ejection.

Kusano and his team looked at the seven active regions from the last solar cycle that produced the strongest flares on the Earth-facing side of the Sun (they also focused on flares from part of the Sun that is closest to Earth, where magnetic field observations are best). SDO's observations of the active regions helped them locate the right magnetic boundaries, and calculate instabilities in the hot spots. In the end, their model predicted seven out of nine total flares, with three false positives. The two that the model didn't account for, Kusano explained, were exceptions to the rest: Unlike the others, the active region they exploded from were much larger, and didn't produce a coronal mass ejection along with the flare.

"Predictions are a main goal of NASA's Living with a Star program and missions," said Dean Pesnell, the SDO principal investigator at NASA's Goddard Space Flight Center in Greenbelt, Maryland, who did not participate in the study. SDO was the first Living with a Star program mission. "Accurate precursors such as this that can anticipate significant solar flares show the progress we have made towards predicting these solar storms that can affect everyone."

While it takes a lot more work and validation to get models to the point where they can make forecasts that spacecraft or power grid operators can act upon, the scientists have identified conditions they think are necessary for a major flare. Kusano said he is excited to have a promising first result.

"I am glad our new model can contribute to the effort," he said.

Credit: 
NASA/Goddard Space Flight Center

Satellite survey shows California's sinking coastal hotspots

image: Coastal elevation in California. Coastal zones, which are defined to be those with elevations less than 10 m, are shown in red. Segments of the coast with elevations higher than 10 m are colored by a yellow gradient.

Image: 
Source: USGS NED.

A majority of the world population lives on low lying lands near the sea, some of which are predicted to submerge by the end of the 21st century due to rising sea levels.

The most relevant quantity for assessing the impacts of sea-level change on these communities is the relative sea-level rise - the elevation change between the Earth's surface height and sea surface height. For an observer standing on the coastland, relative sea-level rise is the net change in the sea level, which also includes the rise and fall of the land beneath observer's feet.

Now, using precise measurements from state-of-the-art satellite-based interferometric synthetic aperture radar (InSAR) that can detect the land surface rise and fall with millimeter accuracy, an Arizona State University research team has, for the first time, tracked the entire California coast's vertical land motion.

They've identified local hotspots of the sinking coast, in the cities of San Diego, Los Angeles, Santa Cruz and San Francisco, with a combined population of 4 to 8 million people exposed to rapid land subsidence, who will be at a higher flooding risk during the decades ahead of projected sea-level rise.

"We have ushered in a new era of coastal mapping at greater than 1,000 fold higher detail and resolution than ever before," said Manoochehr Shirzaei, who is the principal investigator of the NASA-funded project. "The unprecedented detail and submillimeter accuracy resolved in our vertical land motion dataset can transform the understanding of natural and anthropogenic changes in relative sea-level and associated hazards."

The results were published in this week's issue of Science Advances (DOI link here).

The research team included graduate student and lead author Em Blackwell, and faculty Manoochehr Shirzaei, Chandrakanta Ojha and Susanna Werth, all from the ASU School of Earth and Space Exploration (Werth has a dual appointment in the School of Geography and Urban Planning).

Em Blackwell had a keen interest in geology, and as Blackwell began graduate school, the applications of InSAR drew them to pursue this project. InSAR uses radar to measure the change in distance between the satellite and ground surface, producing highly accurate deformation maps of the Earth's surface at 10s m resolution over 100s km spatial extent.

Land subsidence can occur due to natural and anthropogenic processes or a combination of them. The natural processes comprise tectonics, glacial isostatic adjustment, sediment loading, and soil compaction. The anthropogenic causes include groundwater extraction and oil and gas production.

As of 2005, approximately 40 million people were exposed to a 1 in 100-year coastal flooding hazard, and by 2070 this number will grow more than threefold. The value of property exposed to flooding will increase to about 9% of the projected global Gross Domestic Product, with the U.S., Japan, and the Netherlands being the countries with the most exposure. These exposure estimates often rely only on projections of global average sea level rise and do not account for vertical land motion.

The study measured the entire 1350-kilometer long coast of California from 2007-2018, compiling 1000s of satellite images over time, used for making a vertical land motion map with 35-million-pixel at ~80 m resolution, comprising a wide range of coastal uplift and subsidence rates. Coastal communities' policymakers and the general public can freely download the data (link in supplemental data).

The four metropolitan areas majorly affected in these areas included San Francisco, Monterey Bay, Los Angeles, and San Diego.

"The vast majority of the San Francisco Bay perimeter is undergoing subsidence with rates reaching 5.9 mm/year," said Blackwell. "Notably, the San Francisco International Airport is subsiding with rates faster than 2.0 mm/year. The Monterey Bay Area, including the city of Santa Cruz, is rapidly sinking without any zones of uplift. Rates of subsidence for this area reach 8.7 mm/year. The Los Angeles area shows subsidence along small coastal zones, but most of the subsidence is occurring inland."

Areas of land uplift included north of the San Francisco Bay Area (3 to 5 mm/year) and Central California (same rate).

Going forward in the decades ahead, the coastal population is expected to grow to over 1 billion people by 2050, due to coastward migration. The future flood risk that these communities will face is mainly controlled by the rate of relative sea-level rise, namely, the combination of sea-level rise and vertical land motion. It is vital to include land subsidence into regional projections that are used to identify areas of potential flooding for the urbanized coast.

Beyond the study, the ASU research team is hopeful that others in the scientific community can build on their results to measure and identify coastal hazards more broadly in the U.S. and around the world.

Credit: 
Arizona State University

To improve students' mental health, Yale study finds, teach them to breathe

New Haven, Conn. -- When college students learn specific techniques for managing stress and anxiety, their wellbeing improves across a range of measures and leads to better mental health, a new Yale study finds.

The research team evaluated three classroom-based wellness training programs that incorporate breathing and emotional intelligence strategies, finding that two led to improvements in aspects of wellbeing. The most effective program led to improvements in six areas, including depression and social connectedness.

The researchers, who reported findings in the July 15 edition of Frontiers in Psychiatry, said such resiliency training programs could be a valuable tool for addressing the mental health crisis on university campuses.

"In addition to academic skills, we need to teach students how to live a balanced life," said Emma Seppälä, lead author and faculty director of the Women's Leadership Program at Yale School of Management. "Student mental health has been on the decline over the last 10 years, and with the pandemic and racial tensions, things have only gotten worse."

Researchers at the Yale Child Study Center and the Yale Center for Emotional Intelligence (YCEI) conducted the study, which tested three skill-building training programs on 135 undergraduate subjects for eight weeks (30 hours total), and measured results against those of a non-intervention control group.

They found that a training program called SKY Campus Happiness, developed by the Art of Living Foundation, which relies on a breathing technique called SKY Breath Meditation, yoga postures, social connection, and service activities, was most beneficial. Following the SKY sessions, students reported improvements in six areas of wellbeing: depression, stress, mental health, mindfulness, positive affect, and social connectedness.

A second program called Foundations of Emotional Intelligence, developed by the YCEI, resulted in one improvement: greater mindfulness -- the ability for students to be present and enjoy the moment.

A third program called Mindfulness-Based Stress Reduction, which relies heavily on mindfulness techniques, resulted in no reported improvements.

In all, 135 Yale undergraduate students participated in the study.
Across college campuses, there has been a significant rise in student depression, anxiety, and demand for mental health services. From 2009 to 2014, students seeking treatment from campus counseling centers rose by 30%, though enrollment increased by just 6% on average. Fifty-seven percent of counseling center directors indicated that their resources are insuf?cient to meet students' needs.

The researchers say resiliency training tools can address the overburdening of campus counseling centers directly. In the sessions. "Students learn tools they can use for the rest of their lives to continue to improve and maintain their mental health," said co-first author Christina Bradley '16 B.S., currently a Ph.D. student at University of Michigan.

Researchers administered the training sessions in person, but the courses can also be taken remotely.

"Continually adding staff to counseling and psychiatric services to meet demand is not financially sustainable -- and universities are realizing this," Seppälä said. "Evidence-based resiliency programs can help students help themselves."

Davornne Lindo '22 B.A., a member of the Yale track team who participated in the SKY Campus Happiness program, said practicing breathing techniques helped her to manage stress from both academics and athletics. "Now that I have these techniques to help me, I would say that my mentality is a lot healthier," Lindo said. "I can devote time to studying and not melting down. Races have gone better. Times are dropping."
Another participant in the SKY program, Anna Wilkinson '22 B.A., said she was not familiar with the positive benefits of breathing exercises before the training, but now uses the technique regularly. "I didn't realize how much of it was physiology, how you control the things inside you with breathing," Wilkinson said. "I come out of breathing and meditation as a happier, more balanced person, which is something I did not expect at all."

Credit: 
Yale University

Mandatory country-wide BCG vaccination found to correlate with slower growth rates of COVID-19 cases

Scientists have found that countries with mandatory Bacillus Calmette-Guérin (BCG) vaccination until at least the year 2000 tended to exhibit slower infection and death rates during the first 30 days of the outbreak of COVID-19 in their country. By applying a statistical model based on their findings, the researchers further estimated that only 468 people would likely have died from COVID-19 in the U.S. as of March 29, 2020 - which is 19% of the actual figure of 2,467 deaths by that date - if the U.S. had instituted mandatory BCG vaccination several decades ago. They established this correlation via statistical analyses that controlled for several potential biases, including differences in the availability of tests, how cases were reported, and the timing of outbreak onset across countries. Their findings suggest that national policies for universal BCG vaccination can be effective in the fight against COVID-19 - an association that merits clinical investigation, the authors say. Available evidence demonstrates that BCG vaccination, typically given at birth or during childhood to prevent tuberculosis, can also help strengthen immunity against various other infectious diseases - perhaps including COVID-19. However, investigating a potential relationship between universal BCG vaccination and spread of the SARS-CoV-2 virus requires accounting for the effects of several biases and variables across countries, which previous studies have not done, the authors say. For example, some past efforts focused on the total number of infections and deaths, which have varied considerably among countries based on when the disease took hold.

Martha Berg and colleagues instead focused on changes in the growth rates of COVID-19 cases and deaths, while controlling for variables including diagnostic test availability, median age, per capita GDP, population size and density, net migration rate, and various cultural differences such as individualism. They analyzed the day-by-day rate of increase of confirmed cases in 135 countries and deaths in 134 countries in the first 30-day period of each country's outbreak. Mandatory BCG vaccination correlated with a flattening of the curve in the spread of COVID-19, the analysis showed. However, the authors caution that their results do not portray BCG as a "magic bullet." They found substantial variation in COVID-19 growth rates even among BCG-mandated countries, suggesting that additional societal variables likely have an effect on mandatory BCG vaccination's effect on the spread of COVID-19. The authors note that this variation must be addressed in future work.

Credit: 
American Association for the Advancement of Science (AAAS)

DNA metabarcoding detects ecological stress within freshwater species

image: Sample sites at river A) Emscher, B) Ennepe and C) Sieg.

Image: 
Vera Zizka

Metabarcoding allows scientists to extract DNA from the environment (known as environmental DNA or eDNA), for example, river water or, as in the case of the study by the team from the University of Duisburg-Essen (Essen, Germany) within the German Barcode of Life project (GBOL II): Vera Zizka, Dr Martina Weiss and Prof Florian Leese, from individuals in bulk samples. Thus, they are able to detect what species inhabit a particular habitat.

However, while the method has already been known to be of great use in getting an approximate picture of local fauna, hence facilitating conservation prioritisation, few studies have looked into its applicability to infer responses below species level. That is, how the populations of a particular species fare in the environment of interest, also referred to as intraspecific diversity. Meanwhile, the latter could actually be a lot more efficient in ecosystem monitoring and, consequently, biodiversity loss mitigation.

The potential of the method is confirmed in a new study, published in the peer-reviewed scholarly journal Metabarcoding & Metagenomics. To do so, the researchers surveyed the populations of macroinvertebrate species (macrozoobenthos) in three German rivers: Emscher, Ennepe and Sieg, where each is subject to a different level of ecological disturbance. They were looking specifically at species reported at all of the survey sites by studying the number of different haplotypes (a set of DNA variations usually inherited together from the maternal parent) in each sample. The researchers point out that macrozoobenthos play a key role in freshwater ecosystem functionality and include a wide range of taxonomic groups with often narrow and specific demands with respect to habitat conditions.

"As the most basal level of biodiversity, genetic diversity within species is typically the first to decrease, and the last to regenerate, after stressor's impact. It consequently provides a proxy for environmental impacts on communities long before, or even if never visible on species diversity level," explain the scientists.

Emscher is an urban stream in the Ruhr Metropolitan Area that has been used as an open sewage channel for the past hundred years, and is considered to be a very disturbed environment. Ennepe - regarded as moderately stressed - runs through both rural and urban sites, including ones with sewage treatment plant inflow. Meanwhile, Sieg is considered as a stable, near-natural river system with a good ecological and chemical status.

As a result, despite their original assumption that Sieg would support the most prominent diversity within populations of species sensitive to organic pollution, such as mayflies, stoneflies and caddisflies, the scientists reported no significant difference to the medium stressed river Ennepe. This was also true for overall biodiversity. On the other hand, the team discovered higher intraspecific diversity for species resilient to ecological disturbance like small worms and specialised crustaceans in the heavily disturbed Emscher. The latter phenomenon may be explained with low competition pressure for these species, their ability to use organic compounds as resources and, consequently, increased population growth.

"[T]his pioneer study shows that the extraction of intraspecific genetic variation, so-called 'haplotypes' from DNA metabarcoding datasets is a promising source of information to assess intraspecific diversity changes in response to environmental impacts for a whole metacommunity simultaneously," conclude the scientists.

However, the researchers also note limitations of their study, including the exclusion of specialist species that only occured at single sites. They prompt future studies to also carefully control for the individual number of specimens per species to quantify genetic diversity change specifically.

Credit: 
Pensoft Publishers

A rebranding of 'freedom'?

According to recent Gallup polls, socialism is now more popular than capitalism among Democrats and young people, and support for "some form of socialism" among all Americans is at 43% (compared to 25% in 1942). Policies that went unmentioned or were declared out-of-bounds in elections four years ago--a federal jobs guarantee, single-payer health care, free college, massive tax hikes on the rich, and the Green New Deal--are commonplace in Democrats' 2020 campaigns.

However, in "Freedom Now," a new paper published by Alex Gourevitch and Corey Robin in Polity's May Symposium on the Challenges Facing Democrats, there is still no clear, unifying idea behind this political shift. "One has not heard anything on the order of Franklin Roosevelt's Commonwealth Club speech or Reagan's story of the free market," the authors write. If these policies are to have a chance of breaking through, they will need a grounding principle, or ideology name the enemy, organize the policies, orient the actions, state the destination, and provide the fuel for the movement.

Gourevitch and Robin propose that that idea is freedom. "While the left once understood freedom as emancipation from the economy, the right spent the twentieth century neutralizing and appropriating the idea of freedom by reinventing the economy as the true site of freedom."

To reclaim freedom as a value of the left, the authors believe the first place to start is the unfreedom of the workplace. "In nearly every capitalist country, one of the leading elements of the legal definition of employment is subordination to the will of a superior." That can mean that employees must urinate--or are forbidden to urinate. It can mean that they should be sexually appealing--or must not be sexually appealing. They may be told how to speak, what to say, whom to say it to, where to be, where to go, how to dress, when to eat, and what to read--all in the name of the job. "But isn't the worker free to leave a bad boss? Formally speaking, yes," the authors write. "But even if they are free to exit this workplace, they are not free to exit the workplace."

Reclaiming "freedom" names the problem that an increasing number of people face today: systemic unfreedom in the neoliberal economy. By confronting that unfreedom, the left can do more than identify, in a coherent and cohesive way, the myriad problems that individuals are currently facing. The authors find the seeds of that idea in Bernie Sanders's rhetoric about being "organizer in chief," and in proposals from the Warren and Sanders camps that would strengthen workers' right to strike and organize.

However, they note, "A real politics of freedom posits a belief in the capacity of people to revise the terms of their existence and a commitment to the institutions that make these collective revisions possible." In other words, freedom is best realized not through tending our own gardens but through disciplined commitment and collective struggle, in activities like mass strikes and party politics. "These democratic struggles are not simply expressions and experiences of freedom, though they are that. They are also the means to the freedoms people deserve."

Credit: 
University of Chicago Press Journals

The need for progressive national narratives

The recent rise of authoritarian nationalist movements has reinforced the tendency of many on the left, and some on the right, to reject all forms of nationalism, writes Rogers M. Smith in "Toward Progressive Narratives of American Identity," published in Polity's May Symposium on the Challenges Facing Democrats.

Nationalism, Smith argues, is seen by some as prone to the repression of minorities and other vulnerable groups within states, and as hostile to the concerns of all outsiders, as well as to free movement of goods and people. "Liberals and progressives in America and elsewhere have failed to counter with politically resonant narratives of national identity that champion greater and more egalitarian inclusiveness."

Some argue chauvinistic forms of nationalism must be opposed by developing better forms, not by rejecting nationalism outright. "In light of the recent surge of virulent nationalisms, better rather than worse national stories are part of what liberal parties like the Democrats need now," Smith says.

The paper offers three criteria for devising such narratives: resonance, respectfulness, and reticulation, which Smith defines as recognizing the reality that public policies in every society treat persons and citizens in ways that are enormously variegated. "When a society's variegated rights and duties form a logically explicable and practically useful (and always evolving) network of statuses, they are suitably 'reticulated." While "separate but equal" policies are to be avoided, it is necessary to accept that in politics we must seek to create not absolute uniformity, but appropriately egalitarian reticulated civic statuses.

"We must have stories of peoplehood that not only permit but valorize efforts to protect and expand opportunities for all by recognizing and accommodating, not ignoring or suppressing, many human differences," Smith says, to enact policies that resonate desirably with the different values and identities of the multiple groups in all modern societies; that display respect for as many of those values and identities as possible; that militate against harsh treatment of ethnocultural minorities and outsiders, and against reinforcement of the advantages of dominant groups.

The paper also contends that America's democratic traditions, its constitutional traditions aimed at achieving a more perfect union without effacing legitimate diversity, and its quest to realize the goals of the Declaration of Independence over time all provide rich resources for stories of American peoplehood that can meet these criteria and, perhaps, defeat narrower nationalist visions. Smith says it is hard to see Trump's America First agenda as genuinely respectful toward all Americans, much less all persons. "Yet the political potency of Trump's nationalism cannot be gainsaid." Smith suggests it is wise to explore whether there are better stories of American peoplehood that might check these features of Trump's vision, while also responding to legitimate concerns to which he has spoken powerfully. "But rather than insisting one narrative is 'the' American story, the quest must be to identify a variety of inclusive, egalitarian stories of American peoplehood that can serve to build progressive coalitions among those with overlapping yet distinguishable values and identities."

Smith posits perhaps the best story of American peoplehood today is one first advanced by anti-slavery constitutionalists, including Lysander Spooner and Frederick Douglass, and then made prominent in modified form by Abraham Lincoln and the new Republican Party of the 1850s. "All these figures presented the story of America as a collective historical endeavor to fulfill the principles of the Declaration of Independence--to secure basic rights, including the rights to life, liberty, and the pursuit of happiness, for, in Lincoln's words, 'all people, of all colors, everywhere.'"

Telling the story of American peoplehood as a quest to realize more fully the goals and values defined in the Declaration of Independence is the best way to elaborate a resonant, respectful -- and suitably reticulated conception of American nationality today, Smith says. It defines a sense of purpose that is more demanding, but also more elevating than "America First." "Toxic forms of nationalism represent the world's worst political poisons. They must be countered by nationalist antitoxins that can serve as balms for the festering wounds visible in all too many bodies politic today."

Credit: 
University of Chicago Press Journals

To distinguish contexts, animals think probabilistically, study suggests

image: A maze in the Wilson lab at MIT. A rodent must infer that this is a different context than, say, a maze that used different shape cues or one that had an additional arm. A new study suggests they weigh probabilities in doing so.

Image: 
Image by Peter Goldberg

Among the many things rodents have taught neuroscientists is that in a region called the hippocampus, the brain creates a new map for every unique spatial context - for instance, a different room or maze. But scientists have so far struggled to learn how animals decides when a context is novel enough to merit creating, or at least revising, these mental maps. In a study in eLife, MIT and Harvard researchers propose a new understanding: The process of "remapping" can be mathematically modeled as a feat of probabilistic reasoning by the rodents.

The approach offers scientists a new way to interpret many experiments that depend on measuring remapping to investigate learning and memory. Remapping is integral to that pursuit, because animals (and people) associate learning closely with context, and hippocampal maps indicate which context an animal believes itself to be in.

"People have previously asked 'What changes in the environment cause the hippocampus to create a new map?' but there haven't been any clear answers," said lead author Honi Sanders. "It depends on all sorts of factors, which means that how the animals define context has been shrouded in mystery."

Sanders is a postdoc in the lab of co-author Matthew Wilson, Sherman Fairchild Professor in The Picower Institute for Learning and Memory and the departments of Biology and Brain and Cognitive Sciences at MIT. He is also a member of the Center for Brains, Minds and Machines. The pair collaborated with Samuel Gershman, a professor of psychology at Harvard on the study.

Fundamentally a problem with remapping that has frequently led labs to report conflicting, confusing, or surprising results, is that scientists cannot simply assure their rats that they have moved from experimental Context A to Context B, or that they are still in Context A, even if some ambient condition, like temperature or odor, has inadvertently changed. It is up to the rat to explore and infer that conditions like the maze shape, or smell, or lighting, or the position of obstacles, and rewards, or the task they must perform, have or have not changed enough to trigger a full or partial remapping.

So rather than trying to understand remapping measurements based on what the experimental design is supposed to induce, Sanders, Wilson and Gershman argue that scientists should predict remapping by mathematically accounting for the rat's reasoning using Bayesian statistics, which quantify the process of starting with an uncertain assumption and then updating it as new information emerges.

"You never experience exactly the same situation twice. The second time is always slightly different," Sanders said. "You need to answer the question: 'Is this difference just the result of normal variation in this context or is this difference actually a different context?' The first time you experience the difference you can't be sure, but after you've experienced the context many times and get a sense of what variation is normal and what variation is not, you can pick up immediately when something is out of line."

The trio call their approach "hidden state inference" because to the animal, the possible change of context is a hidden state that must be inferred.

In the study the authors describe several cases in which hidden state inference can help explain the remapping, or the lack of it, observed in prior studies.

For instance, in many studies it's been difficult to predict how changing some of cues that a rodent navigates by in a maze (e.g. a light or a buzzer) will influence whether it makes a completely new map or partially remaps the current one and by how much. Mostly the data has showed there isn't an obvious "one-to-one" relationship of cue change and remapping. But the new model predicts how as more cues change, a rodent can transition from becoming uncertain about whether an environment is novel (and therefore partially remapping) to becoming sure enough of that to fully remap.

In another, the model offers a new prediction to resolve a remapping ambiguity that has arisen when scientists have incrementally "morphed" the shape of rodent enclosures. Multiple labs, for instance, found different results when they familiarized rats with square and round environments and then tried to measure how and whether they remap when placed in intermediate shapes, such as an octagon. Some labs saw complete remapping while others observed only partial remapping. The new model predicts how that could be true: rats exposed to the intermediate environment after longer training would be more likely to fully remap than those exposed to the intermediate shape earlier in training, because with more experience they would be more sure of their original environments and therefore more certain that the intermediate one was a real change.

The math of the model even includes a variable that can account for differences between individual animals. Sanders is looking at whether rethinking old results in this way could allow researchers to understand why different rodents respond so variably to similar experiments.

Ultimately, Sanders said, he hopes the study will help fellow remapping researchers adopt a new way of thinking about surprising results - by considering the challenge their experiments pose to their subjects.

"Animals are not given direct access to context identities, but have to infer them," he said. "Probabilistic approaches capture the way that uncertainty plays a role when inference occurs. If we correctly characterize the problem the animal is facing, we can make sense of differing results in different situations because the differences should stem from a common cause: the way that hidden state inference works."

Credit: 
Picower Institute at MIT