Earth

Study highlights sex-specific variability in mouse features

Scientists have shown that sex-specific differences in variability depend on individual physical and physiological features in mice, debunking competing theories that either males or females are more variable.

The study, originally posted on bioRxiv* and published today in eLife, has important implications for both medical and evolutionary research. It suggests that both males and females should be included in studies, but that statistical adjustments are necessary to account for variation in individual traits among both sexes.

The theory that females are more variable than males because of their reproductive cycles has led to them being excluded from many medical studies, making it difficult to know whether the results can also apply to females. To correct this, many organisations now require females to be included in studies, but this has raised questions about how to account for sex-specific differences in these studies. In evolutionary science, by contrast, scientists commonly believe that males are more variable because of competition for mates.

"Differences in variability have far-reaching consequences in eco-evolutionary predictions, which may include sex-dependent responses to climate change, as well as statistical considerations that are important for the robust design of biomedical studies," says lead author Susanne Zajitschek, who completed the study during her postdoctoral studies at the University of New South Wales in Sydney, Australia, and is now a Senior Lecturer at Liverpool John Moores University, UK.

To test the competing ideas in variability between males and females, Zajitschek and her colleagues analysed variability in 218 traits in a database containing information about more than 26,900 mice. They found that the data did not support either of the competing theories about whether males or females are more variable. Instead, the data showed that sex-specific differences varied by individual features. For example, male mice varied more by size, while female mice had greater variations in their immune systems. "These findings highlight how important it is to pay attention to sex-specific variability," Zajitschek says.

To help account for these differences, the team created an interactive tool that scientists can use to examine sex-based variability across a wide range of traits and procedures in mice.

"Our tool can help researchers make adjustments in their experimental design to ensure the results are robust in both males and females," explains senior author Shinichi Nakagawa, Professor of Evolutionary Biology at the University of New South Wales. "We hope this contribution will help improve the optimal design of experiments in biomedical research, resulting in greater clarity in how the sexes differentially respond to medical interventions."

Credit: 
eLife

Holes in Greenland ice sheet are larger than previously thought, study finds

image: University of Arkansas geosciences associate professor Matt Covington flies a drone on the Greenland ice sheet.

Image: 
Jason Gulley

FAYETTEVILLE, Ark. - Holes that carry surface meltwater to the base of the Greenland ice sheet, called moulins, are much larger than previously thought, according to a new study based on observation and first-hand exploration by a team including a geologist from the University of Arkansas.

The extra volume could influence the stability of the Greenland ice sheet and how quickly it slides toward the sea.

The team studied the relationship between the size of the moulins and the daily variation of water depth in them during the summer melt season. Scientists believe increased water depth, and therefore pressure, inside moulins lubricates the base of the ice sheet and increases the speed of its movement toward the sea, the way an ice cube slides easily on a thin film of water. But until now, little was known about the actual size of moulins and how much water they can hold.

"We compared our models with in-the-field observations of the water levels and it seemed like we would need really huge volumes inside moulins to produce the relatively smaller water variations that we were seeing," said Matt Covington, associate professor of geosciences and first author of the study published in the journal Geophysical Research Letters. "Then when we went back in the following year and explored a moulin, it was giant. It was a case where the model made the prediction, and we went out in the field and it turned out to be right."

The team made two trips to the Greenland ice sheet in October 2018 and October 2019. During each trip, they used ropes and other climbing equipment to rappel 100 meters into two separate moulins, almost reaching the water level.

"It's intimidating," said Covington, an experienced cave explorer. "You back over the edge and you just see blueish ice going down as far as you can see, and then it's blackness and there also are occasional sounds of crashing ice, which is pretty unnerving."

Scientists have long observed that Greenland's ice sheet moves and theorized that warmer summer melt seasons due to climate change could speed up that movement. But researchers have little data to help them understand the interaction between meltwater and the base of the ice sheet. The team's findings add to the knowledge of how water interacts with the base of the ice sheet.

"We're trying to understand the way the meltwater is interacting with the ice motion, and the main thing that we found is that the water pressure within these moulins is not as variable as was previously observed, and that this seems to result from really large volumes in the moulins," Covington said.

Credit: 
University of Arkansas

NASA model reveals how much COVID-related pollution levels deviated from the norm

video: Pandemic-related shutdowns have affected how people act, so scientists began monitoring how that's affected the planet -- specifically nitrogen dioxide emissions. How do COVID-19 pollution patterns play into NASA computer models? NASA's GEOS atmospheric composition model shows us the answer.

Watch on YouTube: https://www.youtube.com/watch?v=OWRxa5eQTUw

Download in HD: https://svs.gsfc.nasa.gov/13753

Image: 
NASA's Goddard Space Flight Center

Since the COVID-19 pandemic began, space- and ground-based observations have shown that Earth's atmosphere has seen significant reductions in some air pollutants. However, scientists wanted to know how much of that decline can be attributed to changes in human activity during pandemic-related shutdowns, versus how much would have occurred in a pandemic-free 2020.

Using computer models to generate a COVID-free 2020 for comparison, NASA researchers found that since February, pandemic restrictions have reduced global nitrogen dioxide concentrations by nearly 20%. The results were presented at the 2020 International Conference for High Performance Computing, Networking, Storage, and Analysis.

Nitrogen dioxide is an air pollutant that is primarily produced by the combustion of fossil fuels used by industry and transportation--both of which were significantly reduced during the height of the pandemic to prevent the novel coronavirus from spreading.

"We all knew the lockdowns were going to have an impact on air quality," said lead author Christoph Keller with Universities Space Research Association (USRA) at NASA's Goddard Space Flight Center in Greenbelt, Maryland. Keller works in Goddard's Global Modeling and Assimilation Office (GMAO), which uses high-tech computer models to help track the chemistry of the ocean and the atmosphere, and forecast future climate scenarios. He says, "it was also soon clear that it was going to be difficult to quantify how much of that change is related to the lockdown measures, versus general seasonality or variability in pollution."

No two years are exactly alike. Normal variations in weather and atmospheric circulation change the make-up and chemistry of Earth's atmosphere. Comparing 2020 nitrogen dioxide concentrations with data from 2019 or 2018 alone would not account for year-to-year differences. But, because the NASA model projections account for these natural variations, scientists can use them to parse how much of the 2020 atmospheric composition change was caused by the COVID-19 containment measures.

Even with models, there was no predicting the sudden, drastic shifts in human behavior as the novel coronavirus--and the regulations attempting to control it--spread rapidly. Instead of trying to re-program their model with this unexpected event, Keller and his colleagues accounted for COVID-19 by having the model ignore the pandemic altogether.

The model simulation and machine learning analysis took place at the NASA Center for Climate Simulation. Its "business as usual" scenario showed an alternate reality version of 2020--one that did not experience any unexpected changes in human behavior brought on by the pandemic.

From there it is simple subtraction. The difference between the model simulated values and the measured ground observations represents the change in emissions due to the pandemic response. The researchers received data from 46 countries--a total of 5,756 observation sites on the ground--relaying hourly atmospheric composition measurements in near-real time. On a city-level, 50 of the 61 analyzed cities show nitrogen dioxide reductions between 20-50%.

"In some ways I was surprised by how much it dropped," said Keller. "Many countries have already done a very good job in lowering their nitrogen dioxide concentrations over the last decades due to clean air regulations, but what our results clearly show is that there is still asignificant human behavior-driven contribution."

Wuhan, China was the first municipality reporting an outbreak of COVID-19. It was also the first to show reduced nitrogen dioxide emissions--60% lower than simulated values expected. A 60% decrease in Milan and a 45% decrease in New York followed shortly, as their local restrictions went into effect.

"You could, at times, even see the decrease in nitrogen dioxide before the official policies went into place," said co-author Emma Knowland with USRA at Goddard's GMAO. "People were probably reducing their transit because the talk of the COVID-19 threat was already happening before we were actually told to shut down." Once restrictions were eased, the decreases in nitrogen dioxide lessened, but remained below expected "business as usual" values.

Keller compared his estimates of the nitrogen dioxide decreases to reported economic numbers, namely, the gross domestic products, of the nations included in the study. According to Keller, they lined up shockingly well. "We would expect them to be somewhat related because nitrogen dioxide is so closely linked to economic activities, like people who travel and factories running," he said. "It looks like our data captures this very well."

Credit: 
NASA/Goddard Space Flight Center

NIST sensor experts invent supercool mini thermometer

image: Two of NIST's superconducting thermometers for measuring cryogenic temperatures are glued to the lower left and upper right of this amplifier. The miniature thermometers, made of niobium on a layer of silicon dioxide, measure the temperature of the amplifier or other device based on a frequency signal.

Image: 
Wheeler/NIST

Researchers at the National Institute of Standards and Technology (NIST) have invented a miniature thermometer with big potential applications such as monitoring the temperature of processor chips in superconductor-based quantum computers, which must stay cold to work properly.

NIST's superconducting thermometer measures temperatures below 1 Kelvin (minus 272.15 ?C or minus 457.87 ?F), down to 50 milliKelvin (mK) and potentially 5 mK. It is smaller, faster and more convenient than conventional cryogenic thermometers for chip-scale devices and could be mass produced. NIST researchers describe the design and operation in a new journal paper .

Just 2.5 by 1.15 millimeters in size, the new thermometer can be embedded in or stuck to another cryogenic microwave device to measure its temperature when mounted on a chip. The researchers used the thermometer to demonstrate fast, accurate measurements of the heating of a superconducting microwave amplifier.

The technology is a spinoff of NIST's custom superconducting sensors for telescope cameras, specifically microwave detectors delivered for the BLAST balloon and http://toltec.astro.umass.edu/.

"This was a fun idea that quickly grew into something very helpful," group leader Joel Ullom said. "The thermometer allows researchers to measure the temperature of a wide range of components in their test packages at very little cost and without introducing a large number of additional electrical connections. This has the potential to benefit researchers working in quantum computing or using low-temperature sensors in a wide range of fields."

The thermometer consists of a superconducting niobium resonator coated with silicon dioxide. The coating interacts with the resonator to shift the frequency at which it naturally vibrates. Scientists suspect this is due to atoms "tunneling" between two sites, a quantum-mechanical effect.

The NIST thermometer is based on a new application of the principle that the natural frequency of the resonator depends on the temperature. The thermometer maps changes in frequency, as measured by electronics, to a temperature. By contrast, conventional thermometers for sub-Kelvin temperatures are based on electrical resistance. They require wiring routed to room-temperature electronics, adding complexity and potentially causing heating and interference.

The NIST thermometer measures temperature in about 5 milliseconds (thousandths of a second), much faster than most conventional resistive thermometers at about one-tenth of a second. The NIST thermometers are also easy to fabricate in only a single process step. They can be mass produced, with more than 1,200 fitting on a 3-inch (approximately 75-millimeter) silicon wafer.

Credit: 
National Institute of Standards and Technology (NIST)

Reversal of glial scar tissue back to neuronal tissue through neuroregenerative gene therapy

image: Converting Reactive Astrocytes Into Neurons After Glial Scar Formation

Image: 
Jinan University

Brain or spinal cord injury often results in glial scar tissue that is correlated to neural functional loss. Glial scar is a well-known obstacle for neural regeneration due to its dense glial cell composition and lack of functional neurons. A research team led by Prof. Gong Chen at Jinan University, Guangzhou, China, published an article on November 5th, 2020 in the current issue of Frontiers in Cellular Neuroscience, demonstrating that glial scar tissue can be reversed back to neuronal tissue through NeuroD1-based neuroregenerative gene therapy.

While glial scar may have certain initial protective function, it also inhibits neuronal growth and functional recovery. Over the past decades or even centuries, doctors and scientists have tried to tackle glial scar tissue through surgical removal or molecular ablation but the dreams have dashed one after another. The difficulty lies in the fact that glial scar is a double-edge sword--on one hand it serves as a defense barrier to prevent the injury spreading to neighboring healthy tissue but on the other hand, it also inhibits the interactions between the inside scar tissue and outside healthy tissue, hampering functional recovery. Thus, it becomes such a dilemma that one wants to get rid of the glial scar tissue yet one cannot kill the glial cells inside the glial scar tissue.

Now, Prof. Chen and his team has designed a clever way to solve this century-old problem by turning glial cells inside the scar tissue directly into functional new neurons, essentially reversing glial scar tissue back to functional neural tissue. The key technology Chen and team invented is what he coined "neuroregenerative gene therapy", which is a gene therapy that can regenerate functional new neurons in the nervous system in adult mammals. Chen's team has recently published a series of articles demonstrating that such neuroregenerative gene therapy not only can regenerate functional new neurons from internal glial cells but also can promote brain functional recovery in mouse models of ischemic stroke and Huntington's disease. Coincidently, on the same day of November 5th, 2020, Chen's team also published the first non-human primate study showing that such neuroregenerative gene therapy can convert astrocytes, one subtype of glial cells in the brain and spinal cord, into neurons in adult monkey brains with ischemic stroke, making one step further toward future clinical applications.

"Our neuroregenerative gene therapy is to use AAV vector to deliver neural transcription factors such as NeuroD1 into glial cells in the injury site and directly convert reactive glial cells into functional new neurons. While delivering AAV vector to express a transgene falls into the category of gene therapy, the outcome is regeneration of new neurons, which is typically the result of cell therapy. Therefore, our neuroregenerative gene therapy can be viewed as a gene therapy-mediated cell therapy, and belongs to a more broad category of regenerative medicine", explained by Prof. Chen about this unique technology.

"In this study, we delivered a neural transcription factor NeuroD1 into reactive astrocytes induced by brain injury and demonstrated that NeuroD1 expression reduced toxic reactive astrocytes and generated functional new neurons. Accompanying astrocyte-to-neuron conversion, the remaining astrocytes were found dividing and repopulated themselves. Unexpectedly, we also found that following conversion, neuroinflammation was reduced, new blood vessels emerged, and blood-brain-barrier was restored, leading to new neural tissue composed of new neurons, new astrocytes, and new blood vessels in the injury site", said the first author Dr. Lei Zhang excitingly about her work.

"It is worth to mention that we designed two sets of experiments in this study to demonstrate that NeuroD1 not only can convert reactive astrocytes into neurons shortly after brain injury, but also can convert scar-forming astrocytes into neurons long after injury, suggesting that such in vivo astrocyte-to-neuron conversion approach may have broad time windows for therapeutic interventions", added the co-first author Dr. Zhuo-Fan Lei.

"This in vivo astrocyte-to-neuron conversion technology is an one-stone-two-birds approach that not only generates new neurons but also reduces reactive glial cells in the injury areas, turning glial scar tissue back to neuron-enriched tissue, which should be an effective solution long-sought to solve the glial scar problem", concluded Prof. Chen.

Credit: 
Guangdong-Hongkong-Macau Institute of CNS Regeneration, Jinan University

'The global built environment sector must think in new, radical ways, and act quickly'

image: Aerial photo of Gothenburg, home city of Chalmers University of Technology.

Image: 
Per Pixel Petersson

The construction sector, the real estate industry and city planners must give high priority to the same goal - to drastically reduce their climate impacts. Powerful, combined efforts are absolutely crucial for the potential to achieve the UN's sustainability goals. And what's more - everything has to happen very quickly. These are the cornerstones to the roadmap presented at the Beyond 2020 World Conference.

Today, 55 percent of the world's population lives in cities. By 2050, that figure is estimated to have risen to 68 percent, according to the UN. Cities already produce 70 percent of the world's greenhouse gases. Buildings and construction account for 40 percent of energy-related carbon dioxide emissions. Rapid urbanisation is bringing new demands that need to be met in ecologically, economically and socially sustainable ways.

"If we continue as before, we have no chance of even getting close to the climate goals. Now we need to act with new radical thinking and we need to do it fast, and increase the pace at which we work to reduce cities' climate impact. We must look for innovative ways to build our societies so that we move towards the sustainability goals, and not away from them", says Colin Fudge, Visiting Professor of urban futures and design at Chalmers University of Technology, Sweden.

As an outcome of the Beyond 2020 World Conference, Colin Fudge and his colleague Holger Wallbaum have established a "Framework for a Transformational Plan for the Built Environment". The framework aims to lay the foundation for regional strategies that can guide the entire sector in working towards sustainable cities and communities, and the goals of the UN Agenda 2030.

"The conference clearly demonstrated the growing awareness of sustainability issues among more and more actors in the sector. But it's not enough. Achieving the sustainability goals will require a common understanding among all actors of how they can be achieved - and, not least, real action. That is what we want to contribute to now", says Holger Wallbaum, Professor in Sustainable Building at Chalmers University of Technology, and host of Beyond 2020.

Chair of Sweden's Council for sustainable cities, Helena Bjarnegård, is welcoming their initiative.

"We are aware that we have to deliver change to address the climate, biodiversity, lack of resources and segregation. We need to develop sustainable living environments, not least for the sake of human health. The framework of a transformational plan for the built environment provides a provocative but necessary suggestion on concrete actions to achieve the United Nations Sustainable Development Goals for one of the most important sectors", says Helena Bjarnegård, National architect of Sweden.

In the framework, Wallbaum and Fudge have added a detailed action plan for northwestern Europe that contains 72 concrete proposals for measures - intended as an inspiration for the rest of the world.

The proposals cover everything from energy efficiency improvements, research into new building materials, digital tools and renovation methods, to free public transport, more green spaces and cycle paths. They involve all actors from the entire sector - such as architects, builders, real estate companies, material producers and urban planners.

Several of the high-priority measures in northwestern Europe are under direct governmental responsibility:

Higher taxes on carbon dioxide emissions and utilisation of land and natural resources - lower taxes on labour

State support for energy-efficient renovation works

A plan for large-scale production of sustainable, affordable housing

Increased pace in the phasing out of fossil fuels in favour of electric power from renewables

"Here, governments, in collaboration with towns, cities and other sectors, have a key role, as it is political decisions such as taxation, targeted support and national strategies that can pave the way for the radical changes we propose. But all actors with influence over the built environment must contribute to change. In other parts of the world, it may be the business community that plays the corresponding main role", says Holger Wallbaum.

Wallbaum and Fudge are clear that their proposed measures are specifically intended for the countries of northwestern Europe, and that their work should be seen as an invitation to discussion. Different actors around the world are best placed to propose which measures are most urgent and relevant in their respective regions, based on local conditions, they claim.

"Key people and institutions in different parts of the world have accepted the challenge of establishing nodes for the development of regional strategies. From Chalmers' side, we have offered to support global coordination. Our proposal is that all these nodes present their progress for evaluation and further development at a world conference every three years - next in Montreal, in 2023", says Colin Fudge.

A thousand participants followed the Beyond 2020 conference, which was arranged by Chalmers 2-4 November in collaboration with Johanneberg Science Park, Rise (Research Institutes of Sweden), and the City of Gothenburg. As a result of the Corona pandemic, it was held online. The conference discussed methods for reducing climate footprints, lowering resource consumption, digital development and innovative transport. Among the speakers were authorities in sustainable construction, digitisation and financing from around the world.

Beyond 2020 has the status of a World Sustainable Built Environment Conference (WSBE). Organisers are appointed by iiSBE, a worldwide non-profit organisation whose overall goal is to actively work for initiatives that can contribute to a more sustainable built environment. The next WSBE will be held in Montreal in 2023.

More about: A roadmap for the built environment

In their newly established framework, Wallbaum and Fudge establish a general approach that each individual region in the world can use to identify the measures that are most urgent and relevant to achieving the goals of the UN Agenda 2030, based on local conditions. They identify the key questions that must be answered by all societal actors, the obstacles that need to be overcome and the opportunities that will be crucial for the sector over the next decade.

The work has been carried out in dialogue with prominent researchers and city planners around the world. Read more about the framework and download the material here: Framework document on a Transformational Plan for the Built Environment

More about: Action plan for the built environment sector in northwestern Europe

Wallbaum and Fudge have specified 72 acute sustainability measures in northwestern Europe (Germany, Sweden, Denmark, Finland, the Netherlands, the United Kingdom, Ireland, Norway, Belgium, Switzerland). A selection:

Establish renovation plans which focus on energy efficiencies for all existing property by 2023. Avoid demolition and new construction when it is possible to renovate.

Halve emissions from production of building materials by 2025. The transition to greater usage of materials with lower climate impact needs to accelerate.

Accelerate the phase out of fossil fuels in the transport sector in favour of electric power - with, for example, a ban on new petrol and diesel cars by 2030.

Double the amount of pedestrian and cycle paths in cities by 2030.

Offer free municipal public transport for all school children and for everyone over the age of 70.

Introduce the climate perspective as a mandatory element of the architectural industry's ethical guidelines.

Increase the proportion of green spaces by 20 percent in all cities by 2030.

Concentrate research on the development of new building materials with lower carbon footprints, digital tools for the built environment and new energy-efficient renovation methods.

Read the entire action plan on the pages 20-23 in the Framework document on a Transformational Plan for the Built Environment

Credit: 
Chalmers University of Technology

Key source of memories

image: In the top-most layer of neocortex, memory-related information is relayed by synapses from the thalamus (orange), which are in turn controlled by local 'gatekeeper' neurons (blue).

Image: 
Source: M. Belén Pardi

The brain encodes information collected by our senses. In order to perceive and interact with the environment, however, these sensory signals must be interpreted in the context of past experiences stored in the brain and the individual's current aims. A team led by Prof. Dr. Johannes Letzkus, Professor at the Faculty of Medicine Faculty of the University of Freiburg and Research Group Leader at the Max Planck Institute for Brain Research in Frankfurt am Main, has now identified a key source of this experience-dependent so-called top-down information. The scientists have published their results in the journal Science.

The neocortex is the largest and most powerful area of the human brain. All of its important cognitive functions are made possible by the convergence of two distinct streams of information: a "bottom-up" stream, which represents signals from the environment, and a "top-down" stream, which transmits internally generated information about past experiences and current aims. The question of how and where exactly this internally generated information is processed is still largely unexplored, says Letzkus. This motivated him and his team to search for the sources of these top-down signals. The scientists succeeded in identifying a region of the thalamus, a brain area embedded deep within the forebrain, as a key candidate region for such internal information.

Based on this, Dr. M. Belén Pardi, postdoctoral fellow in the Letzkus laboratory, developed a strategy to measure the responses of individual thalamic synapses in the neocortex of the mouse before and after a learning paradigm. "Whereas neutral stimuli without relevance were encoded by small and transient responses in this pathway, learning strongly boosted their activity and made the signals both faster and more sustained over time," explains Pardi. "We were really convinced when we compared the strength of the acquired memory with the change in thalamic activity: This revealed a strong positive correlation, indicating that inputs from the thalamus prominently encode the learned behavioral relevance of stimuli," says Letzkus.

In further experiments and computer modeling, which were conducted together with the team of Dr. Henning Sprekeler from the Technische Universität Berlin, the researchers discovered a previously unknown mechanism that can finely regulate this information and identified a specialized type of neuron in the outermost layer of the neocortex that dynamically controls the flow of these top-down signals. This confirms the scientists' assumption that the thalamus' projections to the sensory neocortex act as a key source of information about previous experiences associated with sensory stimuli. "Such top-down signals are disrupted in a number of brain disorders like autism and schizophrenia," Letzkus explains. "Our hope is that the present results will also enable a deeper understanding of the maladaptive changes underlying these severe conditions."

Credit: 
University of Freiburg

Actively speaking two languages protects against cognitive decline

In addition to enabling us to communicate with others, languages are our instrument for conveying our thoughts, identity, knowledge, and how we see and understand the world. Having a command of more than one enriches us and offers a doorway to other cultures, as discovered by a team of researchers led by scientists at the Open University of Catalonia (UOC) and Pompeu Fabra University (UPF). Using languages actively provides neurological benefits and protects us against cognitive decline associated with ageing.

In a study published in the journal Neuropsychologia, the researchers conclude that regularly speaking two languages -and having done so throughout one's life- contributes to cognitive reserve and delays the onset of the symptoms associated with cognitive decline and dementia.

"We have seen that the prevalence of dementia in countries where more than one language is spoken is 50% lower than in regions where the population uses only language to communicate", asserts researcher Marco Calabria, a member of the Speech Production and Bilingualism research group at UPF and of the Cognitive NeuroLab at the UOC, and professor of Health Sciences Studies, also at the UOC.

Previous work had already found that the use of two or more languages throughout life could be a key factor in increasing cognitive reserve and delaying the onset of dementia; also, that it entailed advantages of memory and executive functions.

"We wanted to find out about the mechanism whereby bilingualism contributes to cognitive reserve with regard to mild cognitive impairment and Alzheimer's, and if there were differences regarding the benefit it confers between the varying degrees of bilingualism, not only between monolingual and bilingual speakers", points out Calabria, who led the study.

Thus, and unlike other studies, the researchers defined a scale of bilingualism: from people who speak one language but are exposed, passively, to another, to individuals who have an excellent command of both and use them interchangeably in their daily lives. To construct this scale, they took several variables into account such as the age of acquisition of the second language, the use made of each, or whether they were used alternatively in the same context, among others.

The researchers focused on the population of Barcelona, where there is strong variability in the use of Catalan and Spanish, with some districts that are predominantly Catalan-speaking and others where Spanish is mainly spoken. "We wanted to make use of this variability and, instead of comparing monolingual and bilingual speakers, we looked at whether within Barcelona, where everyone is bilingual to varying degrees, there was a degree of bilingualism that presented neuroprotective benefits", Calabria explains.

Bilingualism and Alzheimer's

At four hospitals in the Barcelona and metropolitan area, they recruited 63 healthy individuals, 135 patients with mild cognitive impairment, such as memory loss, and 68 people with Alzheimer's, the most prevalent form of dementia. They recorded their proficiency in Catalan and Spanish using a questionnaire and established the degree of bilingualism of each subject. They then correlated this degree with the age at which the subjects' neurological diagnosis was made and the onset of symptoms.

To better understand the origin of the cognitive advantage, they asked the participants to perform various cognitive tasks, focusing primarily on the executive control system, since the previous studies had suggested that this was the source of the advantage. In all, participants performed five tasks over two sessions, including memory and cognitive control tests.

"We saw that people with a higher degree of bilingualism were given a diagnosis of mild cognitive impairment later than people who were passively bilingual", states Calabria, for whom, probably, speaking two languages and often changing from one to the other is life-long brain training. According to the researcher, this linguistic gymnastics is related to other cognitive functions such as executive control, which is triggered when we perform several actions simultaneously, such as when driving, to help filter relevant information.

The brain's executive control system is related with the control system of the two languages: it must alternate them, make the brain focus on one and then on the other so as not to cause one language to intrude in the other when speaking.

"This system, in the context of neurodegenerative diseases, might offset the symptoms. So, when something does not work properly as a result of the disease, the brain has efficient alternative systems to solve it thanks to being bilingual", Calabria states, who then continues: "we have seen that the more you use two languages and the better language skills you have, the greater the neuroprotective advantage. Active bilingualism is, in fact, an important predictor of the delay in the onset of the symptoms of mild cognitive impairment, a preclinical phase of Alzheimer's disease, because it contributes to cognitive reserve".

Now, the researchers wish to verify whether bilingualism is also beneficial for other diseases, such as Parkinson's or Huntington's disease.

Credit: 
Universitat Pompeu Fabra - Barcelona

Highly sensitive detection of circularly polarized light without a filter

image: Visualized images obtained by detection of polarized light and conventional polarization image sensor

Image: 
Ayumi Ishii

Under JST Strategic Basic Research Programs, PRESTO researcher Ayumi Ishii, (Toin University of Yokohama, specially appointed lecturer) has developed a photodiode using a crystalline film composed of lead perovskite compounds with organic chiral molecules to detect circularly polarized light without a filter.

A technology to detect "polarization", or oscillation direction of light can visualize object surfaces with damages, foreign objects, and distortions. Furthermore, detection of "circularly polarized light", or rotating electric field of light makes it possible for us to identify stress intensity and distribution of objects. Conventional photodiodes for camera or sensor applications cannot detect polarization of light directly, and therefore, various types of filters must be attached on top of the device to separate the information of polarization spatially. These structures cause substantial losses of sensitivity and resolution in the light detection, especially detection of circularly polarized light is heretofore considered difficult. Thus, it has been much desired to develop a new sensor for detection of circularly polarized light without any filters.

In the present study, Dr. Ishii prepared an organic-inorganic hybrid chiral crystalline film consisting of lead perovskite compounds and organic molecules with chirality, which cannot be superposed on its mirror image like right and left hands. This study exhibited that the hybrid film forms a helical one-dimensional (1D) chain structure and the spiral direction allows for selective absorption of left or right-handed circularly polarized light. The photodiode based on this 1D chiral crystalline film successfully detected rotational direction of circularly polarized light without a filter. The ratio of sensitivities between left- and right-handed circularly polarized light detections achieved the world's highest value of 25 or higher for a filterless circular polarization detector.

Direct detection of circularly polarized light without a filter as shown in the present result allows for higher sensitivity and miniaturization of photodetectors. It is anticipated to become a new sensor technology that would achieve acquisition of a previously unidentified information and recognition of stress.

Credit: 
Japan Science and Technology Agency

Solar cells: Mapping the landscape of Caesium based inorganic halide perovskites

image: Nine samples with mixtures from CsPbBr2I (ink 1, left) to pure CsPbI3

Image: 
H. Näsström/HZB

Scientists at HZB have printed and explored different compositions of caesium based halide perovskites (CsPb(BrxI1−x)3 (0 ≤ x ≤ 1)). In a temperature range between room temperature and 300 Celsius, they observe structural phase transitions influencing the electronic properties. The study provides a quick and easy method to assess new compositions of perovskite materials in order to identify candidates for applications in thin film solar cells and optoelectronic devices.

Hybrid halide perovskites (ABX3) have risen up in only a few years as highly efficient new materials for thin film solar cells. The A stands for a cation, either an organic molecule or some alkali metal, the B is a metal, most often Lead (Pb) and the X is a halide element such as Bromide or Iodide. Currently some compositions achieve power conversion efficiencies above 25%. What is more, most perovskite thin films can easily be processed from solution at moderate processing temperatures, which is very economic.

World record efficiencies have been reached by organic molecules such as methylammonium (MA) as the A cation and Pb and Iodine or Bromide on the other sites. But those organic perovskites are not yet very stable. Inorganic perovskites with Caesium at the A-site promise higher stabilities, but simple compounds such as CsPbI3 or CsPbBr3 are either not very stable or do not provide the electronic properties needed for applications in solar cells or other optoelectronic devices.

Now, a team at HZB did explore compositions of CsPb(BrxI1?x)3, which provide tunable optical band gaps between 1.73 and 2.37 eV. This makes these mixtures really interesting for multi-junction solar cell applications, in particular for tandem devices.

For the production they used a newly developed method for printing combinatorial perovskite thin films to produce systematic variations of (CsPb(BrxI1?x)3 thin films onto a substrate. To achieve this, two print heads were filled with either CsPbBr2I or CsPbI3 and then programmed to print the required amount of liquid droplets onto the substrate to form a thin film of the wanted composition. After annealing at 100 Celsius to drive out the solvent and crystallise the sample, they obtained thin stripes with different compositions (shown in the picture).

With a special high intensity x-ray source, the liquid metal jet in the LIMAX lab at HZB, the crystalline structure of the thin film was analysed at different temperatures, ranging from room temperature up to 300 Celsius. "We find that all investigated compositions convert to a cubic perovskite phase at high temperature", Hampus Näsström, PhD student and first author of the publication explains. Upon cooling down, all samples transition to metastable tetragonal and orthorhombic distorted perovskite phases, which make them suitable for solar cell devices. "This has proven to be an ideal use case of in-situ XRD with the lab-based high-brilliance X-ray source", Roland Mainz, head of the LIMAX laboratory, adds.

Since the transition temperatures into the desired phases are found to decrease with increasing bromide content, this would allow to lower processing temperatures for inorganic perovskite solar cells.

"The interest in this new class of solar materials is huge, and the possible compositional variations near to infinite. This work demonstrates how to produce and assess systematically a wide range of compositions", says Dr. Eva Unger, who heads the Young Investigator Group Hybrid Materials Formation and Scaling. Dr. Thomas Unold, head of the Combinatorial Energy Materials Research group agrees and suggests that "this is a prime example of how high-throughput approaches in research could vastly accelerate discovery and optimization of materials in future research".

Credit: 
Helmholtz-Zentrum Berlin für Materialien und Energie

Understanding astrophysics with laser-accelerated protons

image: A material sample is placed in the target chamber of GSI's high-performance laser PHELIX. With the aid of the high-intensity laser beam protons are then accelerated out of its back-surface.

Image: 
Photo: V. Bagnoud, edit: P. Boller / GSI

Bringing huge amounts of protons up to speed in the shortest distance in fractions of a second -- that's what laser acceleration technology, greatly improved in recent years, can do. An international research team from the GSI Helmholtzzentrum für Schwerionenforschung and the Helmholtz Institute Jena, a branch of GSI, in collaboration with the Lawrence Livermore National Laboratory, USA, has succeeded in using protons accelerated with the GSI high-power laser PHELIX to split other nuclei and to analyze them. The results have now been published in the journal Nature Scientific Reports and could provide new insights into astrophysical processes.

For less than one picosecond (one trillionth of a second), the PHELIX laser shines its extremely intense light pulse onto a very thin gold foil. This is enough to eject about one trillion hydrogen nuclei (protons), which are only slightly attached to the gold, from the back-surface of the foil, and accelerate them to high energies. "Such a large number of protons in such a short period of time cannot be achieved with standard acceleration techniques," explains Pascal Boller, who is researching laser acceleration in the GSI research department Plasma Physics/PHELIX as part of his graduate studies. "With this technology, completely new research areas can be opened that were previously inaccessible".

These include the generation of nuclear fission reactions. For this purpose, the researchers let the freshly generated fast protons impinge on uranium material samples. Uranium was chosen as a case study material because of its large reaction cross-section and the availability of published data for benchmarking purposes. The samples have to be close to the proton production to guarantee a maximum yield of reactions. The protons generated by the PHELIX laser are fast enough to induce the fission of uranium nuclei into smaller fission products, which remain then to be identified and measured. However, the laser impact has unwanted side effects: It generates a strong electromagnetic pulse and a gammy-ray flash that interfere with the sensitive measuring instruments used for this detection.

At this stage, the researchers are assisted by the expertise of another GSI research group. For the chemical investigation of superheavy elements, a transport system has been in use for quite some time that can transport the desired particles over long distances from the reaction area to the detector. The reaction chamber is flushed through by a gas which --in the case of fission experiments --carries the fission products with it and, within only a few seconds, transports them via small plastic tubes to the measuring apparatus, which is now several meters away. In this way, generation and measurement can be spatially separated and interference can be prevented.

For the first time, it was possible in the experiments to combine the two techniques and thus to generate a variety of cesium, xenon and iodine isotopes via the fission of uranium, to reliably identify them via their emitted gamma radiation and to observe their short life time. This provides a methodology for studying fission reactions in high-density plasma-state matter. Comparable conditions can be found, for example, in space inside stars, stellar explosions or neutron star mergers. "Understanding the reaction processes of nuclei interacting with each other in plasma can give us insights into the origin of atomic nuclei, the so-called nucleosynthesis, in our universe. Nucleosynthesis processes such as s-process or r-process take place in exactly such media," explains Boller. "The role fission reactions play in these processes has not yet been researched in detail. Here, the laser-accelerated protons can provide new information".

Further measurements with the methods are planned for future experiments of the PHELIX laser at GSI as well as at other research centers around the world. The investigation of highly dense matter with ion and laser beams will also be one of the topics pursued at the future research facility FAIR. FAIR is currently being built at GSI in international cooperation. With its motto "The Universe in the Laboratory", it is intended to reproduce conditions as they occur in astrophysical environments on Earth, thus expanding the knowledge about our cosmos.

Credit: 
GSI Helmholtzzentrum für Schwerionenforschung GmbH

Better than money? In-kind payments incentivize farmers to conserve agrobiodiversity

image: Farmers' rewards included mattresses, farm supplies, and other useful items.

Image: 
A.Drucker

What if you received a new mattress in exchange for planting diverse crops? It may sound unusual, but tangible non-monetary incentives - anything from fertilizer to furniture - may hold significant potential in encouraging farmers to conserve their local agrobiodiversity, which includes a suite of increasingly rare crops and varieties that are often found nowhere else in the world.

"It turns out that a good conservation farmer is a well-rested conservation farmer," said Adam Drucker, a researcher at the Alliance of Bioversity International and CIAT.

Drucker and co-author Marleni Ramirez recently assessed eight years of programs that use incentives and competitive tenders in which farmers receive in-kind payments in exchange for cultivating threatened varieties of important crops such as quinoa and maize.

In their article published in Land Use Policy, Drucker and Ramirez analyzed payments for agrobiodiversity conservation services, or PACS, in four Latin American countries between 2010-2018.

Their conclusion: these schemes are very affordable, attractive to farmers and policymakers, and can successfully conserve crop diversity on farms. The programs have been very well-received in Peru, a megadiverse Andean nation with world-famous cuisine and a long tradition of innovation in cultivation.

Award ceremonies for PACS programs are regularly attended by ministers and other high-level officials, and attract media attention. Due to the success of the programs, PACS are also a part of government policy to conserve biodiversity in the country.

The right incentives

Payment for ecosystem services (PES) is not a new concept. With over 550 PES programs active worldwide, the model offers incentives for beneficiaries to voluntarily commit to sustainably manage land and natural resources. However, high-priority ecosystem services such as water provision have generally eclipsed biodiversity protection.

The article tracks some of the first applications of PES to agrobiodiversity conservation, with schemes encouraging farmers to conserve 130 varieties of crops (including a colorful diversity of quinoa, amaranth, beans, maize and others) in Bolivia, Peru, Guatemala and Ecuador.

Ramirez explained that PES "really fills a gap" by investing in rural communities and holding them collectively responsible. Rather than hand cash to individuals, the process of tenders arranges in-kind payments to groups who bid for conservation contracts.

The farmers obtain the necessary seeds and are subject to monitoring visits to provide extension support and verify successful cultivation, after which they receive their award in a handover ceremony. Farmers keep what they grow, minus a small amount of seed that is returned to the project for distribution to other farmers during the following planting season. "This is a fair and just way to work with communities for participation, equality and social justice," said Ramirez.

Because the programs use awards that are requested by communities, they create conditions to incentivize extremely high compliance. Monitoring in Peru suggests that five years after intervention and without further incentives during the interim, between 30-50% of participating farmers still maintained the threatened varieties that had been re-introduced. Some 83% of farmers declared willingness to participate in future schemes, even without rewards.

Seed-saving policy

The researchers emphasize that accessing the seeds, which are threatened and rare, is a persistent challenge. While many farmers were willing to participate simply in exchange for seeds, building up a depleted genetic resource base often means years of work.

An important aspect of the PACS model is the prioritization of threatened crop varieties based not only on their diversity value but also their value to farmers for food security, nutrition, climate change adaptation and cultural uses. In what is becoming a common theme in global biodiversity conservation conversation, "We can't protect everything, so we need to decide how to conserve the most that we can," said Drucker.

Following successful small-scale piloting with an indigenous people's NGO (UNORCAC), Ecuador has also considered a plan and consulted with the authors. Work with another indigenous people's NGO in Guatemala (ASOCUCH) has shown the important role community seed bank institutions can play in facilitating seed access and exchange.

Drucker is confident that the schemes have potential in other countries. Ethiopia, Madagascar and Zambia have all explored PACS at some level. "PACS provide an opportunity for a whole range of institutions including different levels of governments, universities, scientists, national and international NGOs and farmer organizations to partner in implementation," he said.

Beyond Latin America, Ethiopia and Madagascar are exploring possibilities to apply PACS in protected area buffer zones; and Zambia has looked into its use in the conservation of the wild relatives of crops. At the broader level, the model may be able to lay further groundwork for setting global conservation goals, additional monitoring, market development, and school meal programs.

Drucker and Ramirez conclude that, besides being cost-effective and socially equitable, the PACS platform has shown that many farmers are more than willing to cultivate and conserve threatened crops, and the material reward is only an extra bonus.

"This study reveals that farmers are willing to cultivate traditional and endangered varieties even in absence of any reward. They just want the seeds - and once they have them, they keep sowing them," said Carlo Fadda, who leads the Alliance's research area on biodiversity for food and agriculture. "Compared to the $570 billion a year that governments spend to support farmers - mostly at an industrial scale - the investment in PACS is comparatively small and offers a huge return on investment in terms of conservation and livelihoods. I hope Peru's approach is adopted in many more countries."

Credit: 
The Alliance of Bioversity International and the International Center for Tropical Agriculture

New technique isolates brain cells associated with Parkinson's disease

PITTSBURGH--Carnegie Mellon University researchers have developed a new technique for isolating a type of brain cell associated with Parkinson's disease symptoms, enabling them to study that cell type in detail.

The technique, which works only in specially bred mice, costs less than previous methods for isolating these brain cells, said Alyssa Lawler, a Ph.D. student in biological sciences. By using it, she and her colleagues already have detected previously undiscovered changes to how the diseased neurons sense and use oxygen.

The researchers describe the technique and their findings in a research paper published online by the journal JNeurosci.

"Even a small chunk of brain tissue can have dozens of different cell types," said Andreas Pfenning, an assistant professor in CMU's Computational Biology Department. "Each of these cell types has different roles in the behavior of an animal and also in disease." Separating cells of a certain type from their neighbors is thus a critical first step for researchers who want to study them.

In this case, the research team focused on parvalbumin-expressing (PV+) neurons, which have been implicated in Parkinson's disease by the lab of Aryn Gittis, associate professor of biological sciences. Mice with Parkinson's symptoms regain motor control and their ability to run around when these cells are stimulated.

Lab mice have been bred with PV+ cells that contain a protein called Cre that activates a fluorescent green protein. That fluorescence makes it possible for cell-sorting machines to isolate the cells from others in a mixture. But cell-sorting machines are extremely expensive, so Lawler developed a cheaper method, called Cre-Specific Nuclear Anchored Independent Labeling, or cSNAIL.

The new technique uses a virus commonly employed by researchers to deliver DNA to brain cells. When the virus enters PV+ cells, Cre causes the tag to fluoresce. The tag, anchored to the cell nucleus, can hold on even when the tissues are chopped up, Lawler said. Researchers then use antibodies to detect the tag and pull the PV+ nuclei away from the others.

"The technique turned out to be really specific, really efficient," Lawler said, noting that it can be adapted to other mouse models that use the Cre protein.

In a subsequent analysis of the PV+ neurons, the researchers found that those from sick mice produced more RNA involved in the expression of genes that sense or use oxygen. Further study also showed that the DNA in the nucleus unwound in ways indicating that the oxygen-sensing genes were more active.

"Oxygen-sensing pathways have been implicated in other, earlier aspects of Parkinson's disease, but not previously in PV+ cells," Lawler said. These pathways are involved in both protecting and killing cells during neurodegeneration.

Pfenning noted that datasets from this study are part of a larger effort to build machine learning models that will help researchers interpret disease mechanisms by looking at how particular DNA sequences respond to different conditions across types of cells.

"We're learning how to talk to cells, to speak their language," Lawler said.

Credit: 
Carnegie Mellon University

Chronic alcohol use reshapes the brain's immune landscape, driving anxiety and addiction

LA JOLLA, CA--Deep within the brain, a small almond-shaped region called the amygdala plays a vital role in how we exhibit emotion, behavior and motivation. Understandably, it's also strongly implicated in alcohol abuse, making it a long-running focus of Marisa Roberto, PhD, professor in Scripps Research's Department of Molecular Medicine.

Now, for the first time, Roberto and her team have identified important changes to anti-inflammatory mechanisms and cellular activity in the amygdala that drive alcohol addiction. By countering this process in mice, they were able to stop excessive alcohol consumption--revealing a potential treatment path for alcohol use disorder. The study is published in Progress in Neurobiology.

"We found that chronic alcohol exposure compromises brain immune cells, which are important for maintaining healthy neurons," says Reesha Patel, PhD, a postdoctoral fellow in Roberto's lab and first author of the study. "The resulting damage fuels anxiety and alcohol drinking that may lead to alcohol use disorder."

Roberto's study looked specifically at an immune protein called Interleukin 10, or IL-10, which is prevalent in the brain. IL-10 is known to have potent anti-inflammatory properties, which ensures that the immune system doesn't respond too powerfully to disease threats. In the brain, IL-10 helps to limit inflammation from injury or disease, such as stroke or Alzheimer's. But it also appears to influence key behaviors associated with chronic alcohol use.

In mice with chronic alcohol use, IL-10 was significantly reduced in the amygdala and didn't signal properly to neurons, contributing to increased alcohol intake. By boosting IL-10 signaling in the brain, however, the scientists could reverse the aberrant effects. Notably, they observed a stark reduction in anxiety-like behaviors and motivation to drink alcohol.

"We've shown that inflammatory immune responses in the brain are very much at play in the development and maintenance of alcohol use disorder," Roberto says. "But perhaps more importantly, we provided a new framework for therapeutic intervention, pointing to anti-inflammatory mechanisms."

Alcohol use disorder is widespread, affecting some 15 million people in the United States, and few effective treatments exist. By examining how brain cells change with prolonged exposure to alcohol, Roberto's lab has uncovered many possible new therapeutic approaches for those with alcohol addiction.

In the latest study, Roberto's lab collaborated with Silke Paust, PhD, associate professor in the Department of Immunology and Microbiology. Paust and her team determined the precise immune cells throughout the whole brain that are affected by chronic alcohol use. The findings revealed a large shift in the brain immune landscape, with increased levels of immune cells known as microglia and T-regulatory cells, which produce IL-10.

Despite a higher number of IL-10-producing cells in the whole brain of mice with prolonged alcohol use, the amygdala told a different story. In that region, levels of IL-10 were lower and their signaling function was compromised--suggesting that the immune system in the amygdala responds uniquely to chronic alcohol use.

This study complements recent findings by the Roberto lab demonstrating a casual role for microglia in the development of alcohol dependence.

Future studies will build on these findings to identify exactly how and when IL-10 signals to neurons in the amygdala and other addition-related brain circuits to alter behavior.

Credit: 
Scripps Research Institute

Analysis paves way for more sensitive quantum sensors

Quantum sensors can measure extremely small changes in an environment by taking advantage of quantum phenomena like entanglement, where entangled particles can affect each other, even when separated by great distances.

Researchers ultimately hope to create and use these sensors to detect and diagnose disease, predict volcanic eruptions and earthquakes, or explore underground without digging.

In pursuit of that goal, theoretical researchers at the Pritzker School of Molecular Engineering (PME) at the University of Chicago have found a way to make quantum sensors exponentially more sensitive.

By harnessing a unique physics phenomenon, the researchers have calculated a way to develop a sensor that has a sensitivity that increases exponentially as it grows, without using more energy. The results were published October 23 in Nature Communications.

"This could even help improve classical sensors," said Prof. Aashish Clerk, co-author of the paper. "It's a way to build more efficient, powerful sensors for all kinds of applications."

Harnessing physics phenomena

Quantum sensors use atoms and photons as measurement probes by manipulating their quantum state. Increasing the sensitivity of these sensors - and traditional sensors - often means developing a bigger sensor or using more sensing particles. Even so, such moves only increase the sensitivity of quantum sensors equal to the number of particles that are added.

But the researchers, led by graduate student Alexander McDonald, wondered if there was a way to increase the sensitivity even more. They imagined creating a string of photonic cavities, where photons can be transported to adjacent cavities. Such a string could be used as a quantum sensor, but the researchers wanted to know: If they created a longer and longer chain of cavities, would the sensitivity of the sensor be greater?

In systems like this, photons could dissipate - leak out of the cavities and disappear. But by harnessing a physics phenomenon called non-Hermitian dynamics, where dissipation leads to interesting consequences, the researchers were able to calculate that a string of these cavities would increase the sensitivity of the sensor much more than the number of cavities added. In fact, it would increase the sensitivity exponentially in system size.

Not only that, it would do so without using any extra energy and without increasing the inevitable noise from quantum fluctuations. That would be a huge win for quantum sensors, Clerk said.

"This is the first example of a scheme like this - that by stringing these cavities together in the right way, we can gain an enormous amount of sensitivity," Clerk said.

Improving all kinds of quantum sensors

To prove the theory, Clerk is working with a group of researchers who are building a network of superconducting circuits. These circuits could move photons between cavities in the same manner Clerk described in the research paper. That could create a sensor that could improve how quantum information is read out from quantum bits, or qubits.

Clerk also hopes to examine how to construct analogous quantum sensing platforms by coupling spins instead of photonic cavities, with possible implementations based on arrays of quantum bits.

"We want to know if we can use this physics to improve all kinds of quantum sensors," Clerk said.

Credit: 
University of Chicago