Tech

Mind over body: The search for stronger brain-computer interfaces

image: Emily Oby is a bioengineering postdoctoral research associate at the University of Pittsburgh. She, along with Pitt and Carnegie Mellon University colleagues have been researching how the brain learns tasks.

Image: 
Aimee Obidzinski/University of Pittsburgh

When people suffer debilitating injuries or illnesses of the nervous system, they sometimes lose the ability to perform tasks normally taken for granted, such as walking, playing music or driving a car. They can imagine doing something, but the injury might block that action from occurring.

Brain-computer interface systems exist that can translate brain signals into a desired action to regain some function, but they can be a burden to use because they don't always operate smoothly and need readjustment to complete even simple tasks.

Researchers at the University of Pittsburgh and Carnegie Mellon University are working on understanding how the brain works when learning tasks with the help of brain-computer interface technology. In a set of papers, the second of which was published today in Nature Biomedical Engineering, the team is moving the needle forward on brain-computer interface technology intended to help improve the lives of amputee patients who use neural prosthetics.

"Let's say during your work day, you plan out your evening trip to the grocery store," said Aaron Batista, associate professor of bioengineering in Pitt's Swanson School of Engineering. "That plan is maintained somewhere in your brain throughout the day, but probably doesn't reach your motor cortex until you actually get to the store. We're developing brain-computer interface technologies that will hopefully one day function at the level of our everyday intentions."

Batista, Pitt postdoctoral research associate Emily Oby and the Carnegie Mellon researchers have collaborated on developing direct pathways from the brain to external devices. They use electrodes smaller than a hair that record neural activity and make it available for control algorithms.

In the team's first study, published last June in the Proceedings of the National Academy of Sciences, the group examined how the brain changes with the learning of new brain-computer interface skills.

"When the subjects form a motor intention, it causes patterns of activity across those electrodes, and we render those as movements on a computer screen. The subjects then alter their neural activity patterns in a manner that evokes the movements that they want," said project co-director Steven Chase, a professor of biomedical engineering at the Neuroscience Institute at Carnegie Mellon.

In the new study, the team designed technology whereby the brain-computer interface readjusts itself continually in the background to ensure the system is always in calibration and ready to use.

"We change how the neural activity affects the movement of the cursor, and this evokes learning," said Pitt's Oby, the study's lead author. "If we changed that relationship in a certain way, it required that our animal subjects produce new patterns of neural activity to learn to control the movement of the cursor again. Doing so took them weeks of practice, and we could watch how the brain changed as they learned."

In a sense, the algorithm "learns" how to adjust to the noise and instability that is inherent in neural recording interfaces. The findings suggest that the process for humans to master a new skill involves the generation of new neural activity patterns. The team eventually would like this technology to be used in a clinical setting for stroke rehabilitation.

Such self-recalibration procedures have been a long-sought goal in the field of neural prosthetics, and the method presented in the team's studies is able to recover automatically from instabilities without requiring the user to pause to recalibrate the system by themselves.

"Let's say that the instability was so large such that the subject was no longer able to control the brain-computer interface," said Yu. "Existing self-recalibration procedures are likely to struggle in that scenario, whereas in our method, we've demonstrated it can in many cases recover from even the most dramatic instabilities."

Credit: 
University of Pittsburgh

Researchers unveil electronics that mimic the human brain in efficient learning

image: A graphic depiction of protein nanowires (green) harvested from microbe Geobacter (orange) facilitate the electronic memristor device (silver) to function with biological voltages, emulating the neuronal components (blue junctions) in a brain.

Image: 
UMass Amherst/Yao lab

AMHERST, Mass. - Only 10 years ago, scientists working on what they hoped would open a new frontier of neuromorphic computing could only dream of a device using miniature tools called memristors that would function/operate like real brain synapses.

But now a team at the University of Massachusetts Amherst has discovered, while on their way to better understanding protein nanowires, how to use these biological, electricity conducting filaments to make a neuromorphic memristor, or "memory transistor," device. It runs extremely
efficiently on very low power, as brains do, to carry signals between neurons. Details are in Nature Communications.

As first author Tianda Fu, a Ph.D. candidate in electrical and computer engineering, explains, one of the biggest hurdles to neuromorphic computing, and one that made it seem unreachable, is that most conventional computers operate at over 1 volt, while the brain sends signals called action potentials between neurons at around 80 millivolts - many times lower. Today, a decade after early experiments, memristor voltage has been achieved in the range similar to conventional computer, but getting below that seemed improbable, he adds.

Fu reports that using protein nanowires developed at UMass Amherst from the bacterium Geobacter by microbiologist and co-author Derek Lovely, he has now conducted experiments where memristors have reached neurological voltages. Those tests were carried out in the lab of electrical and computer engineering researcher and co-author Jun Yao.

Yao says, "This is the first time that a device can function at the same voltage level as the brain. People probably didn't even dare to hope that we could create a device that is as power-efficient as the biological counterparts in a brain, but now we have realistic evidence of ultra-low power computing capabilities. It's a concept breakthrough and we think it's going to cause a lot of exploration in electronics that work in the biological voltage regime."

Lovely points out that Geobacter's electrically conductive protein nanowires offer many advantages over expensive silicon nanowires, which require toxic chemicals and high-energy processes to produce. Protein nanowires also are more stable in water or bodily fluids, an important feature for biomedical applications. For this work, the researchers shear nanowires off the bacteria so only the conductive protein is used, he adds.

Fu says that he and Yao had set out to put the purified nanowires through their paces, to see what they are capable of at different voltages, for example. They experimented with a pulsing on-off pattern of positive-negative charge sent through a tiny metal thread in a memristor, which creates an electrical switch.

They used a metal thread because protein nanowires facilitate metal reduction, changing metal ion reactivity and electron transfer properties. Lovely says this microbial ability is not surprising, because wild bacterial nanowires breathe and chemically reduce metals to get their energy the way we breathe oxygen.

As the on-off pulses create changes in the metal filaments, new branching and connections are created in the tiny device, which is 100 times smaller than the diameter of a human hair, Yao explains. It creates an effect similar to learning - new connections - in a real brain. He adds, "You can modulate the conductivity, or the plasticity of the nanowire-memristor synapse so it can emulate biological components for brain-inspired computing. Compared to a conventional computer, this device has a learning capability that is not software-based."

Fu recalls, "In the first experiments we did, the nanowire performance was not satisfying, but it was enough for us to keep going." Over two years, he saw improvement until one fateful day when his and Yao's eyes were riveted by voltage measurements appearing on a computer screen.

"I remember the day we saw this great performance. We watched the computer as current voltage sweep was being measured. It kept doing down and down and we said to each other, 'Wow, it's working.' It was very surprising and very encouraging."

Fu, Yao, Lovely and colleagues plan to follow up this discovery with more research on mechanisms, and to "fully explore the chemistry, biology and electronics" of protein nanowires in memristors, Fu says, plus possible applications, which might include a device to monitor heart rate, for example. Yao adds, "This offers hope in the feasibility that one day this device can talk to actual neurons in biological systems."

Credit: 
University of Massachusetts Amherst

New therapeutic options for multiple sclerosis in sight

image: For the transcriptome analysis Alexander Mildner controls sorted monocytes under the microscope.

Image: 
Alexander Mildner, MDC

Multiple sclerosis (MS) is known as "the disease with a thousand faces" because symptoms and progression can vary dramatically from patient to patient. But every MS patient has one thing in common: Cells of their body's own immune system migrate to the brain, where they destroy the myelin sheath - the protective outer layer of the nerve fibers. As a result, an electrical short circuit occurs, which prevents the nerve signals from being transmitted properly.

Many MS medications impair immune memory

Researchers don't yet know exactly which immune cells are involved in stripping away the myelin sheath. Autoreactive T and B cells, which wrongly identify the myelin sheath as a foreign body, travel to the brain and initiate the disease. "Up until now, MS drugs have essentially targeted these T and B cells, both of which are part of the acquired immune system," says Dr. Alexander Mildner, a scientist at the Max Delbrück Center for Molecular Medicine in the Helmholtz Association (MDC) and the senior author of the paper now published in Nature Immunology.

Mildner is currently conducting externally funded research as a DFG Heisenberg fellow in Professor Achim Leutz's lab at the MDC, which focuses on cell differentiation and tumorigenesis. "But by attacking the acquired immune system, the MS drugs adversely affect the body's immune memory, thus making patients more susceptible to infections in the long run," the scientist says.

MS symptoms improved in mice by reducing monocytes

As a result, Mildner has been pursuing a different strategy for a couple of years now. He wants to find out what role immune cells - particularly those that are part of innate immunity - play in the development of MS and whether they represent a promising target structure for therapy of MS patients. "In an earlier study with a mouse model of MS, we were able to show that disease symptoms in the mice declined significantly within a few days after their monocytes were selectively destroyed by antibodies," the researcher reports. This result came as a big surprise to him and to many of his colleagues. "Apparently, it is not only T and B cells that are involved in causing tissue damage in MS," Mildner says.

The monocytes he studied are a special type of white blood cells that shortly circulate in the blood before migrating into tissue. Once there, they transform themselves into effector cells (phagocytes) and destroy foreign tissue in the central nervous system (CNS) - or which, during MS, they wrongly identify as such. "This process," Mildner says, "leads to inflammation and tissue damage in the brain."

The team discovered unknown types of monocytes

In the current study published in Nature Immunology, which he conducted in collaboration with an Israeli team led by Professor Ido Amit from the Department of Immunology at the Weizmann Institute of Science, Mildner and his team also focused on monocytes. "During the last recent years we realized that several types of these immune cells exits, which might carry out different functions," the researcher says. "We therefore wanted to examine in our mouse model of MS the monocytes in greater detail using single-cell sequencing and to find out, which monocyte subsets are present in the brain in MS and are responsible for tissue damage."

He and his colleagues identified six different monocyte subtypes, four of which were previously unknown. As in his earlier study, Mildner injected the mice with antibodies against a specific monocyte surface protein. As expected, the cells died and the MS symptoms in the mice decreased within a short period of time. "But what surprised us was that the antibodies did not destroy all monocyte subsets in the brain that have this surface protein," Mildner says.

Not all monocytes destroy the protective myelin sheath

"Only a certain type of monocyte, the Cxcl10+ cells, was destroyed by the antibody treatment," Mildner says. "These are apparently the cells that are primarily responsible for causing MS tissue damage in the brain." With the help of single-cell sequencing, he and his team also discovered that this cell type differs from other monocytes in two essential ways: First, Cxcl10+ cells have a particularly large number of receptors for a signal protein secreted by T cells that induces tissue damaging properties in monocytes. Second, these cells produce large amounts of interleukin-1-beta, a substance that opens the blood-brain barrier, enabling immune cells to more easily pass from the blood to the brain and exacerbate the symptoms. "Our research suggests that T cells, as disease initiators, travel to the CNS in order to lure there the monocytes that are responsible for the primary tissue damage," Mildner explains.

The other monocyte subsets that were identified, he speculates, are perhaps even involved in repair processes in which the body tries to rebuild the damaged myelin. In light of the study's findings, he thinks it is also possible that the T and B cells are not even directly involved in stripping away the myelin sheath, but only indirectly in that they prompt the Cxcl10+ monocytes to attack the protective layer of the axons.

Many side effects may be preventable

"If that is the case, in the future most forms of MS could be treated by specifically deactivating the Cxcl10+ monocytes instead of targeting the T or B cells of the immune system," Mildner says. "This would protect the body's immune memory and prevent many side effects of current MS therapies." The researcher and his team next plan to investigate whether the Cxcl10+ monocytes are also present outside the CNS. "If they exist in the body's periphery, for example, in the lymph nodes," he says, "there they would be easier to target with therapeutics than in the brain."

Credit: 
Max Delbrück Center for Molecular Medicine in the Helmholtz Association

Changes in snowmelt threaten farmers in western US

COLUMBUS, Ohio -- Farmers in parts of the western United States who rely on snowmelt to help irrigate their crops will be among the hardest hit in the world by climate change, a new study reveals.

In an article published today in Nature Climate Change, an interdisciplinary team of researchers analyzed monthly irrigation water demand together with snowmelt runoff across global basins from 1985 to 2015. The goal was to determine where irrigated agriculture has depended on snowmelt runoff in the past and how that might change with a warming climate.

They then projected changes in snowmelt and rainfall runoff if the Earth warms by 2 or 4 degrees Celsius (about 3 ½ or 7 degrees Fahrenheit), which will potentially put snow-dependent basins at risk.

The findings pinpointed basins globally most at risk of not having enough water available at the right times for irrigation because of changes in snowmelt patterns. Two of those high-risk areas are the San Joaquin and Colorado river basins in the western United States.

"In many areas of the world, agriculture depends on snowmelt runoff happening at certain times and at certain magnitudes," said Yue Qin, assistant professor of geography and a core faculty of the Sustainability Institute at The Ohio State University.

"But climate change is going to cause less snow and early melting in some basins, which could have profound effects on food production."

Qin, lead author of the study, designed the research with Nathaniel Mueller, assistant professor at Colorado State University, and Steven Davis, associate professor at the University of California, Irvine.

Under the 4-degree Celsius warming scenario, the researchers project that the share of irrigation water demand met by snowmelt in the San Joaquin Basin decreases from 33 percent to 18 percent. In the Colorado Basin, the share of water demand met by snowmelt decreases from 38 percent to 23 percent.

Other basins in which agriculture is at particular risk because of changes in snowmelt are located in southern Europe, western China and Central Asia.

Depending on both the magnitude and the timing, rainfall runoff may be able to compensate for declines in snowmelt runoff in meeting irrigation water demand - but only for some basins, the researchers calculated.

"In many basins, future changes in rainfall do not compensate for the lost snowmelt in crops' growing seasons," Mueller said.

The San Joaquin, for example, is one basin where increases in rainfall runoff won't be able to make up for snowmelt decline when irrigation is most needed.

The researchers evaluated the potential availability of reservoir storage and groundwater to help satisfy the additional irrigation need created by less snowmelt and early melting. In some basins, those additional requirements would pose great challenges in trying to make up for changing snowmelt patterns.

"Irrigation demands not met by rainfall or snowmelt currently already represent more than 40 percent of reservoir water storage in many Asian and North American basins," Davis said.

"And in a warming world, agriculture won't be the only added demand on reservoirs and other alternative water supplies like groundwater."

In the San Joaquin Basin, findings suggested that 14 percent of irrigation water demand must be met by new alternative sources under a 4-degree Celsius warming scenario. In the Colorado Basin, the figure would be 9 percent.

The study also examined which crops globally were at most at risk because of snowmelt changes resulting from climate change. Findings showed that rice and cotton in northern hemisphere summer, or wheat and managed grassland in spring, were particularly snow-dependent.

For the study, researchers used data on monthly rainfall and snowmelt runoff globally from 1985 to 2015 from a dataset called TerraClimate. They then calculated monthly irrigation water consumption for a variety of crops.

Comparing historical runoff and total surface water consumption, they estimated monthly snowmelt and rainfall runoff consumption as well as demand for alternative water sources in each major river basin.

They then used climate models to project snowmelt and rainfall runoff in each basin if global mean temperatures rise 2 degrees or 4 degrees Celsius above pre-industrial conditions.

Qin said the results of the study could be used to prioritize and inform methods to minimize the impact of changing snowmelt on water supplies for agriculture. In some cases, policymakers may have to consider extra groundwater pumping and reservoir development.

Limits may also need to be placed on water demand, such as by increasing crop water productivity, Qin noted.

"We need to find ways to help those basins that will most need to adapt to the coming changes," she said.

Credit: 
Ohio State University

Corona and air pollution: How does nitrogen dioxide impact fatalities?

Elevated levels of nitrogen dioxide in the air may be associated with a high number of deaths from Covid-19. A new study by Martin Luther University Halle-Wittenberg (MLU) provides concrete data that back this assumption for the first time. The paper combines satellite data on air pollution and air currents with confirmed deaths related to Covid-19 and reveals that regions with permanently high levels of pollution have significantly more deaths than other regions. The results were published in the journal Science of the Total Environment.

Nitrogen dioxide is an air pollutant that damages the human respiratory tract. For many years it has been known to cause many types of respiratory and cardiovascular diseases in humans. "Since the novel coronavirus also affects the respiratory tract, it is reasonable to assume that there might be a correlation between air pollution and the number of deaths from Covid-19," says Dr Yaron Ogen from the Institute of Geosciences and Geography at MLU. Until now, however, there has been an absence of reliable data to further investigate this.

In his latest study, the geoscientist combined three sets of data. This included the levels of regional nitrogen dioxide pollution measured by the European Space Agency's (ESA) Sentinel 5P satellite, which continuously monitors air pollution on earth. Based on this data, he produced a global overview for regions with high and prolonged amounts of nitrogen dioxide pollution. "I looked at the values for January and February of this year, before the corona outbreaks in Europe began," explains Ogen. He combined this data with data from the US weather agency NOAA on vertical air flows. His premise: If air is in motion, the pollutants near the ground are also more disseminated. However, if the air tends to stay near the ground, this will also apply to the pollutants in the air, which are then more likely be inhaled by humans in greater amounts and thus lead to health problems. Using this data, the researcher was able to identify hotspots around the world with high levels of air pollution and simultaneously low levels of air movement.

He then compared these with the data on deaths related to Covid-19, specifically analysing the data from Italy, France, Spain and Germany. It turned out that the regions with a high number of deaths also had particularly high levels of nitrogen dioxide and a particularly low amount of vertical air exchange. "When we look at Northern Italy, the area around Madrid, and Hubei Provence in China, for example, they all have something in common: they are surrounded by mountains. This makes it even more likely that the air in these regions is stable and pollution levels are higher," Ogen continues. The advantage of his analysis is that it is based on individual regions and does not only compare countries. "Even though we can obtain a country's average value for air pollution, this figure could vary greatly from region to region and therefore not be a reliable indicator", says Ogen.

The geoscientist suspects that this persistent air pollution in the affected regions could have led to overall poorer health in the people living there, making them particularly susceptible to the virus. "However, my research on the topic is only an initial indication that there might be a correlation between the level of air pollution, air movement and the severity of the course of the corona outbreaks," says Ogen. This correlation should now be examined for other regions and put into a broader context.

Credit: 
Martin-Luther-Universität Halle-Wittenberg

Scientists uncover principles of universal self-assembly

video: 3 nm CdTe quantum dots, 500 nm polystyrene spheres, and 5 μm S. cerevisiae yeast cells, 0.7 μm M. Luteus and 1 μm x 2 μm E. Coli bacterial cells, and 15 μm human mammary gland cells all collect when the laser is on and dissolve when it is off.

Image: 
Authors

For years, researchers have searched for the working principles of self-assembly that can build a cell (complex biological organism) as well as a crystal (far simpler inorganic material) in the same way.

Now, a team of scientists in Turkey has demonstrated the fundamental principles of a universal self-assembly process acting on a range of materials starting from a few atoms-large quantum dots up to nearly 100 trillion atoms-large human cells. Their method is highlighted in Nature Physics.

"To initiate self-assembly, either you force the system to deliver a specific outcome, or you use its inner dynamics to your advantage for universal outcomes. We followed the second approach," says Dr. Serim Ilday of Bilkent University-UNAM, who lead the study.

The researchers not only demonstrated the self-assembly of simple as well as complex constitutes that are more than four orders of magnitude different in size and mass. They all come together following a sigmoid function, also known as the S-curve. Curiously, they also observed that the individual deviations from the S-curves follow the statistics of the Tracy-Widom distribution, which manifests in diverse, social, economic, and physical systems.

"Serim and I were attending a math seminar, where we first saw this particular distribution. Serim looked at me and said, 'that's it; we need to search for it,' and we did." says Dr. F. Ömer Ilday of Bilkent University-UNAM, Physics, and Electrical and Electronics Engineering, co-author of the paper. The seminar was given by Dr. Gökhan Y?ld?r?m of Bilkent University-Mathematics, another co-author, says, "After the seminar, they approached me, the idea was fascinating, and we started working on it immediately." Dr. Ghaith Makey, the first author of the paper, adds, "What motivates us nowadays is to predict and study new examples of systems in Tracy-Widom universality, and understand why it manifests in very many different systems.".

The team further demonstrated how their method could be useful for practical applications. "The possibilities are endless. For instance, an unknown infection is detected at a hospital's ICU. Instead of waiting for hours or days to identify the culprit pathogen, within minutes, you can take a sample, add candidate drugs, see if they put an end to the infection." says Dr. E. Doruk Engin of Ankara University-Biotechnology Institute. "It was unbelievable to see how easily living organisms could be manipulated within seconds," says Dr. Özgür ?ahin of Bilkent University-Molecular Biology and Genetics, and also of the University of South Carolina. Dr. Hilmi Volkan Demir of Bilkent University-UNAM, Physics, and Electrical and Electronics Engineering and also of NTU Singapore, a co-author of the paper, adds "such instant control over tiny speedy quantum matter is in many ways beyond the capabilities of current technology. It can have a real impact on nanotechnology."

Researchers emphasize that this is by no means the end of the story, that self-assembly research has a long and rough road ahead in discovering and practicing mother nature's principles. Dr. Serim Ilday adds, "Possible practical uses aside, our method is a great tool for exploring the physics of how driven systems evolve far from equilibrium. This includes epidemics; in fact, our preliminary analysis of COVID-19 data suggests that its fluctuations may be following the Tracy-Widom statistics just like our system.". Dr. F. Ömer Ilday adds, "More interestingly, our analytical model fits COVID data better than the S-curve. Seeing all these, we decided to investigate the physics of the pandemic promptly."

Credit: 
Bilkent University Faculty of Science

Rare video captures humpback whale nursing behaviors in UH Mānoa research

image: UH researchers tag a humpback whale calf off of Maui  (NOAA permit #21476)

Image: 
UH Mānoa Marine Mammal Research Program

WHAT: An exciting new project that aims to quantify the nursing behavior of humpback whale calves in the Maui breeding grounds.

The project is a collaboration between the University of Hawaiʻi at Mānoa Marine Mammal Research Program, the Goldbogen Lab at Stanford University's Hopkins Marine Station and the Friedlander Lab at University of California, Santa Cruz. 

WHO: UH Mānoa Marine Mammal Research Program Director Lars Bejder, PhD candidates Martin van Aswegen and Will Gough.

HOW: Researchers deployed non-invasive suction-cup tags with cameras, acoustic recorders, depth sensors and accelerometers onto seven humpback whale calves. The camera recordings are providing researchers with seldom seen nursing behavior (including nursing bout frequency and durations) and social interactions between individuals. The accelerometer data allows them to quantify the fine-scale behavior, movement and breathing patterns of tagged whales. The fieldwork also consisted of flying drones over the tagged whales, allowing researchers to calculate their overall length, body condition and health.

WHEN: The project took place over 10 days in February 2020.

WHY: The data collected will provide important insights into the needs of humpback mothers and calves in the Maui breeding grounds.

OTHER FACTS:

Every winter, about 10,000 humpback whales migrate to Hawaiʻi, with the main purpose of breeding. The time period during which adult females and their newborn calves spend on the Hawaiian breeding grounds (typically January - March) represents a critical time. 
No feeding occurs during the breeding season, so the whales are reliant on energy stored from the earlier feeding season in Alaska. 
The tag deployments were made possible with the generous support of Marc Lammers from the Hawaiian Islands Humpback Whale Sanctuary, Stephanie Stack and Jens Currie from the Pacific Whale Foundation and the Oceanwide Science Institute.
Molokai Ocean Tours, PacWhale Eco-Adventures and Rachel and John Sprague were all instrumental in helping to retrieve the tags once they were off of the whales.
Keep up to date through the MMRP website and social media platforms (Twitter and Instagram: @MMRP_UH, Facebook: MMRPUH, Youtube: MMRP UH) to build awareness. To help achieve its mission, the program is also accepting donations to fund research initiatives and student scholarships. All donations are tax-exempt.
All research activities were conducted in accordance with NOAA permit #21476 and institutional animal care and use committee approval. All drone activities were conducted in accordance with FAA Part 107 regulations.

Credit: 
University of Hawaii at Manoa

North pole soon to be ice free in summer

image: Polar bears on Arctic sea ice.

Image: 
Dirk Notz

The Arctic Ocean in summer will very likely be ice free before 2050, at least temporally. The efficacy of climate-protection measures will determine how often and for how long. These are the results of a new research study involving 21 research institutes from around the world, coordinated by Dirk Notz from the University of Hamburg, Germany.

The research team has analyzed recent results from 40 different climate models. Using these models, the researchers considered the future evolution of Arctic sea-ice cover in a scenario with high future CO2 emissions and little climate protection. As expected, Arctic sea ice disappeared quickly in summer in these simulations. However, the new study finds that Arctic summer sea ice also disappears occasionally if CO2 emissions are rapidly reduced.

"If we reduce global emissions rapidly and substantially, and thus keep global warming below 2 °C relative to preindustrial levels, Arctic sea ice will nevertheless likely disappear occasionally in summer even before 2050. This really surprised us" said Dirk Notz, who leads the sea-ice research group at University of Hamburg, Germany.

Currently, the North Pole is covered by sea ice year round. Each summer, the area of the sea ice cover decreases, in winter it grows again. In response to ongoing global warming, the overall area of the Arctic Ocean that is covered by sea ice has rapidly been reduced over the past few decades. This substantially affects the Arctic ecosystem and climate: The sea-ice cover is a hunting ground and habitat for polar bears and seals, and keeps the Arctic cool by reflecting sunlight.

How often the Arctic will lose its sea-ice cover in the future critically depends on future CO2 emissions, the study shows. If emissions are reduced rapidly, ice-free years only occur occasionally. With higher emissions, the Arctic Ocean will become ice free in most years. Hence, humans still have an impact on how often the Arctic loses its year-round sea-ice cover.

Technical details: The simulations used in this study are based on so-called SSP Scenarios (shared socio-economic pathways), which will also be used for the next IPCC report. Scenarios SSP1-1.9 and SSP1-2.6 are used to simulate a rapid reduction of future CO2 emissions, while scenario SSP5-8.5 is used to simulate largely unchanged future CO2 emissions. The study is based on simulations from the most recent generation of climate models, collected within the Coupled Model Intercomparison Project Phase 6 (CMIP6).

Credit: 
University of Hamburg

Self-aligning microscope smashes limits of super-resolution microscopy

image: A T cell with precise localisation of T cell receptors (pink) and CD45 phosphatase (green).

Image: 
Single Molecule Science, UNSW Sydney

UNSW medical researchers have achieved unprecedented resolution capabilities in single-molecule microscopy to detect interactions between individual molecules within intact cells.

The 2014 Nobel Prize in Chemistry was awarded for the development of super-resolution fluorescence microscopy technology that afforded microscopists the first molecular view inside cells, a capability that has provided new molecular perspectives on complex biological systems and processes.

Now the limit of detection of single-molecule microscopes has been smashed again, and the details are published in the current issue of Science Advances.

While individual molecules could be observed and tracked with super-resolution microscopy already, interactions between these molecules occur at a scale at least four times smaller than that resolved by existing single-molecule microscopes.

"The reason why the localisation precision of single-molecule microscopes is around 20-30 nanometres normally is because the microscope actually moves while we're detecting that signal. This leads to an uncertainty. With the existing super-resolution instruments, we can't tell whether or not one protein is bound to another protein because the distance between them is shorter than the uncertainty of their positions," says Scientia Professor Katharina Gaus, research team leader and Head of UNSW Medicine's EMBL Australia Node in Single Molecule Science.

To circumvent this problem, the team built autonomous feedback loops inside a single-molecule microscope that detects and re-aligns the optical path and stage.

"It doesn't matter what you do to this microscope, it basically finds its way back with precision under a nanometre. It's a smart microscope. It does all the things that an operator or a service engineer needs to do, and it does that 12 times per second," says Professor Gaus.

Measuring the distance between proteins

With the design and methods outlined in the paper, the feedback system designed by the UNSW team is compatible with existing microscopes and affords maximum flexibility for sample preparation.

"It's a really simple and elegant solution to a major imaging problem. We just built a microscope within a microscope, and all it does is align the main microscope. That the solution we found is simple and practical is a real strength as it would allow easy cloning of the system, and rapid uptake of the new technology," says Professor Gaus.

To demonstrate the utility of their ultra-precise feedback single-molecule microscope, the researchers used it to perform direct distance measurements between signalling proteins in T cells. A popular hypothesis in cellular immunology is that these immune cells remain in a resting state when the T cell receptor is next to another molecule that acts as a brake.

Their high precision microscope was able to show that these two signalling molecules are in fact further separated from each other in activated T cells, releasing the brake and switching on T cell receptor signalling.

"Conventional microscopy techniques would not be able to accurately measure such a small change as the distance between these signalling molecules in resting T cells and in activated T cells only differed by 4-7 nanometres," says Professor Gaus.

"This also shows how sensitive these signalling machineries are to spatial segregation. In order to identify regulatory processes like these, we need to perform precise distance measurements, and that is what this microscope enables. These results illustrate the potential of this technology for discoveries that could not be made by any other means."

Postdoctoral researcher, Dr Simao Pereira Coelho, together with PhD student Jongho Baek - who has since been awarded his PhD degree - led the design, development, and building of this system. Dr Baek also received the Dean's Award for Outstanding PhD Thesis for this work.

Credit: 
University of New South Wales

Photonic microwave generation using on-chip optical frequency combs

image: Photograph of the silicon nitride photonic chips used for frequency comb and photonic microwave generation.

Image: 
Junqiu Liu and Jijun He (EPFL)

In our information society, the synthesis, distribution, and processing of radio and microwave signals are ubiquitous in wireless networks, telecommunications, and radars. The current tendency is to use carriers in higher frequency bands, especially with looming bandwidth bottlenecks due to demands for e.g. 5G and the "Internet of Things". "Microwave photonics", a combination of microwave engineering and optoelectronics, might offer a solution.

A key building block of microwave photonics is optical frequency combs, which provide hundreds of equidistant and mutually coherent laser lines. They are ultrashort optical pulses emitted with a stable repetition rate that corresponds precisely to the frequency spacing of comb lines. The photodetection of the pulses produces a microwave carrier.

In recent years there has been significant progress on chip-scale frequency combs generated from nonlinear microresonators driven by continuous-wave lasers. These frequency combs rely on the formation of dissipative Kerr solitons, which are ultrashort coherent light pulses circulating inside optical microresonators. Because of this, these frequency combs are commonly called "soliton microcombs".

Generating soliton microcombs needs nonlinear microresonators, and these can be directly built on-chip using CMOS nanofabrication technology. The co-integration with electronic circuitry and integrated lasers paves the path to comb miniaturization, allowing a host of applications in metrology, spectroscopy and communications.

Publishing in Nature Photonics, an EPFL research team led by Tobias J. Kippenberg has now demonstrated integrated soliton microcombs with repetition rates as low as 10 GHz. This was achieved by significantly lowering the optical losses of integrated photonic waveguides based on silicon nitride, a material already used in CMOS micro-electronic circuits, and which has also been used in the last decade to build photonic integrated circuits that guide laser light on-chip.

The scientists were able to manufacture silicon nitride waveguides with the lowest loss in any photonic integrated circuit. Using this technology, the generated coherent soliton pulses have repetition rates in both the microwave K- (~20 GHz, used in 5G) and X-band (~10 GHz, used in radars).

The resulting microwave signals feature phase noise properties on par with or even lower than commercial electronic microwave synthesizers. The demonstration of integrated soliton microcombs at microwave repetition rates bridges the fields of integrated photonics, nonlinear optics and microwave photonics.

The EPFL team achieved a level of optical losses low enough to allow light to propagate nearly 1 meter in a waveguide that is only 1 micrometer in diameter -100 times smaller than that a human hair. This loss level is still more than three orders of magnitude higher than the value in optical fibers, but represents the lowest loss in any tightly confining waveguide for integrated nonlinear photonics to date.

Such low loss is the result of a new manufacturing process developed by EPFL scientists - the "silicon nitride photonic Damascene process". "This process, when carried out using deep-ultraviolet stepper lithography, gives truly spectacular performance in terms of low loss, which is not attainable using conventional nanofabrication techniques," says Junqiu Liu, the paper's first author who also lead the fabrication of silicon nitride nanophotonic chips at EPFL's Center of MicroNanoTechnology (CMi). "These microcombs, and their microwave signals, could be critical elements for building fully integrated low-noise microwave oscillators for future architectures of radars and information networks."

The EPFL team is already working with collaborators in US to develop hybrid-integrated soliton microcomb modules that combine chip-scale semiconductor lasers. These highly compact microcombs can impact many applications, e.g. transceivers in datacenters, LiDAR, compact optical atomic clocks, optical coherence tomography, microwave photonics, and spectroscopy.

Credit: 
Ecole Polytechnique Fédérale de Lausanne

IKBFU and Chinese scientists invented a new way of creating meat analogs

image: This is Prof. Dr. Olga Babich.

Image: 
Immanuel Kant Baltic Federal University

Worldwide focus meat analogs keeps increasing to start producing vegetable protein non-cholesterol products containing essential amino acids. Extrusion is the best way to texture vegetable proteins. This is a method of processing raw materials in which the destruction of secondary bonds in protein molecules occurs, thus increasing their digestibility.

In recent years, high-humidity extrusion technology has become widely used as it makes it possible to obtain a fibrous meat-like structure from vegetable proteins.

The characteristics of vegetable meat analogs can be changed or improved by adding food additives, such as, for example, wheat gluten. Despite a large number of studies aimed at optimizing the extrusion process to adjust the taste and texture characteristics of meat analogs, research in the field of adjusting the taste of extruded meat analogs with a high moisture content is not entirely complete.

IKBFU Scientist Olga Babich together with Chinese colleagues has studies studied the influence of the mass fraction of moisture and the content of wheat gluten on such characteristics of meat analogs obtained by extrusion as retention of volatile aromatic substances, microstructure, moisture distribution, and secondary protein structure. The results of the study were published in the article "Effects of material characteristics on the structural characteristics and flavor substances retention of meat analogs" in the Food Hydrocolloids scientific journal.

The results of the study will be applied in the food industry. The regularities obtained during the study will allow controlling the exposure of aromatic substances (esters, aldehydes, alkanes, alkenes, phenols, alcohols) in the technological process for producing extruded meat analogs by creating a suitable microenvironment in products.

IKBFU Scientist, Olga Babich:

"A decrease in meat consumption has been observed worldwide, due to health concerns, as well as environmental, ethical and social reasons. Therefore, the practical value of our study is undeniable. Switching to a diet based on vegetable proteins means decreasing body weight, cholesterol, and blood pressure, thus reducing the risk of stroke, cardiovascular and oncological diseases. In this regard, there is increasing interest in meat analogs (vegetable meat, or soy meat) obtained by extrusion. Since the second half of the 20th century, low moisture extrusion technology (with

Credit: 
Immanuel Kant Baltic Federal University

With shrinking snowpack, drought predictability melting away

On April 1, water managers across the West use the amount of snowpack present as a part of a simple equation to calculate the available water supply for a given region that year. Historically, this method has accurately predicted whether large areas of the western U.S. will experience drought and to what degree. But new research from CU Boulder suggests that during the 21st century, our ability to predict drought using snow will literally melt away.

By mid-century, over two-thirds of western U.S. states that depend on snowmelt as a water source will see a significant reduction in their ability to predict seasonal drought using snowpack, according to the new study out today in Nature Climate Change. As we approach 2100, this area impacted by reduced drought prediction ability will increase to over 80%.

While measurements of soil moisture, rainfall and temperature can all help assess the chances of coming drought, even when those are taking into consideration, two-thirds of western states are projected to lose much of their ability to predict it.

"Although these other measurements increase a forecast's accuracy, the loss of snow is something that we're not going to be able to compensate for easily," said Ben Livneh, author of the paper and a Fellow in the Cooperative Institute for Research In Environmental Sciences (CIRES).

Snowpack is a crucial source of water for the western U.S., where as much as 75% of freshwater originates as snow. It is also the most relied upon element of annual drought prediction in the region.

Coastal areas that receive water from nearby snowy mountains, like northern California, and regions at lower elevations, like the Washington Cascade Mountains, will be most affected. This is due to the fact that in these areas, less precipitation will fall as snow and they will lose their snow sooner from warming temperatures.

Higher elevations, including the Colorado and Northern Rocky Mountains, will keep their snowpack for longer and be able to continue relying on it as part of their predictive equations. But by the end of the century, even Colorado will not be immune to losing significant snowpack, and therefore, losing accuracy in its seasonal drought prediction.

"If you don't accurately predict a year without drought, there's less impact," said Livneh, an assistant professor of Civil, Environmental and Architectural Engineering. "But there is so much to lose in a drought year by not being prepared for it."

The point of prediction

The paper is the first to assess what vanishing snowpack might mean for future drought predictability.

Using 28 climate models looking at critical water-producing areas of the mountainous western U.S., Livneh and co-author Andrew Badger, formerly at CIRES, now an associate scientist in the Hydrological Sciences Lab at the NASA Goddard Space Flight Center, simulated snow pack, melt water, stream flow, water storage and evaporation. They calibrated these models more than 20 times against historical data from 1950 to present, to see if they could accurately predict how snowpack impacted streamflow in the past before applying these models to the future. Once satisfied with the models, they ran them up to 2100.

The researchers found that further into the future, snowpack alone became less and less accurate at predicting drought due to the reduction, and eventually, the complete loss of at many lower elevations. Between 2035 and 2065, 69% of the western U.S. will see a reduction in accurate seasonal drought prediction on the basis of snow information, with the affected areas increasing to 83% of the greater West between 2070 and 2099.

This reduction in drought prediction ability will affect everything from agriculture and drinking water supplies, to hydropower and flood control. It might increase our reliance on reservoirs, which could fill at different times of year and complicate how cities and states receive their water.

Regions which rely primarily on snow for drought pediction should be looking not only to other methods, but also to places nearby that observe snow at higher elevations, recommends Livneh.

The researchers hope to directly work with regional water managers in Colorado, which will be less affected, as well as those in the Pacific Northwest - which may see some of the biggest impacts of lost snowpack on drought predictability - to plan and adjust to this quickly changing equation.

"This is one way in which the connection to climate change is very clear, and the changing snow landscape has a major impact. Our drinking water, our water supply, for example, is something we take for granted," said Livneh. "That's something people should think about: Is that always going to be the case?"

Credit: 
University of Colorado at Boulder

Almost half of all postpartum psychosis are isolated cases

A new research result from iPSYCH shows that forty per cent of the women who suffer a psychosis after giving birth - known as postpartum psychosis - do not subsequently become ill again.

Out of every thousand mothers, one or two will suffer a postpartum psychosis, but psychological vulnerability in connection with childbirth does not necessarily follow them through the rest of their lives. This is shown by in a research project including partners from iPSYCH.

"Almost half of the women who suffer a postpartum psychosis don't become ill again, excluding if they give birth again. That's to say that these women have a psychological vulnerability that is precisely related to the birth of a child, but at other times in their lives they don't have symptoms of psychiatric disorders and therefore don't require medicinal treatment outside the postpartum period," says Trine Munk-Olsen, who is one of the researchers behind the study.

The researchers refer to these cases as isolated postpartum psychoses.

The new study is based on a systematic literature review and a meta-analysis of published articles within the field, and the results have just been published in the international journal, Journal of Clinical Psychiatry.

Good news for these women

According to Trine Munk-Olsen, women with isolated postpartum psychosis could probably do without treatment for psychiatric disorders - though, of course, with the exception of the period immediately after childbirth.

"The results are also particularly interesting for psychiatrists who are planning the treatment of women with postpartum psychosis, once the acute phase of the disorder is over. This is when decisions about more long-term treatment must be made, and if we're able to identify the women who have an isolated postpartum psychosis, it's possible that these women will be able to discontinue medicinal treatment," she says.

The researcher would like to follow up the study by identifying specific genetic characteristics for isolated postpartum psychosis.

"If we can learn more about why some women have a psychological vulnerability that is particularly associated with childbirth, then we can move closer to finding the cause of psychiatric disorders for this group of women, and thus learning more about the causes of psychiatric disorders in general," explains Trine Munk-Olsen.

Credit: 
Aarhus University

Wind turbine noise affects dream sleep and perceived sleep restoration

image: This is Kerstin Persson Waye, Professor of Environmental Medicine at Sahlgrenska Academy, University of Gothenburg.

Image: 
Photo by University of Gothenburg

Wind turbine noise (WTN) influences people's perception of the restorative effects of sleep, and also has a small but significant effect on dream sleep, otherwise known as REM (rapid eye movement) sleep, a study at the University of Gothenburg, Sweden, shows. A night of WTN resulted in delayed and shortened REM sleep.

Knowledge of how sleep is affected by WTN has been limited to date. Research involving physiological study of its impact using polysomnography, the top-ranking method of sleep recording, is lacking.

Studies carried out in the Sound Environment Laboratory at the Department of Occupational and Environmental Medicine in Gothenburg are adding new knowledge in the field. Polysomnography involves using electrodes attached to the head and chest to record brain activity, eye movement, heart rate, etc. during sleep.

Of the 50 participants in the new study, 24 had been living within one kilometer of one or more wind turbines for at least one year. The other 26, the reference group, did not live near wind turbines.

Kerstin Persson Waye, Professor of Environmental Medicine at Sahlgrenska Academy, University of Gothenburg, is the corresponding author in the study, published in the journal Sleep.

"We wanted to find out whether people exposed to noise from wind turbines over time become more sensitive or more habituated to WTN, so that their sleep may be affected differently than someone who doesn't live near any turbines," she says.

The participants spent three nights in the Sound Environment Laboratory, one for acclimatization and then, in a random order, one quiet night and one with four separate periods of WTN. The sounds that were used were modeled based on outdoor measurements from several wind turbines, and was filtered to correspond with the sound insulation of a typical Swedish wooden house. Exposure was further modeled, to correspond to sleeping with a closed window and window ajar respectively.

The sounds were chosen to represent relatively unfavorable conditions, with a slightly higher average outdoor noise level than is currently permitted in Sweden. This level corresponded, however, with a low indoor noise level -- below the levels at which sleep had previously been found to be affected by, for example, traffic noise.

During the night with WTN, according to the physiological measures, the participants spent an average of 11.1 minutes less in REM sleep, which they entered 16.8 minutes later, than during the quiet night. The proportion of time they spent in REM sleep was 18.8% for the night with WTN, compared with 20.6% for the quiet night -- a small but statistically significant difference that, moreover, was independent from habituation to WTN.

There were no statistically significant differences in other sleep parameters, such as number of awakenings, total sleep time, time in deeper (non-REM) sleep stages or fragmentation of deep sleep, and heart rate. However, rhythmic sound variations appeared to disturb sleep, especially with closed windows.

Besides the physiologically based measurements, participants filled out a questionnaire on their sleep quality and how tired or rested they felt. Both groups reported that they slept worse during nights with WTN.

The study gave no indication of the habituation effect or increased sensitivity in the participants exposed to wind turbines in their home environment. However, the group that lived close to wind turbines reported worse sleep overall, even during the quiet night.

"Sleep disturbance, a negative health effect according to the World Health Organization (WHO), can in itself contribute to chronic diseases. However, we can't draw conclusions from this study on long-term health impact. Further studies should, if possible, investigate sleep in people's home environments and include longer exposure time," Kerstin Persson Waye concludes.

Credit: 
University of Gothenburg

Aquaculture at the crossroads of global warming and antimicrobial resistance

image: The index is calculated by determining the ratio between the number of antibiotics to which an isolate is resistant and the total of antibiotics to which it was exposed (an index >0.2 indicates high antibiotic contamination). The map shows high antimicrobial resistance indexes for 40 countries that account for 93% of global aquacultural production. This is the case for Indonesia (index: 0.355) and China (index: 0.325), for instance.

Image: 
© Miriam Reverter et al., Nature Communications.

Aquaculture - rearing aquatic organisms such as fish and shellfish - plays a vital role in food security in many countries (it supplies more than half of the aquatic animals consumed by humans worldwide). It is particularly important for developing countries, for instance in Asia, which accounts for 90% of global output.

Fish farmers use large quantities of antimicrobials to treat or prevent disease on their farms. However, when used inappropriately, antimicrobials are ineffective and foster the development of resistant bacteria.

An index to assess the risks of antimicrobial resistance in aquaculture

Researchers from IRD and CIRAD belonging to the Institute of Evolution Sciences of Montpellier's Fish Diversity and Aquaculture team (DIVA, UMR ISEM) examined data from more than 400 scientific articles referring to over 10 000 bacteria of aquacultural origin from 40 countries. That meta-analysis allowed them to study the effect of temperature on the mortality rate of aquatic animals infected with pathogenic bacteria commonly found in aquaculture. They then conducted a systematic review on the abundance of resistant bacteria found on fish farms and calculated the Multi-Antibiotic Resistance (MAR) index for 40 countries.

Global warming is partly responsible

"Our results show that global warming promotes the development of pathogenic bacteria, hence disease development on fish farms", Rodolphe Gozlan, an IRD specialist in biodiversity-health relations, explains.

Aquatic bacteria are in effect temperature-sensitive. "Global warming will therefore push up mortality rates on fish farms, which is likely to mean increased antibiotic use", says Miriam Reverter, a post-doctoral student at IRD, and as the study showed, antimicrobial resistance is already a reality in several countries among those that are highly vulnerable to climate change.

A threat to animal and human health

The study's authors raise the alarm about the consequences of inappropriate antibiotic use, for both the sustainability of aquaculture and human health. "Resistant bacteria in aquaculture can either spread or transmit their resistance genes to non-resistant bacteria that infect humans, thus causing diseases that are difficult to treat in both animals and humans", Samira Sarter, a microbiologist with CIRAD, explains.

These health risks linked to antibiotic use are not restricted to aquaculture. They also apply to terrestrial farms. "Some 60% of the infectious diseases that currently affect humans are of animal origin. If a resistant bacterium or its genes were to be transmitted to humans, and existing antibiotics were ineffective, we could face a steep rise in mortality rates as a result of antimicrobial resistance."

Finding alternative to antibiotics

"We urgently need to help producers in the global South find alternatives to treat and prevent disease on fish farms. This means encouraging research that makes use of the One Health or EcoHealth approaches, i.e. that is multi-disciplinary and multi-sector", Rodolphe Gozlan stresses. Work has shown that certain plants are highly effective for boosting disease immunity in fish. Their use on fish farms could help reduce antibiotic use. Alongside this, researchers are also working to develop more resilient aquaculture systems based on the principles of agroecology, in the aim of reducing disease rates.

Credit: 
Institut de recherche pour le développement