Earth

Ion-selective smart porous membranes

image: The Langmuir-Blodgett technique for preparing a polymer nanosheet - a precursor for smart porous membranes

Image: 
Yuya Ishizaki

A research group has developed an ion-selective smart porous membrane that can respond to outer stimuli, potentially paving the way for new applications in molecular separation and sensing applications.

Porous thin films have attracted the attention of scientists because of their potential use in sensors, energy harvesting, and ion/molecular separation.

Nanostructure properties, such as pore size, thickness, and film density, affect molecular selectivity and molecular permeability. Surface properties also have a significant impact on molecular selectivity.

Thus it is important to be able to control both the 3D nanostructures and surface properties of ultrathin porous films.

Previous research shed light on smart porous membranes, which are covered with molecules that can respond to external stimuli such as light, temperature, and pH. Yet their application to porous thin-films with an extremely thin film thickness of below 10 nm has proved immensely challenging for scientists.

"In our study, we succeeded in developing responsive porous SiO2 thin films with an extremely thin film thickness of 8 nm with a uniformly covered surface in a pH-responsive silane coupling agent," said Yuya Ishizaki from the Graduate School of Engineering at Tohoku University and co-author of the study. "The responsive porous thin film can adjust the surface charge depending on the pH change in the solution, resulting in selective ion permeation."

To prepare the porous films with a controlled structure to nm-scale accuracy, the research group focused on polymer thin films containing silsesquioxanes, which have unique cage structures.

The polymer films were fabricated using the Langmuir-Blodgett technique, chosen because it provides molecular-scale controllability in the film thickness. Langmuir-Blodgett polymer nanosheets also make it possible to fabricate porous SiO2 thin films with controlled nanostructures by simple UV-right irradiation under ambient conditions.

"We plan to develop highly efficient separation membranes and sensing materials that take advantage of the extremely thin film thickness and controlled surface properties in the future," added Ishizaki.

Credit: 
Tohoku University

Antarctic ice sheet retreat could trigger chain reaction

image: Today, Antarctic winds usually blow from the continent out to the sea

Image: 
Svein Østerhus

The Antarctic ice sheet was even more unstable in the past than previously thought, and at times possibly came close to collapse, new research suggests.

The findings raise concerns that, in a warmer climate, exposing the land underneath the ice sheet as it retreats will increase rainfall on Antarctica, and this could trigger processes that accelerate further ice loss.

The research is based on climate modelling and data comparisons for the Middle Miocene (13-17 million years ago) when atmospheric carbon dioxide and global temperatures reached levels similar to those expected by the end of this century.

The study was carried out by the Met Office, the universities of Exeter, Bristol, Cardiff and Stockholm, NORCE and the Bjerknes Centre for Climate Research.

"When an ice sheet melts, the newly exposed ground beneath is less reflective, and local temperatures become warmer," said lead author Dr Catherine Bradshaw, of the Met Office and the Global Systems Institute at the University of Exeter.

"This can dramatically change weather patterns.

"With a big ice sheet on the continent like we have today, Antarctic winds usually blow from the continent out to the sea.

"However, if the continent warms this could be reversed, with the winds blowing from the cooler sea to the warmer land - just as we see with monsoons around the world.

"That would bring extra rainfall to the Antarctic continent, causing more freshwater to run into the sea.

"Freshwater is less dense than saltwater and so it can sit on the sea surface, rather than sinking and circulating as saltwater does.

"This effectively breaks the connection between the deep ocean and the surface ocean, causing warmer water to accumulate at depth."

The study suggests that the processes triggered by increasing rainfall would reduce the ability of the climate system to maintain a large Antarctic ice sheet.

"Essentially, if more land is exposed in Antarctica, it becomes harder for a large ice sheet to reform, and without favourable orbital positions in the Middle Miocene playing a role, perhaps the ice sheet would have collapsed at that time," Dr Bradshaw said.

During the warm Middle Miocene period, unusually large swings back and forth in deep-sea temperatures were recorded.

The study shows that fluctuations in the area covered by the ice sheet were a major factor in causing deep-sea temperatures to change so dramatically. Fluctuations in the volume of ice were found to be of much less importance.

Variations in the positioning of the Earth relative to the Sun caused the ice sheet to advance and retreat, and this altered weather patterns - triggering processes that can accelerate ice loss or gain.

Rain falling on the ice sheet can cause fracturing, surface melt and extra freshwater running off the continent, which, in turn, can cause deep-sea temperatures to rise - potentially influencing Antarctic ice from beneath.

The findings of the new study suggest that the Antarctic ice sheet retreated significantly during the Middle Miocene, then stabilised when the warm period ended.

Co-author Associate Professor Agatha De Boer, from the University of Stockholm, said: "When the Middle Miocene climate cooled, the link we have found between the area of the ice sheet and the deep-sea temperatures via the hydrological cycle came to an end.

"Once Antarctica was fully covered by the ice sheet, the winds would always go from the land to the sea and as a result rainfall would have reduced to the low levels falling as snow over the continent we see today."

Dr Petra Langebroek, a Senior Researcher from NORCE and the Bjerknes Centre for Climate Research, another co-author, added: "These findings imply a shift in ocean sensitivity to ice sheet changes occurs when ice sheet retreat exposes previously ice-covered land."

Professor Carrie Lear, from Cardiff University, who first devised the project, concluded: "This study suggests that during a warm period about 15 million years ago, the Miocene Antarctic ice sheet was capable of major advance and retreat across the continent.

"This is concerning, but further research is needed to determine exactly what this means for the long-term future of the modern Antarctic ice sheet."

Dr Bradshaw stressed that conditions now are not identical to those in the Middle Miocene, and the model used in the study does not include the impact of feedbacks from the carbon cycle or the ice sheet itself.

Credit: 
University of Exeter

A new polarized fluorescent probe for revealing architectural dynamics of living cells

video: Time-lapse movies of the polarized fluorescence imaging during the first cleavage of a starfish embryo expressing POLArIS that specifically binds to F-actin in a rotationally constrained manner (the same egg as in Fig. 2A). Magenta parts in the left movie indicate horizontally aligned actin filaments and the green parts for those of actin filaments that are vertically aligned. Right, the same movie with enhanced contrast.

Image: 
Department of Neuroanatomy and Cellular Neurobiology,TMDU

Researchers from Tokyo Medical and Dental University (TMDU), collaborating with scientists from the Marine Biological Laboratory (MBL) and RIKEN, develop a novel technique for live-cell fluorescent imaging which leads them to discover a new actin structure in starfish early embryos.

Tokyo, Japan - Monitoring alignments of the building blocks of cells is important to understand how the cells are built. By collaborating with imaging scientists at the MBL, researchers from Japan have developed a new probe which they call POLArIS, allowing real-time imaging of molecular orientations in live cells.

A fluorophore emits polarized light as it glows. The orientation of polarized fluorescence is closely related to the orientation of the fluorophore. If a molecule of interest is rigidly connected with a fluorescent tag such as Green Fluorescent Protein (GFP), the polarized fluorescence from the fluorophore reports the orientation of the molecule.

"In previous approaches for monitoring the orientation of the protein of interest, researchers needed to develop effective constrained GFP tagging methods which might be different for each protein of interest," says one of the lead authors of the study, Ayana Sugizaki. "POLArIS uses an antibody-like binder that is rigidly connected with GFP, allowing both specific and versatile constrained labeling," adds another lead author Keisuke Sato. The team used a commercially available Adhiron molecule (now rebranded as "Affimer") as the binder molecule to link GFP to a target protein, and developed POLArIS by connecting Adhiron and GFP in a rotationally constrained manner (Figure 1). Because an Adhiron molecule that specifically binds to a molecule of interest can be easily selected from a library of molecules through phage display screening, POLArIS can be designed for any biological molecules of interest. POLArIS can be expressed in specific cell types and organelles, and will be useful for studying architectural dynamics of molecular assemblies in broad range of cell cultures, tissues and whole organisms. "From the point of view of fluorescence polarization imaging, POLArIS has significant advantages because of its genetically encoded nature," says Tomomi Tani, a Senior Researcher at the National Institute of Advanced Industrial Science and Technology (AIST) in Japan, who has joined this project since he was an Associate Scientist at the MBL.

By using the probe for actin, the team uncovered transient emergence and dissolution of highly ordered F-actin architecture that they named FLARE structure, in dividing cells of starfish embryo (Figure 2 and Movie 1). "We found that the structure extends up to the cell cortex in association with the astral microtubules," says corresponding author, Sumio Terada, who had frequently visited the MBL from Tokyo, together with his colleagues in TMDU. "The astral microtubules are responsible for connecting the spindle to the cell cortex and orientating it correctly, controlling the plane of cell divisions." The mechanisms that determine the cell division plane is the key for controlling many aspects of development, and yet a crucial mystery. The discovery of radially aligned actin architectures will shed light on the most fundamental unanswered questions of cell biology.

Credit: 
Tokyo Medical and Dental University

Scientists show immune cells change behavior unexpectedly to instigate psoriasis lesions

Millions of people suffer from psoriasis, a chronic, autoimmune disorder that causes scaly patches on the skin and often precedes psoriatic arthritis. While no cure exists, treatments range from topical creams to injected medications that block inflammation. To improve treatment options, scientists need to better understand the dysregulation of the immune system that leads to these lesions.

Using advanced computational genomic analysis of immune cells from mouse models, a researcher at the Pritzker School of Molecular Engineering (PME) at the University of Chicago and her collaborators discovered that, when exposed to a trigger, certain kinds of immune cells change their behavior in unexpected ways to produce the protein signals that cause lesions.

The research, co-led by Asst. Prof. Samantha Riesenfeld, reveals new pathways underlying immune responses and ultimately could lead to better treatment for the disease.

The results were published recently in the journal Nature.

Understanding how immune cells behave

The researchers, which include collaborators at Yale University and the Broad Institute of MIT and Harvard, set out to better understand innate lymphoid cells (ILCs), immune cells that reside in barrier tissues, such as the skin and lining of the gut. Though not as numerous as T-cells - which play a central role in the body's adaptive immune response - ILCs rapidly sense, integrate, respond to, and propagate signals, thereby modulating downstream immune responses.

Previous studies observed a specific type of ILCs in skin lesions and proposed their importance in driving psoriasis, but the origins of these ILCs and their role remained unclear. To find out what their role was, the group used a combination of advanced experimental and computational approaches.

Experimentally, the researchers stimulated skin inflammation in mouse models using interleukin-23 (IL-23) - a cytokine implicated in causing psoriatic lesions. Even in mice lacking all T-cells, ILCs could still drive psoriasis. The scientists isolated thousands of ILCs from the skin of mice over the course of disease induction and profiled the gene expression of these cells individually using single-cell RNA-sequencing.

Riesenfeld and her collaborators then used machine learning techniques on this high-dimensional gene expression data to quantitatively model the behavior of ILCs before and in response to IL-23 treatment.

They found that ILCs were engaged in a spectrum of activities, unconstrained by previously identified roles. Their models suggest that ILCs across this spectrum have a plasticity that allow them to respond to IL-23 signaling by changing programmed actions, normally considered a stable part of their identities, to produce the pathogenic cytokines that induce skin lesions. The scientists experimentally validated these predictions, using genetically altered mice that allowed the tracking of ILC fates.

"These findings tell us something new both about how psoriasis arises and about how an inflammatory trigger can change the behavior of immune cells," Riesenfeld said. "We now think about the identities of ILCs cells as more flexible and less predetermined than we used to. That is, cells that are predisposed to play one role may do something very different under duress."

A new approach to understanding the immune system

The combined experimental and computational approach can be used to characterize not just one gene or protein, but whole transcriptional programs of individual immune cells, which can, with the appropriate analysis, offer valuable insights into immune response patterns.

"Understanding how heterogenous cells integrate and are transformed by immune signals is central to addressing fundamental health-related questions, such as why one person responds to a stimulus with an inflammatory reaction, while another tolerates it," Riesenfeld said.

Delving into a basic understanding of how these cells work, this research suggests that better psoriasis treatment could ultimately involve blocking these early responder cells from becoming pathogenic.

Credit: 
University of Chicago

Researchers 3D print complex micro-optics with improved imaging performance

image: In tests of the new lenses, the reference lens (left) shows color seams due to chromatic aberrations. The 3D printed achromat lenses (middle) reduced these drastically while images taken with the apochromat (right) completely eliminated the color distortion.

Image: 
Michael Schmid, University of Stuttgart

WASHINGTON -- In a new study, researchers have shown that 3D printing can be used to make highly precise and complex miniature lenses with sizes of just a few microns. The microlenses can be used to correct color distortion during imaging, enabling small and lightweight cameras that can be designed for a variety of applications.

"The ability to 3D print complex micro-optics means that they can be fabricated directly onto many different surfaces such as the CCD or CMOS chips used in digital cameras," said Michael Schmid, a member of the research team from University of Stuttgart in Germany. "The micro-optics can also be printed on the end of optical fibers to create very small medical endoscopes with excellent imaging quality."

In The Optical Society (OSA) journal Optics Letters, researchers led by Harald Giessen detail how they used a type of 3D printing known as two-photon lithography to create lenses that combine refractive and diffractive surfaces. They also show that combining different materials can improve the optical performance of these lenses.

"3D printing of micro-optics has improved drastically over the past few years and offers a design freedom not available from other methods," said Schmid. "Our optimized approach for 3D printing complex micro-optics opens many possibilities for creating new and innovative optical designs that can benefit many research fields and applications."

Pushing the limits of 3D printing

Two-photon lithography uses a focused laser beam to solidify, or polymerize, a liquid light-sensitive material known as photoresist. The optical phenomenon known as two-photon absorption allows cubic micrometer volumes of the photoresist to be polymerized, which enables fabrication of complex optical structures on the micron scale.

The research team has been investigating and optimizing micro-optics made with two-photon lithography for the past 10 years. "We noticed that color errors known as chromatic aberrations were present in some of the images created with our micro-optics, so we set out to design 3D printed lenses with improved optical performance to reduce these errors," said Schmid.

Chromatic aberrations occur because the way that light bends, or refracts, when it enters a lens depends on the color, or wavelength, of the light. This means that without correction, red light will focus to a different spot than blue light, for example, causing fringes or color seams to appear in images.

The researchers designed miniature versions of lenses traditionally used to correct for chromatic aberrations. They began with an achromatic lens, which combines a refractive and diffractive component to limit the effects of chromatic aberration by bringing two wavelengths into focus on the same plane. The researchers used a commercially available two-photon lithography instrument made by NanoScribe GmbH to add a diffractive surface to a printed smooth refractive lens in one step.

They then took this a step further by designing an apochromatic lens by combining the refractive-diffractive lens with another lens made from a different photoresist with different optical properties. Topping the two-material lens with the refractive-diffractive surface reduces chromatic aberrations even more, thus improving imaging performance. The design was performed by Simon Thiele from the Institute of Technical Optics in Stuttgart, who recently spun out the company PrintOptics which gives customers access to the entire value chain from design over prototyping to a series of micro-optical systems.

Testing the micro-optics

To show that the new apochromatic lens could reduce chromatic aberration, the researchers measured the focal spot location for three wavelengths and compared them to a simple refractive lens with no color correction. While the reference lens with no chromatic correction showed focal spots separated by many microns, the apochromatic lenses exhibited focal spots that aligned within 1 micron.

The researchers also used the lenses to acquire images. Images taken using the simple reference lens showed strong color seams. Although the 3D printed achromat reduced these drastically, only images taken with the apochromat completely eliminated the color seams.

"Our test results showed that the performance of 3D printed micro-optics can be improved and that two-photon lithography can be used to combine refractive and diffractive surfaces as well as different photo resists," said Schmid.

The researchers point out that fabrication time will become faster in the future, which makes this approach more practical. It currently can take several hours to create one micro-optical element, depending on size. As the technology continues to mature, the researchers are working to create new lens designs for different applications.

Credit: 
Optica

Mixing massive stars

Astronomers commonly refer to massive stars as the chemical factories of the Universe. They generally end their lives in spectacular supernovae, events that forge many of the elements on the periodic table. How elemental nuclei mix within these enormous stars has a major impact on our understanding of their evolution prior to their explosion. It also represents the largest uncertainty for scientists studying their structure and evolution.

A team of astronomers led by May Gade Pedersen, a postdoctoral scholar at UC Santa Barbara's Kavli Institute for Theoretical Physics, have now measured the internal mixing within an ensemble of these stars using observations of waves from their deep interiors. While scientists have used this technique before, this paper marks the first time this has been accomplished for such a large group of stars at once. The results, published in Nature Astronomy, show that the internal mixing is very diverse, with no clear dependence on a star's mass or age.

Stars spend the majority of their lives fusing hydrogen into helium deep in their cores. However, the fusion in particularly massive stars is so concentrated at the center that it leads to a turbulent convective core similar to a pot of boiling water. Convection, along with other processes like rotation, effectively removes helium ash from the core and replaces it with hydrogen from the envelope. This enables the stars to live much longer than otherwise predicted.

Astronomers believe this mixing arises from various physical phenomena, like internal rotation and internal seismic waves in the plasma excited by the convecting core. However, the theory has remained largely unconstrained by observations as it occurs so deep within the star. That said, there is an indirect method of peering into stars: asteroseismology, the study and interpretation of stellar oscillations. The technique has parallels to how seismologists use earthquakes to probe the interior of the Earth.

"The study of stellar oscillations challenges our understanding of stellar structure and evolution," Pedersen said. "They allow us to directly probe the stellar interiors and make comparisons to the predictions from our stellar models."

Pedersen and her collaborators from KU Leuven, the University of Hasselt, and the University of Newcastle have been able to derive the internal mixing for an ensemble of such stars using asteroseismology. This is the first time such a feat has been achieved, and was possible thanks only to a new sample of 26 slowly pulsating B-type stars with identified stellar oscillations from NASA's Kepler mission.

Slowly pulsating B-type stars are between three and eight times more massive than the Sun. They expand and contract on time scales of the order of 12 hours to 5 days, and can change in brightness by up to 5%. Their oscillation modes are particularly sensitive to the conditions near the core, Pedersen explained.

"The internal mixing inside stars has now been measured observationally and turns out to be diverse in our sample, with some stars having almost no mixing while others reveal levels a million times higher," Pedersen said. The diversity turns out to be unrelated to the mass or age of the star. Rather, it's primarily influenced by the internal rotation, though that is not the only factor at play.

"These asteroseismic results finally allow astronomers to improve the theory of internal mixing of massive stars, which has so far remained uncalibrated by observations coming straight from their deep interiors," she added.

The precision at which astronomers can measure stellar oscillations depends directly on how long a star is observed. Increasing the time from one night to one year results in a thousand-fold increase in the measured precision of oscillation frequencies.

"May and her collaborators have really shown the value of asteroseismic observations as probes of the deep interiors of stars in a new and profound way," said KITP Director Lars Bildsten, the Gluck Professor of Theoretical Physics. "I am excited to see what she finds next."

The best data currently available for this comes from the Kepler space mission, which observed the same patch of the sky for four continuous years. The slowly pulsating B-type stars were the highest mass pulsating stars that the telescope observed. While most of these are slightly too small to go supernova, they do share the same internal structure as the more massive stellar chemical factories. Pedersen hopes insights gleaned from studying the B type stars will shed light on the inner workings of their higher mass, O type counterparts.

She plans to use data from NASA's Transiting Exoplanet Survey Satellite (TESS) to study groups of oscillating high-mass stars in OB associations. These groups comprise 10 to more than 100 massive stars between 3 and 120 solar masses. Stars in OB associations are born from the same molecular cloud and share similar ages, she explained. The large sample of stars, and constraint from their common ages, provides exciting new opportunities to study the internal mixing properties of high-mass stars.

In addition to unveiling the processes hidden within stellar interiors, research on stellar oscillations can also provide information on other properties of the stars.

"The stellar oscillations not only allow us to study the internal mixing and rotation of the stars, but also determine other stellar properties such as mass and age," Pedersen explained. "While these are both two of the most fundamental stellar parameters, they are also some of the most difficult to measure."

Credit: 
University of California - Santa Barbara

Study finds low sugar metabolite associates with disability, neurodegeneration in MS

image: Michael Demetriou, MD, PhD, FRCP(C), professor of neurology, microbiology and molecular genetics at UCI School of Medicine, is senior author on a new study that found low serum levels of the sugar N-acetylglucosamine (GlcNAc), is associated with progressive disability and neurodegeneration in multiple sclerosis (MS).

Image: 
UCI School of Medicine

Irvine, CA - May 13, 2021 - A new University of California, Irvine-led study finds low serum levels of the sugar N-acetylglucosamine (GlcNAc), is associated with progressive disability and neurodegeneration in multiple sclerosis (MS).

The study, done in collaboration with researchers from Charité - Universitätsmedizin Berlin, Germany, and the University of Toronto, Canada, is titled, "Association of a Marker of N-Acetylglucosamine With Progressive Multiple Sclerosis and Neurodegeneration," The study was published this week in JAMA Neurology.

The study suggests that GlcNAc, which has been previously shown to promote re-myelination and suppress neurodegeneration in animal models of MS, is reduced in serum of progressive MS patients and those with worse clinical disability and neurodegeneration.

"We found the serum levels of a marker of GlcNAc was markedly reduced in progressive MS patients compared to healthy controls and patients with relapsing-remitting multiple sclerosis" explained Michael Demetriou, MD, PhD, FRCP(C), professor of neurology, microbiology and molecular genetics at UCI School of Medicine, and senior author on the paper.

First author of the study, Alexander Brandt, MD, adjunct associate professor of neurology at the UCI School of Medicine and previously associated with the Experimental and Clinical Research Center, Charité - Universitätsmedizin Berlin and Max Delbrueck Center for Molecular Medicine, Germany, added, "Lower GlcNAc serum marker levels correlated with multiple measures of neurodegeneration in MS, namely worse expanded disability status scale scores, lower thalamic volume, and thinner retinal nerve fiber layer. Also, low baseline serum levels correlated with a greater percentage of brain volume loss at 18 months," he said.

GlcNAc regulates protein glycosylation, a fundamental process that decorates the surface of all cells with complex sugars. Previous preclinical, human genetic and ex vivo human mechanistic studies revealed that GlcNAc reduces proinflammatory immune responses, promotes myelin repair, and decreases neurodegeneration. Combined with the new findings, the data suggest that GlcNAc deficiency may promote progressive disease and neurodegeneration in patients with MS. However, additional human clinical studies are required to confirm this hypothesis.

"Our findings open new potential avenues to identify patients at risk of disease progression and neurodegeneration, so clinicians can develop and adjust therapies accordingly," said Michael Sy, MD, PhD, assistant professor in residence in the Department of Neurology at UCI and a co-author of the study.

MS is characterized by recurrent episodes of neurologic dysfunction resulting from acute inflammatory demyelination. Progressive MS is distinguished by continuous inflammation, failure to remyelinate, and progressive neurodegeneration, causing accrual of irreversible neurologic disability. Neurodegeneration is the major contributor to progressive neurological disability in MS patients, yet mechanisms are poorly understood and there are no current treatments for neurodegeneration.

Credit: 
University of California - Irvine

Immunocompromised pediatric patients showed T-cell activity and humoral immunity against SARS-CoV-2

According to data from a cohort of adult and pediatric patients with antibody deficiencies, patients that often fail to make protective immune responses to infections and vaccinations showed robust T-cell activity and humoral immunity against SARS-CoV-2 structural proteins. The new study, led by researchers at Children's National Hospital, is the first to demonstrate a robust T-cell response against SARS-CoV-2 in immunocompromised patients.

"If T-cell responses to SARS-CoV-2 are indeed protective, then it could suggest that adoptive T-cell immunotherapy might benefit more profoundly immunocompromised patients," said Michael Keller, M.D., director of the Translational Research Laboratory in the Program for Cell Enhancement and Technologies for Immunotherapy (CETI) at Children's National. "Through our developing phase I T-cell immunotherapy protocol, we intend to investigate if coronavirus-specific T-cells may be protective following bone marrow transplantation, as well as in other immunodeficient populations."

The study, published in the Journal of Clinical Immunology, showed that patients with antibody deficiency disorders, including inborn errors of immunity (IEI) and common variable immunodeficiency (CVID), can mount an immune response to SARS-CoV-2. The findings propose that vaccination may still be helpful for this population.

"This data suggests that many patients with antibody deficiency should be capable of responding to COVID-19 vaccines, and current studies at the National Institutes of Health and elsewhere are addressing whether those responses are likely to be protective and lasting," said Dr. Keller. The T-cell responses in all the COVID-19 patients were similar in magnitude to healthy adult and pediatric convalescent participants.

Kinoshita et al. call for additional studies to further define the quality of the antibody response and the longevity of immune responses against SARS-CoV-2 in immunocompromised patients compared with healthy donors. Currently, there is also very little data on adaptive immune responses to SARS-CoV-2 in these vulnerable populations.

The study sheds light on the antibody and T-cell responses to SARS-CoV-2 protein spikes based on a sample size of six patients, including a family group of three children and their mother. All have antibody deficiencies and developed mild COVID-19 symptoms, minus one child who remained asymptomatic. Control participants were the father of the same family, who tested positive for COVID-19, and another incidental adult (not next of kin) experienced mild COVID-19 symptoms. The researchers took blood samples to test the T-cell response in cell cultures and provided comprehensive statistical analysis of the adaptive immune responses.

"This was a small group of patients, but given the high proportion of responses, it does suggest that many of our antibody deficient patients are likely to mount immune responses to SARS-CoV-2," said Dr. Keller. "Additional studies are needed to know whether other patients with primary immunodeficiency develop immunity following COVID-19 infection and will likely be answered by a large international collaboration organized by our collaborators at the Garvan Institute in Sydney."

Credit: 
Children's National Hospital

Scientists invent a method for predicting solar radio flux for two years ahead

video: Europe's space freighter Automated Transfer Vehicle Jules Verne burning up over an uninhabited area of the Pacific Ocean at the end of its mission

Image: 
ESA

Scientists at the Skolkovo Institute of Science and Technology (Skoltech) and their colleagues from the University of Graz & the Kanzelhöhe Observatory (Austria) and the ESA European Space Operations Centre developed a method and software called RESONANCE to predict the solar radio flux activity for 1-24 months ahead. RESONANCE will serve to improve the specification of satellite orbits, re-entry services, modeling of space debris evolution, and collision avoidance maneuvers. The research results were published in the high-profile Astrophysical Journal Supplement Series.

Since the launch of Sputnik, the Earth's first artificial satellite, in 1957, more than 41,500 tons of manmade objects have been placed in orbit around the Sun, the Earth, and other planetary bodies. Since that time, the majority of objects, such as rocket bodies and large pieces of space debris, re-entered the Earth's atmosphere in an uncontrolled way, posing a potential hazard to people and infrastructure. Predicting the re-entry date and time is a challenging task, as one needs to specify the density of the upper Earth atmosphere that strongly depends on solar activity which, in turn, is hard to predict. Earth atmosphere can become very heated due to solar activity which causes it to expand, and a satellite can decay in its orbit and fall back to the Earth due to the effect known as atmospheric drag. In addition, there is a lot of space debris, much of it very small; if a spacecraft unexpectedly changes its orbit and encounters even a small piece of debris, this would be equivalent to hitting a bomb because of the high speed.

An international group of scientists led by Skoltech professor Tatiana Podladchikova developed a new method and software called RESONANCE ("Radio Emissions from the Sun: ONline ANalytical Computer-aided Estimator") which provides predictions of the solar radio flux at F10.7 and F30 cm with a lead time of 1 to 24 months. The F10.7 and F30 indices represent the flux density of solar radio emissions at a wavelength of 10.7 and 30 cm averaged over an hour and serve as a solar proxy of the ultraviolet solar emission which heats the Earth's upper atmosphere. The method combines state-of-art physics-based models and advanced data assimilation methods, where the resulting F10.7 and F30 forecasts are used as solar input in the re-entry prediction tool for further estimation of an object re-entry time.

"We systematically evaluated the performance of RESONANCE in providing re-entry predictions on past ESA re-entry campaigns for 602 payloads and rocket bodies as well as 2,344 objects of space debris that re-entered from 2006 to 2019 over the full 11-year solar cycle. The test results demonstrated that the predictions obtained by RESONANCE in general also lead to improvements in the forecasts of re-entry epochs and can thus be recommended as a new operational service for re-entry predictions and other space weather applications," says lead author and Skoltech's MSc graduate Elena Petrova who is currently pursuing her Ph.D. studies at the Centre for Mathematical Plasma Astrophysics, Catholic University of Leuven (KU Leuven).

"The number of re-entered objects is closely related to the solar activity level: the majority of objects return during the maximum solar activity phase within the 11-year cycle. Interestingly, the space debris re-entry time closely follows the evolution of the cycle, reacting immediately to changes in solar activity. At the same time, payloads and rocket bodies also show a large number of re-entries during the declining phase of the cycle, which may be related to the time delay between solar activity and re-entry for large objects", says professor Astrid Veronig, a co-author of the study and director of Kanzelhöhe Observatory at the University of Graz.

"It is very important to monitor and predict solar activity for orbit prediction needs. For example, Skylab which was intended to perform a controlled re-entry in the 1970s dropped on Earth in an uncontrolled way due to inaccurate calculations of the atmospheric drag due to solar activity. Another example is the most recent launch of the Chinese Long March 5B rocket on May 9, 2021: the remnants from its second stage that carried China's first space station module made an uncontrolled re-entry and landed in the Indian Ocean. Thus the development of robust and reliable space weather operational services bringing together the forefront of research with engineering applications is of prime importance for the protection of space and ground-based infrastructures and advancement of space exploration. And whatever storms may rage, we wish everyone a good weather in space," says Tatiana Podladchikova, assistant professor at the Skoltech Space Center (SSC) and a research co-author.

Currently, the team is preparing RESONANCE for operational use as part of a new space weather service for continuous prediction of solar radio flux activity.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

Study of nitinol deformations to enrich understanding of materials with targeted properties

image: Graphical abstract

Image: 
Kazan Federal University

The work was sponsored by Russian Science Foundation; the project, headed by Professor Anatolii Mokshin, is titled "Theoretical, simulating and experimental research of physico-mechanical traits of amorphous-producing systems with heterogeneous local visco-elastic properties".

"We performed calculations for porous nitinol," shares first co-author, Associate Professor Bulat Galimzyanov. "It's widely used in various industries thanks to its unique physico-mechanical properties, such as low volume weight, high corrosion resistance, high biocompatibility and shape memory. Obtaining nitinol as amorphous foam is very labor-intensive, it requires high temperatures and extremely high melt cooling rate (over 1,000,000 K per second). Obviously, traditional experiments in this case are very costly and complex. We used computer modelling based on molecular dynamics."

As Galimzyanov explains, amorphous metallic foams are prospective materials.

"Their cell structure comprises a solid metallic frame with gas-filled pores. Pores can be either hermetic or conjoined. The volume ratio of pores and their hermeticity determine the primary physico-chemical properties of the metallic foam, among which are low heat conductivity, high plasticity, and good noise absorption. Thanks to that, metallic foams can find wide applications in the automotive industry, shipbuilding, and aerospace industry," says the interviewee.

As the research shows, amorphous porous nitinol can sustain major mechanical loads, significantly higher than crystalline nitinol.

Apart from the aforementioned applications, amorphous porous nitinol can also be used in prosthetics and biocompatible materials because it's much more resistant to stretching and shrinking than bones but has the same porousness.

Credit: 
Kazan Federal University

Politically polarized brains share an intolerance of uncertainty

PROVIDENCE, R.I. [Brown University] -- Since the 1950s, political scientists have theorized that political polarization -- increased numbers of "political partisans" who view the world with an ideological bias -- is associated with an inability to tolerate uncertainty and a need to hold predictable beliefs about the world.

But little is known about the biological mechanisms through which such biased perceptions arise.

To investigate that question, scientists at Brown University measured and compared the brain activity of committed partisans (both liberals and conservatives) as they watched real political debates and news broadcasts. In a recent study, they found that polarization was indeed exacerbated by intolerance of uncertainty: liberals with this trait tended to be more liberal in how they viewed political events, conservatives with this trait tended to be more conservative.

Yet the same neural mechanisms was at work, pushing the partisans into their different ideological camps.

"This is the first research we know of that has linked intolerance to uncertainty to political polarization on both sides of the aisle," said study co-author Oriel FeldmanHall, an assistant professor of cognitive, linguistic and psychological sciences at Brown. "So whether a person in 2016 was a strongly committed Trump supporter or a strongly committed Clinton supporter, it doesn't matter. What matters is that an aversion to uncertainty only exacerbates how similarly two conservative brains or two liberal brains respond when consuming political content."

Jeroen van Baar, study co-author and a former post-doctoral researcher at Brown, said the findings are important because they show that factors other than political beliefs themselves can influence individuals' ideological biases.

"We found that polarized perception -- ideologically warped perceptions of the same reality -- was strongest in people with the lowest tolerance for uncertainty in general," said van Baar, who is now a research associate at Trimbos, the Netherlands Institute of Mental Health and Addiction. "This shows that some of the animosity and misunderstanding we see in society is not due to irreconcilable differences in political beliefs, but instead depends on surprising -- and potentially solvable -- factors such as the uncertainty people experience in daily life."

The study was published online in the journal PNAS on Thurs., May 13.

To examine whether and how intolerance for uncertainty shapes how political information is processed in the brain, the researchers recruited 22 committed liberals and 22 conservatives. They used fMRI technology to measure brain activity while participants watched three types of videos: a neutrally worded news segment on a politically charged topic, an inflammatory debate segment and a non-political nature documentary.

After the viewing session, participants answered questions about their comprehension and judgment of the videos and completed an extensive survey with five political and three cognitive questionnaires designed to measure traits like intolerance of uncertainty.

"We used relatively new methods to look at whether a trait like intolerance of uncertainty exacerbates polarization, and to examine if individual differences in patterns of brain activity synchronize to other individuals that hold like-minded beliefs," FeldmanHall said.

When the researchers analyzed participants' brain activity while processing the videos, they found that neural responses diverged between liberals and conservatives, reflecting differences in the subjective interpretation of the footage. People who identified strongly as liberal processed political content much in the same way and at the same time -- which the researchers refer to as neural synchrony. Likewise, the brains of those who identified as conservative were also in sync when processing political content.

"If you are a politically polarized person, your brain syncs up with like-minded individuals in your party to perceive political information in the same way," FeldmanHall said.

This polarized perception was exacerbated by the personality trait of intolerance of uncertainty. Those participants -- of any ideology -- who were less tolerant to uncertainty in daily life (as reported on their survey responses) had more ideologically polarized brain responses than those who are better able to tolerate uncertainty.

"This suggests that aversion to uncertainty governs how the brain processes political information to form black-and-white interpretations of inflammatory political content," the researchers wrote in the study.

Interestingly, the researchers did not observe the polarized perception effect during a non-political video or even during a video about abortion presented in a neutral, non-partisan tone.

"This is key because it implies that 'liberal and conservative brains' are not just different in some stable way, like brain structure or basic functioning, as other researchers have claimed, but instead that ideological differences in brain processes arise from exposure to very particular polarizing material," van Baar said. "This suggests that political partisans may be able to see eye to eye -- provided we find the right way to communicate."

Credit: 
Brown University

Congestion pricing could shrink car size

PULLMAN, Wash. - Rush hour will likely return when pandemic lockdowns lift, but a new study suggests that congestion pricing--policies that charge tolls for driving during peak hours--could not only cure traffic jams but also convince motorists it is safe to buy smaller, more efficient cars.

Researchers from Washington State University and the Brookings Institution studied a sample of nearly 300 households in the Seattle area over a six-year period, finding that the more congested their commutes, the more likely they would buy bigger cars which they perceive as safer and more comfortable. They then modelled what congestion pricing might do to change car purchase decisions, finding it would reduce the market share of mid- to full-size SUVs by 8%.

Nationally, shrinking that number of large vehicles on the road would mean a 10% decline in the vehicle fatality rate, saving lives and $25 billion in associated costs as well as another 3% in improved fuel efficiency which amounts to nearly $10 billion in savings.

"We found that congestion pricing can reduce congestion on one side and reduce vehicle size on the other," said Jia Yan, WSU economics professor and corresponding author on the study published in the Journal of Econometrics. "Then the positive impacts of decreasing vehicle size mean that energy consumption and fatality rates can also drop."

In 1980, light trucks and SUVs made up only about 20% of new vehicles sold. In 2017, that figure had risen to 62%. Previous research indicates that traffic jams can lead to an "arms race" with drivers buying bigger and bigger cars for their perceived personal safety on congested highways when accidents are more likely to occur.

"If the highways you are travelling on are very congested, and you are sitting in a small car surrounded by many large SUVs, that may motivate you to purchase a larger car to protect yourself. It's logical reasoning," said Yan. "If the congestion decreases, and drivers can easily travel on a free-flowing highway that self-protection motivation drops."

Drivers are also drawn to the comfort of larger vehicles when they spend more time traveling on a highway, Yan added. When congestion decreases, the two motivations of safety and comfort also decrease, the authors found, and it's less likely that commuters will choose to purchase larger vehicles.

While many people perceive that having a large SUV or light truck will protect them personally, studies indicate that larger vehicles increase the risk of fatalities for the occupants of smaller cars in multi-vehicle accidents.

"There's always a trade-off between your own benefits and the cost to others in society," said Yan. "When people purchase a large vehicle, they don't always take in these externalities - these negative impacts - into consideration. They only consider their own self-protection, or whether they are comfortable when they're driving, so this is why we need better policy."

Credit: 
Washington State University

CNIO researchers discover the cause of neuronal death in a large proportion of familial ALS patients

image: Mouse motor neurons, generated from mouse embryonic stem cells exposed (right) or not (left) to ALS-associated peptides (right). As observed in patients, these peptides are toxic and cause neuronal death.

Image: 
CNIO

In Amyotrophic Lateral Sclerosis (ALS), the progressive death of neurons that control body movement leads to paralysis of muscles in the limbs and gradually of the whole body, which ultimately makes it impossible to breathe. ALS is currently untreatable, and its cause is unknown.

It is known, however, that in 10% of affected individuals there is a strong genetic component, which causes the disease to occur in several members of a single family. In about half of these cases of familial ALS, the origin lies in a gene called C9ORF72. But why do mutations in this gene kill motor neurons?

The answer may have been found by the Genomic Instability Group headed by Óscar Fernández-Capetillo at the Spanish National Cancer Research Centre (CNIO), who discovered a mechanism that explains the toxicity derived from mutations in C9ORF72. The novel mechanism links these mutations to a general problem that blocks all nucleic acids, DNA and RNA, and thus disrupts a multitude of processes that are fundamental to the functioning of cells.

The paper is published this week in The EMBO Journal, with CNIO researchers Vanesa Lafarga and Oleksandra Sirozh as first authors.

Why neurons die in ALS patients

ALS researchers had already observed that many basic cellular processes that use nucleic acids do indeed fail in the neurons of affected patients. Now the CNIO group provides a model that connects them all and thereby explains these widespread problems.

"I think we have a pretty satisfactory model that helps us understand what is going on in the motor neurons of ALS patients, what is killing them," says Fernández-Capetillo. "We are excited, as the key to curing any disease is to understand first what is not working. Only then can you start looking for a treatment."

Although the newly identified toxic mechanism is associated with mutations in a specific gene, C9ORF72, the CNIO group believes it is likely that other ALS-related mutations are acting in a similar way, i.e., by blocking the DNA and RNA of motor neurons.

Too much arginine

Mutations in the C9ORF72 gene are toxic, the CNIO researchers reveal, because they induce the cell to produce small proteins or peptides that are very rich in arginine, an amino acid that, due to its positive charge and chemical nature, binds very avidly to nucleic acids, DNA and RNA.

The CNIO study indicates that, by binding to nucleic acids with such high affinity, these arginine-rich proteins displace all cellular proteins that interact with DNA and RNA in a widespread manner, thus blocking any cellular reaction that involves DNA or RNA. And as a result, with its nucleic acids effectively blocked, the cell dies.

DNA contains the instructions for the cell to make the proteins it needs for proper function. Hundreds of proteins need to anchor themselves to the DNA and RNA to read their instructions and eventually make new components for the cell. But "the presence of arginine-rich peptides hampers any reaction involving nucleic acids," the authors of the new study add.

As Fernández-Capetillo puts it: "What we have seen is that arginine-containing peptides are like a kind of tar that sticks to nucleic acids and decorates them, and in doing so they displace the proteins that are normally bound to the nucleic acids so that nothing that involves DNA or RNA works."

Eureka moment

"In all these decades of ALS research, neuroscience researchers have been publishing all sorts of problems in reactions using nucleic acids: translation, replication, etc. Nothing works! We think our model gives a simple answer to all these observations," continues Fernández-Capetillo.

Fernández-Capetillo's research usually focuses on cancer, but he strives to "keep his eyes open" to any problem to which his knowledge can be applied. In 2014, he started working on ALS, convinced that a technique recently established in his group could help them understand the toxicity of mutations in the C9ORF72 gene. And it was a flash of insight, an idea that came up after having coffee at the CNIO with Nobel laureate Jack Szostak, which put him on the trail of arginine.

"Szostak investigates the chemistry of the origin of life, and he told me that to stop reactions involving nucleic acids what they used in their experiments was precisely synthetic peptides with lots of arginines because of their high affinity for nucleic acids," says Fernández-Capetillo. "So I thought, what if this is what is going on, what if the arginine-rich proteins in ALS patients are blocking DNA and RNA in a generalised way?"

The protein that compacts sperm nucleus does the same thing

This initial hypothesis was supported when the group decided to test whether similar problems were also seen when cells would be exposed to a natural protein that has a lot of arginines. There is such a protein, but it is expressed only transiently during the development of sperm cells: protamine.

Consistent with the model now published in The EMBO Journal, the biological function of protamine is to displace histones from the DNA; histones are proteins that facilitate DNA compaction. "By exchanging histones for protamine, which is smaller, sperm DNA can become more compact," explains Fernández-Capetillo.

However, protamine is toxic to any cell that is not a sperm cell. "We think that what happens in ALS patients is equivalent to what would happen if their motor neurons accidentally started to produce protamine."

Indeed, the paper in The EMBO Journal shows that the cellular effects of protamine are identical to those of the arginine-rich peptides found in ALS patients.

How to overcome toxicity

Now that we understand why arginine-rich peptides are toxic, the next step is to find ways to overcome this toxicity. And research along these lines has already begun in the group. As has the work to create animal models in which the problem - the production of toxic peptides - is reproduced, to provide a platform for testing potential therapies.

Learning how to alleviate the toxicity of these peptides may also be useful in addressing non-C9ORF72-associated ALS, that is, the disease as a whole. The authors of the paper believe that the widespread mechanism of nucleic acid blocking is probably what happens in ALS in general.

As Vanesa Lafarga explains, "the vast majority of mutations found in ALS patients are in proteins that bind RNA, and what these mutations generally do is prevent the binding of these proteins to RNA. Moreover, the cells of these patients also have very general problems with their nucleic acids. That is why we believe that, although mutations in C9ORF72 only affect a fraction of ALS patients, the mechanism underlying the toxicity of neurons may not be fundamentally different from what happens in the rest of ALS patients. Trying to show whether this is the case is something we are working on now."

Credit: 
Centro Nacional de Investigaciones Oncológicas (CNIO)

New algorithm to ensure more accuracy in studying the interior of the Earth

image: Arseny Shlykov at work

Image: 
SPbU

An essential preliminary to building and construction or resource extraction is studying the geological structure of the site. One of steps of this process is geophysical investigation. This provides a continuous overview of the geological horizons rather than just data on points: boreholes. The standard methods of geophysics help successfully solve this problem in comparatively simple conditions. Yet the classical direct current methods may lead to serious inaccuracy if we have to investigate geologically complex structures with thin layers of sandy and clayey soils.

Among the most popular methods in geoelectrical survey is electrical resistivity tomography (ERT). It is a geophysical method for imaging subsurface structures by electrical resistivity measurements made at the surface or in boreholes. It allows geologists to 'see' various rock formations as they have different resistivity. Yet the electrical resistivity tomography can also result in serious inaccuracy in measuring geologic layer thickness, and therefore lead to a considerable increase in values.

'Errors in estimation of the electrical properties of the soils may lead to mistakes in pile construction and other problems during building. When we explore the deposit of the sands, such errors can lead to wrong data about sand reserves. You never know what is below the earth's surface. If we interpret our data following formal approach only it is a huge chance to have mistakes,' said Arseny Shlykov, the first author of the research, PhD and senior researcher at the Institute of Earth Sciences at St Petersburg University .

Electrical resistivity tomography (ERT) is not the only method to investigate the Earth's subsurface. A relatively new radiomagnetotelluric (RMT) method is being developed by geophysicists at St Petersburg University and their colleagues at: the Institute of Geophysics and Meteorology (IGM), the University of Cologne (Germany); and Indian Institute of Technology Kharagpur (IIT Kharagpur). It uses electromagnetic field of remote radio transmitters, and provides information about the subsurface to depths from 1 to 30-50 metres. If we use controlled source radiomagnetotelluric (CSRMT), we can study the subsurface down to 100 - 150 metres.

'If we use both methods in one site with complicated geoelectrical section, we can get different results. It is because significantly different structure of the electromagnetic field used in CSRMT and ERT methods. But the joint inversion of CSRMT and ERT data allows to use benefits of both methods and get more accurate results. This is the reason why we needed an algorithm to join them,' said Arseny Shlykov.

The field experiment was carried out on the field testing site of Lomonosov Moscow State University that is located in the settlement of Aleksandrovka in the Kaluga region. The international team of geophysicists compared the results obtained using both methods and interpreted obtained data both separately and jointly. The data obtained using the newly developed algorithm was the closest to the borehole data.

'The newly developed algorithm is a one more step forward to ensuring more accuracy of geophysical exploration. This algorithm works within a one-dimension horizontally layered vertically anisotropic model of the Earth. The one-dimensional models are simplest. They represent the subsurface as a puff pastry pie with multiple horizontal layers. The properties of the rocks in such models can change in one direction only, i.e. downward. That's why such models are called one-dimensional. Obviously, the geological media is more complex. We are planning to continue developing the algorithm to be able to use it with two- and three-dimensional geological models. Two-dimensional models represent both vertical and lateral changes. Yet the lateral changes are also in one direction only. Three-dimensional models are most complex and close to what we have in real life. Yet using three-dimensional models is not an easy task. It is rather resource-intensive and time consuming,' said Arseny Shlykov.

Credit: 
St. Petersburg State University

How smartphones can help detect ecological change

image: The Flora Incognita mobile app can help identify plants in the field. In additiona, by gathering information on the location of identified plant species, valuable datasets are created.

Image: 
Jana Wäldchen / MPI-BGC

Leipzig/Jena/Ilmenau. Mobile apps like Flora Incognita that allow automated identification of wild plants cannot only identify plant species, but also uncover large scale ecological patterns. These patterns are surprisingly similar to the ones derived from long-term inventory data of the German flora, even though they have been acquired over much shorter time periods and are influenced by user behaviour. This opens up new perspectives for rapid detection of biodiversity changes. These are the key results of a study led by a team of researchers from Central Germany, which has recently been published in Ecography.

With the help of Artificial Intelligence, plant species today can be classified with high accuracy. Smartphone applications leverage this technology to enable users to easily identify plant species in the field, giving laypersons access to biodiversity at their fingertips. Against the backdrop of climate change, habitat loss and land-use change, these applications may serve another use: by gathering information on the locations of identified plant species, valuable datasets are created, potentially providing researchers with information on changing environmental conditions.

But is this information reliable - as reliable as the information provided by data collected over long time periods? A team of researchers from the German Centre for Integrative Biodiversity Research (iDiv), the Remote Sensing Centre for Earth System Research (RSC4Earth) of Leipzig University (UL) and Helmholtz Centre for Environmental Research (UFZ), the Max Planck Institute for Biogeochemistry (MPI-BGC) and Technical University Ilmenau wanted to find an answer to this question. The researchers analysed data collected with the mobile app Flora Incognita between 2018 and 2019 in Germany and compared it to the FlorKart database of the German Federal Agency for Nature Conservation (BfN). This database contains long-term inventory data collected by over 5,000 floristic experts over a period of more than 70 years.

Mobile app uncovers macroecological patterns in Germany

The researchers report that the Flora Incognita data, collected over only two years, allowed them to uncover macroecological patterns in Germany similar to those derived from long-term inventory data of German flora. The data was therefore also a reflection of the effects of several environmental drivers on the distribution of different plant species.

However, directly comparing the two datasets revealed major differences between the Flora Incognita data and the long-term inventory data in regions with a low human population density. "Of course, how much data is collected in a region strongly depends on the number of smartphone users in that region," said last author Dr. Jana Wäldchen from MPI-BGC, one of the developers of the mobile app. Deviations in the data were therefore more pronounces in rural areas, except for well-known tourist destinations such as the Zugspitze, Germany's highest mountain, or Amrum, an island on the North Sea coast.

User behaviour also influences which plant species are recorded by the mobile app. "The plant observations carried out with the app reflect what users see and what they are interested in," said Jana Wäldchen. Common and conspicuous species were recorded more often than rare and inconspicuous species. Nonetheless, the large quantity of plant observations still allows a reconstruction of familiar biogeographical patterns. For their study, the researchers had access to more than 900,000 data entries created within the first two years after the app had been launched.

Automated species recognition bears great potential

The study shows the potential of this kind of data collection for biodiversity and environmental research, which could soon be integrated in strategies for long-term inventories. "We are convinced that automated species recognition bears much greater potential than previously thought and that it can contribute to a rapid detection of biodiversity changes," said first author Miguel Mahecha, professor at UL and iDiv Member. In the future, a growing number of users of apps like Flora Incognita could help detect and analyse ecosystem changes worldwide in real time.

The Flora Incognita mobile app was developed jointly by the research groups of Dr. Jana Wäldchen at MPI-BGC and the group of Professor Patrick Mäder at TU Ilmenau. It is the first plant identification app in Germany using deep neural networks (deep learning) in this context. Fed by thousands of plant images, that have been identified by experts, it can already identify over 4,800 plant species.

"When we developed Flora Incognita, we realized there was a huge potential and growing interest in improved technologies for the detection of biodiversity data. As computer scientists we are happy to see how our technologies make an important contribution to biodiversity research," said co-author Patrick Mäder, professor at TU Ilmenau.

Credit: 
German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig