Tech

An unusual symbiosis of a ciliate, green alga, and purple bacterium

Dr Sebastian Hess and his team at the University of Cologne's Institute of Zoology have studied a very rare and puzzling tripartite symbiosis. This consortium consists of a ciliate as host and two types of endosymbionts: a green alga and a previously unknown purple bacterium. Through genetic analyses of the pink-green ciliate, the researchers discovered that the endosymbiotic bacterium belongs to the so-called 'purple sulfur bacteria' (family Chromatiaceae), but has lost the ability to oxidize reduced sulfur compounds, a hallmark of the other members of the Chromatiaceae. The genome of the purple bacterium is greatly reduced, suggesting that the bacterium became mainly specialized in carbon fixation through photosynthesis. It is probably no longer able to live outside of the host cell. Thus, the new bacterial species "Candidatus Thiodictyon intracellulare" is a notable exception among the known purple sulfur bacteria. In their article 'A microbial eukaryote with a unique combination of purple bacteria and green algae as endosymbionts' published in Science Advances, the researchers report the new findings and explain how the oxygen-sensitive purple bacterium lives together with the oxygenic green alga in the ciliate host.

The pink-green ciliate Pseudoblepharisma tenue inhabits the hypoxic sediments in the ponds of the 'Simmelried', a moorland near the German city of Constance. Dr Martin Kreutz, an amateur microscopist and friend of Dr Hess, has been observing the occurrence of this unusual life form for several years. Since the ciliate could not be cultivated in the laboratory so far, a lively collaboration has developed between Dr Kreutz and the UoC researchers. Dr Kreutz took the samples and sent the fresh sediment to Cologne by mail. In Cologne, the researchers isolated single cells of Pseudoblepharisma tenue from the samples and analysed them with various microscopic and genetic techniques.

The first author of the study, Dr Sergio Muñoz-Gómez, reconstructed the genomes of the three symbiotic partners and demonstrated the massive physiological reduction of the endosymbiotic purple bacterium. 'Based on our observations in natural samples, the microscopic details of the symbiotic consortium, and the genomic data an interesting picture emerges. Symbiosis has given rise to a unique chimeric creature: a motile and voracious cell that, at the same time, uses light energy from anoxygenic photosynthesis to inhabit the deep and oxygen-poor layers of ponds,' the researchers said. 'The green algae seem to play a minor role. Instead, the crucial physiological contribution comes from the oxygen-sensitive purple sulfur bacteria.'

Credit: 
University of Cologne

Pollutant concentration increases in the franciscana dolphin

image: The levels of chromium, copper, iron and nickel increased over the 1953-2015 period in the most threatened Cetacean populations of the western Atlantic Ocean

Image: 
Massimiliano Drago, UB-IRBio

The concentration of potentially toxic metals is increasing in the population of the franciscana dolphin --a small cetacean, endemic from the Rio de la Plata and an endangered species-- according to a study led by a team of the Faculty of Biology and the Biodiversity Research Institute (IRBio), published in the journal Science of The Total Environment.

The impact of human activity in the region could be the cause for the increase of trace elements such as chromium, copper, iron and nickel in the dolphins' biological tissues, as stated in the study. The paper counts on the participation of members from the National History Museum of Uruguay, and is subsidized through a project of the research and conservation program of the Barcelona Zoo Foundation, with Massimiliano Drago (UB-IRBio) as principal researcher.

One of the smallest and most threatened dolphin species worldwide

The franciscana dolphin (Pontoporia blainvillei) is an endemic species of the marine regions of Brazil, Uruguay and Argentina, and it is considered a vulnerable species according to the International Union for the Conservation of Nature (IUCN). Currently, it is considered the most threatened cetacean in the southwestern Atlantic Ocean, and its population decreased due to accidental bycatches that accelerated by mid-20th century with the artisanal shark fishing to take vitamin A. Now, the future of this species is in danger due to accidental fishing --it causes between 1200 and 1800 dolphin deaths every year, mainly juvenile ones-- and the progressive degradation of the environment due to the impact of maritime transport, tourism and environmental pollution.

Rio de la Plata: biological productivity and anthropogenic pollution

The Rio de la Plata estuary, in the western coast of the southwestern Atlantic Ocean, is one of the richest and most productive ecosystems worldwide. It is an affected marine region by the anthropogenic activity (maritime transport, industry, expansion of urban areas, untreated wastewaters, etc.) which favours the accumulation of pollutants. Also, the estuary receives the transported pollutants by the water network of the river branches of the Parana and Uruguay rivers and other secondary rivers. With more than three million kilometres of expansion, this large water system transports a great volume of water masses that are highly polluted during their passing for big towns and urbanized regions in the South American continent.

In the internal area of the estuary --the most polluted one-- there is a lot of freshwater from river branches and the waste from nearby cities (Buenos Aires, Montevideo, la Plata, etc.). The area in the middle has freshwater with marine influence and is less polluted while the external area has salty waters with a salinity gradient. There are marine currents, generated by the regime of tides in the estuary, that drive the entrance of the marine water in the intermediate area and the exit of freshwater towards the external section.

Analysis of trace elements on dolphin bone remains

The pollutants such as polychlorinated biphenyl, pesticides, hydrocarbons, plasticizers, or some metals can be endocrine or cancer disruptors, and they can cause adverse reproductive effects or osteoporosis, among others. Among the pollutants that are thrown into the estuary "are trace elements, which are specially worrying, such as certain heavy metals that can be highly toxic for the marine fauna and indirectly, for humans", notes Odei García-Garín, first author of the article and member of the Research Group on Large Marine Vertebrates, led by Professor Àlex Aguilar.

The paper analyses the concentration of trace elements in franciscana dolphin bone remains in Rio de la Plata during the 1953-2015 period. According to the results, the concentration of chromium, copper, iron and nickel, has increased in sixty-two years, while the levels of lead have been reduced in the dolphin bone remains.

Anthropogenic activities could be the origin of the growing concentration of metals in marine mammals, as stated in the study. The trace elements coming from the waste of the industries of leather, petroleum refineries and painting from boats would accumulated progressively in the sediments of the Rio de la Plata estuary and eventually, in the tissue of the franciscana dolphins. Contrary to this, the prohibition from the nineties to use lead as additive in fossil fuel involved the progressive reduction of the concentration of this metal in the dolphins' bones.

The study also points out to a temporary increase in the concentrations of aluminium and manganese, and parallelly, a decrease in the concentrations of arsenic and strontium. These temporary tendencies are hard to relate to the anthropogenic pollution and will require more studies to reach conclusions. The results indicate a higher concentration of aluminium, iron and chromium in females, although the differences are not significant.

The paper published in Science of the Total Environment confirms the suitability of the trace element studies on bone remains conserved in museums or private collections to conduct large-scale temporary studies. Using this methodology, we can analyse both the impact of pollutants on a species in temporary series and the evolution of compounds in the environment.

Protecting the franciscana dolphin for the conservation of the marine environment

The franciscana dolphin is an apical marine predator and plays an essential role in the marine ecosystem. It shapes the abundance of several species --fish, octopuses, prawns, etc.-- that hold a medium trophic level in the ocean. Therefore, if the population of Franciscan dolphins decreases, it would completely alter the whole trophic marine network in the estuary. "In addition, the franciscana dolphins act as an 'umbrella' species. That is, protecting their populations would benefit many other species whose viability depends on the presence of the franciscana dolphin in the marine ecosystem", notes Odei García-Garín, member of the Department of Evolutionary Biology, Ecology and Environmental Sciences and IRBio.

"In order to improve the survival of the species, we need to reduce the accidental fishing as the first urgent measure", continues García-Garín. "Since juvenile individuals are the most affected ones, it would be important to implement ban periods during the breeding seasons, the most critical ones for the species. Promoting fish farms would help reduce the accidental fishing, although this measure could bring other negative effects for the marine environment (eutrophication and contamination by the waste produced by fish farms, etc.). "It would be convenient to create or expand marine reserves where the species live and reduce the contamination produced by large cities and industries --improvement of wastewater treatment systems and rivers that run into the sea, --also key strategies to improve the conservation of this vulnerable mammal", concludes the researcher.

Credit: 
University of Barcelona

Scientists discover how oxygen loss saps a lithium-ion battery's voltage

image: Scientists at SLAC and Stanford have made detailed measurements of how oxygen seeps out of the billions of nanoparticles that make up lithium-ion battery electrodes, degrading the battery's voltage and energy efficiency over time. In this illustration, the pairs of red spheres are escaping oxygen atoms and purple spheres are metal ions. This new understanding could lead to new ways to minimize the problem and improve battery performance.

Image: 
Greg Stewart/SLAC National Accelerator Laboratory

When lithium ions flow in and out of a battery electrode during charging and discharging, a tiny bit of oxygen seeps out and the battery's voltage - a measure of how much energy it delivers - fades an equally tiny bit. The losses mount over time, and can eventually sap the battery's energy storage capacity by 10-15%.

Now researchers have measured this super-slow process with unprecedented detail, showing how the holes, or vacancies, left by escaping oxygen atoms change the electrode's structure and chemistry and gradually reduce how much energy it can store.

The results contradict some of the assumptions scientists had made about this process and could suggest new ways of engineering electrodes to prevent it.

The research team from the Department of Energy's SLAC National Accelerator Laboratory and Stanford University described their work in Nature Energy today.

"We were able to measure a very tiny degree of oxygen trickling out, ever so slowly, over hundreds of cycles," said Peter Csernica, a Stanford PhD student who worked on the experiments with Associate Professor Will Chueh. "The fact that it's so slow is also what made it hard to detect."

A two-way rocking chair

Lithium-ion batteries work like a rocking chair, moving lithium ions back and forth between two electrodes that temporarily store charge. Ideally, those ions are the only things moving in and out of the billions of nanoparticles that make up each electrode. But researchers have known for some time that oxygen atoms leak out of the particles as lithium moves back and forth. The details have been hard to pin down because the signals from these leaks are too small to measure directly.

"The total amount of oxygen leakage, over 500 cycles of battery charging and discharging, is 6%," Csernica said. "That's not such a small number, but if you try to measure the amount of oxygen that comes out during each cycle, it's about one one-hundredth of a percent."

In this study, researchers measured the leakage indirectly instead, by looking at how oxygen loss modifies the chemistry and structure of the particles. They tracked the process at several length scales - from the tiniest nanoparticles to clumps of nanoparticles to the full thickness of an electrode.

Because it's so difficult for oxygen atoms to move around in solid materials at the temperatures where batteries operate, the conventional wisdom has been that the oxygen leaks come only from the surfaces of nanoparticles, Chueh said, although this has been up for debate.

To get a closer look at what's happening, the research team cycled batteries for different amounts of time, took them apart, and sliced the electrode nanoparticles for detailed examination at Lawrence Berkeley National Laboratory's Advanced Light Source. There, a specialized X-ray microscope scanned across the samples, making high-res images and probing the chemical makeup of each tiny spot. This information was combined with a computational technique called ptychography to reveal nanoscale details, measured in billionths of a meter.

Meanwhile, at SLAC's Stanford Synchrotron Light Source, the team shot X-rays through entire electrodes to confirm that what they were seeing at the nanoscale level was also true at a much larger scale.

A burst, then a trickle

Comparing the experimental results with computer models of how oxygen loss might occur, the team concluded that an initial burst of oxygen escapes from the surfaces of particles, followed by a very slow trickle from the interior. Where nanoparticles glommed together to form larger clumps, those near the center of the clump lost less oxygen than those near the surface.

Another important question, Chueh said, is how the loss of oxygen atoms affects the material they left behind. "That's actually a big mystery," he said. "Imagine the atoms in the nanoparticles are like close-packed spheres. If you keep taking oxygen atoms out, the whole thing could crash down and densify, because the structure likes to stay closely packed."

Since this aspect of the electrode's structure could not be directly imaged, the scientists again compared other types of experimental observations against computer models of various oxygen loss scenarios. The results indicated that the vacancies do persist - the material does not crash down and densify - and suggest how they contribute to the battery's gradual decline.

"When oxygen leaves, surrounding manganese, nickel and cobalt atoms migrate. All the atoms are dancing out of their ideal positions," Chueh said. "This rearrangement of metal ions, along with chemical changes caused by the missing oxygen, degrades the voltage and efficiency of the battery over time. People have known aspects of this phenomenon for a long time, but the mechanism was unclear."

Now, he said, "we have this scientific, bottom-up understanding" of this important source of battery degradation, which could lead to new ways of mitigating oxygen loss and its damaging effects.

Credit: 
DOE/SLAC National Accelerator Laboratory

Research reveals why people pick certain campsites

image: The researchers studied the busy Watchman Campground in Utah's Zion National Park. They used big data from national park reservations to understand how visitors pick campsites.

Image: 
U.S. National Park Service

MISSOULA - Those in love with the outdoors can spend their entire lives chasing that perfect campsite. New University of Montana research suggests what they are trying to find.

Will Rice, a UM assistant professor of outdoor recreation and wildland management, used big data to study the 179 extremely popular campsites of Watchman Campground in Utah's Zion National Park. Campers use an online system to reserve a wide variety of sites with different amenities, and people book the sites an average of 51 to 142 days in advance, providing hard data about demand.

Along with colleague Soyoung Park of Florida Atlantic University, Rice sifted through nearly 23,000 reservations. The researchers found that price and availability of electricity were the largest drivers of demand. Proximity to the adjacent river and ease of access also affected demand. Other factors - such as views of canyon walls or number of nearby neighbors - seemed to have less impact.

The work was published in the Journal of Environmental Management.

"This study demonstrated the power of using the big data of outdoor recreationists' revealed preferences to build models of decision-making, and did so in a setting that is incredibly relatable to many Americans," Rice said. "For instance, anyone who has ever picked a campsite within a campground has certainly dealt with the dilemma of proximity to the restroom. I mean, we want to be close enough to make navigation easy in the middle of the night, but not so close that we're smelling it and listening to the door open and close all night."

He said past studies on recreation decision-making have relied on surveying people about their stated preferences - basically asking them what they like. This study broke new ground by using revealed preferences - observations of people's actual decision-making - made possible by the Recreation Information Database. That database contains facts about all bookings made through the federal Recreation.gov site, which makes reservations for many national parks across America.

The researchers studied these site variables at the Watchman Campground: distance to the nearest dump station; distance to the nearest restroom, trash or recycling station, or water spigot; whether it was a walk-in site; price and electricity; number of neighboring campsites within a 40-meter radius; campsite shading; access to the nearby Virgin River; direct access to canyon walls; and views of canyon walls. These variables were broken into three setting categories: managerial, social and ecological.

Certain amenities at sites influenced how early they are reserved, on average. For instance, good views of the canyon walls increase the average booking window by three days. Price, access to electricity and ease of access also increase how early sites are reserved, demonstrating their popularity.

Rice said they were surprised that sites with access to the Virgin River were less popular. He suspects this might be because of known struggles with the river's water quality, and Zion National Park has issued a press release urging visitors not to swim or submerge themselves in the river.

Rice said their work and new research model can help park managers make better decisions about campground design and recreation planning.

"Since the 1960s, park managers - in collaboration with researchers - have been trying to figure out how people make decisions when choosing campsites, trails or any number of recreation facilities," he said. "This information is vital for recreation planning, not only for improving visitor experiences but also for ensuring the protection of ecological resources and fair allocation of recreation opportunities."

It also demonstrates the usefulness of a big-data approach for measuring the demand on stretched recreational resources.

"Our findings specific to Zion's Watchman Campground highlight the merit of using these methodologies elsewhere," Rice said. "As campers, we're always in search for the perfect campsite."

Credit: 
The University of Montana

Eco-friendly technology to produce energy from textile waste

image: Dr Yousef and his research group have developed several green/eco-friendly technologies to extract cotton, glucose, and energy products from textile waste.

Image: 
KTU

A team of scientists from Kaunas University of Technology and Lithuanian Energy Institute proposed a method to convert lint-microfibers found in clothes dryers into energy. They not only constructed a pilot pyrolysis plant but also developed a mathematical model to calculate possible economic and environmental outcomes of the technology. Researchers estimate that by converting lint microfibers produced by 1 million people, almost 14 tons of oil, 21.5 tons of gas and nearly 10 tons of char could be produced.

Each year, the global population consumes approximately 80 billion pieces of clothing and approximately €140 million worth of it goes into landfill. This is accompanied by large amounts of emissions, causing serious environmental and health problems. One of the ways to lessen the footprint of consuming clothes is to reduce the laundry impact. During a machine-washing process of textile, around 300 mg of microfiber are generated from 1 kg of textile.

"Lint-microfibers are classified as microplastics. Whereas large plastic items can be sorted out and recycled relatively easily, this is not the case with microplastic - tiny plastic pieces, less than 5 mm in diameter. Large quantities of microplastic are being washed down our drains and enter our seas threatening the environment", says Dr Samy Yousef, senior researcher at Kaunas University of Technology (KTU), Faculty of Mechanical Engineering and Design.

Dr Yousef is the leader of the inter-institutional team, which developed an eco-friendly technology to extract energy products from textile waste. For the experiment, lint-microfibers collected from the filters of the drying machines in the dormitories of KTU were collected. As the residents of the dormitories come from different cultures in Europe, Africa, Asia and America, the collected samples were very diverse. Using a pilot pyrolysis plant, built at the laboratories of Lithuanian Energy Institute, the scientists were able to extract three energy products - oil, gas and char - from the collected lint-microfiber batches. When treated thermally, the lint-microfibers decompose into energy products with around a 70 per cent conversion rate.

"When we think about textile waste, we usually imagine long fabric with high crystallinity, which is contaminated with dye and dirt. Much energy is needed to turn the solid waste into liquid. However, lint-microfiber is a somewhat 'broken fiber' textile waste; it has a uniform size and shape, contains a lot of flammable compounds (resulted cotton and polyester elements), its transformation is easier", says Dr Yousef.

Researchers also developed a mathematical model to evaluate the economic and environmental performance of the suggested strategy, based on the lint-microfibers generated by 1 million persons. The study shows that if applied on an industrial scale, the strategy is profitable and eco-friendly: the energy from the lint-microfiber generated by 1 million people has estimated profitability at around €100 thousand and reduced carbon footprint 42,039,000kg CO2-eq/t of lint-microfibers.

"I believe that the collection system, similar to deposit-return for drink containers, could be developed based on our research. A household would bring the lint-microfiber from their drying machine filters to a collection point and receive some kind of compensation for it. We have proposed the technology and made calculations, which may be developed further", says Dr Yousef.

According to research, lint-microfibers can be considered a renewable energy source that ensures sustainability and accelerates the general transition of the textile industry to a circular economy. In addition to the study described above, Dr Yousef and his research group have developed other green/eco-friendly technologies to extract cotton, glucose, and energy products from textile waste and end-of-life euro banknotes using mechanical, thermal, chemical, and biological treatments.

Credit: 
Kaunas University of Technology

Study finds lightning impacts edge of space in ways not previously observed

Solar flares jetting out from the sun and thunderstorms generated on Earth impact the planet's ionosphere in different ways, which have implications for the ability to conduct long range communications.

A team of researchers working with data collected by the Incoherent Scatter Radar (ISR) at the Arecibo Observatory, satellites, and lightning detectors in Puerto Rico have for the first time examined the simultaneous impacts of thunderstorms and solar flares on the ionospheric D-region (often referred to as the edge of space).

In the first of its kind analysis, the team determined that solar flares and lightning from thunderstorms trigger unique changes to that edge of space, which is used for long-range communications such the GPS found in vehicles and airplanes.

The work, led by New Mexico Tech assistant professor of physics Caitano L. da Silva was published recently in the journal Scientific Reports, a journal of the Nature Publishing Group.

"These are really exciting results," says da Silva. "One of the key things we showed in the paper is that lightning- and solar flare-driven signatures are completely different. The first tends to create electron density depletions, while the second enhancements (or ionization)."

While the AO radar used in the study is no longer available because of the collapse of AO's telescope in December of 2020, scientists believe that the data they collected and other AO historical data will be instrumental in advancing this work.

"This study helps emphasize that, in order to fully understand the coupling of atmospheric regions, energy input from below (from thunderstorms) into the lower ionosphere needs to be properly accounted for," da Silva says. "The wealth of data collected at AO over the years will be a transformative tool to quantify the effects of lightning in the lower ionosphere."

Better understanding the impact on the Earth's ionosphere will help improve communications.

da Silva worked with a team of researchers at the Arecibo Observatory (AO) in Puerto Rico, a National Science Foundation facility managed by the University of Central Florida under a cooperative agreement. The co-authors are AO Senior Scientist Pedrina Terra, Assistant Director of Science Operations Christiano G. M. Brum and Sophia D. Salazar a student at NMT who spent her 2019 summer at the AO as part of the NSF- supported Research Undergraduate Experience. Salazar completed the initial analysis of the data as part of her internship with the senior scientists' supervision.

"The Arecibo Observatory REU is hands down one of the best experiences I've had so far," says the 21-year-old. "The support and encouragement provided by the AO staff and REU students made the research experience everything that it was. There were many opportunities to network with scientists at AO from all over the world, many of which I would likely never have met without the AO REU."

AO's Terra and Brum worked with Salazar taking her initial data analysis, refining it and providing interpretation for the study.

"Sophia's dedication and her ability to solve problems grabbed our attention from the very first day of the REU program," Brum says. "Her efforts in developing this project resulted in publication in one of the most prestigious journals in our field."

"Another remarkable result of this work is that for the first time, a mapping of the spatial and seasonal occurrence of lightning strike over the region of the Puerto Rico archipelago is presented," Brum says. "Intriguing was also the detection of a lighting activity hotspot concentrated in the western part of La Cordillera Central mountain range of Puerto Rico."

Credit: 
University of Central Florida

Nursing shortage affects rural Missourians more, MU study finds

image: Anne Heyen is an assistant teaching professor at the MU Sinclair School of Nursing.

Image: 
Sinclair School of Nursing

COLUMBIA, Mo. -- While the United States faces a nationwide nursing shortage, a recent study at the University of Missouri found rural Missouri counties experience nursing shortages at a greater rate than the state's metropolitan counties. In addition, the study found rural Missouri counties have a higher percentage of older nurses nearing retirement, which could have a severe impact on the future of the state's nursing workforce.

Anne Heyen, an assistant teaching professor in the MU Sinclair School of Nursing, analyzed workforce data of nearly 136,000 licensed Missouri nurses to identify the age and geographical disparities across the state.

"Out of the 114 total counties in Missouri, 97 are designated as health care professional shortage areas, and a majority of these counties are rural," Heyen said. "By identifying the specific areas where there is the greatest need for more nurses, we can better tailor our response to help Missouri have a more balanced nursing workforce."

The study found 31% of all Missouri nurses are older than the age of 54, and rural Missouri counties had higher percentages of nurses over the age of 54 compared to their urban counterparts, including three rural counties--Dekalb, Reynolds and Worth--where where more than half of the nurses are over age 54.

"In some of these rural areas where nearly half of the nursing workforce is nearing retirement, now is the time to be proactive and start thinking about who is going to replace them 10 years down the road," Heyen said. "Research has shown nurses tend to stay and work where they are educated, which can influence young nurses to stay in urban areas where there tend to be more educational resources."

Heyen added higher pay and more job opportunities in cities also lead young nursing students to pursue work in the urban areas they are often educated in, which contributes to the geographical disparities for the nursing shortage.

"This research identifies the specific areas in Missouri facing nursing shortages so that potential solutions can be targeted to the areas with the greatest needs," Heyen said.

Institutions of higher education can play a key role in addressing the disparities, according to Heyen.

"Whether it's potentially partnering with community colleges in rural areas or establishing satellite campuses with dual credit options or more outreach programs, universities and their nursing schools can use this information to brainstorm solutions to assist underserved communities and provide more educational and employment opportunities to nursing students in the areas that need it most," Heyen said. "The overall goal of this research is to make sure everyone in Missouri ultimately has access to the health care they need, regardless of where they live, and identifying where the nursing shortages occur is a key first step."

As an assistant teaching professor, Heyen is passionate about educating the next generation of nurses, who will be in high demand as the need for nurses rises.

"It feels rewarding to see the nursing students I have taught go out into the world and make a positive difference at a time when they are so desperately needed," Heyen said. "Mizzou and the University of Missouri System are well poised to help address these challenges going forward, given their influence and impact throughout the state."

To help meet the nursing shortage, the Sinclair School of Nursing's new 64,585-square-foot facility, expected to be completed on MU's campus by spring 2022, will allow the school to increase class sizes and graduate more nurses. In addition, the school is placing an emphasis on recruiting more students from the 25-county service area MU Health Care oversees, as students who come from a rural area are more likely to return there for work after they graduate.

"Show me the nursing shortage: Location matters in Missouri nursing shortage" was recently published in the Journal of Nursing Regulation. Co-authors on the study include Lori Scheidt and Tracy Greever-Rice.

Credit: 
University of Missouri-Columbia

Hope for infertile men; mice could hold the secret

Male infertility affects more than 20 million men globally and is a contributing cause to around 50% of infertility in couples. Frequently, male infertility is the result of defects in the sperm tail, the flagellum, which allows the sperm to swim toward an egg. Males with severe infertility can experience multiple sperm malformations, including flagella that are shortened, irregular, coiled or even absent, preventing them from swimming.

In humans, several genetic mutations lead to malformed sperm, including those affecting the sheath that covers the sperm; the mitochondria, which power sperm as they swim; and a tiny sac, the acromosal vesicle, which releases the enzymes that allow one successful sperm to break down the exterior lining of the egg cell to fertilize it.

To understand more about the causes of male infertility, Drs Na Li and Ling Sun, research group leaders at Guangzhou Women and Children's Medical Center, collected sperm samples from infertile men and identified one individual with multiple defects affecting his sperm flagella. Through genetic analysis, they found a mutation in a largely unknown sperm protein, FSIP2 (Fibrous Sheath-Interacting Protein 2), a component of the fibrous sheath. "The fibrous sheath covers the tails of sperm found in humans, mice and other species in which fertilisation occurs within the animal's body", explains Li. "It offers the sperm tails flexibility and strength, which is necessary for sperm to swim in the dense and sticky medium of the human body before they meet the egg. Interestingly, animals whose sperm swim through water because fertilization occurs outside of the body, such as fish, either do not have the FSIP2 protein or, at most, a defective version."

To study the function of FSIP2, Li, Sun and their team of researchers generated two sets of mice: one in which they recreated the FSIP2 mutation of the human patient and another in which the animals overproduce the FSIP2 protein. They found that mice with the FSIP2 mutation become infertile; their semen contained fewer live sperm and over 50% could not swim forward, even though some of them could still beat their flagella. In contrast, the mice that overproduced the FSIP2 protein remained fertile and, compared to normal mice, had over 7 times more super-long sperm, which could swim faster and be more capable of fertilizing an egg.

To understand the reasons for these changes in the sperm flagella, the researchers looked at the composition of the sperm. They found that the sperm of mice with the FSIP2 mutation had lower amounts of the proteins that make up the sheath surrounding the sperm, the mitochondrial power generators and the acrosomal vesicle. In contrast, the sperm of the mice that were overproducing FSIP2 made more sperm tail proteins, particularly in the fibrous sheath, which could allow sperm to swim more easily through the body. They published this discovery in Development at http://journals.biologists.com/dev.

The findings of Li, Sun and their team offer hope that scientists can begin to develop treatments for infertility, either by finding drugs that restore sperm movement or even by finding ways to correct the debilitating mutation that causes the problems in the first place. Ultimately, such treatments could give men suffering from infertility the chance of becoming fathers.

Credit: 
The Company of Biologists

PCF-based 'parallel reactors' unveils collective matter-light analogies of soliton molecules

image: a. Schematic of the parallel optical-soliton reactors based on a mode-locked ring-fibre laser cavity. The temporal optomechanical (OM) lattice enabled by PCF provides trapping potentials to host parallel soliton interactions, while global and individual manipulations can be applied to control the interaction. b. PCF microstructure. c. Schematic of controlled soliton reactions in parallel trapping potentials. The solitonic elements trapped in each reactor can be transitioned between phase-uncorrelated long-range bound states and phase-locked soliton molecules, corresponding to the synthesis and dissociation of soliton molecules.

Image: 
by Wenbin He, Meng Pang, Dung-Han Yeh, Jiapeng Huang, Philip St.J. Russell

Optical solitons are nonlinear optical wave-packets that can maintain their profile during the propagation even in the presence of moderate perturbations, offering useful applications in optical communications, all-optical information processing as well as ultrafast laser techniques. The interaction between optical solitons exhibit many particle-like properties, and has been widely investigated for decades. Particularly, the bound-states of optical solitons in nonlinear dissipative systems, as a result of balanced interactions, have been found to manifest unique matter-light analogies and are epitomized by the "soliton molecules" - compact multi-soliton structures that propagate as invariant single entities. The dynamics of soliton molecules has attracted wide interest, especially the synthesis and dissociation of soliton molecules that are reminiscent of chemical reactions. However, the study of soliton molecules mostly relied on uncontrolled random excitations, and has long plateaued at single-object level without exploring the stochastic and statistical properties that involve massive number of solitons, making it difficult to perform higher-level study of multi-soliton dynamics.

In a new paper published in Light Science & Application, a team of scientists, led by Dr. Wenbin He and Dr. Meng Pang in Prof. Philip Russell's Division of Max Planck Institute for the Science of Light in Germany has developed a unique platform, named as "parallel optical-soliton reactors", which can host massively dynamic events of soliton molecules. Such parallel reactors, resembling chemical reactors, can isolate and host multiples solitons, and then manipulate their interactions through various all-optical methods. When hundreds of such parallel reactors are operated simultaneously with carefully prepared initial states and control techniques, on-demand synthesis and dissociations of soliton molecules can be initiated in massively numbers, unfolding an a novel panorama of multi-soliton dynamics that are stochastic in nature. Moreover, statistical rules are found out of the massively parallel reactions that highly resembles classic chemical kinetics, which promote the conventional matter-light analogy into a collective level. These results bring a higher-level insight to soliton dynamics that can benefit both fundamental researches for nonlinear systems but also practical applications that involves massive number of optical solitons.

The parallel optical-soliton reactors are based on a unique optomechanical lattice that is created using an optoacoustically mode-locked fibre laser. The key component is actually only a short-piece of photonic crystal fibre (PCF) - a special micro-structured optical fibre that has a micro-core surrounded by an array of hollow channels. These scientists summarize the operational principle of their parallel reactors:

"Optoacoustically mode-locked fibre lasers based on micro-core PCFs, which have been developed in our lab for many years, make use of the enhanced optoacoustic interactions in the micro-core PCF. When inserted in a conventional mode-locked fibre laser, the PCF provides an acoustic resonance, typically at GHz rate, through which the meters-long fibre cavity can be effectively divided into hundreds of time-slots, each corresponding to one acoustic vibration cycle, leading to the formation of an optomechanical lattice. Each time-slot, or "lattice cell" can host multiple solitons that are isolated from other time-slots and can be manipulated, functioning as many parallel reactors in which the reactants are optical solitons instead of real atoms and molecules."

"The major breakthrough of this work is the on-demand control of the soliton interactions in each parallel-reactor hosted by the optomechanical lattice. We categorized the methods into two types. One relied on laser cavity perturbations that affect all reactors simultaneously, which is called 'global control'. The other utilize external addressing pulses to induced perturbations upon selected reactors without affecting the others, which is called 'individual control'. Phase-uncorrelated long-range soliton interactions play an important role in such controlled interaction. The controlled synthesis and dissociation of soliton molecules are actually enabled by careful tailoring of the long-range soliton interactions."

"By careful adjustment of the laser cavity, we have successfully initiated hundreds of soliton-molecule synthesis/dissociation events in parallel. We employed the dispersive Fourier transform (DFT) method to capture the transient multi-soliton dynamics in each reactor. By analyzing these massively parallel events recorded in the experiment, which are unavailable in previous studies, we have unveiled many features of multi-soliton dynamics, including a few statistical rules that emulate classic chemical kinetics, suggesting a collective-level matter-light analogy."

"The presented technique offered a series of new possibilities for studying optical solitons. Many phenomena concerning soliton dynamics can possibly be re-examined using such parallel-reactor scheme to gain a collective-level insight. The various control technique, especially the individual control methods that enabled selective editing of multi-soliton states, can be potentially useful in optical information technology that use solitons as bit-carriers. We also expect the concept of parallel reactors to be realized in other platforms, e.g. using a massive array of micro-resonators." the scientists forecast.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

Big data: IPK researchers double accuracy in predicting wheat yields

image: By increasing population sizes, an international team of scientists led by IPK Leibniz-Institute was able to double the prediction accuracy for wheat yield.

Image: 
IPK/ Christoph Martin

The enormous potential of Big Data has already been demonstrated in areas such as financial services and telecommunications. An international team of researchers led by the IPK Leibniz Institute has now tapped the potential of big data for the first time on a large scale for plant research. To this end, data from three projects were used to increase the predictive accuracy for yield in hybrid varieties of wheat.

"We were able to draw on the largest dataset published to date, which contains information from almost a decade of wheat research and development," says Prof. Dr. Jochen Reif, Head of the Breeding Research Department at IPK. The results, which could herald a new era for plant breeding, have now been published in the magazine Science Advances.

Finally, data on more than 13,000 genotypes tested in 125,000 yield plots were analysed. For comparison: In a breeding programme, plants are tested in 20,000 yield plots every year. "It was clear to us that we would have to increase the population sizes in order to ultimately develop robust predictive models for yield," says Prof. Dr. Jochen Reif, "so in this case it was really once: 'a lot goes a long way'". The effort was worth it, he said. "We were able to double the predictive accuracy for yield in our study."

The research team used data from the two previous projects HYWHEAT (funded by the Federal Ministry of Research and Education) and Zuchtwert (funded by the Federal Ministry of Food and Agriculture) as well as from a programme of the seed producer KWS. Basically, the challenge in such studies is to prepare the information to a uniform quality level and thus enable a common analysis. "Since we were responsible for the designs of the experiments from the start, we were able to plan them in such a way that a small proportion of the same genotypes were always tested across the projects, thus enabling an integrated analysis in the first place," says Prof. Dr. Jochen Reif.

The scientist is firmly convinced that it pays off to use Big Data for plant breeding and research. "We have ultimately worked on the future of all of us", says the IPK scientist. "We have succeeded in showing the potential of Big Data for breeding yield-stable varieties in times of climate change."

According to Prof. Dr. Jochen Reif, the current model study has a significance that goes far beyond one crop type and hopefully heralds a cultural change in breeding. "We were able to show the great benefits of Big Data for plant breeding. However, the possibilities for this are only possible through a trusting cooperation of all stakeholders to share data and master the challenges of the future together."

Ultimately, this is also the entry point for the use of artificial intelligence (AI). "The successful use of AI also stands and falls in plant breeding and research with curated and comprehensive data. Our current study is an important door opener for this path."

Credit: 
Leibniz Institute of Plant Genetics and Crop Plant Research

Factors Associated With Self-reported Symptoms of Depression Among Adults With/Without Previous COVID-19

What The Study Did: This survey study compared features of major depression in people with or without prior COVID-19 illness.

Authors: Roy H. Perlis, M.D., M.Sc., of Massachusetts General Hospital in Boston, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2021.16612)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

New dipping solution turns the whole fish into valuable food

image: After filleting, there are still lots of valuable and nutritious parts of the fish left, such as the backbones, heads and fins. By dipping these side streams into a specially developed solution, containing ingredients such as rosemary extract and citric acid, their shelf life can be extended significantly, giving a useful window of time to process them further. Haizhou Wu is one of the researchers in the project.

Image: 
Pixabay, Haizhou Wu/Chalmers, Xueqing Lei, Freepik

When herring are filleted, more than half their weight becomes a low-value 'side stream' that never reaches our plates - despite being rich in protein and healthy omega-3 fatty acids. Now, scientists from Chalmers University of Technology, Sweden, have developed a special dipping solution, with ingredients including rosemary extract and citric acid, which can significantly extend the side streams' shelf life, and increase the opportunities to use them as food.

Techniques for upgrading these side-streams to food products such as minces, protein isolates, hydrolysates and oils are already available today, and offer the chance to reduce the current practices of using them for animal feed, or, in the worst cases, simply throwing them away.

However, the big challenge is that the unsaturated fatty acids found in fish are very sensitive to oxidative degradation, meaning that the quality starts to decrease after just a few hours. This results in an unpleasant taste, odour, colour and texture in the final product. The reason why side stream parts from the fish such as backbones and heads are so sensitive is because they are rich in blood, which in turn contains the protein haemoglobin, which accelerates the fatty acid degradation process.

"Our new technology offers a valuable window of time for the producer, where the side-streams remain fresh for longer, and can be stored or transported before being upgraded into various food ingredients," says Ingrid Undeland, Professor of Food Science at the Department of Biology and Biological Engineering at Chalmers University of Technology.

The new technology is based on a dipping solution containing ingredients including for example rosemary extract and citric acid. Within the frame of a European project called WaSeaBi, and together with colleagues Haizhou Wu and Mursalin Sajib, Ingrid Undeland recently published a scientific study exploring the possibilities of the method.

Recycling the solution up to ten times

The results showed that dipping the side stream parts from the herring filleting process into the solution, prior to storage, significantly extended the time before rancidity developed. At 20 °C, the storage time could be extended from less than half a day to more than three and a half days, and at 0 degrees, from less than one day to more than eleven days.

"And because the dipping solution covers the surface of side stream parts with a thin layer of antioxidants, these are carried over to the next stage of the process, providing more high-quality minces, protein or oil ingredients," explains Ingrid Undeland.

To make the technology cost-effective, the possibility of re-using the solution was also investigated. Results showed that even after reusing the solution up to ten times, rancidity was completely inhibited at 0 °C. In addition, it was found that the solution kept the fish haemoglobin in a form that was more stable and less reactive with the fatty acids, which the researchers believe explains the decrease in oxidation.

More on the study, and the possibilities of side-streams

The study, Controlling hemoglobin-mediated lipid oxidation in herring (Clupea harengus) co-products via incubation or dipping in a recyclable antioxidant solution, was published with open access in the journal Food Control. It was based on herring side-streams from Sweden Pelagic, however, results obtained with dipping of cod-side streams from Royal Greenland also confirm that rosemary-based antioxidant mixtures are good at protecting against oxidation. This means that the solution can be used to prevent rancidity of different kinds of fish side-streams. The study was made available online in February, ahead of final publication in the July issue 2021.

Examples of valuable side streams from fish include, for example, the backbones and heads, which are rich in muscle and therefore suitable for fish mince or protein ingredients. As the belly flap and intestines are rich in omega-3 fatty acids, they can be used for oil production. The tail fin has a lot of skin, bones and connective tissue and is therefore well suited for, for example, the production of marine collagen, which is a much sought-after ingredient on the market right now. In addition to food, marine collagen is also used in cosmetics and 'nutraceuticals' with documented good effects on the health of our joints and skin.

Credit: 
Chalmers University of Technology

Anomalous weak values via a single photon detection

image: a, Experimental setup. Photons at 702 nm from a single-mode fiber (SMF) are decoupled and collimated in a free-space Gaussian mode. The Robust Weak Measurement is obtained by means of the n = 7 identical blocks put after the initial polarizing beam splitter (PBS). A 2D spatial resolving detector (an EM-CCD camera working in photon counting regime) determines the final position of the photons. b, Schematic of each of the n = 7 blocks realizing the RWM: a half-wave plate (HWP) is responsible for the preselection, a pair of birefringent crystals (BCs) implements the weak coupling between the polarization (measured observable) and the transverse momentum (measuring device) of the photon, and finally a polarizing plate (Pol) realizes the postselection. c, Measurement of an anomalous weak value with a single click. The yellow solid lines indicate the boundaries of the eigenvalues spectrum of the measured observable (i.e. the polarization of the photon at n = 7 times). Upon suitable pre- and postselection, the theoretically expected value (18.7) is highlighted by a green dashed line, while the white pixel and red uncertainty bars show the single-shot experimental result, 21.4 ± 4.5.

Image: 
by Enrico Rebufello, Fabrizio Piacentini, Alessio Avella, Muriel A. de Souza, Marco Gramegna, Jan Dziewior, Eliahu Cohen, Lev Vaidman, Ivo Pietro Degiovanni, and Marco Genovese

In the field of quantum measurement, weak values, introduced in 1988 by Aharonov, Albert and Vaidman (AAV), represent undoubtedly one of the most intriguing and puzzling paradigm, with many properties in sharp contrast with respect to traditional (projective) quantum measurements.

In fact, by weakening the coupling between measured particle and measuring device, and exploiting suitable pre- and postselection, AAV demonstrated that it was possible to obtain a value of 100 while (weakly) measuring the spin of a ½-spin particle.

Such a result was obtained after averaging on multiple measurements on identically pre- and postselected particles; hence, a debate started on the single-particle/statistical nature of weak values as well as on their "quantumness", within the more general discussion on weak values as a tool for understanding the very foundations of quantum mechanics.

In a new paper published in Light Science & Application, a team of researchers led by Dr. Marco Genovese from the Italian metrological institute INRIM (Turin, Italy), in collaboration with people from the Brazilian metrological institute INMETRO (Rio de Janeiro, Brazil), the Max-Planck-Institut für Quantenoptik (Garching, Germany), the Bar-Ilan University (Ramat Gan, Israel), and the Tel-Aviv University (Tel-Aviv, Israel), sheds new light on this decades-old debate, with a quantum optics experiment measuring, for the first time, an anomalous weak value with a single detection event, without any statistics.

This was obtained by realizing a novel measurement paradigm dubbed Robust Weak Measurement, implemented as an iterative protocol in which the measured particle (a photon, in our case) goes through a sequence of n blocks, each implementing the preselection, weak coupling and postselection mechanisms.

This way, the measured observable is "the sum of polarisation variables of the same photon at n different times, with the spatial degree of freedom of this single photon playing the role of the measuring device."

Regarding the experiment realization and results, the authors write:

"The experimental setup is composed of a set of n = 7 blocks in which a birefringent crystal pair realises the weak interaction, preceded by a half-wave plate and followed by a polarising plate. While the polarising plate performs the postselection, the half-wave plate rotates the polarisation of the photon outgoing the previous block to set the preselection state. The EM-CCD placed at the end of the n = 7 blocks detects the arrival position of the photon."

"We measured an observable with eigenvalues in the range [-7,7]. The weak value of the observable of the pre- and postselected system on which a single-click measurement was performed was 18.7, and our single click yielded 21.4 ± 4.5."

"Our findings stress the non-statistical, single-particle nature of weak values, demonstrating how a single-click measurement can provide a weak value estimate even for anomalous weak values. Furthermore, this experiment suggests a viable possibility for amplification methods effectively reducing the uncertainty contribution associated with the measurement of the pointer. This paves the way for future practical applications of the robust weak measurement paradigm."

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

Olfactory virtual realities show promise for mental health practices and integrative care

image: "The OVR environment is an immersive, three dimensional, 6 DoF (Six Degrees of Freedom) environment in which the subject can freely move and interact with the virtual items presented therein (e.g. campfire, marshmallows, sticks/logs wood, flowers, citronella candles, bacon, tree bark, sodacola), with ambient scent of forest, as well as natural environment sounds such as leaves, etc., to recreate a reality-like environment," the study explains.

Image: 
Courtesy OVR Technology

BURLINGTON, VT -- Findings from a study on the feasibility of addressing anxiety, pain and stress with Olfactory Virtual Reality (OVR) -- a new form of VR that incorporates the sense of smell into its augmented reality -- paint a clearer picture for clinical psychiatrists about how it could be used to safely and effectively help mental health and mood disorders. What's more, it holds promise for improved access and inclusion of patients impacted by physical limitations or constraints, such as patient mobility, comorbidities and safety.

Building on previous research proving VR's effectiveness in "distraction for pain and medical procedures, relaxation and calming, and immersion therapy for trauma, PTSD and phobias," the study -- published by the Journal of Medical Research and Health Sciences this spring -- provides evidence that stimulating the olfactory system via scent in practitioner-administered virtual realities can trigger memory, cognition and emotion, and may improve the therapeutic benefits of augmented realities targeting chronic pain, anxiety and mood disorders.

"The OVR sessions...were focused on creating a more immersive, realistic, evocative, meaningful and emotional [virtual and altered reality] experience," explains the study led by David Tomasi -- a clinical psychologist and psychotherapist at UVM Medical Center (UVMMC) and teacher of integrative health at the University of Vermont -- "by allowing for the subjects enrolled therein to enter a calming and realistic environment, in order to decrease the amount of anxiety, stress and pain experienced."

Tomasi and a team of psychotherapists at UVMMC's Inpatient Psychiatry Department collaborated with OVR Technology, a Burlington, Vermont-based company that specializes in olfactory virtual reality in this context, to design a relaxing, virtual forest and campsite that could be independently, fully experienced in an area of just 100-square-feet. Using software, scentware and hardware supplied by OVR Technology, the team created a simulation complete with a virtual tent, picnic table, fire pit, logs and other objects to touch, and aromas of fresh bacon and toasted marshmallows.

"At OVR, designing new scents is a collaborative process between what the desired outcome is of the experience, along with what makes sense given the auditory and visual stimuli," says Vice President of Scentware for OVR Technology Sarah Socia, who collaborated on the study. "We focus on the entire experience -- the mix of audio, visual and olfactory stimuli that give rise to the experience and then the feelings follow suit."

Participants -- all inpatient psychiatry patients that voluntary participated in the study -- were immersed in the forest camp environment for 8-12-minutes, in weekly OVR sessions that coincided with their standard clinical treatment plans. Following the OVR sessions, participants reported significant and immediate improvements to their anxiety, stress and pain levels that lasted up to three hours after a session.

Among the most dramatic improvements reported by participants were reduced anxiety levels. Asked throughout the sessions to rate their anxiety levels on a scale of 1 to 10 (with 1 being the lowest and 10 the highest), nearly half the participants (45.6%) rated their anxiety levels prior to OVR as either a 9 or 10. Roughly the same percent of participants (44.6%) rated their anxiety levels immediately after the session as either a 1 or 2. Between one to three hours later, half the participants (50%) rated their anxiety levels as either a 2 or 3. In all, participants' anxiety dropped a median of 5 levels from start to finish through the process.

"OVR allowed patients whose circumstances excluded them from physical activity and exposure to nature to virtually experience physical activity in nature with similar sounds, sights and smells to a real-world scenario," says Tomasi. "Those similar sensations evoked memories and responses that reduced anxiety and improved mood, just as the real experience would."

While the study was years in the making, it reflects data collected over a four-month span between September and December 2020 -- a critical point in the COVID-19 pandemic. The timing certainly was not ideal, Tomasi says, but the unlikely circumstance opened a window of silver lining that brought new understanding to the potential of OVR within the context of forced social isolation.

"The added COVID-19 restrictions, on top of an already very limiting situation for many individuals suffering with mental health disorders, presented a very difficult challenge to the research," he says. "However, we can say that precisely because of this situation, we were able to see how important this approach is to help mental health in general."

Credit: 
University of Vermont

When physics meets financial networks

Generally, physics and financial systems are not easily associated in people's minds. Yet, principles and techniques originating from physics can be very effective in describing the processes taking place on financial markets. Modeling financial systems as networks can greatly enhance our understanding of phenomena that are relevant not only to researchers in economics and other disciplines, but also to ordinary citizens, public agencies and governments. And the theory of Complex Networks represents a powerful framework for studying how shocks propagate in financial systems, identifying early-warning signals of forthcoming crises, and reconstructing hidden linkages in interbank systems.

In a review article appearing on Nature Reviews Physics, several scholars in Complex Networks have now teamed up to organize and update the knowledge in the field. The article summarizes over 15 years of truly interdisciplinary research, highlighting how the statistical physics approach has shed light on various key properties of these phenomena. The authors represent some of the most internationally active research groups in the field, based at the IMT School for Advanced Studies Lucca, the University of Leiden, "Ca' Foscari" University of Venice, University of Zurich, "Tor Vergata" University of Rome, University College London and Bank of England.

The starting point of the analysis is the recognition that financial institutions are linked together in a global web of interactions whose structure can be analyzed quantitatively by means of network theory, the framework that studies the structure and consequences of the relationships connecting different objects in large systems. In fact, the financial system can be viewed as a network whose nodes represent agents - e.g. retail and investment banks, insurance companies, investment funds, central banks but also non-financial firms and households - and whose edges represent dependencies between nodes.

The models traditionally employed by regulators and policymakers consider far too simple representations of financial systems, describing them either as collections of isolated actors or as a homogeneous "mixture" where each actor interacts equally with all the other ones. However, as the 2007-2008 crisis showed dramatically, both representations fail to provide an appropriate description of the highly heterogeneous and intertwined structure of these systems, as well as the implications for society. When the crisis struck, banks that broke down could not repay their debt, causing other banks to go bankrupt, in a cascading effect whose dynamics strongly depended on the details of the interconnection patterns. Policymakers admitted that they felt abandoned by traditional economic models.

Here is where network theory comes into play, by clarifying the interplay between the structure of the network, the heterogeneity of the individual characteristics of financial actors and the dynamics of risk propagation, in particular contagion, i.e. the domino effect by which the instability of some financial institutions can reverberate to other institutions to which they are connected. The associated risk is indeed "systemic", i.e. both produced and faced by the system as a whole, as in collective phenomena studied in physics. "Each bank determines the interest rate for loans to other banks based on their perceived individual riskiness," explains Diego Garlaschelli, Associate Professor at the IMT School for Advanced Studies Lucca and at Leiden University, the Netherlands. "However if those banks are in turn interconnected via other loans, then the actual risk of a collective default can be much higher. Since the existence of loans is a matter of confidentiality, one has to devise new techniques to guess the key properties of interbank networks from partial information. This is crucial also for central banks that strive to run reliable stress tests on the financial system. A nontrivial generalization of the statistical physics framework allowed us to address this challenge in an original way."

The publication in Nature Reviews Physics is a recognition of the fact that financial networks are one of the new frontiers of modern physics; besides, it acknowledges the key role played by statistical physics in providing a mathematical description of the relation between microscopic and macroscopic properties of systems composed of many parts, including social and economic ones.

The authors of the review have been working in the field of financial networks for several years. "Our network reconstruction methods have been tested by various groups worldwide, including one uniting researchers from several central banks, and have been found to systematically outperform the alternative ones," says Tiziano Squartini, Assistant Professor in physics at the IMT School. "In collaboration with the Dutch Central Bank, we even found that, while the 2007-2008 crisis came as a surprise to traditional models, a network analysis accounting for the observed heterogeneity of banks could have predicted it three years in advance."

Today, after almost fifteen years from the financial crisis, the role of networks for monitoring financial stability and designing macroprudential regulation is widely recognized. Both policymakers and researchers agree that systemic risk has to be studied and managed by adopting a network perspective. Besides, it is necessary that institutions adopt network models for risk assessment more comprehensively. This is also reflected in the policy action and discourse of the highest financial authorities, both in the US and in the EU.

Credit: 
IMT School for Advanced Studies Lucca