Brain

Catalyzing the conversion of biomass to biofuel

image: Prof. Lercher in his laboratory at the Department of Chemistry at the Technical University of Munich.

Image: 
Andreas Heddergott / TUM

Zeolites are extremely porous materials: Ten grams can have an internal surface area the size of a soccer field. Their cavities make them useful in catalyzing chemical reactions and thus saving energy. An international research team has now made new findings regarding the role of water molecules in these processes. One important application is the conversion of biomass into biofuel.

Fuel made from biomass is considered to be climate-neutral, although energy is still needed to produce it: The desired chemical reactions require high levels of temperature and pressure.

"If we are to do without fossil energy sources in the future and make efficient large-scale use of biomass, we will also have to find ways to reduce the energy required for processing the biomass," says Johannes Lercher, professor for Chemical Technology at the Technical University of Munich (TUM) and Director of the Institute for Integrated Catalysis at the Pacific Northwest National Laboratory in Richland, Washington (USA).

Working together with an international research team, Lercher has taken a closer look at the role of water molecules in reactions inside the zeolite's pores, which are less than one nanometer in size.

It all starts with acids

One characteristic of an acid is that it easily donates protons. Thus, when added to water, hydrochloric acid splits into negatively charged chloride anions, like those found in table salt crystals, and positively charged protons which attach themselves to the water molecules. This results in a positively charged hydronium ion, which looks to further pass on this proton, for example to an organic molecule.

When the organic molecule is "forced" to accept a proton, it tries to stabilize itself. Thus, an alcohol can give rise to a molecule with a double bond - a typical reaction step on the path from biomass to biofuel. The zeolite walls stabilize transitional states occurring during conversion and, thus, help to minimize the amount of energy required by the reaction to occur.

Zeolites acting as acids

Zeolites contain oxygen atoms in their crystal structure which already carry a proton. Like molecular acids they form hydronium ions through the interactions with water.

However, while hydronium ions disperse in water, they remain closely associated with the zeolite. Chemical pre-treatment can vary the number of these active centers and, thus, establish a certain density of hydronium ions in the pores of the zeolite.

The ideal zeolite for every reaction

By systematically varying the size of the cavities, the density of the active sites and the amount of water, the research team was able to elucidate the pore sizes and concentrations of water which best catalyzed selected example reactions.

"In general, it's possible to increase the reaction rate by making the pores smaller and raising the charge density," Johannes Lercher explains. "However, this increase has its limits: When things get too crowded and the charges are too close to one another, the reaction rate drops again. This makes it possible to find the optimum conditions for every reaction."

"Zeolites are generally suitable as nanoreactors for all chemical reactions whose reaction partners fit into the pores and in which an acid is used as a catalyst," emphasizes Lercher. "We are at the very beginning of a development with the potential to increase the reactivity of molecules even at low temperatures and, thus, to save considerable amounts of energy in the production of fuels or chemicals."

Credit: 
Technical University of Munich (TUM)

Embryo freezing for IVF appears linked to blood pressure problems in pregnancy

30 June 2021: A large cohort study drawn from the national IVF registry of France, which included almost 70,000 pregnancies delivered after 22 weeks gestation between 2013 and 2018, has found a higher risk of pre-eclampsia and hypertension in pregnancies derived from frozen-thawed embryos. This risk was found significantly greater in those treatments in which the uterus was prepared for implantation with hormone replacement therapies. The results confirm with real-life data what has been observed in sub-groups of patients in other studies.

The results are presented today by Dr Sylvie Epelboin from the Hôpital Bichat-Claude Bernard, Paris, at the online annual meeting of ESHRE. The study was performed on behalf of the Mother & child health after ART network, of the French Biomedecine Agency. She said that the results highlight two important considerations in IVF: the potentially harmful effects on vascular pathologies of high and prolonged doses of hormone replacement therapies used to prepare the uterus for the implantation of frozen-thawed embryos; and the protective effect of a corpus luteum (1), which is present in natural or stimulated cycles for embryo transfer. The hormone replacement therapy given to prepare the uterus for embryo transfer, explained Dr Epelboin, suppresses ovulation and therefore the formation of the corpus luteum.

The risk of pre-eclampsia and other pregnancy-related disorders of pregnancy has been raised in a growing number of studies of freezing in IVF.(2) However, the overall risks of maternal morbidity are known to be generally lower in pregnancies resulting from frozen embryo transfer than in those from fresh transfers - except in relation to the risk of pre-eclampsia. While some studies have observed such risks in frozen embryo transfers, few studies, said Dr Epelboin, have compared these "maternal vascular morbidities with the two hormonal environments that preside over the early stages of embryonic development".

This study divided the cohort of pregnancies from IVF and ICSI in the French national database into three groups of singletons for comparison: those derived from frozen embryo transfer in a natural "ovulatory" cycle (whether stimulated or not) (n = 9,500); those from frozen embryo transfer with hormone replacement therapy (n = 10,373); and conventional fresh transfers (n = 48,152).

Results showed a higher rate of pre-eclampsia with frozen embryos transferred in the artificial (ie, prepared with hormone therapy) frozen cycle (5.3%) than in an ovulatory cycle (2.3%) or in fresh cycles (2.4%). The rates were found similarly distinct in pregnancy-induced hypertension (4.7% vs 3.4% vs 3.3%). These differences were statistically significant, even after adjusting for maternal characteristics (age, parity, tobacco, obesity, history of diabetes, hypertension, endometriosis, polycystic ovaries, premature ovarian failure) to avoid bias.

Dr Epelboin and colleagues concluded that the study demonstrates that preparation of the uterus with hormones in an artificial cycle is significantly associated with a higher risk of vascular disorders than from cycles with preserved ovulation and fresh embryo transfers.

The use of frozen embryos has increased in IVF in recent years. Success rates in frozen-thawed embryo transfers are reported to be as or more successful than with fresh embryos and, because frozen transfers appear to reduce the risk of hyperstimulation, it also has safety advantages; the blood pressure risks observed in this study and others do not appear to outweigh these benefits, said Dr Epelboin.

Moreover, because results obtained in an ovulatory cycle appear not to affect the chance of pregnancy, preservation of the ovulatory cycle could be advocated as first-line preparation in frozen embryo transfers whenever the choice is possible.

Presentation 0-182 Wednesday 30 June
Higher risk of preeclampsia and pregnancy-induced hypertension with artificial cycle for Frozen-thawed Embryo Transfer compared to ovulatory cycle or fresh transfer following In Vitro Fertilization

The corpus luteum in pregnancy

The corpus luteum is a naturally developing cluster of cells which form in the ovary during early pregnancy and pump out a pulse of progesterone, a fertility hormone. Progesterone supports the lining of the uterus (endometrium) during pregnancy and improves blood flow.

Embryo freezing and the risk of pre-eclampsia in pregnancy

This is the first large-scale study to identify an association between a hormonally prepared uterus (artificial cycle) and a significantly raised risk of pre-eclampsia in pregnancies following the transfer of a frozen-thawed embryo. Several (but not all) randomised trials of freezing embryos generated from an initial egg collection ("freeze-all") have observed such trends as a secondary endpoint. A substantial review of the literature published in 2018 (Maheshewari et al, Hum Reprod Update 2018) concluded that the evidence in favour of embryo freezing was "reassuring" while adding "a need for caution" from the increased risk of hypertension in pregnancy. Generally, embryo freezing allows several transfers from an initial egg collection treatment (and thereby encourages single embryo transfer and the avoidance of multiple pregnancies) and in freeze-all protocols avoids transfer in the same cycle in which the ovaries were stimulated.

Credit: 
European Society of Human Reproduction and Embryology

Machine learning algorithm predicts how genes are regulated in individual cells

image: A schematic overview of the BITFAM machine learning system developed by researchers at UIC. User provided sequencing data ("Normalized scRNA-Seq gene expression") and existing data on transcription factor binding sites ("ChIP-seq TF-Target gene matrix") are analyzed to predict transcription factor activity ("Inferred TF activity") that can be leveraged for a broad range of analyses.

Image: 
Genome Research, Attribution 4.0 International CC BY 4.0 license

A team of scientists at the University of Illinois Chicago has developed a software tool that can help researchers more efficiently identify the regulators of genes. The system leverages a machine learning algorithm to predict which transcription factors are most likely to be active in individual cells.

Transcription factors are proteins that bind to DNA and control what genes are turned "on" or "off" inside a cell. These proteins are relevant to biomedical researchers because understanding and manipulating these signals in the cell can be an effective way to discover new treatments for some illnesses. However, there are hundreds of transcription factors inside human cells and it can take years of research, often through trial and error, to identify which are most active -- those that are expressed, or "on" -- in different types of cells and which could be leveraged as drug targets.

"One of the challenges in the field is that the same genes may be turned "on" in one group of cells but turned "off" in a different group of cells within the same organ," said Jalees Rehman, UIC professor in the department of medicine and the department of pharmacology and regenerative medicine at the College of Medicine. "Being able to understand the activity of transcription factors in individual cells would allow researchers to study activity profiles in all the major cell types of major organs such as the heart, brain or lungs."

Named BITFAM, for Bayesian Inference Transcription Factor Activity Model, the UIC-developed system works by combining new gene expression profile data gathered from single cell RNA sequencing with existing biological data on transcription factor target genes. With this information, the system runs numerous computer-based simulations to find the optimal fit and predict the activity of each transcription factor in the cell.

The UIC researchers, co-led by Rehman and Yang Dai, UIC associate professor in the department of bioengineering at the College of Medicine and the College of Engineering, tested the system in cells from lung, heart and brain tissue. Information on the model and the results of their tests are reported today in the journal
Genome Research
.

"Our approach not only identifies meaningful transcription factor activities but also provides valuable insights into underlying transcription factor regulatory mechanisms," said Shang Gao, first author of the study and a doctoral student in the department of bioengineering. "For example, if 80% of a specific transcription factor's targets are turned on inside the cell, that tells us that its activity is high. By providing data like this for every transcription factor in the cell, the model can give researchers a good idea of which ones to look at first when exploring new drug targets to work on that type of cell."

The researchers say that the new system is publicly available and could be applied widely because users have the flexibility to combine it with additional analysis methods that may be best suited for their studies, such as finding new drug targets.

"This new approach could be used to develop key biological hypotheses regarding the regulatory transcription factors in cells related to a broad range of scientific hypotheses and topics. It will allow us to derive insights into the biological functions of cells from many tissues," Dai said.

Rehman, whose research focuses on the mechanisms of inflammation in vascular systems, says an application relevant to his lab is to use the new system to focus on the transcription factors that drive diseases in specific cell types.

"For example, we would like to understand if there is transcription factor activity that distinguished a healthy immune cell response from an unhealthy one, as in the case of conditions such as COVID-19, heart disease or Alzheimer's disease where there is often an imbalance between healthy and unhealthy immune responses," he said.

Credit: 
University of Illinois Chicago

Abnormalities in how the brain reorganises prior experiences identified in schizophrenia

Neuroscientists at UCL have, for the first time, identified abnormalities in the way memories are 'replayed' in the brains of people with schizophrenia; researchers say the pathbreaking study provides an entirely new basis for explaining many of the condition's core symptoms.

Schizophrenia is a serious and debilitating mental disorder characterised by episodes of psychosis. Symptoms include hallucinations (typically hearing voices), delusions, and disorganised thinking. It affects around 20 million people globally, though the exact cause is unknown.

In the study, published in the journal Cell, researchers used state-of-the-art brain imaging, known as magnetoencephalography (MEG), along with machine learning tools, to measure and assess neural activity corresponding to inner states of mind during rest periods when the brain is consolidating its prior experiences.

The research is the first to demonstrate a link between abnormal neural replay and schizophrenia, and the authors suggest that the findings might enable much earlier detection of the disorder as well as provide a basis for examining novel treatment options.

Explaining the study, carried out at the UCL Max Planck Centre for Computational Psychiatry and Ageing, senior author Professor Ray Dolan (UCL Queen Square Institute of Neurology) said: "Every human being carries around a model of the world in their mind and when confronted with new information this model is updated using a process called 'neural replay'.

"We asked whether abnormalities in neural replay are present in people with schizophrenia. To do this we designed a first-of-its-kind study using newly developed decoding tools, based on machine learning, that allow us to track neural replay corresponding to a task-related inference process."

For the study, 55 participants, 28 with schizophrenia (13 unmedicated) and 27 healthy volunteers, were given an abstract rule e.g [A → B → C → D] and then asked to arrange in their minds a series of presented pairs of images into two distinct groups and sequences, by applying an abstract rule they had learnt.

Once participants had completed the tasks they then relaxed for five minutes, enabling the brain to enter a 'rest period'; this is the time when the brain subconsciously replays its prior experiences using neural replay, and this is thought to be an important mechanism for memory consolidation as well as inference and belief formation.

At all times during the task phase and rest phase, participants were seated, awake and their brains were being monitored by the MEG.

In analysing the MEG neural activity data, researchers used a machine learning based approach, developed by the authors and their collaborators.

Researchers found participants with schizophrenia were far less able to 'build a structure' of the task. This behavioural impairment was directly linked to an impoverished expression of neural replay measured during a post-task 'rest-period'.

By contrast, healthy volunteers demonstrated a pattern of replay that was consistent with their brains inferring a correct task structure.

Lead author Dr Matthew Nour (UCL Queen Square Institute of Neurology) said: "These findings raise the tantalising possibility that subtle impairments in memory replay might result in alterations in memory consolidation and belief formation, and thus explain previously mysterious aspects of schizophrenia.

"The findings also open up exciting new research avenues that apply similar imaging techniques across a range of mental health conditions, with the aim of developing better early assessments and more targeted treatment tools."

While similar findings have previously been shown in mouse models, this is the first evidence of similar memory distortion in humans.

Professor Dolan added: "We are very excited by these findings especially as they build on a set of techniques the group at UCL has developed over the past five years. This is the first demonstration of a link between replay and schizophrenia. Replay itself, and its disruption in schizophrenia, provides a highly plausible neurophysiological rationale for explaining core symptoms of the disorder in a way that has previously proved elusive."

Credit: 
University College London

Thinking in 3D improves mathematical skills

Spatial reasoning ability in small children reflects how well they will perform in mathematics later. Researchers from the University of Basel recently came to this conclusion, making the case for better cultivation of spatial reasoning.

Good math skills open career doors in the natural sciences as well as technical and engineering fields. However, a nationwide study on basic skills conducted in Switzerland in 2019 found that schoolchildren achieved only modest results in mathematics. But it seems possible to begin promoting math skills from a young age, as Dr. Wenke Möhring's team of researchers from the University of Basel reported after studying nearly 600 children.

The team found a correlation between children's spatial sense at the age of three and their mathematical abilities in primary school. "We know from past studies that adults think spatially when working with numbers - for example, represent small numbers to the left and large ones to the right," explains Möhring. "But little research has been done on how spatial reasoning at an early age affects children's learning and comprehension of mathematics later."

The study, which was published in the journal Learning and Instruction, suggests that there is a strong correlation between early spatial skills and the comprehension of mathematical concepts later. The researchers also ruled out the possibility that this correlation is due to other factors, such as socio-economic status or language ability. Exactly how spatial ability affects mathematical skills in children is still unclear, but the spatial conception of numbers might play a role.

The findings are based on the analysis of data from 586 children in Basel, Switzerland. As part of a project on language acquisition of German as a second language, the researchers gave three-year-old children a series of tasks to test cognitive, socio-emotional and spatial abilities. For example, the children were asked to arrange colored cubes in certain shapes. The researchers repeated these tests four times at an interval of about 15 months and compared the results with the academic performance of seven-year-old children in the first grade.

The researchers also closely examined whether the pace of development, i.e. particularly rapid development of spatial abilities, can predict future mathematical ability. Past studies with a small sample size had found a correlation, but Möhring and her colleagues were unable to confirm this in their own study. Three-year-old children who started out with low spatial abilities improved them faster in the subsequent years, but still performed at a lower level in mathematics when they were seven years old. Despite faster development, by the time they began school these children had still not fully caught up with the children possessing higher initial spatial reasoning skills.

"Parents often push their children in the area of language skills," says Möhring. "Our results suggest how important it is to cultivate spatial reasoning at an early age as well." There are simple ways to do this, such as using "spatial language" (larger, smaller, same, above, below) and toys - e.g. building blocks - that help improve spatial reasoning ability.

Spatial reasoning and gender

The researchers found that boys and girls are practically indistinguishable in terms of their spatial reasoning ability at the age of three, but in subsequent years this develops more slowly in girls. Möhring and her colleagues suspect that boys may hear more "spatial language" and that toys typically designed for boys often promote spatial reasoning, whereas toys for girls focus mainly on social skills. Children may also internalize their parents' and teacher's expectations and then, as they grow up, live up to stereotypes - for example, that women do not perform as well in the areas of spatial reasoning and mathematics as men.

Credit: 
University of Basel

Digging into the molecules of fossilized dinosaur eggshells

Dinosaurs roamed the Earth more than 65 million years ago, and paleontologists and amateur fossil hunters are still unearthing traces of them today. The minerals in fossilized eggs and shell fragments provide snapshots into these creatures' early lives, as well as their fossilization processes. Now, researchers reporting in ACS Earth and Space Chemistry have analyzed the molecular makeup of fossilized dinosaur eggshells from Mexico, finding nine amino acids and evidence of ancient protein structures.

Current research indicates that all dinosaurs laid eggs, though most haven't survived the test of time. And because whole eggs and shell fragments are very rare fossils, their mineral composition has not been widely investigated. Previously, Abel Moreno and colleagues reported the micro-architectures of eggshells from several species of dinosaurs found in Baja California. Although other teams have shown that some dinosaur eggshells contained calcium carbonate, carbohydrates and other compounds, no one has done similar analyses on the shells of species that Moreno's team had collected. So, as a next step, these researchers wanted to look at the mineral and organic carbon-based components in fossilized eggshells from species that hatched in the Late Cretaceous.

The researchers collected five fossilized eggshells from dinosaurs in the Theropod (bipedal carnivores) and Hadrosauridae (duck-billed dinosaurs) families and an unidentified ootaxon. They found that calcium carbonate was the primary mineral, with smaller amounts of albite and quartz crystals. Anhydrite, hydroxyapatite and iron oxide impurities were also present in the shells, which the researchers suggest replaced some of the original minerals during fossilization. Then, with Fourier transform infrared spectroscopy (FT-IR), the team found nine amino acids among the five samples, but only lysine was in all of them. In addition, they identified evidence of secondary protein structures, including turns, α-helices, β-sheets and disordered structures, which were preserved for millions of years by being engrained in the minerals. The FT-IR bands corresponding to amino acids and secondary structures could be indicative of ancestral proteins that have not been characterized before, the researchers say.

Credit: 
American Chemical Society

Targeted delivery of therapeutic RNAs only to cancer, no harm caused to healthy cells

video: Tel Aviv University's groundbreaking technology may revolutionize the treatment of a wide range of diseases and medical conditions. In the framework of this study, the researchers were able to create a new method of transporting RNA-based drugs to a subpopulation of immune cells.

Image: 
Tel Aviv University

Tel Aviv University's groundbreaking technology may revolutionize the treatment of cancer and a wide range of diseases and medical conditions. In the framework of this study, the researchers were able to create a new method of transporting RNA-based drugs to a subpopulation of immune cells involved in the inflammation process, and target the disease-inflamed cell without causing damage to other cells.

The study was led by Prof. Dan Peer, a global pioneer in the development of RNA-based therapeutic delivery. He is Tel Aviv University's Vice President for Research and Development, head of the Center for Translational Medicine and a member of both the Shmunis School of Biomedicine and Cancer Research, George S. Wise Faculty of Life Sciences, and the Center for Nanoscience and Nanotechnology. The study was published in the prestigious scientific journal Nature Nanotechnology.

Prof. Peer: "Our development actually changes the world of therapeutic antibodies. Today we flood the body with antibodies that, although selective, damage all the cells that express a specific receptor, regardless of their current form. We have now taken out of the equation healthy cells that can help us, that is, uninflamed cells, and via a simple injection into the bloodstream can silence, express or edit a particular gene exclusively in the cells that are inflamed at that given moment."

As part of the study, Prof. Peer and his team were able to demonstrate this groundbreaking development in animal models of inflammatory bowel diseases such as Crohn's disease and colitis, and improve all inflammatory symptoms, without performing any manipulation on about 85% of the immune system cells. Behind the innovative development stands a simple concept, targeting to a specific receptor conformation.

"On every cell envelope in the body, that is, on the cell membrane, there are receptors that select which substances enter the cell," explains Prof. Peer. "If we want to inject a drug, we have to adapt it to the specific receptors on the target cells, otherwise it will circulate in the bloodstream and do nothing. But some of these receptors are dynamic - they change shape on the membrane according to external or internal signals. We are the first in the world to succeed in creating a drug delivery system that knows how to bind to receptors only in a certain situation, and to skip over the other identical cells, that is, to deliver the drug exclusively to cells that are currently relevant to the disease."

Previously, Prof. Peer and his team developed delivery systems based on fatty nanoparticles - the most advanced system of its kind; this system has already received clinical approval for the delivery of RNA-based drugs to cells. Now, they are trying to make the delivery system even more selective.

According to Prof. Peer, the new breakthrough has possible implications for a wide range of diseases and medical conditions. "Our development has implications for many types of blood cancers and various types of solid cancers, different inflammatory diseases, and viral diseases such as the coronavirus. We now know how to wrap RNA in fat-based particles so that it binds to specific receptors on target cells," he says. "But the target cells are constantly changing. They switch from 'binding' to 'non-binding' mode in accordance with the circumstances. If we get a cut, for example, not all of our immune system cells go into a 'binding' state, because we do not need them all in order to treat a small incision. That is why we have developed a unified protein that knows how to bind only to the active state of the receptors of the immune system cells. We tested the protein we developed in animal models of inflammatory bowel disease, both acute and chronic."

Prof. Peer adds, "We were able to organize the delivery system in such a way that we target to only 14.9% of the cells that were involved in the inflammatory condition of the disease, without adversely affecting the other, non-involved, cells, which are actually completely healthy cells. Through specific binding to the cell sub-population, while delivering the RNA payload we were able to improve all indices of inflammation, from the animal's weight to pro-inflammatory cytokines. We compared our results with those of antibodies that are currently on the market for Crohn's and colitis patients, and found that our results were the same or better, without causing most of the side effects that accompany the introduction of antibodies into the entire cell population. In other words, we were able to deliver the drug 'door-to-door,' directly to the diseased cells."

The study was led by Prof. Peer, together with Dr. Niels Dammes, a postdoctoral fellow from the Netherlands, with the collaboration of Dr. Srinivas Ramishetti, Dr. Meir Goldsmith and Dr. Nuphar Veiga, from Prof. Dan Peer's lab. Professors Jason Darling and Alan Packard of Harvard University in the United States also participated. The study was funded by the European Union, in the framework of the European Research Council (ERC).

Credit: 
Tel-Aviv University

Microbes feast on crushed rock in subglacial lakes beneath Antarctica

image: Photo of core catcher used to extract subglacial sediments.

Image: 
John Priscu

Pioneering research has revealed the erosion of ancient sediments found deep beneath Antarctic ice could be a vital and previously unknown source of nutrients and energy for abundant microbial life.

The study, led by the University of Bristol and published today in Nature's Communications Earth & Environment journal, sheds new light on the many compounds supporting various microbes which form part of a huge subglacial ecosystem.

Lead author Dr Beatriz Gill Olivas, a Post-Doctoral Research Associate at the University of Bristol, said: "Although the study focused on samples obtained from a single subglacial lake, the results could have much wider implications. Subglacial Lake Whillans is part of a large interconnected hydrological system, so erosion taking place upstream could represent a potential source of biologically important compounds to this and other lakes within the system that might harbour thriving communities of microbial life."

The international research team replicated erosion processes in ice sheets by crushing sediments extracted from Lake Whillans, a subglacial lake in Antarctica, spanning around 60 km2, some 800m below the ice surface, then wetting these sediments and keeping them at 0 °C without oxygen, to approximate subglacial conditions.

Findings showed a single crushing event down to a depth of 10 cm, followed by a 41-day incubation period, of these same sediments has the potential to provide up to a quarter (24 per cent) of the estimated methane required by methanotrophs, tiny microbes which depend on methane as their source of carbon and energy, present in these environments. Substantial concentrations of hydrogen and carbon dioxide were also produced during crushing and incubation. These gases could potentially be used by methane-generating microbes, called methanogens, to produce enough methane which would help explain its widely varying levels within Lake Whillans.

Another previously unidentified finding was the detection of measurable concentrations of ammonium in water after incubation with crushed sediments. This is of particular significance to Lake Whillans, where there is an abundance of microbial taxa that have the potential to derive energy from ammonium oxidation, a process known as nitrification. The study demonstrated a single high-energy crushing event, followed by the same incubation period, has the potential to produce more than the annual ammonium demand (120 per cent) within the lake.

"Our previous understanding suggests the structure and function of subglacial ecosystems depends on the presence of redox species and redox gradients within liquid water. This study shows the process of erosion could potentially split water on freshly abraded mineral surfaces and produce hydrogen (a reducing end member species) and hydrogen peroxide (a highly oxidizing compound), creating a redox gradient and providing a previously unrecognised source of energy to microbial ecosystems," Dr Gill Olivas explained.

"Only two previous studies have looked at the potential influence of erosion on subglacial energy and nutrient sources, which involved crushing largely unweathered rock samples. This is the first study to use highly weathered, ancient marine sediments, yet concentrations of gases measured largely agreed with previous results."

Dr Gill Olivas added: "Whilst these experiments do not reflect the true extent of erosion under glaciers, they do hint at the potential for the crushing of sediments to supply important sources of nutriment to subglacial ecosystems. Further study is now needed to identify the complete range of reactions resulting directly from erosion processes and how they may vary with differences in the underlying sediment."

Credit: 
University of Bristol

This crystal impurity is sheer perfection

image: STEM tomography image of a 3D-grown 100-200-nanometer crystalline disc grown from 3D gold-polystyrene polymer-grafted nanoparticles.

Image: 
Berkeley Lab

Crystallization is one of the most fundamental processes found in nature - and it's what gives minerals, gems, metals, and even proteins their structure.

In the past couple of decades, scientists have tried to uncover how natural crystals self-assemble and grow - and their pioneering work has led to some exciting new technologies - from the quantum dots behind colorful QLED TV displays, to peptoids, a protein-mimic that has inspired dozens of biotech breakthroughs.

Now, a research team led by scientists at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) and UC Berkeley has developed a nanoparticle composite that grows into 3D crystals. The scientists say that the new material - which they call a 3D PGNP (polymer-grafted nanoparticle) crystal in their recently published Nature Communications study - could lead to new technologies that are 3D-grown rather than 3D-printed.

"We've demonstrated a new lever to turn, so to speak, to grow a crystalline material into a composite or structured material for applications ranging from nanoscale photonics for smart buildings to actuators for robotics," said Ting Xu, senior author of the study. Xu is a faculty senior scientist in Berkeley Lab's Materials Sciences Division and professor of chemistry and materials science and engineering at UC Berkeley.

Xu said that their new method is compatible with the demands of mass manufacturing. "Many smart minds have designed elegant chemistries, such as DNAs and supramolecules, to crystallize nanoparticles. Our system is essentially a blend of nanoparticle and polymers - which are similar to the ingredients people use to make airplane wings or automobile bumpers. But what's even more interesting is that we didn't expect our method to be so simple and so fast," Xu said.

A chance discovery

Lead author Yiwen Qian, a Ph.D. student researcher in the Xu Group at UC Berkeley, discovered the 3D PGNP nanocrystals by chance in an ordinary lab experiment.

A couple of days before, she had left a solution of toluene solvent and gold nanoparticles grafted with polystyrene (Au-PS) in a centrifuge tube on a lab counter. When she looked at the sample under a transmission electron microscope (TEM), she noticed something odd. "Nanoparticles had crystallized quickly. That was not a normal thing to expect," she said.

To investigate, Xu collaborated with Peter Ercius, a staff scientist at Berkeley Lab's Molecular Foundry, and Wolfgang Theis and Alessandra DaSilva of the University of Birmingham, all of whom are widely regarded for their expertise in STEM (scanning transmission electron microscopy) tomography, an electron microscopy technique that uses a highly focused beam of electrons to reconstruct images of a material's 3D structure at high resolution.

Using microscopes at the Molecular Foundry, a world-leading user facility in STEM tomography, the researchers first captured crystalline 3D patterns of the Au-PS nanoparticles.

On the hunt for more clues, Xu and Qian then deployed nuclear magnetic resonance spectroscopy experiments at UC Berkeley, where they discovered that a tiny trace of polyolefin molecules from the centrifuge tube lining had somehow entered the mix. Polyolefins, which include polyethylene and polypropylene, are some of the most ubiquitous plastics in the world.

Qian repeated the experiment, adding more polyolefin to the Au-PS solution - and this time, they got bigger 3D PGNP crystals within minutes.

Xu was surprised. "I thought, 'This shouldn't be happening so fast,'" she recalled. "Crystals of nanoparticles usually take days to grow in the lab."

A boon for industry: growing materials at the nanolevel

Subsequent experiments revealed that as the toluene solvent quickly evaporates at room temperature, the polyolefin additive helps the Au-PS nanoparticles form into 3D PGNP crystals, and to "grow into their favorite crystal structure," said Qian.

In another key experiment, the researchers designed a self-assembling 100-200-nanometer crystalline disc that looks like the base of a pyramid. From this stunning demonstration of mastery over matter at the nanolevel, the researchers learned that the size and shape of the 3D PGNP crystals are driven by the kinetic energy of the polyolefins as they precipitate in the solution.

Altogether, these findings "provide a model for showing how you can control the crystal structure at the single particle level," Xu said, adding that their discovery is exciting because it provides new insight into how crystals form during the early stages of nucleation.

"And that's challenging to do because it's hard to make atoms sit next to each other," Ercius said.

The new approach could grant researchers unprecedented control in fine-tuning electronic and optical devices at the nanolevel (billionths of a meter), Xu said. Such nanoparticle-scale precision, she added, could speed up production and eliminate errors in manufacturing.

Looking ahead, Qian would like to use their new technique to probe the toughness of different crystal structures - and perhaps even make a hexagonal crystal.

Xu plans to use their technique to grow bigger devices such as a transistor or perhaps 3D-print nanoparticles from a mix of materials.

"What can you do with different morphologies? We've shown that it's possible to generate a single-component composite from a mineral and a polymer. It's really exciting. Sometimes you just need to be in the right place at the right time," Xu said.

Co-authors on the paper include Alessandra da Silva and Wolfgang Theis at the University of Birmingham in the United Kingdom; Emmy Yu, an undergraduate student researcher in the Xu Group at UC Berkeley; and Christopher L. Anderson and Yi Liu at Berkeley Lab's Molecular Foundry.

The Molecular Foundry is a DOE Office of Science nanoscience user facility at Berkeley Lab.

The work was supported by the DOE Office of Science.

Credit: 
DOE/Lawrence Berkeley National Laboratory

Stock markets becoming increasingly networked due to high-frequency traders

During the last twenty years, the trading in stock markets has undergone significant changes. Researchers from the University of Turku and the University of Palermo have investigated the role of high-frequency traders in the markets.

Technological evolution and innovations both in the technology used by stock exchanges and the resources of the traders using their services have made faster trading possible. As a result, high-frequency trading in sub millisecond scale has increased.

However, not everyone has the opportunity to use high-frequency trading, and generally, the scales can be anything from microseconds to tens of thousands of seconds. The role of high-frequency traders has given rise to broad debate over the past years.

- On the one hand, people have argued that the presence of high-frequency traders makes the operation of the markets more efficient and lowers the trading expenses. On the other hand, there have been arguments that this type of trading increases the volatility of the markets, meaning strong fluctuation of prices, and makes them more prone to crashes, says University Research Fellow, Docent Jyrki Piilo from the University of Turku.

The researchers used data from the years 2004-2006, 2010-2011, and 2018. In the first decade of this millennium, the high-frequency trading was still limited while in the second decade it has become much more widely spread due to significant technological development.

- Using statistical analysis of data and the modern methods of network theory, the study allows us to see how the actions of the market members and their reactions to other members' actions has changed over the years, says Dr. Federico Musciotto from the University of Palermo.

A member of the market can trade for their own account or complete commissions for other traders on their request. The study also shows what kind of role the high-frequency traders have in this transformation.

- Based on the data, we have constructed the trading networks of market members, which allows us to detect in which pairs of market members there is a preference to trade with each other and for which pairs there is avoidance to trade with each other. In other words, the results show which type of market members have a preference to react for the order to buy or sell of other type of market member, and which type of market members avoid trading with each other, says Professor Rosario N. Mantegna from the University of Palermo.

The central outcome of the study is that the markets have become significantly more networked as trading has become faster through technological development. The networked character has increased significantly over the years - even though the anonymity in the markets has also increased.

Over the years, there has been a clear increase in the preference for the high-frequency traders to trade with other types of traders. Instead, high-frequency traders avoid mutual trading between themselves, as do other types of traders also. In general, the increasing presence of high-frequency traders has led to the increased networked structure of the market where the strong preferential trading patterns between specific pairs of market members can last up to several months.

Made possible by their technological edge and fast trading compared to others, the high-frequency traders have the ability to perform strategic trading decisions not accessible to other market members or investors, and thereby significantly influence the liquidity of the markets.

The networked structure of the market and the competition among market members are not necessarily the optimal solutions for the best operation of the markets

- The results emphasise the importance of the debate and further investigations of how we can ensure the fair and efficient operation of the markets, summarises Mantegna.

Credit: 
University of Turku

Reversal speeds creation of important molecule

image: Rice University synthetic chemists have simplified the process to make halichondrin B, top, the parent compound of the successful cancer drug eribulin, bottom. Their reverse synthesis reduced the number of steps required to make the natural product.

Image: 
Jenna Kripal/Nicolaou Research Group

HOUSTON - (June 29, 2021) - The story of halichondrin B, an inspirational molecule obtained from a marine creature, goes back to the molecule's discovery in an ocean sponge in 1986.

Though it has been replicated in the laboratory several times before, new work by Rice University chemists could make halichondrin B and its naturally occurring or designed variations easier to synthesize.

Synthetic chemist K.C. Nicolaou and his lab reported in the Journal of the American Chemical Society their success in simplifying several processes used to make halichondrin B and its variations.

Halichondrin's molecular structure and potent antitumor properties inspired the design and synthesis of variations (aka analogues). The Rice lab's "reverse approach" to make halichondrin B resulted in the shortest route to what the researchers referred to as a "highly complex and important molecule."

"This total synthesis represents the shortest of the previously reported approaches to this complex natural product," Nicolaou said. "Its importance lies in its potential for further improvement and application to the rapid synthesis of other members of the halichondrin family as well as novel designed analogues as potential drug candidates."

He said the Rice lab's technologies can in principle be applied to the production of eribulin, a simpler and powerful halichondrin B analogue clinically used to treat breast cancer and liposarcoma.

Previous and current syntheses of halichondrin B and its analogues require initial bonding of carbon atoms, and then bonding of carbon and oxygen atoms, to construct cyclic ethers, key building blocks essential to making the molecules.

Nicolaou and his colleagues flipped the sequence to make the carbon-oxygen connections first. Known as the Nicholas etherification, this process was followed by radical cyclization to form the required carbon-carbon bonds, finally coupling them en route to the targeted halichondrin B.

Nicolaou noted other labs have "flipped" the process to synthesize various simpler compounds, but none had tried it on halichondrin B. "Its importance as a biologically active molecule coupled with its synthetically challenging structure served as our motivation to pursue this project," he said.

Their work reduced the number of steps required to make the molecule to 25, starting from commercially available materials. Nicolaou expects further simplification, will not only further reduce the steps of the synthesis, but also improve the overall yield, resulting in a more efficient and cost-effective chemical process for making this type of compounds.

Credit: 
Rice University

Making seawater drinkable in minutes

image: Schematic of co-axial electrospinning device.

Image: 
Elsevier

According to the World Health Organization, about 785 million people around the world lack a clean source of drinking water. Despite the vast amount of water on Earth, most of it is seawater and freshwater accounts for only about 2.5% of the total. One of the ways to provide clean drinking water is to desalinate seawater. The Korea Institute of Civil Engineering and Building Technology (KICT) has announced the development of a stable performance electrospun nanofiber membrane to turn seawater into drinking water by membrane distillation process.

Membrane wetting is the most challenging issue in membrane distillation. If a membrane exhibits wetting during membrane distillation operation, the membrane must be replaced. Progressive membrane wetting has been especially observed for long-term operations. If a membrane gets fully wetted, the membrane leads to inefficient membrane distillation performance, as the feed flow through the membrane leading to low-quality permeate.

A research team in KICT, led by Dr. Yunchul Woo, has developed co-axial electrospun nanofiber membranes fabricated by an alternative nano-technology, which is electrospinning. This new desalination technology shows it has the potential to help solve the world's freshwater shortage. The developed technology can prevent wetting issues and also improve the long-term stability in membrane distillation process. A three-dimensional hierarchical structure should be formed by the nanofibers in the membranes for higher surface roughness and hence better hydrophobicity.

The co-axial electrospinning technique is one of the most favorable and simple options to fabricate membranes with three-dimensional hierarchical structures. Dr. Woo's research team used poly(vinylidene fluoride-co-hexafluoropropylene) as the core and silica aerogel mixed with a low concentration of the polymer as the sheath to produce a co-axial composite membrane and obtain a superhydrophobic membrane surface. In fact, silica aerogel exhibited a much lower thermal conductivity compared with that of conventional polymers, which led to increased water vapor flux during the membrane distillation process due to a reduction of conductive heat losses.

Most of the studies using electrospun nanofiber membranes in membrane distillation applications operated for less than 50 hours although they exhibited a high water vapor flux performance. On the contrary, Dr. Woo's research team applied the membrane distillation process using the fabricated co-axial electrospun nanofiber membrane for 30 days, which is 1 month.

The co-axial electrospun nanofiber membrane performed a 99.99% salt rejection for 1 month. Based on the results, the membrane operated well without wetting and fouling issues, due to its low sliding angle and thermal conductivity properties. Temperature polarization is one of the significant drawbacks in membrane distillation. It can decrease water vapor flux performance during membrane distillation operation due to conductive heat losses. The membrane is suitable for long-term membrane distillation applications as it possesses several important characteristics such as, low sliding angle, low thermal conductivity, avoiding temperature polarization, and reduced wetting and fouling problems whilst maintaining super-saturated high water vapor flux performance.

Dr. Woo's research team noted that it is more important to have a stable process than a high water vapor flux performance in a commercially available membrane distillation process. Dr. Woo said that "the co-axial electrospun nanofiber membrane have strong potential for the treatment of seawater solutions without suffering from wetting issues and may be the appropriate membrane for pilot-scale and real-scale membrane distillation applications."

Credit: 
National Research Council of Science & Technology

Oregon State graduate student sheds light on better way to study reputedly secretive toad

image: eastern spadefoot

Image: 
Anne Devan-Song

CORVALLIS, Ore. - Research by a graduate student in Oregon State University's College of Science has upended the conventional wisdom that for a century has incorrectly guided the study of a toad listed as endangered in part of its range.

Anne Devan-Song used spotlighting - shining a light in a dark spot and looking for eye reflections - to find large numbers of the eastern spadefoot toad. The study illustrates how confirmation bias - a tendency to interpret new information as ratification of existing theories - can hamper discovery and the development of better ones.

Her findings, which show that the toad spends much more time above ground than commonly believed, were published in the Journal of Herpetology.

Known for bright yellow eyes with elliptical pupils and, as the name suggests, a spade on each hind foot, the eastern spadefoot toad ranges from the southeast corner of the United States up the Atlantic Coast to New England. Known scientifically as Scaphiopus holbrooki, it is a species of conservation concern in the northern reaches of its territory.

Devan-Song, a Ph.D. student in integrative biology, grew up in Singapore, where she learned she could search for reptiles and amphibians by spotlighting. In Rhode Island, where she earned a master's degree and then worked as a university research associate, the eastern spadefoot toad is endangered.

One rainless night while surveying for amphibians during a project in Virginia, Devan-Song's spotlight detected one eastern spadefoot after another. That surprised her because the toads were thought to be detectable only on a few rainy nights every year, when they emerge from underground burrows to mate in wetlands.

She continued looking for eastern spadefoots and kept finding them on dry nights, including in upland forest locales not close to any damp areas. Spadefoots remain still when spotlighted so it was easy for Devan-Song to approach the eye-shines and positively identify the toads.

"They need to get above ground to hunt for insects and build up energy stores for mating," she said. "That's why we were finding them when and where conventional wisdom said we weren't supposed to be finding them."

Back in Rhode Island, she tried spotlighting for spadefoots; it took her just 15 minutes to find one. The success led to a 10-night survey in a pair of locations last summer that produced 42 sightings - nearly double the number of eastern spadefoot toad sightings in Rhode Island over the previous seven decades.

Devan-Song also learned that she wasn't the first to question the notion that the eastern spadefoot was so "secretive" as to almost always avoid detection. As far back as 1944, Devan-Song said, it was suggested in scientific literature that the toad could be found outside of rain-induced migration and breeding aggregations. And in 1955, researchers used spotlighting to detect huge numbers of eastern spadefoots in Florida; the technique subsequently, inexplicably fell into disuse.

"Confirmation bias perpetuated the fallacy of when the eastern spadefoot could be found," Devan-Song said. "No breeding events or migration occurred during our surveys and we detected thousands of toads in Virginia and dozens in Rhode Island. The majority of those were subadults, a demographic category mainly overlooked in the literature. Progress in learning about the toad, its ecology and its conservation has been greatly hindered by a misconception that persisted even when evidence to the contrary was presented."

The ease with which many toads could be found during breeding, combined with a lack of data on toads in upland habitats, helped fuel confirmation bias in this case, she said.

"Everyone assumed they were underground most of the time so no one was really looking for them most of the time," Devan-Song said. "Our research demonstrates that you can detect them year round, though they do remain rare in Rhode Island. But likely not as rare as the scientific community thought."

Credit: 
Oregon State University

Looking at tumors through a new lens

Neoadjuvant immune checkpoint blockade (ICB) is a promising treatment for melanoma and other cancer types, and has recently been shown to provide a modest survival benefit for patients with recurrent glioblastoma. To improve the treatment efficacy, researchers are looking for vulnerabilities in surgically removed glioblastoma tissues, but this has been difficult due to the vast differences within the tumor and between patients.

To address this challenge, researchers at Institute for Systems Biology (ISB) and their collaborators developed a new way to study tumors. The method builds mathematical models using machine learning-based image analysis and multiplex spatial protein profiling of microscopic compartments in the tumor.

The team used the approach to analyze and compare tumor tissues collected from 13 patients with recurrent glioblastoma and 23 patients with high-risk melanoma, with both sets of patients treated with neoadjuvant ICB. Using melanoma to guide the interpretation of glioblastoma analyses, they identified the proteins that correlate with tumor-killing T cells, tumor growth, and immune cell-cell interactions.

"This work reveals similarities shared between glioblastoma and melanoma, immunosuppressive factors that are unique to the glioblastoma microenvironment, and potential co-targets for enhancing the efficacy of neoadjuvant immune checkpoint blockade," said Dr. Yue Lu, co-lead author of the paper describing the research.

"This framework can be used to uncover pathophysiological and molecular features that determine the effectiveness of immunotherapies," added Dr. Alphonsus Ng, co-lead author of the paper.

The work was published today in Nature Communications, and is a collaborative project by ISB, UCLA and MD Anderson. Brain cancer represents one of the most challenging settings for achieving immunotherapy success. The fruitful collaboration between scientists and clinicians provides a tremendous opportunity for improving patient care and achieving an understanding of cancer immunotherapy at the deepest levels.

"We believe that the integrated biological, clinical and methodological insights derived from comparing two classes of tumors widely seen as at the opposite ends of the spectrum with respect to immunotherapy treatments should be of interest to broad scientific and clinical audiences," said ISB President Dr. Jim Heath, corresponding author of the paper.

Credit: 
Institute for Systems Biology

Striking gold: Synthesizing green gold nanoparticles for cancer therapy with biomolecules

image: Figure 1 Nanoparticles from Biomolecules: An Eco-Friendly Synthesis

Image: 
Tokyo Tech

In cancer therapy, the effectiveness of an approach is determined by its ability to preserve the non-cancerous cells. Simply put, the higher the collateral damage, the greater are the side-effects of a therapy. An ideal situation is where only the cancer cells can be targeted and destroyed. In this regard, photothermal therapy--an approach in which cancer cells infused with gold nanoparticles can be heated up and destroyed using near-infrared (NIR) light that is strongly absorbed by the gold nanoparticles--has emerged as a promising strategy due to its minimally invasive nature.

"Because NIR light is able to penetrate biological tissues, it can illuminate the gold nanoparticles within the body and turn them into nano-sized cell heating agents," explains Prof. Masayoshi Tanaka from Tokyo Institute of Technology (Tokyo Tech), Japan, who researches nanomaterials for biomedical applications.

In particular, gold nanoplates (AuNPls) are extremely attractive as photothermal therapeutic agents owing to their efficient absorption of NIR light. However, synthesizing these nanoparticles requires harsh reagents and highly toxic conditions, making the process hazardous. In a new study, Prof. Tanaka and his collaborators from UK (University of Leeds) and Korea (Chung-Ang University) have now addressed this issue by developing a safer and more eco-friendly protocol for AuNPl synthesis, the results of which are published in ACTA Biomaterialia.

The team took the hint from a process called "biomineralization" that uses biomolecules to generate metal nanoparticles with tunable structures. "Peptides, or short chains of amino acids, are particularly attractive candidates for this purpose because of their relatively small size and stability. However, their use for producing Au nanoparticles with optimized structures for efficient NIR absorption has not yet been reported," says Prof. Tanaka.

Motivated, the team began by identifying peptides suitable for the mineralization of AuNPls and, after picking out over 100 peptides, decided to examine the potential of a peptide named B3 for synthesizing AuNPls with controllable structure that can serve as photothermal conversion agents.

In a process called "one pot synthesis", the team mixed a gold salt, HAuCl4, along with B3 peptide and its derivatives at various concentrations in a buffer solution (an aqueous solution resistant to changes in pH) at neutral pH and synthesized triangular and circular-shaped AuNPls with different levels of NIR absorption based on the peptide concentration.

The team then tested the effect of the AuNPls on cultured cancer cells under irradiated conditions and found them to exhibit the desired therapeutic effects. Furthermore, on characterizing the peptide using B3 derivatives, they found that an amino acid called histidine governed the structure of the AuNPls.

"These findings provide not only an easy and green synthetic method for AuNPls but also insight into the regulation of peptide-based nanoparticle synthesis," comments Prof. Tanaka excitedly. "This could open doors to new techniques for non-toxic synthesis of nanoparticle therapeutic agents."

Indeed, we might have struck gold with gold nanoparticles!

Credit: 
Tokyo Institute of Technology