Tech

Microbes in cow stomachs can break down plastic

Plastic is notoriously hard to break down, but researchers in Austria have found that bacteria from a cow's rumen - one of the four compartments of its stomach - can digest certain types of the ubiquitous material, representing a sustainable way to reduce plastic litter.

The scientists suspected such bacteria might be useful since cow diets already contain natural plant polyesters. "A huge microbial community lives in the rumen reticulum and is responsible for the digestion of food in the animals," said Dr Doris Ribitsch, of the University of Natural Resources and Life Sciences in Vienna, "so we suspected that some biological activities could also be used for polyester hydrolysis," a type of chemical reaction that results in decomposition. In other words, these microorganisms can already break down similar materials, so the study authors thought they might be able to break down plastics as well.

Ribitsch and her colleagues looked at three kinds of polyesters. One, polyethylene terephthalate, commonly known as PET, is a synthetic polymer commonly used in textiles and packaging. The other two consisted of a biodegradable plastic often used in compostable plastic bags (polybutylene adipate terephthalate, PBAT), and a biobased material (Polyethylene furanoate, PEF) made from renewable resources.

They obtained rumen liquid from a slaughterhouse in Austria to get the microorganisms they were testing. They then incubated that liquid with the three types of plastics they were testing (which were tested in both powder and film form) in order to understand how effectively the plastic would break down.

According to their results, which were recently published in Frontiers in Bioengineering and Biotechnology, all three plastics could be broken down by the microorganisms from cow stomachs, with the plastic powders breaking down quicker than plastic film. Compared to similar research that has been done on investigating single microorganisms, Ribitsch and her colleagues found that the rumen liquid was more effective, which might indicate that its microbial community could have a synergistic advantage - that the combination of enzymes, rather than any one particular enzyme, is what makes the difference.

While their work has only been done at a lab scale, Ribitsch says, "Due to the large amount of rumen that accumulates every day in slaughterhouses, upscaling would be easy to imagine." However, she cautions that such research can be cost-prohibitive, as the lab equipment is expensive, and such studies require pre-studies to examine microorganisms.

Nevertheless, Ribitsch is looking forward to further research on the topic, saying that microbial communities have been underexplored as a potential eco-friendly resource.

Credit: 
Frontiers

Oncotarget: Loss of CPAP in oral cancer

image: CPAP and EGFR expression levels in HNSCC and normal tissues. TCGA data set was analyzed using UALCAN for CPAP mRNA expression and comparisons were made between head and neck normal tissues with primary tumor tissues (A) and different tumor grade subgroups (B). Immunohistochemistry of head and neck cancer TMA slides containing tumor tissue and adjacent normal tissue sections was carried out for CPAP protein expression using anti-CPAP antibody (C). Images of staining examples of normal and tumor tissues (left) and mean staining intensity grades of tumor and adjacent normal epithelia (right) are shown. TCGA data set was also analyzed for EGFR mRNA expression and comparisons were made between normal tissues with primary tumor tissues (D) and different tumor grade subgroups (E). TMA staining was performed using anti-EGFR antibody (F) and the images of staining examples (left) and mean staining intensity grades (right) are shown. All P-values are by two-tailed, unpaired Student t-test (normal vs primary tumor for panels A and D; normal vs specific grade for panels B and E; normal epithelium vs tumor epithelium for panels C and F). *denotes not significant.

Image: 
Correspondence to - Radhika Gudi - gudi@musc.edu and Chenthamarakshan Vasu - vasu@musc.edu

Oncotarget published "Loss of CPAP causes sustained EGFR signaling and epithelial-mesenchymal transition in oral cancer" which reported that not only do the abnormal functions of microtubule and microtubule-organizing centers such as centrosomes lead to cancers, but also the malignant tissues are characterized by aberrant centriolar features and amplified centrosomes.

In this study, the authors show that the loss of expression of a microtubule/tubulin binding protein, centrosomal protein 4.1-associated protein, which is critical for centriole biogenesis and normal functioning of the centrosome, caused an increase in the EGFR levels and its signaling and, enhanced the EMT features and invasiveness of OSCC cells.

Further, depletion of CPAP enhanced the tumorigenicity of these cells in a xeno-transplant model. Importantly, CPAP loss-associated EMT features and invasiveness of multiple OSCC cells were attenuated upon depletion of EGFR in them.

On the other hand, they found that CPAP protein levels were higher in EGF treated OSCC cells as well as in oral cancer tissues, suggesting that the frequently reported aberrant centriolar features of tumors are potentially a consequence, but not the cause, of tumor progression.

Overall, these novel observations show that, in addition to its known indispensable role in centrosome biogenesis, CPAP also plays a vital role in suppressing tumorigenesis in OSCC by facilitating EGFR homeostasis.

Dr. Radhika Gudi and Dr. Chenthamarakshan Vasu from The Medical University of South Carolina said, "Head and neck squamous cell carcinoma (HNSCC) represents the sixth most common cancer with more than 600,000 new patients diagnosed worldwide and it is linked to more than 300,000 deaths every year."

EGFR is significantly altered in OSCC and its prolonged signaling is mitogenic, driving uncontrolled proliferation of tumor cells.

Despite these advances in the understanding of EGFR signaling, the regulatory mechanisms underlying EGFR signaling and their effects on cancer initiation, progression and metastasis are not fully understood.

Recent studies have shown that microtubule inhibition causes EGFR inactivation or increases the sensitivity to EGFR targeting drugs in various cancers including OSCC.

Microtubules and microtubule-organizing centers have multiple roles in cellular functions including homeostasis of cell signaling, formation of cilia, cytoskeletal actin organization, and centrosome/centriole duplication and normal cell division.

Paradoxically, however, they show not only that EGFR signaling, which is known to contribute to EMT, upregulates the cellular levels of CPAP in OSCC cells, but also detected higher CPAP protein levels in OSCC tumors.

The Gudi/Vasu Research Team concluded in their Oncotarget Research Output that while it has been reported before that EGF stimulation endows OSCC cells with stem cell-like properties, increased invasiveness, and tumorigenic properties, the molecular mechanisms underlying the regulation of EGF induced EMT and tumorigenicity were not known.

Hence, this study does begin to shed light on the molecular mechanisms by which centrosome/MTOC associated proteins are involved in preventing tumorigenesis.

Nevertheless, additional studies are needed in the future to address the mechanism by which CPAP suppresses EGFR dependent EMT and tumorigenesis.

Credit: 
Impact Journals LLC

Methylglyoxal detoxification deficits causes schizophrenia-like behavioral abnormalities

image: Summary of results

Image: 
TMIMS

Methylglyoxal (MG) is a highly reactive α-ketoaldehyde formed endogenously as a byproduct of the glycolytic pathway. MG accumulates under conditions of hyperglycemia, impaired glucose metabolism, or oxidative stress. An excess of MG formation causes mitochondrial impairment and reactive oxygen species (ROS) production that further increases oxidative stress. It also leads to the formation of advanced glycation end products (AGEs) due to MG reacting with proteins, DNA, and other biomolecules, which can induce aberrant inflammation via binding to receptors for AGEs (RAGE). To remove the toxic MG, various detoxification systems work together in vivo, including the glyoxalase system which enzymatically degrades MG using glyoxalase 1 (GLO1) and GLO2, and the MG scavenging system by vitamin B6 (VB6).

Schizophrenia is a heterogeneous psychiatric disorder characterized by positive symptoms, such as hallucinations and delusions, negative symptoms, such as anhedonia and flat affect, and cognitive impairment. We have reported that several patients with schizophrenia have a novel heterozygous frameshift and a single nucleotide variation (SNV) in GLO1 that results in reductions of enzymatic activity. Furthermore, we have reported that VB6 (pyridoxal) levels in peripheral blood of patients with schizophrenia are significantly lower than that of healthy controls. More than 35% of patients with schizophrenia have low levels of VB6 (clinically defined as male: in vivo remain unclear.

In this study, we created a novel mouse model for MG detoxification deficits by feeding Glo1 knockout mice VB6-deficent diets (KO/VB6(-)) and evaluated the effects of impaired MG detoxification systems on brain function. KO/VB6(-) mice accumulated MG in the PFC, hippocampus, and striatum, and displayed behavioral deficits, such as impairments of social interaction and cognitive memory and a sensorimotor deficit in the PPI test. Furthermore, we found aberrant gene expression related to mitochondria function in the PFC of the KO/VB6(-) mice by RNA-sequencing and weighted gene co-expression network analysis (WGCNA). Finally, we demonstrated abnormal mitochondrial respiratory function and subsequently enhanced oxidative stress in the PFC of KO/VB6(-) mice in the PFC. These findings suggest that MG detoxification deficits may cause the observed behavioral deficits via mitochondrial dysfunction and oxidative stress in the PFC.

This is the first report to show that MG detoxification deficits are involved in the development of schizophrenia. Considering the molecular mechanism revealed in this study, antioxidants to prevent oxidative stress and VB6 supplementation may be effective as a new therapeutic strategy for patients with MG detoxification deficits, GLO1 dysfunction and VB6 deficiency.

Credit: 
Tokyo Metropolitan Institute of Medical Science

Energy production at Mutriku remains constant even if the wave force increases

image: Gabriel Ibarra and an image of the dock that houses the Mutriku facilities for transforming wave energy into electrical power

Image: 
Mitxi. UPV/EHU

The Mutriku wave power plant was built on the Mutriku breakwater, a site with great wave energy potential, and has been in operation since 2011. With 14 oscillating water columns to transform wave energy, it is the only wave farm in the world that supplies electricity to the grid on a continuous basis. In general, technologies that harness the power of the waves to produce electricity are in their infancy, and this is precisely what is being explored by the UPV/EHU's Research Group EOLO, which focusses on Meteorology, Climate and Environment, among many other aspects.

Gabriel Ibarra, researcher in the group and lecturer in the UPV/EHU's Department of Energy Engineering, explained that "one thing is the energy the waves produce, the hydraulic energy they have, and another thing is the amount of electrical power obtained from them". This is what they have been working on over the last few years. "After identifying some of the key aspects of the operation of the Mutriku facilities a few years ago, we have now developed a methodology that allows us to find out the impact of climate change on the output at Mutriku. We have used it to reconstruct the daily electrical power that would have been generated if the Mutriku wave farm had been operational during the entire 1979-2019 period, and this will help us to predict what might happen in the future," explained Ibarra.

The researcher affirmed that "we have found that there has been a growing trend in the strength of the waves in the Bay of Biscay as a result of climate change, from 1900 to the present day. The aim was therefore to analyse how the Mutriku facilities responded to this trend. In this respect, while taking into account the evolution of the waves over the last four decades, we developed a methodology that allows us to determine how this increase may affect generation at Mutriku".

Wave energy increases, but not electricity output

The research group found that "in the Mutriku area this upward trend in wave energy is not as high as in other areas of the Bay of Biscay, and that this trend would be dampened and electricity output would remain constant at the Mutriku facilities as a result of the way they function, their regulation system". It follows that the energy flow levels off above a certain threshold and is therefore more stable than the wave energy flow; consequently, they determined that moderate long-term changes in wave energy cannot directly affect wave power installations consisting of oscillating water columns. In Ibarra's opinion, much stronger waves would be needed to increase electricity production.

In the study, they identified ten main types of sea state with which a distinctive pattern of electrical power generation has been associated on a daily scale. This has allowed them to reconstruct the daily electrical power that would have been generated if the Mutriku wave farm had been operational during the entire 1979-2019 period and, consequently, to assess the impact that the observed changes in the wave climate and the associated energy flow would have had on electrical power output.

So, "the next step is to consider the future that climate change will bring and make a forecast; we believe that this upward trend will continue and we want to see, firstly, whether this trend will be on a large or small scale, and secondly, what impact this will have in the future, over the coming decades, on output at Mutriku. All the research carried out at the Mutriku facilities is hugely useful in advancing this type of technology, as it is the only facility in the world that supplies energy continuously to the grid", said Ibarra. The research has therefore shown that highly reliable feasibility and economic studies of wave power facilities can be carried out, as the future uncertainties of the resource itself will not have a significant impact on the electrical power performance of the installations throughout their life cycles.

Credit: 
University of the Basque Country

Changing consumption of certain fatty acids can lessen severity of headaches

image: Daisy Zamora, PhD

Image: 
UNC School of Medicine

CHAPEL HILL, NC - Migraine is one of the largest causes of disability in the world. Existing treatments are often not enough to offer full relief for patients. A new study published in The BMJ demonstrates an additional option patients can use in their effort to experience fewer migraines and headaches - a change in diet.

"Our ancestors ate very different amounts and types of fats compared to our modern diets," said co-first author Daisy Zamora, PhD, assistant professor in the UNC Department of Psychiatry in the UNC School of Medicine. "Polyunsaturated fatty acids, which our bodies do not produce, have increased substantially in our diet due to the addition of oils such as corn, soybean and cottonseed to many processed foods like chips, crackers and granola."

The classes of polyunsaturated fatty acids examined in this study are omega-6 (n-6) and omega-3 (n-3). Both have important functions within our body, but need to be in balance, as n-3 fatty acids have been shown to decrease inflammation and some derivatives of n-6 have been shown to promote pain. However, due to the amount of processed food consumed today, most people in the U.S. are eating substantially more n-6 and fewer n-3 fatty acids.

To see whether the amount of these fatty acids in a person's diet could impact pain from headaches, 182 patients currently diagnosed with and seeking treatment for migraines were enrolled in this randomized, controlled trial, led by Doug Mann, MD, professor of Neurology and Internal Medicine in the UNC School of Medicine. In addition to their current treatments, patients adhered to one of three diets for 16 weeks: a control diet that maintained the average amount of n-6 and n-3 fatty acids that a person living in the U.S. consumes, a diet that increased n-3 and maintained n-6 fatty acids, and a diet that increased n-3 and decreased n-6 fatty acids. Participants were provided with 2/3 of their daily food requirements, and were also given an electronic diary to record how many hours each day they had headache pain.

"Participants seemed highly motivated to follow these diets due to the amount of pain they were experiencing," said Beth MacIntosh, MPH, RD, clinical nutrition manager for the UNC Metabolic and Nutrition Research Core.

"The results are quite promising," Zamora said. "Patients who followed either diet experienced less pain than the control group. Those who followed the diet high in n-3 and low in n-6 fatty acids experienced the biggest improvement."

Participants reported fewer days a month with headaches, and some were able to decrease the amount of medication they needed for their pain. However, participants did not report a change in quality of life.

"I think this modification in diet could be impactful," Zamora said. "The effect we saw for the reduction of headaches is similar to what we see with some medications. The caveat is that even though participants did report fewer headaches, some people did not change their perception of how headaches affected them."

"This study specifically tested n-3 fatty acids from fish and not from dietary supplements," said study co-author Keturah Faurot, assistant professor of Physical Medicine and Rehabilitation, and assistant director of the Program on Integrative Medicine. "Our findings do not apply to supplement use."

Zamora says the biochemical hypothesis of how certain fatty acids affect pain applies to a wide variety of chronic pain. She and her colleagues are currently working on a new study to test diet modification in other pain syndromes.

Credit: 
University of North Carolina Health Care

Air pollution during pregnancy may affect growth of newborn babies

image: Air pollution affects the thyroid glands, and thyroid hormones are essential for regulating foetal growth and metabolism and play an important role in neurological development.

Image: 
Bernat Alberdi

According to studies in recent years, air pollution affects the thyroid. Thyroid hormones are essential for regulating foetal growth and metabolism, and play an important role in neurological development. Thyroxine (T4) is the main thyroid hormone that is circulating and the thyroid-stimulating hormone is TSH. At 48 hours newborn babies undergo a heel prick test in which thyroxine and TSH levels in the blood are measured. In fact, if the balance of these thyroid hormones is not right, the risk of developing serious diseases increases. That is why, "this study set out to analyse the relationship between atmospheric pollution during pregnancy and the level of thyroxine in the newborn", explained Amaia Irizar-Loibide, a researcher in the UPV/EHU's Department of Preventive Medicine and Public Health.

Nitrogen dioxide (NO2) and fine particulate matter less than 2.5 micra in diameter (PM2.5) are two of the main pollutants related to air pollution and vehicle traffic. PM2.5 particles for example are very fine and easily enter the respiratory tract. "In this work we specifically analysed the effect of maternal exposure to these fine particles and to nitrogen dioxide during pregnancy and the link existing with thyroxine levels in newborn babies. We have been monitoring on a weekly basis, as the development of the foetus varies greatly from one week to the next. So we tried to conduct the most detailed research possible in order to find out which the most sensitive weeks of pregnancy are", added the UPV/EHU researcher.

So the sample of the INMA (Environment and Childhood) project in Gipuzkoa was analysed. Data on the air pollutants PM2.5 and NO2, data on TSH and T4 levels from neonatal heels, etc. collected in the project were also used.

According to Amaia Irizar, "the results obtained in this study have revealed the direct relationship between exposure to fine particles during pregnancy and the level of thyroxine in newborns. However, we have not observed a clear link with exposure to nitrogen dioxide". These results therefore coincide with the limited previous research. "What we have seen in this work," stressed Irizar, "is that exposure during the first months of pregnancy has a direct influence on the balance of thyroid hormones. These babies tend to have a lower level of thyroxine. As the pregnancy progresses, we found that this relationship gradually diminishes, i.e. the mother's exposure gradually becomes less important. In late pregnancy, however, this link becomes apparent again, but displays an opposite effect: as the concentration of these fine particles increases, we have seen that the level of thyroid hormones also increases, which has the opposite effect on the balance". "It is not clear what mechanism lies behind all this. In any case, we have come to the conclusion that the most sensitive periods of pregnancy in terms of atmospheric pollution are the early and late months," the UPV/EHU researcher stressed.

"The next task would be to study the mechanisms by which these fine particles cause opposing effects in early and late pregnancy. In fact, these particles are nothing more than small spheres made up of carbon, and it is not clear whether the effect these spheres exert is because they pass from the placenta to the baby, whether other components attached to the particles are released once they have entered the body...," she explained. "We need to continue to investigate whether exposure during pregnancy affects not only thyroid hormones, but also other aspects such as neuropsychological development, growth, obesity, etc.," explained Amaia Irizar.

Credit: 
University of the Basque Country

How children integrate information

image: When children learn words, they combine different -- sometimes even conflicting -- sources of information.

Image: 
123RF | Olga Yastremska

"We know that children use a lot of different information sources in their social environment, including their own knowledge, to learn new words. But the picture that emerges from the existing research is that children have a bag of tricks that they can use", says Manuel Bohn, a researcher at the Max Planck Institute for Evolutionary Anthropology.

For example, if you show a child an object they already know - say a cup - as well as an object they have never seen before, the child will usually think that a word they never heard before belongs with the new object. Why? Children use information in the form of their existing knowledge of words (the thing you drink out of is called a "cup") to infer that the object that doesn't have a name goes with the name that doesn't have an object. Other information comes from the social context: children remember past interactions with a speaker to find out what they are likely to talk about next.

"But in the real world, children learn words in complex social settings in which more than just one type of information is available. They have to use their knowledge of words while interacting with a speaker. Word learning always requires integrating multiple, different information sources", Bohn continues. An open question is how children combine different, sometimes even conflicting, sources of information.

Predictions by a computer program

In a new study, a team of researchers from the Max Planck Institute for Evolutionary Anthropology, MIT, and Stanford University takes on this issue. In a first step, they conducted a series of experiments to measure children's sensitivity to different information sources. Next, they formulated a computational cognitive model which details the way that this information is integrated.

"You can think of this model as a little computer program. We input children's sensitivity to different information, which we measure in separate experiments, and then the program simulates what should happen if those information sources are combined in a rational way. The model spits out predictions for what should happen in hypothetical new situations in which these information sources are all available", explains Michael Henry Tessler, one of the lead-authors of the study.

In a final step, the researchers turned these hypothetical situations into real experiments. They collected data with two- to five-year-old children to test how well the predictions from the model line up with real-world data. Bohn sums up the results: "It is remarkable how well the rational model of information integration predicted children's actual behavior in these new situations. It tells us we are on the right track in understanding from a mathematical perspective how children learn language."

Language learning as a social inference problem

How does the model work? The algorithm that processes the different information sources and integrates them is inspired by decades of research in philosophy, developmental psychology, and linguistics. At its heart, the model looks at language learning as a social inference problem, in which the child tries to find out what the speaker means - what their intention is. The different information sources are all systematically related to this underlying intention, which provides a natural way of integrating them.

Additionally, the model also specifies what changes as children get older. Over development, children become more sensitive to the individual information sources, and yet the social reasoning process that integrates the information sources remains the same.

"The virtue of computational modeling is that you can articulate a range of alternative hypotheses - alternative models - with different internal wiring to test if other theories would make equally good or better predictions. In some of these alternatives, we assumed that children ignore some of the information sources. In others, we assumed that the way in which children integrate the different information sources changes with age. None of these alternative models provided a better explanation of children's behavior than the rational integration model", explains Tessler.

The study offers several exciting and thought-provoking results that inform our understanding of how children learn language. Beyond that, it opens up a new, interdisciplinary way of doing research. "Our goal was to put formal models in a direct dialogue with experimental data. These two approaches have been largely separated in child development research", says Manuel Bohn. The next steps in this research program will be to test the robustness of this theoretical model. To do so, the team is currently working on experiments that involve a new set of information sources to be integrated.

Credit: 
Max Planck Institute for Evolutionary Anthropology

Spatial patterns of gene transcripts captured across single cells of mouse embryo

image: A depiction of how sci-Space captures the personalities and localities of individual cells as they come together to form organs in a developing mouse embryo.

Image: 
Nigel Sussman

A new technique called sci-Space, combined with data from other technologies, could lead to four-dimensional atlases of gene expression across diverse cells during embryonic development of mammals.

Such atlases would map how the gene transcripts in individual cells reflect the passage of time, cell lineages, cell migration, and location on the developing embryo. They would also help illuminate the spatial regulation of gene expression.

Mammalian embryonic development is a remarkable phenomenon: a fertilized egg divides repeatedly and turns, in a matter of weeks or months, into a complex organism capable of a myriad of physiological processes and composed of a variety of cells, tissues, organs, anatomical structures.

A better understanding of how mammals form before birth -- particularly the prenatal spatial patterns of gene expression at a single-cell level during embryonic development -- could advance biomedical and veterinary research on a variety of conditions. These range from inherited disorders to congenital malformations and developmental delays. Understanding how organs originate might also assist future regenerative medicine efforts.

An international team led by scientists at UW Medicine, Howard Hughes Medical Institute and the Brotman Baty Institute for Precision Medicine in Seattle demonstrated the proof-of-concept of their sci-Space technique in mouse embryos.

Their results are published in the July 2 edition of Science. The lead authors are Sanjay R. Srivatsan of the Department of Genome Sciences at the University of Washington School of Medicine, and Mary C. Regier of the UW Department of Bioengineering.

The senior authors are Jay Shendure, UW Medicine professor of genome sciences, and director of the Brotman Baty Institute, and an investigator at the Allan Discovery Center for Cell Lineage Tracing; Kelly R. Stevens, UW assistant professor of bioengineering; and Cole Trapnell, associate professor of genome sciences. Regier and Stevens are also investigators at the UW Medicine Institute for Stem Cell and Regenerative Medicine Research.

The researchers observed the orchestration of genes in 120,000 cell nuclei. All the body's somatic cells contain the same DNA code. The researchers captured information on which genes were turned on or off in these nuclei as mouse embryos took shape. The scientists also investigated how cells' locations in an embryo affected which genes were activated during development.

This technique builds on previous work in which these scientists and other groups developed ways of conducting whole-organism profiling of gene expression and DNA-code accessibility, in thousands of single cells, during embryonic development. They did so to track the emergence and trajectory of various cell types.

How cells are organized spatially - what physical positions they take as an embryo forms - is critical to normal development. Misplacements, disruptions, or cells not showing at the right time in the right spot can cause serious problems or even prenatal death.

However, gaining knowledge on spatial patterns of gene expression has been technically difficult. It has been unwieldy to assay gene transcripts of individual cells over wide swaths of the embryo. This limited the scientific understanding of how spatial organization influences gene expression and, consequently, why which cell types form where, or how neighboring groups of cells influence each other's future roles.

The scientists on the present study had earlier developed a method to label cell nuclei, a technique they called sci-Plex. They then went on to index single-cell RNA sequencing, with a method called sci-RNA-sequencing.

Now, with sci-Space, by analyzing spatial coordinates and cell gene transcripts the scientists identified thousands of genes whose expression was anatomically patterned. For example, certain genetic profiles emerged in neurons in the brain and spinal cord and others in cardiac muscle cells in the heart.

The scientists also used spatial and gene profile information to annotate subtypes of cells. For example, while both blood vessel cells and heart muscle might both express the gene for a particular growth factor, only the heart muscle cells produced certain growth factor receptors.

The researchers also observed that cell types varied greatly in the extent of their spatial patterning of gene expression. For example, connective tissue progenitor cells showed a relatively large proportion of spatially restricted gene expression. This observation suggests that subtypes of these cells behave in a position-dependent manner throughout the body.

To measure the power of spatial position on a cell type's gene transcript profile, the researchers also calculated the physical distance between cells and the angular distance of their gene expression profiles.

"For many cell types, as the physical distance between cells increased, so did the angular distance between their transcriptomes," the researchers noted in their paper. However, they added that this trend varied considerably. It was most pronounced in certain brain and spinal cord cells.

The genetic transcript profiles of some other cell types were highly influenced by their position in the developing embryo. Among these are certain cartilage cells, which become part of the scaffolding for bones of the head and face.

The researchers also studied gene expression dynamics that took place as part of brain cell differentiation and migration during mouse embryonic development. The researchers examined how various brain cell trajectories were anatomically distributed. The researchers did so by using the Allen Institute's Anatomical Reference Brain Atlas as a guide.

"Cells from each trajectory overwhelmingly occupied distinct brain regions," the researchers noted. They also observed gradients of developmental maturity in different regions of the brain. These gradients revealed both known and new patterns of migration.

In the future, the researchers hope sci-Space will be further applied to serial sections that span the entire mouse embryo and that cover many points of time.

Credit: 
University of Washington School of Medicine/UW Medicine

Unfinding a split electron

image: Printed circuit board for mounting the nanowire sample.

Image: 
IST Austria

Quantum computers promise great advances in many fields - from cryptography to the simulation of protein folding. Yet, which physical system works best to build the underlying quantum bits is still an open question. Unlike regular bits in your computer, these so-called qubits cannot only take the values 0 and 1, but also mixtures of the two. While this potentially makes them very useful, they also become very unstable.

One approach to solve this problem bets on topological qubits that encode the information in their spatial arrangement. That could provide a more stable and error-resistant basis for computation than other setups. The problem is that no one has ever definitely found a topological qubit yet.

An international team of researchers from Austria, Copenhagen, and Madrid around Marco Valentini from the Nanoelectronics group at IST Austria now have examined a setup which was predicted to produce the so-called Majorana zero modes - the core ingredient for a topological qubit. They found that a valid signal for such modes can in fact be a false flag.

Half of an Electron

The experimental setup is composed of a tiny wire just some hundred nanometers - some millionths of a millimeter - long, grown by Peter Krogstrup from Microsoft Quantum and University of Copenhagen. These appropriately-called nanowires form a free-floating connection between two metal conductors on a chip. They are coated with a superconducting material that loses all electrical resistance at very low temperatures. The coating goes all the way up to a tiny part left at one end of the wire, which forms a crucial part of the setup: the junction. The whole contraption is then exposed to a magnetic field.

The scientists' theories predicted that Majorana zero modes - the basis for the topological qubit they were looking for - should appear in the nanowire. These Majorana zero modes are a strange phenomenon, because they started out as a mathematical trick to describe one electron in the wire as composed of two halves. Usually, physicists do not think of electrons as something that can be split, but using this nanowire setup it should have been possible so separate these "half-electrons" and to use them as qubits.

"We were excited to work on this very promising material platform," explains Marco Valentini, who joined IST Austria as an intern before becoming a PhD student in the Nanoelectronics group. "What we expected to see was the signal of Majorana zero modes in the nanowire, but we found nothing. First, we were confused, then frustrated. Eventually, and in close collaboration with our colleagues from the Theory of Quantum Materials and Solid State Quantum Technologies group in Madrid, we examined the setup, and found out what was wrong with it."

A False Flag

After attempting to find the signatures of the Majorana zero modes, the researchers began to vary the nanowire setup to check whether any effects from its architecture were disturbing their experiment. "We did several experiments on different setups to find out what was going wrong," Valentini explains. "It took us a while, but when we doubled the length of the uncoated junction from a hundred nanometers to two hundred, we found our culprit."

When the junction was big enough the following happened: The exposed inner nanowire formed a so-called quantum dot - a tiny speck of matter that shows special quantum mechanical properties due to its confined geometry. The electrons in this quantum dot could then interact with the ones in the coating superconductor next to it, and by that mimic the signal of the "half-electrons" - the Majorana zero modes - which the scientists were looking for.

"This unexpected conclusion came after we established the theoretical model of how the quantum dot interacts with the superconductor in a magnetic field and compared the experimental data with detailed simulations performed by Fernando Peñaranda, a PhD student in the Madrid team," says Valentini.

"Mistaking this mimicking signal for a Majorana zero mode shows us how careful we have to be in our experiments and in our conclusions," Valentini cautions. "While this may seem like a step back in the search for Majorana zero modes, it actually is a crucial step forward in understanding nanowires and their experimental signals. This finding shows that the cycle of discovery and critical examination among international peers is central to the advancement of scientific knowledge."

Credit: 
Institute of Science and Technology Austria

Beam steering angle expander with two liquid crystal polymeric diffractive optical elements

image: Illustration of a planar telescope consisting of two layers of flat optics for achieving angle magnification. Both layers are assigned phase profiles following the sum of even order polynomials and they are separated in space by d.

Image: 
by Ziqian He, Kun Yin, and Shin-Tson Wu

Flat optics based on patterned liquid crystals (LCs) has recently received extensive research interest. Comparing with dielectric metasurfaces which are usually fabricated by sophisticated lithography process, LC polymer-based planar optics, owing to the self-assembly properties, can be fabricated through all-solution process. During the past decades, a variety of planar optical devices have been demonstrated based on geometric phase (also termed as Pancharatnum-Berry phase) manipulation. The total effective thickness of the device, including the underlying liquid crystal alignment layer and the liquid crystal polymer, is usually in the order of 1 μm. Amazingly, commercial-quality transmissive lenses, gratings, optical vortex processors, etc., have been developed in the past few years. Engineering of their operating spectral/angular bands has been illustrated in both passive and active devices. For example, a multi-twist structure can be designed to customize the spectral/angular bandwidth as a passive means, while active devices that can respond to external stimuli such as mechanical stress, electric field, and light, have also been realized. Nevertheless, the existing explorations have been focused on optical functionalities that can be fulfilled by a single-layer device. One way to go beyond the current limit is to design cascaded flat optics, where more degree of freedom is involved and thereby more distinct functionalities can be rationally achieved. In the meantime, the cascaded optical elements should still preserve the advantages such as high efficiency, compact and lightweight, easy processing, flexibility, and low cost.

In a new paper published in Light Science & Application, a team of scientists, led by Prof. Shin-Tson Wu from the College of Optics and Photonics, University of Central Florida, USA, proposed a cascaded LC flat optical element, termed miniature planar telescope, to achieve steering angle magnification independent of the incident beam position. Such an angle magnification function cannot be achieved with a single layer optical device such as a grating or a refractive surface. This miniature planar telescope consists of two flat optical elements, as schematically shown in Fig. 1. Both layers are assigned phase profiles following the sum of even order polynomials and they are separated in space by d. Through ray-tracing simulations, the system can be optimized according to specific aperture size and incident angle range, and nearly diffraction-limited performance can be obtained.

In experiments, different LC diffractive devices in millimeter sizes with various f/# were fabricated through all-solution processing and assembled into two telescope modules with designed magnification factors of 1.67 (module I) and 2.75 (module II), respectively. The measured magnification agreed well with the designed values. Moreover, a reasonably high efficiency (>89.8% for module I and >84.6% for module II) was achieved within the designed incident angle range. Through error analysis, the efficiency could be improved by optimizing the fabrication process. The team demonstrated that the telescope module can be a promising candidate for non-mechanical beam steering to expand the currently limited steering range (also known as field of regard). For example, for LiDAR (light detection and ranging) applications at λ=905 nm, a maximum output angle range of ±27° can be expected. Comparing to a high-efficiency optical phase array (most matured electronic beam steerer) with an incident field range of ~±5°, a magnification of 5.4 can be acquired. For a longer operating wavelength, say λ=1550 nm, the steering range can be expanded to about ±37°, corresponding to a magnification of 7.4. In this regard, the team also characterized the output beam profile to ensure the high quality of the telescope modules and the compatibility with high-end beam steerers.

With the presented work, Wu and co-workers demonstrated lightweight, cost effective, miniature planar telescopes for optical angle magnification based on LC polymer flat optics. High efficiency, designable magnification factors, and excellent beam quality make the proposed telescopes highly promising for practical applications requiring advanced laser beam steering technology. More importantly, this is a new milestone for planar LC optics to go beyond its current development.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

A crystal made of electrons

image: A Wigner crystal of electrons (red) inside a semiconductor material (blue/grey).

Image: 
ETH Zurich

Crystals have fascinated people through the ages. Who hasn't admired the complex patterns of a snowflake at some point, or the perfectly symmetrical surfaces of a rock crystal The magic doesn't stop even if one knows that all this results from a simple interplay of attraction and repulsion between atoms and electrons. A team of researchers led by Atac Imamoglu, professor at the Institute for Quantum Electronics at ETH Zurich, have now produced a very special crystal. Unlike normal crystals, it consists exclusively of electrons. In doing so, they have confirmed a theoretical prediction that was made almost ninety years ago and which has since been regarded as a kind of holy grail of condensed matter physics. Their results were recently published in the scientific journal "Nature".

A decades-old prediction

"What got us excited about this problem is its simplicity", says Imamoglu. Already in 1934 Eugene Wigner, one of the founders of the theory of symmetries in quantum mechanics, showed that electrons in a material could theoretically arrange themselves in regular, crystal-like patterns because of their mutual electrical repulsion. The reasoning behind this is quite simple: if the energy of the electrical repulsion between the electrons is larger than their motional energy, they will arrange themselves in such a way that their total energy is as small as possible.

For several decades, however, this prediction remained purely theoretical, as those "Wigner crystals" can only form under extreme conditions such as low temperatures and a very small number of free electrons in the material. This is in part because electrons are many thousands of times lighter than atoms, which means that their motional energy in a regular arrangement is typically much larger than the electrostatic energy due to the interaction between the electrons.

Electrons in a plane

To overcome those obstacles, Imamoglu and his collaborators chose a wafer-thin layer of the semiconductor material molybdenum diselenide that is just one atom thick and in which, therefore, electrons can only move in a plane. The researchers could vary the number of free electrons by applying a voltage to two transparent graphene electrodes, between which the semiconductor is sandwiched. According to theoretical considerations the electrical properties of molybdenum diselenide should favour the formation of a Wigner crystal - provided that the whole apparatus is cooled down to a few degrees above the absolute zero of minus 273.15 degrees Celsius.

However, just producing a Wigner crystal is not quite enough. "The next problem was to demonstrate that we actually had Wigner crystals in our apparatus", says Tomasz Smolenski, who is the lead author of the publication and works as a postdoc in Imamoglu's laboratory. The separation between the electrons was calculated to be around 20 nanometres, or roughly thirty times smaller than the wavelength of visible light and hence impossible to resolve even with the best microscopes.

Detection through excitons

Using a trick, the physicists managed to make the regular arrangement of the electrons visible despite that small separation in the crystal lattice. To do so, they used light of a particular frequency to excite so-called excitons in the semiconductor layer. Excitons are pairs of electrons and "holes" that result from a missing electron in an energy level of the material. The precise light frequency for the creation of such excitons and the speed at which they move depend both on the properties of the material and on the interaction with other electrons in the material - with a Wigner crystal, for instance.

The periodic arrangement of the electrons in the crystal gives rise to an effect that can sometimes be seen on television. When a bicycle or a car goes faster and faster, above a certain velocity the wheels appear to stand still and then to turn in the opposite direction. This is because the camera takes a snapshot of the wheel every 40 milliseconds. If in that time the regularly spaced spokes of the wheel have moved by exactly the distance between the spokes, the wheel seems not to turn anymore. Similarly, in the presence of a Wigner crystal, moving excitons appear stationary provided they are moving at a certain velocity determined by the separation of the electrons in the crystal lattice.

First direct observation

"A group of theoretical physicists led by Eugene Demler of Harvard University, who is moving to ETH this year, had calculated theoretically how that effect should show up in the observed excitation frequencies of the excitons - and that's exactly what we observed in the lab", Imamoglu says. In contrast to previous experiments based on planar semiconductors, in which Wigner crystals were observed indirectly through current measurements, this is a direct confirmation of the regular arrangement of the electrons in the crystal. In the future, with their new method Imamoglu and his colleagues hope to investigate exactly how Wigner crystals form out of a disordered "liquid" of electrons.

Credit: 
ETH Zurich

Healthcare professionals are failing smell loss patients

People who have lost their sense of smell are being failed by healthcare professionals, new research has revealed.

A study by Newcastle University, University of East Anglia and charity Fifth Sense, shows poor levels of understanding and care from GPs and specialists about smell and taste loss in patients.

This is an issue that has particularly come to the forefront during the Covid-19 pandemic as many people who have contracted the virus report a loss of taste and smell as their main symptoms.

Around one in 10 people who experience smell loss as a result of Covid-19 report that their sense of smell has not returned to normal four weeks after falling ill.

The study, published in the journal Clinical Otolaryngology, highlights the difficulties that people with smell and taste disorders experience in accessing treatment.

The research team say that identifying these barriers is vital to help people have better access to healthcare.

More resources needed

Dr Stephen Ball, from Newcastle University's Faculty of Medical Sciences, who led the study, said: "This research highlights that a greater focus needs to be dedicated to patients with smell or taste loss.

"When you contrast the healthcare services funded and available for people with loss of other senses - such as vision or hearing the differences are vast. Our results show this exists for patients through both primary and secondary care.

"More attention and resources need to be provided for this group of patients that has increased significantly following the Covid-19 pandemic."

More than 600 smell loss patients took part in a survey which captured their poor experiences of accessing healthcare.

The survey highlights poor levels of understanding from many GPs and consultants (both in Neurology and Ear, Nose and Throat departments) about the impact of smell and taste disorders on patients.

Over 60 per cent of patients reported suffering from anxiety or depression since their smell loss. And almost all of the patients - 98 per cent - said their quality of life has been affected.

Only around 20 per cent of patients reported that that they had experienced an improvement in their symptoms following treatment.

Professor Carl Philpott, from the University of East Anglia's Norwich Medical School, said: "Before the pandemic, smell disorders affected around five per cent of the population. But the huge rise in smell loss caused by Covid-19 has created an unprecedented worldwide demand for treatment.

"Smell disorders cause people to lose their sense of smell or change the way they perceive odours. Some people perceive smells that aren't there at all.

"There are many causes for smell loss - from infections and injury to neurological diseases such as Alzheimer's and as a side-effect of some medications.

"Our research shows that there is an unmet need for smell loss patients in accessing healthcare and a clear need to improve training within healthcare to remove the barriers faced."

Quality of life impacted

This research was conducted before the Covid-19 pandemic, but it identifies many areas that were consistent across multiple experiences.

Further issues reported included repeated ineffective treatments, difficulties getting referrals for further care, and an average personal cost of £421 seeking advice and treatment.

Duncan Boak, Founder and Chair of Fifth Sense, a charity for people affected by smell and taste disorders, said: "Smell disorders can have a huge impact on people's quality of life in many ways.

"An important part of Fifth Sense's work is giving our beneficiaries a voice and the opportunity to change the way society understands smell and taste disorders, whether through volunteering or participating in research studies like this one.

"The results of this study will be a big help in our ongoing work to improve the lives of those affected."

Credit: 
Newcastle University

Advances in optical engineering for future telescopes

image: Conceptual rendered image of the OASIS space observatory with a 20-metrer diameter inflatable primary aperture.

Image: 
Opto-Electronic Advances

In a new publication from Opto-Electronic Advances; DOI 10.29026/oea.2021.210040, Researchers led by Professor Daewook Kim from The University of Arizona, Tucson, AZ, USA consider advances in optical engineering for future telescopes.

Astronomical advances are largely coupled with technological improvements - from the invention of the first optical telescope used by Galileo in 1609 and for the foreseeable future, astronomy and optical engineering will be forever linked. This paper summarizes several advances that will enable future telescopes to expand scientific understanding of the universe. Significant optical engineering advances at the University of Arizona are being made for design, fabrication, and construction of next generation astronomical telescopes. This paper focuses on the technological advances in four key areas:

Optical fabrication techniques used for constructing next-generation telescope mirrors.

Advances in ground-based telescope alignment control and instrumentation, including laser truss-based active alignment of the Large Binocular Telescope (LBT) prime focus camera and the MOBIUS (Mask-Oriented Breadboard Implementation for Unscrambling Spectra) cross-dispersion spectroscopy unit used at the prime focal plane of the LBT.

Topological pupil segment optimization.

Future space telescope concepts and enabling technologies. Namely, the Nautilus space observatory requires precise alignment of segmented, multi-order diffractive optical elements. The OASIS (Orbiting Astronomical Satellite for Investigating Stellar Systems) terahertz space telescope presents unique challenges for characterizing the sag of the inflatable primary mirror. The Hyperion space telescope pushes the limits of high spectral resolution, far-UV spectroscopy. The CDEEP (Coronagraphic Debris and Exoplanet Exploring Pioneer) is a SmallSat mission concept for high-contrast imaging of circumstellar disks and exoplanets using a vector vortex coronagraph. These advances in optical engineering technologies will help mankind to survey, explore, and understand the scientific beauty of our universe.

A diverse selection of ground-based and space-based future telescope technologies are actively being conceptualized, designed, prototyped, and demonstrated at the University of Arizona. Associate Professor Daewook Kim has been leading the Large Optics Fabrication and Testing (LOFT) group of researchers, who investigate freeform optical system design, highly-aspheric optical surface figure manufacturing challenges, and dynamic metrology system developments. Professor Kim and his LOFT peers have published more than 150 publications as of 2021.

Computer Controlled Optical Surfacing (CCOS) process enhancements by the LOFT group enable efficient production of future optical elements. New engineering technologies will upgrade and expand the capabilities of existing large ground-based or space-based telescopes. This suite of optical technologies serves the next generation of astronomical investigations by offering novel and practical approaches that the wider design and engineering community will benefit from. It is Professor Kim's hope that these contributions in design and instrumentation will not only provide new benchmarks for modern astronomy but will also precipitate the next great insights and questions about our universe.

Professor Kim continues to coordinate with various flagship ground-based and space-based telescope missions using existing infrastructure and is working in close collaboration with the facilities' directors, staff, and scientists. He considers it essential to maintain the LOFT group's international contribution and academic service to the optics community for the next generation of advanced optical system design, manufacturing/testing, and engineering by continuously researching, developing, and applying innovative and advanced optical technologies.

Credit: 
Compuscript Ltd

Good food in a nice setting: wild bees need diverse agricultural landscapes

image: Solitary wild bee on an oilseed rape flower

Image: 
N Beyer

Mass-flowering crops such as oilseed rape or faba bean (also known as broad bean) provide valuable sources of food for bees, which, in turn, contribute to the pollination of both the crops and nearby wild plants when they visit. But not every arable crop that produces flowers is visited by the same bees. A team from the University of Göttingen and the Julius Kühn Institute (JKI) in Braunschweig has investigated how the habitat diversity of the agricultural landscape and the cultivation of different mass-flowering crops affect wild bees. The research shows that diverse agricultural landscapes increase the species richness of wild bees. Flowering arable crops with different flower shapes support different wild bee species. The results of the study have been published in Landscape Ecology.

The research team recorded wild bees in flower-rich, semi-natural habitats such as hedgerows and flower strips in a total of 30 different agricultural landscapes, each covering one square kilometre, near Göttingen, Itzehoe and Leipzig. Researchers counted the number of bees along standardised sections and used a hand net to catch them and identify the species. The landscapes used in the study differed in their diversity and in the proportion of land covered by rapeseed and faba beans.

"The shape of the flower is an important criterion for determining which wild bee species will collect nectar from its flowers," says PhD student Felix Kirsch from the University of Göttingen, who conducted the study as part of his Master's thesis. "For example, the shape of the flower must fit the bee's body size and the length of its tongue. Nectar is easily accessible from rapeseed flowers, while the nectar of faba bean is hidden deep inside the flowers."

"Our study shows that faba beans promote social wild bees, especially long-tongued bumblebees," explains Dr Doreen Gabriel from the JKI in Braunschweig. A different picture emerged in landscapes with large amounts of oilseed rape: here, the study found that the proportion of solitary wild bees, which often have a smaller body size, was higher. "The cultivation of a certain mass-flowering crop is not sufficient to maintain diverse bee communities, which in turn ensure the pollination success of many flowering arable crops and wild plants," says first author Nicole Beyer, a postdoctoral researcher in the Functional Agrobiodiversity Department at Göttingen University. The head of the department, Professor Catrin Westphal, concludes: "Our results show convincingly that diverse, flowering arable crops and especially diverse semi-natural habitats in the agricultural landscape are necessary to support a broad range of wild bee species."

Credit: 
University of Göttingen

New ternary hydrides of lanthanum and yttrium join the ranks of high-temperature superconductors

A team led by Skoltech professor Artem R. Oganov studied the structure and properties of ternary hydrides of lanthanum and yttrium and showed that alloying is an effective strategy for stabilizing otherwise unstable phases YH10 and LaH6, expected to be high-temperature superconductors. The research was published in the journal Materials Today.

Cuprates had long remained record-setters for high-temperature superconductivity until H3S was predicted in 2014. This unusual sulfur hydride was estimated to have high-temperature superconductivity at 191-204 K and was later obtained experimentally, setting a new record in superconductivity.

Following this discovery, many scientists turned to superhydrides, which are abnormally rich in hydrogen, and discovered new compounds that became superconducting at even higher temperatures: LaH10 (predicted and then experimentally shown to have superconductivity at 250-260 K at 2 million atmospheres) and YH10 (predicted to be an even higher temperature superconductor). Despite the similarity between yttrium and lanthanum, YH10 proved to be unstable, and thus far no one has succeeded in synthesizing it in its pure form. Having reached the upper limit of critical temperatures for binary hydrides, chemists turned to ternary hydrides which appear as the most promising path towards still higher temperature superconductivity. Finally in 2020, after over 100 years of research, scientists were able to synthesize the first room-temperature superconductor - a ternary sulfur and carbon hydride ? with a critical temperature of +15 oC.

In their recent work, scientists from Skoltech, the Institute of Crystallography of RAS, and V.L. Ginzburg Center for High-Temperature Superconductivity and Quantum Materials studied ternary hydrides of lanthanum and yttrium with different ratios of these two elements.

"Although lanthanum and yttrium are similar, their hydrides are different: YH6 and LaH10 do exist, while LaH6 and YH10 do not. We found that both structures could be stabilized by adding the other element. For example, LaH6 can be made more stable by adding 30 percent of yttrium, and its critical superconductivity temperature is slightly higher as compared to YH6," professor Oganov says.

In addition, the research has helped to elucidate the general profile of superconductivity in ternary hydrides. "We realized that ternary and quaternary hydrides have progressively less ordered structures and a much greater width of the superconducting transition than binary hydrides. Also, they require more intensive and longer laser heating than their binary counterparts," lead author and Skoltech PhD student Dmitrii Semenok explains.

The scientists believe that the study of ternary hydrides holds much promise for stabilizing unstable compounds and enhancing their superconducting performance.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)