Tech

Oncotarget | SLC25A32 sustains cancer cell proliferation by regulating flavin adenine nucleotide (FAD) metabolism

image: Genetic alterations of SLC25A32 reduce survival of cancer patients. (A) Representation of SLC25A32 genetic alterations across different cancers (www.cbioportal.org). (B) Spearman's rank correlation between SLC25A32 mRNA expression (RSEM TPM) and somatic copy number in breast cancer (1075 sample; P < 0.05), ovarian cancer (300 sample; P = 0.0.05) and liver cancer (364 sample; P = 0.05) in patient samples of TCGA. Each dot represents a tumor sample of one particular patient. The dotted line represents a linear regression line and the blue area around the fitted line shows the 95% confidence intervals. (C) Median overall survival data of ovarian carcinoma patients with SLC25A32 amplification (67 cases) and no amplification (241 cases). Median survival difference between the two groups is statistically significant (P = 0.0435). (D) Median overall survival data obtained from breast carcinoma patients with SLC25A32 amplification (407 cases) and no amplification (1459 cases) are presented. Median survival difference between the two groups is statistically significant (P = 0.0000228)

Image: 
Sven Christian - sven.christian@bayer.com

Oncotarget Volume 11 Issue 8 reported that while it is known that cancer cells require one-carbon and FAD-dependent mitochondrial metabolism to sustain cell proliferation, the role of SLC25A32 in cancer cell growth remains unexplored.

Si RNA-mediated knock-down and CRISPR-mediated knock-out of SLC25A32 in cancer cells of different origins, resulted in the identification of cell lines sensitive and resistant to SLC25A32 inhibition.

Treatment of cells with the FAD precursor riboflavin and with GSH rescues cancer cell proliferation upon SLC25A32 down-regulation.

Dr. Sven Christian from Bayer AG, Drug Discovery, in Berlin Germany said "Altered tumor metabolism is described as a hallmark of tumor biology and is essential for the adaptation of tumor cells to their specific needs, e. g. a higher demand for energy and macromolecules."

"Altered tumor metabolism is described as a hallmark of tumor biology and is essential for the adaptation of tumor cells to their specific needs, e. g. a higher demand for energy and macromolecules."

- Dr. Sven Christian, Bayer AG, Drug Discovery

Due to the glycolytic switch of tumor cells, mitochondrial biology and especially mitochondrial oxidative phosphorylation have been considered of minor importance in cancer biology.

Although the outer mitochondrial membrane was shown to be relatively permeable, the inner mitochondrial membrane is comparatively impermeable and consequently contains several transporter proteins to overcome such a physical barrier.

The SLC25 family consists of 53 members localized at the inner mitochondrial membrane that transport a wide range of molecules involved in essential mitochondrial processes such as redox balance, the urea and citric acid cycles, oxidative phosphorylation, DNA maintenance and iron metabolism.

Uncoupling proteins are transporting protons across the mitochondrial membrane and thus, uncouple the transport from ATP generation.

In support of this, yeast lacking the mitochondrial FAD transporter FLX1, could be rescued by human SLC25A32 expression, suggesting that this transporter may also transport FAD across the inner membrane.

The Christian Research Team concluded in their Oncotarget Research Article that the data suggests that inhibition of SLC25A32 is anti-proliferative in a subset of tumor cell lines, at least partially by an increase of reactive oxygen species as a result of a malfunctional FAD-dependent enzymes such as SDH and that resistant cell line can compensate for the loss by the availability of higher reducing capacities. The study validates the role of SLC25A32 as a novel cancer target involved in the regulation of FAD-dependent mitochondrial metabolism. Molecular targeting of SLC25A32 using a single agent or in combination with ROS-inducing therapies could be an effective clinical strategy to successfully treat cancer patients.

Sign up for free Altmetric alerts about this article

DOI - https://doi.org/10.18632/oncotarget.27486

Full text - http://www.oncotarget.com/index.php?journal=oncotarget&page=article&op=view&path[]=27486&path[]=89920

Correspondence to - Sven Christian - sven.christian@bayer.com

Keywords -
transporter,
mitochondria,
metabolism,
ROS,
FAD

About Oncotarget

Oncotarget is a weekly, peer-reviewed, open access biomedical journal covering research on all aspects of oncology.

To learn more about Oncotarget, please visit http://www.oncotarget.com or connect with:

SoundCloud - https://soundcloud.com/oncotarget
Facebook - https://www.facebook.com/Oncotarget/
Twitter - https://twitter.com/oncotarget
LinkedIn - https://www.linkedin.com/company/oncotarget
Pinterest - https://www.pinterest.com/oncotarget/
Reddit - https://www.reddit.com/user/Oncotarget/

Oncotarget is published by Impact Journals, LLC please visit http://www.ImpactJournals.com or connect with @ImpactJrnls

Media Contact
MEDIA@IMPACTJOURNALS.COM
18009220957x105

Journal

Oncotarget

DOI

10.18632/oncotarget.27486

Credit: 
Impact Journals LLC

Reconfigurable chiral microlaser by spontaneous symmetry breaking

image: This is a microsphere.

Image: 
Xiao Yun-Feng

Coherent light sources are one of the most crucial foundations in both scientific disciplines and advanced applications. As a prominent platform, ultrahigh-Q whispering-gallery mode (WGM) microcavities have witnessed significant developments of novel light sources. However, the intrinsic chiral symmetry of WGMs microcavity geometry and the resulting equivalence between the two directions of laser propagation in a cavity severely limits further applications of microlasers.

Very recently, a team of researchers led by Professor Xiao Yun-Feng and Professor Gong Qihuang at Peking University, in collaboration with Professor Qiu Cheng-Wei at National University of Singapore and Professor Stefan Rotter at Vienna University of Technology, has demonstrated a spontaneously symmetry-broken microlaser in an ultrahigh-Q WGM microcavity, exhibiting reconfigurable propagating directions of the chiral laser. This work has been published online in Nature Communications on February 28, 2019 (DOI: 10.1038/s41467-020-14861-5).

In previous studies, existing solutions for a chiral microlaser mainly resort to explicitly breaking the structure symmetry of a WGM microcavity. Unfortunately, the scalability and reconfigurability of these preceding strategies are strongly limited since the devices, once fabricated, come with a prefixed, non-tailorable laser directionality. In this work, the researchers achieve a reconfigurable chiral microlaser in a symmetric WGM microcavity by utilizing the cavity-enhanced optical Kerr nonlinearity.

"We employed microcavity Raman lasers in the experiment, which generally involve a pair of balanced clockwise (CW) and counterclockwise (CCW) waves," said Cao Qi-Tao, a Ph.D. student at Peking University and one of the co-first authors of this work. The Raman laser waves in the two directions are coupled together through linear surface Rayleigh scattering and nonlinear phase modulation by optical Kerr effect. As the power of the microlaser with a particular phase increases and reaches a threshold, the linear coupling is completely compensated by the nonlinear coupling. Above this threshold, the chiral symmetry of the laser field breaks spontaneously, and the Raman wave evolves randomly into a chiral state with a CW or CCW dominated laser propagation. Experimentally, an unprecedented ratio of counter-propagating emission intensities is obtained exceeding 160:1. Furthermore, the directionality of such the chiral microlaser is all-optically and dynamically controlled by the bias in the pump direction, and the symmetry breaking threshold is adjustable using a nanotip scatterer.

"Our results break the perception boundary of how to realize a reconfigurable coherent light source, to enable a powerful reconfigurability of a laser's directionality and chirality, and to extend a long-ranging impact on on-chip nanophotonics and nonlinear processes," said Professor Xiao. "Such a spontaneously chiral emitting laser also can be extended to various microstructures, and is almost free from the material limit due to the ubiquity of the Kerr nonlinearity."

Credit: 
Peking University

Researchers announce progress in developing an accurate, noninvasive urine test for prostate cancer

Researchers at the Johns Hopkins Kimmel Cancer Center have made significant progress toward development of a simple, noninvasive liquid biopsy test that detects prostate cancer from RNA and other specific metabolic chemicals in the urine.

A description of their findings appears in the Feb. 28 issue of the journal Scientific Reports.

The investigators emphasize that this is a proof-of-principle study for the urine test, and it must be validated in additional, larger studies before it is ready for clinical use.

The researchers used RNA deep-sequencing and mass spectrometry to identify a previously unknown profile of RNAs and dietary byproducts, known as metabolites, among 126 patients and healthy, normal people. The cohort included 64 patients with prostate cancer, 31 with benign prostatic hyperplasia and prostatitis diseases, and 31 healthy people with none of these conditions. RNA alone was not sufficient to positively identify the cancer, but addition of a group of disease-specific metabolites provided separation of cancer from other diseases and healthy people.

"A simple and noninvasive urine test for prostate cancer would be a significant step forward in diagnosis. Tissue biopsies are invasive and notoriously difficult because they often miss cancer cells, and existing tests, such as PSA (prostate-specific antigen) elevation, are not very helpful in identifying cancer," says Ranjan Perera, Ph.D., the study's senior author. Perera is also the director of the Center for RNA Biology at Johns Hopkins All Children's Hospital, a senior scientist at the Johns Hopkins All Children's Cancer & Blood Disorders Institute and the Johns Hopkins All Children's Institute for Fundamental Biomedical Research, and an associate professor of oncology at the Johns Hopkins University School of Medicine and Johns Hopkins Kimmel Cancer Center member.

"We discovered cancer-specific changes in urinary RNAs and metabolites that -- if confirmed in a larger, separate group of patients -- will allow us to develop a urinary test for prostate cancer in the future," says Bongyong Lee, Ph.D., the study's first author and a senior scientist at the Cancer & Blood Disorders Institute.

Credit: 
Johns Hopkins Medicine

Rare disease in children: the key role of a protein revealed

image: Lysosomal degradation efficiency after 30 minutes. For cells with normal CLN3 protein (left), the green markers are degraded, and when the protein is absent (right), degradation is less efficient.

Image: 
Stéphane Lefrançois, INRS

Laval, February 28 2020-- Professor Stéphane Lefrançois, a researcher at the Institut National de la Recherche Scientifique (INRS), is working on Batten disease, a neurodegenerative genetic disease that primarily affects children. His research focuses on the most common form of the disease - Batten CLN 3 - which is caused by mutations in the protein of the same name and for which there is still no cure.

Children affected by Batten disease are born with no symptoms and develop normally, learning to walk, talk, and interact with others. Between 5 and 8 years of age, however, they start to regress. "The first symptom that leads parents to seek medical attention for their child is a loss of vision caused by retinal degeneration. This is followed by cognitive regression characterized by speech and mobility impairment. The life expectancy for people with the disease is usually around 30 years," explains Lefrançois, who has been working on Batten disease for more than ten years.

A key protein

Professor Lefrançois and his team in Laval are delving into the cellular biology of the CLN3 protein, which has been synthesized with the help of its namesake gene, in order to better understand the protein's function and identify therapeutic targets. They recently published findings about a key role played by CLN3 in the Journal of Cell Science. In the absence of the disease, CLN3 ensures a constant supply of proteins to the endosome, an intracellular compartment that serves as a sorting centre for proteins within the cell.

"Under this cellular process, a receptor acts as a truck that carries proteins from the Golgi apparatus, the production factory, to the sorting centre. Thanks to CLN3, this truck normally returns to the Golgi to pick up another load of proteins in an ongoing cycle," the researcher explains. "In the presence of the mutations, however, the truck doesn't make the return trip. Instead, it is redirected to the lysosomes, where it's broken down as cellular waste."

Because the receptor is degraded, the proteins vital to lysosome function can't reach their destination. In consequence, these organelles are no longer able to break down cellular waste, so they accumulate and cause cellular degeneration. "We think that children with the disease develop normally in their early years because their cells compensate by making more trucks. It's possible that the cells can't keep up, so the system becomes dysfunctional and starts to degrade," adds Professor Lefrançois.

Professor Lefrançois is working with a team of European researchers to re-establish normal CLN3 function with a promising drug. The aim is to prevent degradation of the receptor so it can continue carrying proteins.

Worldwide, it is estimated that one person in 100,000 has Batten's disease in all its forms.

Credit: 
Institut national de la recherche scientifique - INRS

Two sides of a coin: Our own immune cells damage the integrity of the blood-brain barrier

image: Microglia express tight junction molecule (CLDN5) to maintain BBB integrity in early phase and molecule for phagocytosis (CD68) to impair the BBB function in late phase of systemic inflammation. Microglia phagocytose astrocyte end feet which is one of the components of BBB.

Image: 
Hiroaki Wake

The blood-brain barrier is a layer of cells that covers the blood vessels in the brain and regulates the entry of molecules from the blood into the brain. Increases in blood-brain barrier "permeability," or the extent to which molecules leak through, are observed in several neurological and psychiatric disorders; therefore, understanding the regulation of blood-brain barrier permeability is crucial for developing better therapies for such disorders.

In a study recently published in Nature Communications, a research team led by Prof. Hiroaki Wake of Nagoya University Graduate School of Medicine shows that microglia -- the resident immune cells of the brain -- initially protect the blood-brain barrier from damage due to "systemic inflammation," a condition of chronic inflammation associated with factors like smoking, ageing, and diabetes, and leading to an increased risk of neurodegenerative disorders. However, these same microglia can change their behavior and increase the blood-brain barrier permeability, thereby damaging it.

"It has long been known that microglia can become activated due to systemic inflammation," remarks Prof. Wake, "so we became interested in the question of whether microglia can regulate blood-brain barrier permeability." To explore this, Prof. Wake's team worked with mice that were genetically engineered to produce fluorescent proteins in the microglia. This "fluorescent labeling" allowed the investigators to use a technique called "two-photon imaging" to study the interactions of microglia and the blood-brain barrier in living mice. The investigators also injected the mice with fluorescent molecules that can pass through the blood-brain barrier only if the barrier is damaged enough to be sufficiently permeable. By observing the locations of these fluorescent molecules and the interactions of microglia, the research team could study microglial interactions with the blood-brain barrier and the permeability of the blood-brain barrier under various conditions.

A key point of interest was the systemic inflammation induced by injecting the mice with an inflammation-inducing substance. Such injections resulted in the movement of microglia to the blood vessels and increased the permeability of the blood-brain barrier within a few days. Then, the microglia initially acted to protect the blood-brain barrier and limit increases in permeability, but as inflammation progressed, the microglia reversed their behavior by attacking the components of the blood-brain barrier, thus increasing the barrier's permeability. The subsequent leakage of molecules into the brain had the potential to cause widespread inflammation in the brain and consequent damage to neurons (cells of the nerves).

These results clearly show that microglia play a dual role in regulating the permeability of the blood-brain barrier. In describing his team's future research objectives, Prof. Wake comments, "We aim to identify therapeutic targets on the microglia for regulating blood-brain barrier permeability, because drugs designed for such targets can be used to treat neurological and psychiatric diseases by curbing inflammatory responses in the brain."

As the scientists note in their study, uncontrolled inflammatory responses in the brain can cause a range of cognitive disorders and adverse neurological effects, and drugs that target microglia may help patients avoid such problems by preserving the integrity of the blood-brain barrier. More studies are required to understand more about the processes underlying the microglial behaviors observed in this study. Nevertheless, the study's results offer hope for the development of therapies that could "force" microglia to promote blood-brain barrier integrity and prevent microglia from transitioning to behaviors that damage the barrier.

Credit: 
Nagoya University

Stress-relief substrate helps OLED stretch two-dimensionally?

image: Photographs of the patterned rigid part of the substrate on the finger joint indicating 2D dimensional stretchability and images of stretchable OLEDs on a finger joint emitting green light.

Image: 
Professor Kyung Cheol Choi, KAIST

Highly functional and free-form displays are critical components to complete the technological prowess of wearable electronics, robotics, and human-machine interfaces.

A KAIST team created stretchable OLEDs (Organic Light-Emitting Diodes) that are compliant and maintain their performance under high-strain deformation. Their stress-relief substrates have a unique structure and utilize pillar arrays to reduce the stress on the active areas of devices when strain is applied.

Traditional intrinsically stretchable OLEDs have commercial limitations due to their low efficiency in the electrical conductivity of the electrodes. In addition, previous geometrically stretchable OLEDs laminated to the elastic substrates with thin film devices lead to different pixel emissions of the devices from different peak sizes of the buckles.

To solve these problems, a research team led by Professor Kyung Cheol Choi designed a stretchable substrate system with surface relief island structures that relieve the stress at the locations of bridges in the devices. Their stretchable OLED devices contained an elastic substrate structure comprising bonded elastic pillars and bridges. A patterned upper substrate with bridges makes the rigid substrate stretchable, while the pillars decentralize the stress on the device.

Although various applications using micropillar arrays have been reported, it has not yet been reported how elastic pillar arrays can affect substrates by relieving the stress applied to those substrates upon stretching. Compared to results using similar layouts with conventional free-standing, flat substrates or island structures, their results with elastic pillar arrays show relatively low stress levels at both the bridges and plates when stretching the devices. They achieved stretchable RGB (red, green, blue) OLEDs and had no difficulties with material selection as practical processes were conducted with stress-relief substrates.

Their stretchable OLEDs were mechanically stable and have two-dimensional stretchability, which is superior to only one-direction stretchable electronics, opening the way for practical applications like wearable electronics and health monitoring systems.

Professor Choi said, "Our substrate design will impart flexibility into electronics technology development including semiconductor and circuit technologies. We look forward this new stretchable OLED lowering the barrier for entering the stretchable display market."

Credit: 
The Korea Advanced Institute of Science and Technology (KAIST)

Asteroid impact enriches certain elements in seawater

Tsukuba City, Japan - Asteroid strikes upset the environment and provide clues via the elements they leave behind. Now, University of Tsukuba researchers have linked elements that are enriched in the Cretaceous-Paleogene (KPg) boundary clays from Stevns Klint, Denmark, to the impact of the asteroid that produced the Chicxulub crater at the Yucatán Peninsula, Mexico. This corresponds to one of the "Big Five" mass extinctions, which occurred at the KPg boundary at the end of the Cretaceous, 66 million years ago. The findings provide a better understanding of which processes lead to enrichment of these types of elements--an understanding that may be applied to other geological boundary events as well.

In a study published in the Geological Society of America Bulletin, the researchers analyzed the concentrations of certain elements within the KPg boundary clays--such as copper, silver, and lead--to determine which processes led to the element enrichment after the end-Cretaceous asteroid impact. Two enriched components were found in the boundary layer, each with distinctly different compositions of elements. One component was incorporated in pyrite (FeS2), whereas the other component was not related to pyrite.

"Since the enrichments of elements in these two components of the boundary clay were accompanied by enrichments of iridium," says first author Professor Teruyuki Maruoka, "both two components might have been induced by processes related to the asteroid impact."

Iron oxides/hydroxides acted as a carrier phase that supplied chalcophile elements (elements concentrated in sulfide minerals) to the KPg boundary clays on the sea floor. The vapor cloud of the asteroid impact produced iron oxides/hydroxides, which could have carried chalcophile elements in oceans and been the source of iron in the pyrite grains holding chalcophile elements.

"These could have been incorporated into the pyrite as impurities," explains Professor Maruoka. "Furthermore, both iron oxides/hydroxides and chalcophile elements could have been released to the environment from the rocks that were struck by the asteroid impact."

Additionally, organic matter in the oceans could have accumulated copper and silver. As such matter degraded on the sea floor, it could have released these elements, which then formed copper- or silver-enriched grains in the KPg boundary clays. This, in turn, may have led to the formation of discrete grains that differ from pyrite. Acid rain that occurred after the end-Cretaceous asteroid impact could have supplied elements such as copper, silver, and lead to the ocean, as these elements are typical constituents of acid-soluble sulfides and were enriched in the second chalcophile component not related to pyrite.

These findings will hopefully provide further avenues to increase our understanding of the events around the end-Cretaceous impact, and potentially other major boundary events.

Credit: 
University of Tsukuba

Clinical factors during pregnancy related to congenital cytomegalovirus infection

A group led by researchers from Kobe University has illuminated clinical factors that are related to the occurrence of congenital cytomegalovirus (CMV) infection in newborns. They revealed for the first time in the world that fever or cold-like symptoms (including cough, sore throat and runny nose) during pregnancy, and threatened miscarriage or threatened premature labor (*1) in the second trimester (14-27 gestational weeks) were associated with CMV infection in newborns.

The cross-institutional research group consisted of Kobe University Graduate School of Medicine's Professor YAMADA Hideto (Department of Obstetrics and Gynecology), Nihon University School of Medicine's Professor MORIOKA Ichiro (Department of Pediatrics and Child Health) and Director MINEMATSU Toshio (of Aisenkai Nichinan Hospital's Research Center for Disease Control), among others.

Recent research conducted by this team and others has indicated that the blood tests currently carried out on pregnant women might not be effective in determining the likelihood of congenital CMV infection in newborns. This research has illuminated clinical factors during pregnancy that could be used to predict the occurrence of congenital CMV infection without relying on blood tests. This would allow at-risk newborns to be comprehensively tested and treated immediately when necessary; hopefully reducing the number of children suffering from the aftereffects of congenital CMV infection.

The results of this research were published in the American scientific journal 'Clinical Infectious Diseases' on January 14, 2020.

Main Points:

Cytomegalovirus can infect fetuses, causing mental and physical development issues as well as hearing difficulties in newborns.

Recently, it has been reported that early treatment with antiviral drugs can ameliorate loss of hearing and delayed mental development. Therefore, the early detection of newborns with congenital CMV infection is highly important.

As it was previously thought that the majority of CMV infected newborns were born to mothers who were initially infected during the pregnancy, maternal blood tests (for example, serology tests that detect antibodies) have been used to screen (*2) for the virus. However, there are in fact more newborns with congenital CMV infection whose mothers had the virus prior to the pregnancy as opposed to mothers who contracted the virus during the pregnancy. The severe aftereffects in newborns remain the same in both cases. Serology tests may be unable to predict the occurrence of congenital CMV infection in mothers who were infected prior to the current pregnancy.

The research team looked for clinical factors during pregnancy that could be used to predict congenital CMV infection occurrence without relying on blood tests.

Research Background

Research has focused on CMV because it can cause severe aftereffects if it infects the fetus, including issues with mental and physical development and hearing loss. It is a big issue worldwide; for example, it is estimated that around 1000 babies are born with congenital CMV infections every year in Japan.

Presently there are no effective vaccines or treatments available, therefore screening all pregnant women for CMV has been discouraged. However, it has recently been revealed that prompt treatment of affected newborns with antiviral drugs can improve mental and hearing outcomes. Consequently, the importance of accurate detection of congenital CMV infections in infants prior to birth has been reasserted.

Until recently, it was thought that newborns with congenital CMV infection were born to mothers who initially acquired the infection during pregnancy (primary infection). For this reason, maternal serological screening, such as blood tests for CMV-specific immunoglobin (Ig) M (*3), IgG antibodies (*4), and CMV IgG avidity tests(*5), were considered effective for detecting pregnancies with a high risk of congenital CMV infection.

However in recent years, many researchers from around the world reported that there were more infected babies born to pregnant women with chronic CMV infection prior to the affected pregnancy than those born to pregnant women with primary CMV infection. In addition, the severity of the symptoms in newborns was similar regardless of when the mother was infected. This research group also published results indicating this in 'Clinical Infectious Diseases' in 2017. These research studies illuminated the dangers of congenital CMV infection being overlooked in some cases due to the ineffectiveness of serological screening.

Ideally, universal screening for CMV-DNA in urine samples of newborns using PCR (*6) would be able to detect all cases of congenital CMV infection, however there are currently no countries that carry out CMV PCR on all newborns. As universal neonatal screening is not practical, it would be more realistic to detect babies at high risk of congenital CMV infection prior to birth and subsequently test their urine after birth.

This research study on pregnant women who gave birth at a primary maternity hospital sought to determine whether there were any clinical factors during pregnancy that were predictive of congenital CMV infection occurrence, without using serological screening.

Research Methodology

The cohort study was carried out on 4,125 low-risk pregnant women who received consultation and gave birth at Nadeshiko Ladies Hospital (a primary maternity hospital affiliated with Kobe University) between March 2009 and November 2019.

CMV PCR tests were conducted using urine samples from all of the infants born during the research period, nine (0.2%) of which had congenital CMV infection. Among these nine, one newborn had hearing problems. In order to determine the factors that increase the likelihood of congenital CMV infection occurrence, the research group collected the following clinical data on the all pregnant women in the study:

Age

Gravity and parity

BMI prior to pregnancy

Occupation

Smoking history

Fertility treatment history

Presence of fever or cold-like symptoms during pregnancy

History of maternal and obstetric complications; including threatened miscarriages, threatened premature labor, >hypertensive disorders and gestational diabetes.

Non-reassuring fetal status (*7) during labor

Whether delivery was performed by Caesarian section.

Gestational age at delivery

In addition, data on the newborns, including birth weight, sex, and hearing test screening results, was also compiled.

The results from pregnancies affected by CMV and unaffected pregnancies were compared through statistical analysis of the aforementioned clinical factors. It was determined that the percentage of pregnant women who had a fever or cold-like symptoms during pregnancy was higher among those who gave birth to newborns with congenital CMV infection, compared with those who did not. In addition, threatened miscarriage or threatened premature labor in the second trimester was experienced in a higher percentage of congenital CMV infection cases. Furthermore, the research group statistically proved that these clinical factors were associated with congenital CMV infection in newborns using logistical regression analysis.

Subsequently, the optimal predictive factors for congenital CMV infection occurrence were estimated. The presence of fever or cold-like symptoms during pregnancy yielded a sensitivity (*8) of 78% and a specificity (*9) of 85%. Threatened miscarriage or threatened premature labor in the second trimester had a sensitivity of 78% and a specificity of 61%. Furthermore, if a combination of these two symptoms was experienced, then the sensitivity was 100% with a specificity of 53%.

Based on these results, it is recommended that CMV PCR tests are conducted on the newborn's urine if the mother experiences any of the above identified factors during pregnancy.

Conclusion

Blood tests have enabled primary infections of CMV during pregnancy to be diagnosed, yet have been unable to predict congenital CMV infection occurrence in pregnant women with chronic infections. This has resulted in numerous cases of congenital CMV infection being overlooked.

This research study was carried out to see if the occurrence of congenital CMV infection could be linked to clinical factors during pregnancy, to allow the likelihood of infection to be predicted without conducting blood tests. The results revealed that fever or cold-like symptoms during pregnancy, and threatened miscarriage or threatened premature labor in the second trimester were factors linked to congenital CMV infection.

If these symptoms are experienced during pregnancy, carrying out PCR tests for CMV in the newborn's urine would allow congenital infection to be diagnosed and treated sooner. It is hoped that this would reduce the number of babies suffering from the effects of congenital CMV.

Credit: 
Kobe University

SUWA: A hyperstable artificial protein that does not denature in high temperatures above 100°C

image: The Onbashira Matsuri is a festival where men climb on and slide down a mountain side on large timber logs, a holy tradition dating back 1,200 years. The lumber is then used to build the one of the main shrines of Japan, the Suwa Taisha.

Image: 
Copyright ©2012-2014 Suwa Tourism Association

Proteins denature, or "cook" in heat, irreversibly changing their structure, like how an egg boils or a slab of sirloin turns to steak. This prevents proteins from being used in applications where they would need to withstand heat. Scientists have had high expectations for proteins to be used in nanotechnology and synthetic biology. A new hyperstable artificial protein constructed at Shinshu University in collaboration with Princeton University hopes to make some of those aspirations possible with the successful development of SUWA (Super WA20), a nanobuilding block in the shape of a pillar, anointed so in honor of the Onbashira Matsuri, also known as "the pillar" festival where men climb on and slide down a mountain side on large timber logs, a holy tradition dating back 1,200 years. The lumber is then used to build the one of the main shrines of Japan, the Suwa Taisha. The hope is that these SUWA nano-pillars will go on to build things just as central to society.

Summary of this research:

A de novo protein SUWA (Super WA20) is significantly more stable than its predecessor WA20.

SUWA did not boil at 100 °C, while WA20 denatures at 75 °C. The denaturation midpoint temperature of SUWA protein was found to be 122 °C. This is an ultra-stabilized artificial protein.

The characteristic three-dimensional structure of the dimer with a bisecting U topology of SUWA was elucidated by X-ray crystallography.

Molecular dynamics simulation suggests that the stabilization of the center of the α-helices contributes to the structural stabilization and high heat resistance in SUWA.

Protein nanobuilding blocks using SUWA, nanoscale pillars "nano-onbashira" are expected to be applied to nanotechnology and synthetic biology research in the near future.

Proteins and self-assembling protein complexes perform functions inside the living body like nanomachines making them a key component in the complex phenomena of life. Artificial design of proteins with desired functions would have many applications in biopharmacy and provide chemical reactions with low environmental impact. This nanotechnology is in the scale of molecules, 1/1,000,000 of a millimeter, making them difficult to work with, but have many promising applications.

A research group led by Ryoichi Arai of Shinshu University and Michael H. Hecht of Princeton University solved the crystal structure of the de novo protein WA20 in 2012. This current research builds upon the WA20 structure, to make the Super WA20 --aka SUWA-- recently explored in the paper published in the February issue of ACS Synthetic Biology, an American Chemical Society's academic journal.

Associate Professor Ryoichi Arai of Shinshu University Interdisciplinary Cluster of Cutting Edge Research's Institute for Biomedical Sciences and Naoya Kimura, a graduate of the Faculty of Textile Science and Technology of Shinshu University were central figures behind this new development of SUWA, a hyperstable artificial protein.

The naming of SUWA is derived from the location of the Onbashira Matsuri, which takes place in the Suwa region of Nagano Prefecture. Nagano is where Shinshu University holds its five campuses.

Credit: 
Shinshu University

High sugar diet may impair metabolic health & maternal care after pregnancy

Rats on a high sugar diet during pregnancy have altered levels of sex steroid hormones (e.g. progesterone) and dopamine in their brains, which may lead to behavioural changes that can affect care of offspring and motivation, as well as increasing the risk of diabetes and liver disease, according to a study published in the Journal of Endocrinology. Pregnant rats on a high sugar diet, equivalent to a typical Western diet, had increased progesterone levels, a hormone important for healthy pregnancy and lactation, and changes in the dopamine system, a neurotransmitter key in motivation, reward and mood. They also showed signs of prediabetes and fatty liver disease. The study findings suggest that sugar consumption during pregnancy may have serious, long-term health risks for the mental health of both mothers and pups, beyond the established risks for diabetes and heart disease.

The World Health Organisation advises limiting added sugars in our diet to 5-10% of our daily calories. However, The Western diet typically contains 15-25%. It is well established that high sugar consumption increases the risk of diabetes, heart and liver disease. Chronic sugar consumption has also been reported to affect learning, memory and goal-directed behaviours in rats. The mechanisms underlying these brain effects are poorly understood and the majority of studies have been performed on male rats. Since important hormone and metabolic changes occur during pregnancy and lactation, the current study aimed to investigate how high sugar intake may affect the health rats after giving birth.

In this study, Dr Daniel Tobiansky and colleagues, working in the lab of Prof Kiran Soma at The University of British Columbia in Canada, investigated the effects of a high sugar diet on hormone levels and markers of metabolic function in female rats. The rats were maintained on a high sugar diet (equivalent to 25% of their total calorie intake) covering a period 10 weeks prior to mating, as well as throughout pregnancy and lactation. Markers of metabolic health indicated that their glucose regulation was impaired and that they had fatty livers, although their body weight was not different from rats on normal diet. Levels of progesterone were increased whilst markers of dopamine function indicated that its activity in the brain was altered.

Dr Tobiansky states, "Beyond the established metabolic effects of high sugar intake, our data suggest that it may also have long-term harmful effects on mental health and maternal care. Progesterone is important for healthy pregnancy and lactation, whilst dopamine signalling is key for reward, learning and motivational behaviours. Taken together, these findings suggest that maternal behaviour, such as pup grooming and feeding could be negatively affected."

Dr Tobiansky and his team are planning to publish data from the offspring of these female rats. He remarks, "Our yet to be published data also indicate that maternal high sucrose diet can impact offspring behaviours and food preferences in profound ways."

Dr Tobiansky comments, "We suggest that public and health professionals follow the recommendations from the World Health Organization and the American Heart Association. They both suggest limiting added sugars to

Credit: 
Society for Endocrinology

Omega-3 fats do not protect against cancer

Omega-3 fats do not protect against cancer - according to new research from the University of East Anglia.

Increased consumption of omega 3 fats is widely promoted globally because of a common belief that it will protect against, or even reverse, diseases such as cancer, heart attacks and stroke.

But two systematic reviews published today find that omega 3 supplements may slightly reduce coronary heart disease mortality and events, but slightly increase risk of prostate cancer. Both beneficial and harmful effects are small.

If 1,000 people took omega 3 supplements for around four years, three people would avoid dying from heart disease, six people would avoid a coronary event (such as a heart attack) and three extra people would develop prostate cancer.

The sister systematic reviews are published today in the British Journal of Cancer and the Cochrane Database of Systematic Reviews.

Omega 3 is a type of fat. Small amounts are essential for good health and can be found in the food that we eat including nuts and seeds and fatty fish, such as salmon.

Omega 3 fats are also readily available as over-the-counter supplements and they are widely bought and used.

The research team looked at 47 trials involving adults who didn't have cancer, who were at increased risk of cancer, or had a previous cancer diagnosis, and 86 trials with evidence on cardiovascular events or deaths.

More than 100,000 participants were randomised to consume more long-chain omega-3 fats (fish oils), or maintain their usual intake, for at least a year for each of the reviews.

They studied the number of people who died, received a new diagnosis of cancer, heart attack or stroke and/or died of any of the diseases.

Lead author Dr Lee Hooper, from UEA's Norwich Medical School, said: "Our previous research has shown that long-chain omega 3 supplements, including fish oils, do not protect against conditions such as anxiety, depression, stroke, diabetes or death.

"These large systematic reviews included information from many thousands of people over long periods. This large amount of information has clarified that if we take omega 3 supplements for several years we may very slightly reduce our risk of heart disease, but balance this with very slightly increasing our risk of some cancers. The overall effects on our health are minimal.

"The evidence on omega 3 mostly comes from trials of fish oil supplements, so health effects of oily fish, a rich source of long-chain omega 3, are unclear. Oily fish is a very nutritious food as part of a balanced diet, rich in protein and energy as well as important micronutrients such as selenium, iodine, vitamin D and calcium - it is much more than an omega 3 source.

"But we found that there is no demonstrable value in people taking omega 3 oil supplements for the prevention or treatment of cancer. In fact, we found that they may very slightly increase cancer risk, particularly for prostate cancer.

"However this risk is offset by a small protective effect on cardiovascular disease.

"Considering the environmental concerns about industrial fishing and the impact it is having on fish stocks and plastic pollution in the oceans, it seems unhelpful to continue to take fish oil tablets that give little or no benefit."

Credit: 
University of East Anglia

Study estimates mental health impact of welfare reform, Universal Credit, in Great Britain

The 2013 Universal Credit welfare reform appears to have led to an increase in the prevalence of psychological distress among unemployed recipients, according to a nationally representative study following more than 52,000 working-age individuals from England, Wales, and Scotland over nine years between 2009-2018, published as part of an issue of The Lancet Public Health journal on income and health.

Specifically, the analysis suggests that the introduction of Universal Credit was associated with a 6.6% increase in the prevalence of psychological distress among recipients, equivalent to an estimated 63,674 unemployed recipients experiencing clinically significant levels of psychological distress between April 2013 and December 2018, of whom over a third (21,760) may have become clinically depressed [1].

Universal Credit was launched in April 2013 in a bid to simplify the benefits system and help more people into work, but has been the subject of controversy from the start, with reports of long delays in payments and increased use of sanctions. Replacing six existing benefits, this major welfare reform has been rolled out in stages, and by the end of 2018, 1.6 million people were receiving Universal Credit in England, Scotland and Wales, including 73% of unemployed people (990,000) [2].

"Our study supports growing calls for Universal Credit to be fundamentally modified to reduce these mental health harms", says Dr Sophie Wickham from the University of Liverpool, UK, who led the research. "So far, the government has only looked at the impact of Universal Credit on the labour market, and there are no plans to assess its effect on health and wellbeing. With a further 5.5 million recipients of existing benefits expected to claim Universal Credit over the next few years, this expanding group may exacerbate pressures on already stretched mental health and social care services." [3]

The study is the first to quantify the possible impact of Universal Credit on mental health, and underscores the importance of evaluating the potential health effects of changes to welfare systems worldwide. However, the authors caution that Universal Credit has been implemented within broader welfare changes that may have contributed to the mental health toll.

Previous qualitative studies suggested that the Universal Credit claims process and sanction threats may exacerbate long-term health problems and negatively affect mental health. Doctors have also raised concerns that the reform is harming the mental health of claimants and increasing the workload of General Practitioners.

In this study, researchers analysed data from 197,111 interviews with 52,187 people of working age (16-64 years) conducted between 2009 and 2018 as part of the Understanding Society UK Longitudinal Household Panel Study--a nationally representative survey. People out of work with a disability were excluded because they claim disability benefits, rather than unemployment benefit, and were not enrolled onto Universal Credit at the time of the study. Mental health was assessed by trained interviewers using the General Health Questionnaire, with a higher score indicating psychological distress. Participants were also asked about their employment status, area of residence, and demographic information.

Modelling was used to compare changes in psychological distress between people who were eligible for Universal Credit and those who were not unemployed (eg, employed, self-employed, retired) before and after the reform was rolled out in their local authority area.

The analysis suggested that the introduction of Universal Credit was associated with a 6.6 percentage point increase in the prevalence of psychological distress among recipients, compared to those who were employed or retired, after adjusting for a range of influential factors including age, sex, education level, and marital status.

Interestingly, the researchers found all socioeconomic groups were affected similarly. However, the researchers speculate that the mental health of people with low levels of education will be more affected by the reform because they are more likely to be unemployed, potentially widening health inequalities.

Further analyses suggested that Universal Credit did not appear to affect the physical health of unemployed recipients. They also found no evidence that the welfare reform was associated with more people entering employment.

"Given the mounting evidence of substantial mental health harms related to Universal Credit, it is crucial that the government conducts a robust health impacts assessment of all welfare reforms, including Universal Credit", says co-author Professor Dame Margaret Whitehead from the University of Liverpool, UK. "With nearly two-thirds of households in the UK receiving some kind of welfare benefit, any changes to the welfare system--even those with small individual effects--could have major implications for the nation's health." [3]

According to senior author Professor Ben Barr from the University Liverpool, UK, "Our findings have international importance for health professionals who are responding to the rising mental health needs of populations and for policymakers weighing up the costs and benefits of changes to welfare policies. Other countries considering similar changes to their welfare systems, such as introducing a digitalised service, paying monthly in arrears rather than prospectively each week, and stricter sanctions, should take into account the potential serious consequences for population mental health." [3]

The authors say that future studies should assess other groups of people who might be affected by Universal Credit (eg, employed people who currently receive working tax credits or those receiving child tax credits); and try to disentangle which elements of this reform might be driving the negative effects on mental health, such as the longer wait for payments (there is a five week delay in payment following a new claim) and stricter sanctions.

The authors acknowledge that their findings show observational associations rather than cause and effect. They note several data and methodological issues that might affect the accuracy of the estimates including that unemployment and mental health were based on self-reported data; and that the negative impact of the reform is likely to be underestimated because not all unemployed people have moved on to Universal Credit, and some participants in the comparison group would have become eligible and moved on to Universal Credit during the study period (eg, employed people on tax credits in some areas).

Commenting on the implications of the findings in a linked Comment, Dr Peter Craig (who was not involved in the study) from the University of Glasgow, UK says that the findings have important policy implications: "Universal Credit may be unique to the UK, but many of its features are common to welfare reforms in other countries, so Wickham and colleagues' findings have much wider implications for how welfare changes are evaluated; an exclusive focus on employment-related outcomes is clearly no longer adequate. Policy makers must take seriously the health effects of reforms of the kind associated with Universal Credit, which newly expose a large swathe of the population to an unfamiliar regime of sanctions and conditionality and to a claim process that for many is worryingly opaque and uncertain."

In a modelling study published at the same time in The Lancet Public Health journal, researchers evaluated how effective certain taxation and welfare policies could be at reducing health inequalities, by comparing how 12 policies might affect a representative sample of Scottish households across various levels of social deprivation.

"Our estimates suggest that the most effective policies for improving health and reducing health inequalities in Scotland would be those that disproportionately increase incomes in the most deprived areas", says Dr Elizabeth Richardson from the Public Health Observatory, NHS Scotland, UK, who led the research. "Policies that increase incomes are likely to result in the biggest reductions in mortality rates and advance the Government's aim of making Scotland a healthier place." [3]

In a third paper published in that same issue, researchers reviewed the existing evidence on the public health impact of income or tax interventions similar to the concept of basic income, in which individuals or families receive a regular, largely unconditional sum of money from the government.

"The evidence on the health impact is mixed, with some strong positive effects on birthweight, infant obesity, nutrition, and mental health, but we found small and inconsistent effects on health service use, work missed due to illness, and chronic health impairment measures", says Dr Marcia Gibson from the University of Glasgow who led the research. "While reductions in employment were mostly small for men, with larger reductions for mothers of young children, more evidence is needed to understand the wider health, economic, and social implications of basic income." [3]

Credit: 
The Lancet

Data centers use less energy than you think

image: Filled with computing and networking equipment, data centers are central locations that collect, store and process data.

Image: 
David Lohner

EVANSTON, Ill. -- If the world is using more and more data, then it must be using more and more energy, right? Not so, says a comprehensive new analysis.

Researchers at Northwestern University, Lawrence Berkeley National Laboratory and Koomey Analytics have developed the most detailed model to date of global data center energy use. With this model, the researchers found that, although demand for data has increased rapidly, massive efficiency gains by data centers have kept energy use roughly flat over the past decade.

This detailed, comprehensive model provides a more nuanced view of data center energy use and its drivers, enabling the researchers to make strategic policy recommendations for better managing this energy use in the future.

"While the historical efficiency progress made by data centers is remarkable, our findings do not mean that the IT industry and policymakers can rest on their laurels," said Eric Masanet, who led the study. "We think there is enough remaining efficiency potential to last several more years. But ever-growing demand for data means that everyone -- including policy makers, data center operators, equipment manufacturers and data consumers -- must intensify efforts to avoid a possible sharp rise in energy use later this decade."

The paper will be published on Feb. 28 in the journal Science.

Masanet is an adjunct professor in Northwestern's McCormick School of Engineering and the Mellichamp Chair in Sustainability Science for Emerging Technologies at the University of California, Santa Barbara. He conducted the research with Ph.D. student and coauthor Nuoa Lei at Northwestern.

Filled with computing and networking equipment, data centers are central locations that collect, store and process data. As the world relies more and more on data-intensive technologies, the energy use of data centers is a growing concern.

"Considering that data centers are energy-intensive enterprises in a rapidly evolving industry, we do need to analyze them rigorously," said study coauthor Arman Shehabi, a research scientist at Lawrence Berkeley National Laboratory. "Less detailed analyses have predicted rapid growth in data center energy use, but without fully considering the historical efficiency progress made by the industry. When we include that missing piece, a different picture of our digital lifestyles emerges."

To paint that more complete picture, the researchers integrated new data from numerous sources, including information on data center equipment stocks, efficiency trends, and market structure. The resulting model enables a detailed analysis of the energy used by data center equipment (such as servers, storage devices and cooling systems), by type of data center (including cloud and hyperscale centers) and by world region.

The researchers concluded that recent efficiency gains made by data centers have likely been far greater than those observed in other major sectors of the global economy.

"Lack of data has hampered our understanding of global data center energy use trends for many years," said coauthor Jonathan Koomey of Koomey Analytics. "Such knowledge gaps make business and policy planning incredibly difficult."

Addressing these knowledge gaps was a major motivation for the research team's work. "We wanted to give the data center industry, policy makers and the public a more accurate view of data center energy use," said Masanet. "But the reality is that more efforts are needed to better monitor energy use moving forward, which is why we have made our model and datasets publicly available."

By releasing the model, the team hopes to inspire more research into the topic. The researchers also translated their findings into three specific types of policies that can help mitigate future growth in energy use, urging policy makers to act now:

Extend the life of current efficiency trends by strengthening IT energy standards such as ENERGY STAR, providing financial incentives and disseminating best energy efficiency practices;

Increase research and development investments in next generation computing, storage and heat removal technologies to mitigate future energy use, while incentivizing renewable energy procurement to mitigate carbon emissions in parallel;

Invest in data collection, modeling and monitoring activities to eliminate blind spots and enable more robust data center energy policy decisions.

Credit: 
Northwestern University

How the brain separates words from song

The perception of speech and music - two of the most uniquely human uses of sound - is enabled by specialized neural systems in different brain hemispheres adapted to respond differently to specific features in the acoustic structure of the song, a new study reports. Though it's been known for decades that the two hemispheres of our brain respond to speech and music differently, this study used a unique approach to reveal why this specialization exists, showing it depends on the type of acoustical information in the stimulus. Music and speech are often inextricably entwined and the ability for humans to recognize and separate words from melodies in a single continuous soundwave represents a significant cognitive challenge. It's thought that the perception of speech strongly relies on the ability to process short-lived temporal modulations, and for melody, the detailed spectral composition of sounds, such as fluctuations in frequency. Previous studies have proposed a left- and right-hemisphere neural specialization for handling speech and music information, respectively. However, whether this brain asymmetry stems from the different acoustical cues of speech and music or from domain-specific neural networks remains unclear. By combining ten original sentences with ten original melodies, Philippe Albouy and colleagues created a collection of 100 unique a cappella songs, which contained acoustic information in both the temporal (speech) and spectral (melodic) domain. The nature of the recordings allowed the authors to manipulate the songs and selectively degrade each in either the temporal or spectral domain. Albouy et al. found that degradation of temporal information impaired speech recognition but not melody recognition. On the other hand, perception of melody decreased only with spectral degradation of the song. Concurrent fMRI brain scanning revealed asymmetrical neural activity; decoding of speech content occurred primarily in the left auditory cortex, while melodic content was handled primarily in the right. In a related Perspective, Daniela Sammler discusses the study's findings in more detail.

Credit: 
American Association for the Advancement of Science (AAAS)

A better way to detect underground water leaks

You can delay irrigating the lawn or washing the car all you want, but to really make a big dent in water savings we need to stop water waste long before the precious resource ever reaches our taps.

An estimated 20 to 50 percent of water is lost to leaks in North America's supply system - a major issue as utilities contend with how to sustain a growing population in an era of water scarcity.

"People talk about reducing the time you take showers, but if you think about 50 percent of water flowing through the system being lost, it's another magnitude," said study author Daniel Tartakovsky, a professor of energy resources engineering in Stanford's School of Earth, Energy & Environmental Sciences (Stanford Earth).

In a move that could potentially save money and billions of gallons of water, Tartakovsky, along with Abdulrahman Alawadhi from the University of California, San Diego, have proposed a new way to swiftly and accurately interpret data from pressure sensors commonly used to detect leaks.

In addition to water utilities, Tartakovsky said the method could also be applied to other industries that use pressure sensors for leak detection, such as in oil and natural gas transmission networks that run under the sea and pose additional environmental hazards.

The research was published online Feb. 12 in the journal Water Resources Research.

Water hammer

The new method targets water leaks in transmission mains, which are typically routed out of sight underground. Water transmission networks in North America and much of Europe are fitted with sensors that measure pressure to gauge flow.

The researchers built upon a technique known as the water hammer test - the industry standard for predicting the location of leaks. The test involves suddenly shutting off flow through a pipe and using sensors to gather data about how the resulting shock wave, or "water hammer," propagates. Tartakovsky and Alawadhi propose a new way to assimilate this data into a mathematical model to narrow down the location of a leak.

The current method for detecting leaks is computationally expensive; to reduce the cost, analysts need to make a lot of simplifying assumptions, according to Tartakovsky.

"We proposed a method that is fast enough that you don't need to make these assumptions, and so it's more accurate - you could do it in real time on a laptop," Tartakovsky said. "It's something utilities can use with existing computational resources and the models they already have."

By improving speed and accuracy, the researchers' method saves money, both in terms of time and labor and the cost of wasted water. For example, if you wanted to find a leak in a football field-length pipe, you could dig up the whole field until you hit wet soil, or you could use the new method to constrain the location of the leak to a 10-meter section of the pipe.

"In cities, it's harder because pipes are under buildings and you have to break asphalt and things like that, so the more accurate your prediction of the location, the better," Tartakovsky said.

Cities have the most potential for major water leaks - and the older the urban areas, the bigger the problems, with their complex networks of aging pipes.

"For operators who routinely use water hammer tests, the cost of this is zero - this is just a better way of interpreting these tests," Tartakovsky said. "We are not selling it or patenting it, so people could just use it and see whether they get better predictions."

Credit: 
Stanford's School of Earth, Energy & Environmental Sciences