Tech

Genetic screen offers new drug targets for Huntington's disease

CAMBRIDGE, MA -- Using a type of genetic screen that had previously been impossible in the mammalian brain, MIT neuroscientists have identified hundreds of genes that are necessary for neuron survival. They also used the same approach to identify genes that protect against the toxic effects of a mutant protein that causes Huntington's disease.

These efforts yielded at least one promising drug target for Huntington's: a family of genes that may normally help cells to break down the mutated huntingtin protein before it can aggregate and form the clumps seen in the brains of Huntington's patients.

"These genes had never been linked to Huntington's disease processes before. When we saw them, that was very exciting because we found not only one gene, but actually several of the same family, and also we saw them have an effect across two models of Huntington's disease," says Myriam Heiman, an associate professor of neuroscience in the Department of Brain and Cognitive Sciences and the senior author of the study.

The researchers' new screening technique, which allowed them to assess all of the roughly 22,000 genes found in the mouse brain, could also be applied to other neurological disorders, including Alzheimer's and Parkinson's diseases, says Heiman, who is also a member of MIT's Picower Institute for Learning and Memory and the Broad Institute of MIT and Harvard.

Broad Institute postdoc Mary Wertz is the lead author of the paper, which appears today in Neuron.

Genome-wide screen

For many decades, biologists have been performing screens in which they systematically knock out individual genes in model organisms such as mice, fruit flies, and the worm C. elegans, then observe the effects on cell survival. However, such screens have never been done in the mouse brain. One major reason for this is that delivering the molecular machinery required for these genetic manipulations is more difficult in the brain than elsewhere in the body.

"These unbiased genetic screens are very powerful, but the technical difficulty of doing it in the central nervous system at a genome-wide scale has never been overcome," Heiman says.

In recent years, researchers at the Broad Institute have developed libraries of genetic material that can be used to turn off the expression of every gene found in the mouse genome. One of these libraries is based on short hairpin RNA (shRNA), which interferes with the messenger RNA that carries a particular gene's information. Another makes use of CRISPR, a technique that can disrupt or delete specific genes in a cell. These libraries are delivered by viruses, each of which carry one element that targets a single gene.

The libraries were designed so that each of the approximately 22,000 mouse genes is targeted by four or five shRNAs or CRISPR components, so 80,000 to 100,000 viruses need to make it into the brain to ensure that all genes are hit at least once. The MIT team came up with a way to make their solution of viruses highly concentrated, and to inject them directly into the striatum of the brain. Using this approach, they were able to deliver one of the shRNA or CRISPR elements to about 25 percent of all of the cells in the striatum.

The researchers focused on the striatum, which is involved in regulating motor control, cognition, and emotion, because it is the brain region most affected by Huntington's disease. It is also involved in Parkinson's disease, as well as autism and drug addiction.

About seven months after the injection, the researchers sequenced all of the genomic DNA in the targeted striatal neurons. Their approach is based on the idea that if particular genes are necessary for neurons' survival, any cell with those genes knocked out will die. Then, those shRNAs or CRISPR elements will be found at lower rates in the total population of cells.

The study turned up many genes that are necessary for any cell to survive, such as enzymes involved in cell metabolism or copying DNA into RNA. The findings also revealed genes that had been identified in previous studies of fruit flies and worms as being important for neuron function, such as genes involved the function of synapses (structures that allow neurons to communicate with each other).

However, a novel finding of this study was the identification of genes that hadn't been linked to neuron survival before, Heiman says. Many of those were genes that code for metabolic proteins that are essential in cells that burn a lot of energy.

"What we interpret this to mean is that neurons in the mammalian brain are much more metabolically active and have a much higher dependency on these processes than for example, a neuron in C. elegans," Heiman says.

Promising targets

The researchers then performed the same type of screen on two different mouse models of Huntington's disease. These mouse models express the mutated form of the huntingtin protein, which forms clumps in the brains of Huntington's patients. In this case, the researchers compared the results from the screen of the Huntington's mice to normal mice. If any of the shRNA or CRISPR elements were found less frequently in the Huntington's mice, that would suggest that those elements targeted genes that are helping to make cells more resistant to the toxic effects of the huntingtin protein, Heiman says.

One promising drug target that emerged from this screen is the Nme gene family, which has previously been linked to cancer metastasis, but not Huntington's disease. The MIT team found that one of these genes, Nme1, regulates the expression of other genes that are involved in the proper disposal of proteins. The researchers hypothesize that without Nme1, these genes don't get turned on as highly, allowing huntingtin to accumulate in the brain. They also showed that when Nme1 is overexpressed in the mouse models of Huntington's, the Huntington's symptoms appear to improve.

Although this gene hasn't been linked to Huntington's before, there have already been some efforts to develop compounds that target it, for use in treating cancer, Heiman says.

"This is very exciting to us because it's theoretically a druggable compound," she says. "If we can increase its activity with a small molecule, perhaps we can replicate the effect of genetic overexpression."

Credit: 
Massachusetts Institute of Technology

Patterns in the brain shed new light on how we function

The patterns created by neurons in the brain can be used to shine a light on how the brain functions, and take us a step closer to creating intelligent robots, scientists claim.

Publishing their research today in PLoS Computational Biology, the international team from the universities of Newcastle and Zurich, ETH Zurich and the California Institute of Technology, show that the way the neurons are structured and the patterns they make can be used to explain how they behave and function.

Modelling the neurons in the visual cortex - those responsible for sight - the researchers showed the seemingly random patterns could be explained by simple developmental rules.

In turn, these recurring patterns can be used to better understand how neurons organise their connections to communicate with one another.

Co-author of the study Dr Roman Bauer, a Research Fellow in the School of Computing, explains:

"At first glance, the network of neurons in the human brain appears so tangled and complex you would think it impossible to start to understand how they all connect together.

"But what we have shown is that certain neurons make particular patterns which follow some quite simple rules.

"If we can spot these patterns in the brain then we can use them to predict how those particular neurons are behaving."

Focussing their work on the connections between the thalamus and cortical regions of the brain, Dr Bauer says that if we can understand how animals sense visual stimuli and recognise objects it could revolutionise current technology.

"When we change the orientation of an object, the brain still recognises it as the same object and easily adjusts to the changing situation. But current AI has a real problem with that.

"If we can simplify the brain to a few key patterns that can be translated by technology then it might be possible to create artificial intelligence that truly mimics the human brain.

"More importantly, by understanding what healthy network looks like, it will allow us to spot changes or abnormalities and inform new treatments."

Credit: 
Newcastle University

The first roadmap for ovarian aging

image: From left: Concepcion Rodriguez Esteban and Juan Carlos Izpisua Belmonte.

Image: 
Salk Institute

LA JOLLA--(January 30, 2020) Due to the modern tendency to postpone childbirth until later in life, a growing number of women are experiencing issues with infertility. Infertility likely stems from age-related decline of the ovaries, but the molecular mechanisms that lead to this decline have been unclear. Now, scientists from the U.S. and China have discovered, in unprecedented detail, how ovaries age in non-human primates. The findings, published in Cell on January 30, 2020, reveal several genes that could be used as biomarkers and point to therapeutic targets for diagnosing and treating female infertility and age-associated ovarian diseases, such as ovarian cancer, in humans.

"This is the first in-depth analysis of ovarian aging at a single-cell resolution in a non-human primate model," says Juan Carlos Izpisua Belmonte, one of the co-corresponding authors, professor in Salk's Gene Expression Laboratory and holder of the Roger Guillemin Chair. "We found that oxidative stress, the cellular stress that damages cells, is a key player in ovarian aging. This discovery provides valuable insight into the mechanisms by which ovaries age and eventually become infertile."

The ovary is a complex reproductive organ in which an ovarian cell, called an oocyte, undergoes meiosis to become an egg. Current research suggests that women are born with a set number of oocytes that start to become less functional once women turn 35, leading to infertility. A better understanding of the ovarian environment as well as the mechanisms of healthy aging could inform new therapies for women with fertility issues.

"Our goal was to analyze each ovarian cell type along with patterns in gene expression in order to better understand exactly how ovaries age," says Jing Qu, co-corresponding author, professor at the Chinese Academy of Sciences and former Salk research associate. "This systematic approach provides a better understanding of the mechanisms of healthy ovarian aging."

The scientists compared 2,601 ovarian cells from young and old non-human primates, and identified gene activity patterns for every type of primate ovarian cell including ooctyes and granulosa cells, which surround the oocytes as they develop. Similar to previous studies in rodents, the scientists observed changes in gene function related to cellular stress and cell division across the non-human primates. As the oocytes and granulosa cells aged, some of the genes that fight cellular stress became less active which led to damage and impairment in function.

The scientists then compared the primate data with granulosa cells from healthy women ranging in age from 21 to 46 years. They observed age-associated damage from cellular stress as well as cell death in the women's cells. Two key antioxidant genes (IDH1 and NDUFB10) showed decreased function, as seen in the non-human primate cells. To better understand the connection between ovarian aging and the antioxidant genes, the scientists tested what happened to the human cells when the antioxidant genes were made non-functional. They found that without IDH1 or NDUFB10, the cells appeared old and similar to the old non-human primate cells.

The results suggest that IDH1 and NDUFB10 play a critical role in protecting both human and non-human primate ovarian cells from cellular stress during aging. These genes represent promising biomarkers or therapeutic targets for the diagnosis and treatment of age-related decline of the ovaries.

"This study provides a comprehensive understanding of the specific mechanisms of primate ovarian aging at single-cell resolution," says Guang-Hui Liu, co-corresponding author, professor at the Chinese Academy of Sciences and former Salk research associate. "Our results will hopefully lead to the development of new tools to aid in the rejuvenation of aged ovarian cells."

"Our research is enabling the identification of new biomarkers for the diagnosis and treatment of female infertility as well as aging-associated human ovarian disorders," says Concepcion Rodriguez Esteban, an author on the paper and senior staff researcher in the Izpisua Belmonte lab. "These genes could possibly be targeted for the development of therapies to assist with fertility preservation."

Credit: 
Salk Institute

Mechanism for improvement of photoluminescence intensity in phosphor material

image: Has a non-integer multiple period of 4.11 times in the b-direction, and has a periodic structure in which T, S, and U repeat due to the difference in the MO4 gradient.

Image: 
COPYRIGHT (C) TOYOHASHI UNIVERSITY OF TECHNOLOGY. ALL RIGHTS RESERVED.

Overview

A research team consisting of Toyohashi University of Technology, Nagoya Institute of Technology, and the National Institute for Materials Science (NIMS) have clarified the mechanism by which the crystal structure of a red phosphor material obtained by adding P2O5 and Eu2O3 to a silicate (Ca2SiO4)-based material at various heat treatment temperatures changes in photoluminescence intensity due to differences in these factors.

Phosphor materials are widely used in everyday items such as vehicles and projectors in addition to LED lighting, and many researchers are developing phosphor materials using various approaches in search of brighter and more efficient phosphor materials. Knowledge of the photoluminescence intensity and crystal structure is important for designing brighter phosphor materials. In this study, the researchers succeeded in analyzing the changes in the crystal structure of the material at an atomic level due to heat treatment and the addition of P5+ and Eu3+ ions, and were able to clarify the relationships between these factors and the photoluminescence intensity.

Details

White LEDs have undergone significant development over the past two decades, the market size has continued to expand, and LEDs are now the industry leader. Phosphor materials are used in a wide range of applications such as backlighting for monitors and automotive applications in addition to lighting. In the future, many researchers will be competing to develop new materials in order to further diversify applications and realize energy saving, low cost, high color rendering, and long life, and this area has attracted a great deal of attention at academic conferences. Therefore, clarification of the characteristics and expression mechanism of phosphor materials is an important finding in the development of new materials. However, while most development of phosphor materials requires brighter and more efficient materials, there have not been many papers on the details of clarifying the mechanisms.

With this background, the research team synthesized a red phosphor made by adding P5+ , and Eu3+ as an activator to Ca2SiO4 (silicate) while changing the heat treatment temperature from 1200 to 1500 degrees. This is because the crystal structure of silicate can easily be changed by heat. Here, the activator is an element (ion) that emits various emission colors from blue to red when added to the crystal.

As a result of the research, it was found that the photoluminescence intensity changes depending on the heat treatment temperature, and is closely related to changes in the crystal structure. The research teams noticed that the crystal structure of the phosphor material that had been heat treated at 1500 degrees changed to an incommensurate structure (IC), which is rare for ceramic materials. Normal crystal structures have a period that is an integer multiple, but the IC phase has a non-integer multiple of modulation. The photoluminescence intensity decreased due to the formation of the IC phase. By making full use of X-ray diffraction and computational science, the teams succeeded in analyzing the crystal structure in detail at the atomic level. As a result of the analysis, the material has a modulation structure of 4.110 times in the b-direction, the structure has three types of SiO4 tetrahedron (T, U, S) gradients, and two more types of gradient (T", S") when looking at the long period. In this way, the teams discovered that the material constitutes an IC phase. It is thought that the reason for such a complex IC phase is that P is present in a part of the SiO4 tetrahedron, Eu is present in a part of the Ca, and the crystal structure is formed by rapid cooling treatment from a high temperature of 1500 degrees. Based on the precise analysis of the crystal structure, the researchers can answer questions such as: Which crystal sites should have an activator in order to synthesize a material with a brighter photoluminescence intensity? Which crystal structure is better? The teams believe that this knowledge can be used for new material design.

Development background

Prof. Hiromi Nakano and her team started research on phosphor materials using a silicate matrix doped with P2O5 about five years ago. Since the crystal structure of silicate can be controlled by the heat treatment temperature, she focused on the relationships between the crystal structure and the photoluminescence characteristics from the beginning of the research. In this research, when Prof. Hiromi Nakano, the team leader, showed the data to Professor Fukuda of the Nagoya Institute of Technology because the obtained XRD did not match the previous results, he advised that there was the possibility of an IC phase, and that the IC phase could also be analyzed by XRD. Prof. Nakano observed through electron diffraction that there was a crystal structure with a non-integer multiple period, but other means were needed to obtain quantitative data. The research collaboration between the group of XRD experts led by Professor Fukuda at Nagoya Institute of Technology and the IC phase analysis group led by Japan's leading analyst for IC phase, Dr. Michiue at NIMS resulted in the successful precise analysis of the crystal structure.

Future outlook

This knowledge is important for the development of phosphor materials, and the researchers believe it will be useful in industry in this field. Going forward, the researchers intend to conduct precise crystal structure analyses, further develop new materials, and widely disclose the new knowledge related to the physical properties of phosphor materials.

Credit: 
Toyohashi University of Technology (TUT)

Nanotechnology: Putting a nanomachine to work

A team of chemists at Ludwig-Maximilians-Universitaet (LMU) in Munich has successfully coupled the directed motion of a light-activated molecular motor to a different chemical unit - thus taking an important step toward the realization of synthetic nanomachines.

Molecular motors are chemical compounds that convert energy into directed motions. For example, it is possible to cause a substituent attached to a specific chemical bond to rotate unidirectionally when exposed to light of a certain wavelength. Molecules of this sort are therefore of great interest as driving units for nanomachines. However, in order to perform useful work, these motors must be integrated into larger assemblies in such a way that their mechanical motions can be effectively coupled to other molecular units. So far, this goal has remained out of reach. LMU chemist Dr. Henry Dube is a noted specialist in the field of molecular motors. Now he and his team have taken an important step towards achievement of this aim. As they report in the renowned journal Angewandte Chemie, they have succeeded in coupling the unidirectional motion of a chemical motor to a receiver unit, and demonstrated that motor can not only cause the receiver to rotate in the same direction but at the same time significantly accelerate its rotation.

The molecular motor in Dube's setup is based on the molecule hemithioindigo, which contains a mobile carbon double bond (-C=C-). When the compound is exposed to light of a specific wavelength, this bond rotates unidirectionally. "In a paper published in 2018, we were able to show that this directional double bond rotation could be transmitted by means of a molecular 'cable' to the single carbon bond rotation of a secondary molecular unit." says Dube. "This single bond itself rotates randomly under the influence of temperature fluctuations. But, thanks to the physical coupling between them, the unidirectional motion of the light-driven motor is transmitted to the single bond, which is forced to rotate in the same direction."

To verify that the 'motorized' bond was actively driving the motion of the single bond, and not simply biasing its direction of rotation, Dube and colleagues added a brake to the system that reduced the thermal motion of the single bond. The modification ensured that the motor would have to expend energy to overcome the effect of the brake in order to cause the single bond to rotate. "This experiment enabled us to confirm that the motor really does determine the rate of rotation of the single bond - and in fact increases it by several orders of magnitude," Dube explains.

Taken together, these results provide unprecedentedly detailed insights into the mode of operation of an integrated molecular machine. In addition, the experimental setup allowed the authors to quantify the potential energy available to drive useful work, thus yielding the first indication of how much work can effectively be done by a single molecular motor under realistic conditions. "Our next challenge will be to demonstrate that the energy transmitted in this system can indeed be used to perform useful work on the molecular scale," says Dube.

Credit: 
Ludwig-Maximilians-Universität München

A quantum of solid

image: Scientists from Vienna, Kahan Dare (left) and Manuel Reisenbauer (right) working on the experiment that cooled a levitated nanoparticle to its motional quantum groundstate.

Image: 
© Lorenzo Magrini, Yuriy Coroli/University of Vienna

It is well known that quantum properties of individual atoms can be controlled and manipulated by laser light. Even large clouds of hundreds of millions of atoms can be pushed into the quantum regime, giving rise to macroscopic quantum states of matter such as quantum gases or Bose-Einstein condensates, which nowadays are also widely used in quantum technologies. An exciting next step is to extend this level of quantum control to solid state objects. In contrast to atomic clouds, the density of a solid is a billion times higher and all atoms are bound to move together along the object's center of mass. In that way, new macroscopic quantum states involving large masses should become possible.

However, entering this new regime is not at all a straightforward endeavour. A first step for achieving such quantum control is to isolate the object under investigation from influences of the environment and to remove all thermal energy - by cooling it down to temperatures very close to absolute zero (-273.15 °C) such that quantum mechanics dominates the particle's motion. To show this the researchers chose to experiment with a glass bead approximately a thousand times smaller than a typical grain of sand and containing a few hundred million atoms. Isolation from the environment is achieved by optically trapping the particle in a tightly focused laser beam in high vacuum, a trick that was originally introduced by Nobel laureate Arthur Ashkin many decades ago and that is also used for isolating atoms. "The real challenge is for us to cool the particle motion into its quantum ground state. Laser cooling via atomic transitions is well established and a natural choice for atoms, but it does not work for solids", says lead-author Uros Delic from the University of Vienna.

For this reason, the team has been working on implementing a laser-cooling method that was proposed by Austrian physicist Helmut Ritsch at the University of Innsbruck and, independently, by study co-author Vladan Vuletic and Nobel laureate Steven Chu. They had recently announced a first demonstration of the working principle, "cavity cooling by coherent scattering", however they were still limited to operating far away from the quantum regime. "We have upgraded our experiment and are now able not only to remove more background gas but also to send in more photons for cooling", says Delic. In that way, the motion of the glass bead can be cooled straight into the quantum regime. "It is funny to think about this: the surface of our glass bead is extremely hot, around 300°C, because the laser heats up the electrons in the material. But the motion of the center of mass of the particle is ultra-cold, around 0.00001°C away from absolute zero, and we can show that the hot particle moves in a quantum way."

The researchers are excited about the prospects of their work. The quantum motion of solids has also been investigated by other groups all around the world, along with the Vienna team. Thus far, experimental systems were comprised of nano- and micromechanical resonators, in essence drums or diving boards that are clamped to a rigid support structure. "Optical levitation brings in much more freedom: by changing the optical trap - or even switching it off - we can manipulate the nanoparticle motion in completely new ways", says Nikolai Kiesel, co-author and Assistant Professor at the University of Vienna. Several schemes along these lines have been proposed, amongst others by Austrian-based physicists Oriol Romero-Isart and Peter Zoller at Innsbruck, and may now become possible. For example, in combination with the newly achieved motional ground state the authors expect that this opens new opportunities for unprecedented sensing performance, the study of fundamental processes of heat engines in the quantum regime, as well as the study of quantum phenomena involving large masses. "A decade ago we started this experiment motivated by the prospect of a new category of quantum experiments. We finally have opened the door to this regime."

Credit: 
University of Vienna

Researchers make critical advances in quantifying methane released from the Arctic Ocean

image: The foredeck of the icebreaker Oden with the atmospheric measurement tower, moving through sea ice with many melt ponds (blue areas) in the East Siberian Sea during the SWERUS-C3 project.

Image: 
Brett Thornton/Stockholm University

A new study, lead by researchers at Stockholm university and published in Science Advances, now demonstrate that the amount of methane presently leaking to the atmosphere from the Arctic Ocean is much lower than previously claimed in recent studies.

Methane is well known as a major contributor to global warming. Understanding the natural sources of this gas, especially in the fast-warming Arctic, is critical for understanding the future climate.

Compared with the amount of methane produced by human activities the amount from the ocean was long thought to be negligible. Nevertheless, over the past decade, there have been reports claiming large amounts of methane emitted from the Arctic Ocean to the atmosphere. The amounts released were sometimes claimed to be catastrophically large and, even though the emissions had not be observed by atmospheric monitoring stations, it raised the question that perhaps scientists had missed something important about the Arctic Ocean's methane cycle. However, measuring small amounts of gas escaping from the sea and properly scaling the emissions over millions of square kilometers of the remote Arctic Ocean is not an easy task.

A unique application of an established measurement technique

In their study, the researchers used direct measurements of the methane sea-to-air flux to determine how much methane is leaking from the eastern Arctic Ocean to the atmosphere. They used data from the 2014 SWERUS-C3 project, during which the Swedish icebreaker Oden crossed the eastern Arctic Ocean from Tromsø, Norway.

Although other researchers have calculated the sea-to-air flux before, this study used a unique measurement technique to measure the fluxes directly, and the authors believe their paper is the first to successfully apply this method from a ship. The reason the method has not been used before is that it requires measuring the gas concentration in the atmosphere very rapidly--10 times per second--in addition to even faster measurements of the wind flow in three dimensions around the ship, and the precise location, acceleration and motion of the ship relative to the sea surface. Faster, smaller, accelerometers and inertial navigation units, similar to the chips which let smartphones know when you turn them sideways or upside down, as well as faster spectrometers for the methane measurement, and a detailed model of airflow around Oden, made this measurement possible.

"By understanding the airflow over the sea surface, and simultaneously measuring methane concentrations, we can determine how much methane is coming out of the ocean", explains researcher Brett Thornton at the Department of Geological Sciences, Stockholm University.

"This is actually our second paper on the topic of methane emissions from the sea during the SWERUS-C3 expedition. The method used then relied on slower measurements of methane in the surface water, and so we could not detect the largest 'hotspots' of emission as precisely", says Brett Thornton.

This new study shows that "hotspots" of methane emission from the sea can be up to 25 times higher than emissions from on-shore wetlands. These emissions are driven by bubbles coming from the seafloor and reaching the sea surface. This study directly observed very high peak emissions and, for the first time, was able to map their spatial extent.

"The peak emissions are indeed large but at the same time they are also extremely limited in area", says Brett Thornton.

Across the Laptev, East Siberian, and Chukchi seas the authors saw no evidence of widespread emissions at the magnitude of the "hotspots". In fact, their estimates for total methane emission from the eastern Arctic Ocean did not substantially increase even when they included these "hotspots" in the budget calculations.

"What this means is that--at least at the time of our measurements--the shallow eastern Arctic Ocean was not a huge source of methane to the atmosphere, and our understanding of Arctic sea emissions in the methane cycle is still reasonably correct. So this is, I would say, a bit of good news in the global warming story. Yes, there is methane leaking from the Arctic Ocean to the atmosphere. But, at least for now, it is not globally important to atmospheric methane and global warming", Brett Thornton explains.

It's important to realize that this work doesn't give insight into what might happen to these methane emissions in the future Arctic Ocean, with warmer waters and less ice cover. Will they increase or decrease? Will they become important globally? That remains to be determined by future research.

Credit: 
Stockholm University

Wearable health tech gets efficiency upgrade

image: NC State's improved theromoelectric generator demonstrates efficiency and flexibility.

Image: 
Photo courtesy of Mehmet Ozturk, NC State University.

North Carolina State University engineers have demonstrated a flexible device that harvests the heat energy from the human body to monitor health. The device surpasses all other flexible harvesters that use body heat as the sole energy source.

In a paper published in Applied Energy, the NC State researchers report significant enhancements to the flexible body heat harvester they first reported in 2017. The harvesters use heat energy from the human body to power wearable technologies - think of smart watches that measure your heart rate, blood oxygen, glucose and other health parameters - that never need to have their batteries recharged. The technology relies on the same principles governing rigid thermoelectric harvesters that convert heat to electrical energy.

Flexible harvesters that conform to the human body are highly desired for use with wearable technologies. Mehmet Ozturk, an NC State professor of electrical and computer engineering and corresponding author of the paper, mentioned superior skin contact with flexible devices, as well as the ergonomic and comfort considerations to the device wearer, as the core reasons behind building flexible thermoelectric generators, or TEGs.

The performance and efficiency of flexible harvesters, however, currently trail well behind rigid devices, which have been superior in their ability to convert body heat into usable energy.

"The flexible device reported in this paper is significantly better than other flexible devices reported to date and is approaching the efficiency of rigid devices, which is very encouraging," Ozturk said.

The proof-of-concept TEG originally reported in 2017 employed semiconductor elements that were connected electrically in series using liquid-metal interconnects made of EGaIn - a non-toxic alloy of gallium and indium. EGaIn provided both metal-like electrical conductivity and stretchability. The entire device was embedded in a stretchable silicone elastomer.

The upgraded device employs the same architecture but it significantly improves the thermal engineering of the previous version, while increasing the density of the semiconductor elements responsible for converting heat into electricity. One of the improvements is an improved silicone elastomer - essentially a type of rubber - that encapsulates the EGaIn interconnects.

"The key here is using a high thermal conductivity silicone elastomer doped with graphene flakes and EGaIn," Ozturk said. The elastomer provides mechanical robustness against punctures while improving the device's performance.

"Using this elastomer allowed us to boost the thermal conductivity - the rate of heat transfer - by six times, allowing improved lateral heat spreading," he said.

Ozturk added that one of the strengths of the technology is that it eliminates the need for device manufacturers to develop new flexible, thermoelectric materials because it incorporates the very same semiconductor elements used in rigid devices. Ozturk said future work will focus on further improving the efficiencies of these flexible devices.

Yasaman Sargolzaeiaval, Viswanath P. Ramesh, Taylor V. Neumann, Veena Misra, Michael Dickey and Daryoosh Vashaee co-authored the paper. The group also has a recent patent on the technology.

Credit: 
North Carolina State University

Autonomous microtrap for pathogens

Antibiotics are more efficient when they can act on their target directly at the site of infestation, without dilution. In the journal Angewandte Chemie, American scientists describe a synthetic chemical trap that propels itself to its place of action in the body fluid and then lures the bacteria into its interior to poison them. One of the main functionalities of the microdevice is the communication with its target, says the study.

The scientists constructed the novel multifunctional weapon to address the common medical issue that most drugs dilute in body fluids before they can exert their function. It would be more efficient if the drug and its target were brought together so that less medicine is wasted. In association with Joseph Wang at the University of California San Diego, researchers have developed a self-propelling chemical trap to corner and destroy pathogens. It works by the sequential release of chemicals from a container-like autonomous microdevice and could be especially useful against gastric pathogens, the authors report.

The scientists developed a microswimming device with an onion-like character. Its core was a bead of magnesium metal engine, which was partially covered with several polymer coatings--each having its own function. In an acidic environment, such as gastric acid, the magnesium bead reacted with the acid to produce hydrogen bubbles, which drove the microswimmer forward, similar to a submarine run by a jet of gas. The device's journey ended when it becomes stuck to a wall, such as the stomach lining. Once the magnesium engine was dissolved and exhausted, a hollow structure of about thirty times the size of a bacterium remained, like an empty, multiwalled spherical bag.

The bag worked as a trap. The hollow microdevice lured bacteria into it and then became a toxic cage. Its inner walls were made of an acid-soluble polymer incorporating the amino acid serine--a substance that signals food to the gut bacterium Escherichia coli. The dissolving polymer released the serine, which, through a phenomenon called chemotaxis, caused the bacteria to move towards the source. Under a microscope, the researchers observed accumulation of the bacteria inside the hollow sphere.

In the final step, a toxin was activated. A polymer layer dissolved and released silver ions, which killed the bacteria. This multistage pathway represents a novel approach to making antibiotics more efficient. The authors also see it as a "first step towards chemical communication between synthetic microswimmers and motile microorganisms." They believe that the concept could be expanded to a variety of decontamination applications; for example, in the food and healthcare industries, or for security and environmental remediation.

Credit: 
Wiley

Self-learning heat­ing control system saves energy

image: The "Urban Mining and Recycling" unit in the NEST research building has two student rooms. One of them was equipped with a self-learning heating and cooling control system.

Image: 
Zooey Braun, Stuttgart / Empa

Factory halls, airport terminals and high-rise office buildings are often equipped with automated "anticipatory" heating systems. These work with pre-defined scenarios specially calculated for the building and help save building own­ers a great deal of heating energy. However, such an individual programming is too expensive for individual apartments and private homes.

Last summer, a group of Empa re­searchers proved for the first time that it could indeed be much simpler than that: Intelligent heating and cooling control does not necessarily have to be programmed, the system can just as easily learn to reduce costs by itself and based on the data of past weeks and months. Programming experts are no longer necessary. With this trick, the cost-saving technology will soon also be available for families and singles.

The crucial experiment took place in Empa's research building NEST. The UMAR unit ("Urban Mining and Recycling") offers prime conditions for this test: A large eat-in kitchen is framed on both sides by two student rooms. Both rooms are 18 square meters each. The entire window front looks east-southeast towards the morning sun. In the UMAR unit, heated or pre-­cooled water flows through a stainless steel ceiling cladding and ensures the desired room temperature. The energy used for heating and cooling can be calculated for each individual room using the respective valve positions.

Clever cooling - thanks to the weather forecast

Since project leader Felix Bünning and his colleague Benjamin Huber did not want to wait for the heating period, they started a cooling experiment in June 2019. The week from 20 to 26 June began with two sunny, but still rather cool days, followed by a cloudy day, finally the sun burned over Dübendorf and drove the outside temper­ature to just short of 40 degrees.

In the two sleeping rooms, the temperature should not exceed the mark of 25 degrees during the day, at night the limit is set to 23 degrees. A conventional thermostatic valve provided the cooling in one room. In the other room, the experimental control system equipped with artificial intelligence (AI) developed by Bünning and Huber and their team was at work. The AI had been fed with data from the past ten months - and it knew the current weather forecast from MeteoSwiss.

Greater comfort with less energy

The result was crystal-clear: The smart heating and cooling control system adhered much more closely to the pre-set comfort specifications - while using around 25% less energy. This was mainly because in the morning, when the sun was shining through the windows, the system was cooling the rooms beforehand. The conventional thermostat in the second room, on the other hand, could only react once the temperature went through the ceiling. Too late, too hectic and with full power. In November 2019, a cool month with little sun, lots of rain and cool winds, Bünning and Huber repeated the experiment. Now it was all about heating the two rooms. At the time this issue went to press, the evaluation was still in progress. But Bünning is convinced that his predictive heating control system also collects points here.

The Empa team has already prepared the next step: "In order to test the system in a real-world environment, we have planned a larger field test in a building with 60 apartments. We will equip four of these apartments with our intelligent heating and cooling control system".Bünning is curious about the results. "I think that new controllers based on machine learning offer a huge opportunity. With this method we can construct a good, energy-saving retrofit solution for existing heating systems using relatively simple means and the recorded data."

Credit: 
Swiss Federal Laboratories for Materials Science and Technology (EMPA)

New research shows sustainability can be a selling point for new ingredients

image: Bambara pods growing in rows.

Image: 
University of Nottingham

The first UK consumer study on the use of Bambara Groundnut as an ingredient in products has shown that sharing information on its sustainable features increased consumers' positive emotional connection to food.

Researchers from the University of Nottingham's School of Biosciences tested Biscotti and crackers made with Bambara groundnut against standard commercial products. The study published in Food Research International showed that the main driver for accepting this new ingredient was how people felt when given information about its sustainability credentials.

Alternative crops

With current global challenges such as population growth, climate change and water scarcity, it is critical to develop strategies to achieve food security. The University of Nottingham's Future Food Beacon of Excellence and Malaysia campus along with Crops for the Future research projects have been investigating ways to tackle this by researching alternative crops that are resistant to climate change.

They have discovered that Bambara Groundnut found in Africa puts limited demands on soil and is capable of growing in nutrient poor soils where most crops would not thrive. Bambara is high in carbohydrate and protein and is gluten free so could offer an alternative to rice and wheat flour.

100 participants were invited to the University of Nottingham's Sensory Science Centre to undertake two tasting sessions of biscotti and crackers - two commercial products and two with Bambara flour. For each sample participants were asked to rate their overall liking and emotional response based on sensory properties of the product. The first tasting was done 'blind' with a red light masking the appearance of the products. The participants were then invited back for a second session where they were informed about global resource challenges and the sustainable features of Bambara and told which products contained this ingredient.

Dr Qian Yang, Assistant Professor in Sensory and Consumer Sciences at the University of Nottingham led the research and explains what they found: "Under the blind condition, no significant differences in overall liking were observed between standard and Bambara products, suggesting UK consumers accept the sensory properties of products containing Bambara flour. This indicates as long as the products taste good consumers engage with the new sustainable crops. Interestingly, after being given the information about climate and sustainability issues we saw a shift towards more positive emotions towards the Bambara product and people felt guiltier when eating standard products."

Tim Foster Professor in Food Structure added: "This study gives an important insight into how emotional response could be used as a way to encourage consumers to eat more sustainable products. Our participants' positive reactions to the information they were given about the sustainable credentials of the products suggest this type of information could help with promotion when a new product is brought to market."

Credit: 
University of Nottingham

New research could aid cleaner energy technologies

image: Guangwen Zhou is a professor of mechanical engineering at Binghamton University, State University of New York.

Image: 
Binghamton University, State University of New York

BINGHAMTON, N.Y. - New research led by faculty at Binghamton University, State University of New York, could aid cleaner energy technologies.

The atomic reaction between gases and oxides is a key piece for many technological puzzles. It can lead to benefits such as better catalysts to enable cleaner energy technologies, or to problems like corrosion.

Understanding those interactions isn't always easy, though, and often doesn't go beyond the surface -- quite literally.

A team from Binghamton University, the Brookhaven National Laboratory and the National Institute of Standards and Technology -- led by Professor Guangwen Zhou from the Thomas J. Watson School of Engineering and Applied Science's Department of Mechanical Engineering -- has a new way to look deeper into how gas molecules affect the atoms beneath the surface of a material.

The material studied is cupric oxide, a copper oxide that many researchers are interested in because it is more abundant and affordable than noble metals such as silver, gold and platinum, and it is used for numerous processes such as methanol production.

For the paper "Surface-reaction induced structural oscillations in the subsurface," published earlier this month in Nature Communications, Zhou and his fellow researchers (including Binghamton PhD students Xianhu Sun, Wenhui Zhu, Dongxiang Wu, Chaoran Li, Jianyu Wang, Yaguang Zhu and Xiaobo Chen) examined the reaction between hydrogen and copper oxide using atomic-scale transmission electron microscopy.

The technique allowed them to see the surface and subsurface simultaneously and in real time, showing that structural oscillations are induced in the subsurface by loss of oxygen from the oxide surface.

"This study shows how the reaction from the surface propagates to deeper atomic layers. We look at it from a cross-section so we can see atoms both in the top layer and subsurface layers more clearly," said Zhou, who teaches as part of the Materials Science and Engineering Program and also is the associate director of Binghamton's Institute for Materials Research.

This new study is funded by the Department of Energy, in the hope that the results can lead to better catalysts, improved batteries, longer-lasting vehicles and other higher-quality products.

"If we know these reaction mechanisms, we can design better materials," Zhou said. "We can't care just about the surface but also the deeper layers if we want to understand the process better."

Credit: 
Binghamton University

New research establishes how first exposure to flu virus sets on our immunity for life

image: Matthew Miller, a co-author on the study and an associate professor at the Michael G. DeGroote Institute for Infectious Disease and the McMaster Immunology Research Centre

Image: 
JD Howell, McMaster University

Were you born in an H1N1 year or an H3N2 year? The first type of influenza virus we are exposed to in early childhood dictates our ability to fight the flu for the rest of our lives, according to a new study from a team of infectious disease researchers at McMaster University and Université de Montréal.

The findings, published this week in the journal Clinical Infectious Diseases, provide compelling new evidence to support the phenomenon known as 'antigenic imprinting', which suggests that early exposure to one of the two flu strains that circulate every year imprints itself on our immunity and disproportionately affects the body's lifelong response to the flu.

This could have important implications for pandemic and epidemic planning, allowing public health officials to assess who might be at greater risk in any given year, based on their age and what viruses were dominant at the time of their birth.

"People's prior immunity to viruses like flu, or even coronavirus, can have a tremendous impact on their risk of becoming ill during subsequent epidemics and pandemics," says Matthew Miller, a co-author on the study and an associate professor at the Michael G. DeGroote Institute for Infectious Disease and the McMaster Immunology Research Centre.

"Understanding how their prior immunity either leaves them protected or susceptible is really important for helping us to identify the populations who are most at risk during seasonal epidemics and new outbreaks," he says.

Researchers collected and analyzed data from the 2018-19 flu season, which was highly unusual because both strains of influenza A dominated at different periods of time. Typically, only one strain dominates each flu season and will account for almost all cases.

The researchers found that people who were born when H1N1 was dominant have a much lower susceptibility to influenza during seasons dominated by that virus than during seasons dominated by H3N2. In turn, those born in a H3N2 year are less vulnerable to influenza A during seasons dominated by H3N2.

"We already knew from our previous studies that susceptibility to specific influenza subtypes could be associated with year of birth. This new study goes much further in support of antigenic imprinting. Instead of just showing how specific age patterns are associated with one subtype or the other during a single influenza season, we took advantage of a unique 'natural experiment' to show how the change in subtype dominance during one season appears to lead, practically in real time, to a change in susceptibility by age," explained Alain Gagnon, professor of demography at the University of Montreal and lead author of the study.

Health Canada estimates that influenza causes approximately 12,200 hospitalizations and 3,500 deaths every year.

Researchers hope to further explore transmission dynamics by analyzing how viruses spread within households, where exposure is high and prolonged. In this environment, they can assess how imprinting may or may not affect the transmission of each strain.

Credit: 
McMaster University

Less chemotherapy may have more benefit in rectal cancer

image: Ashley Glode, Pharm.D., and colleagues present preliminary evidence for lowering dosage or reducing schedule of neoadjuvant chemotherapy for locally advanced rectal cancer.

Image: 
University of Colorado Cancer Center

Chemotherapy used to shrink a tumor before surgery, called neoadjuvant chemotherapy, is becoming more common in many cancers, including stage II and III rectal cancer. However, the chemotherapy regimens FOLFOX and CapeOx used in this setting come with significant side effects, to the degree that many patients are unable to complete the recommended schedule. Now a University of Colorado Cancer Center study presented at the 2020 Gastrointestinal Cancers Symposium shows they may not have to: A small study of 48 patients with locally advanced rectal cancer receiving neoadjuvant chemotherapy, found that patients receiving lower-than-recommended doses in fact saw their tumors shrink more than patients receiving the full dose.

"I think we need bigger studies to explore less intensive therapy - maybe lower doses, maybe a shorter course of treatment - to see what is the optimal dosing prior to surgery," says Ashley E. Glode, PharmD, assistant professor at the Skaggs School of Pharmacy and Pharmaceutical Sciences, and the study's first author.

In some cancers, a tumor may be entwined with nearby organs and blood vessels to the point that surgery is not initially an option. Most patients with locally advanced rectal cancer are surgical candidates, but chemotherapy used to shrink a tumor prior to surgery has been associated with more successful surgeries and a lower rate of cancer recurrence. As a high-volume center for the treatment of these cancers, University of Colorado Cancer Center oncologists including Christopher Lieu, MD, noticed that patients who were unable to complete the recommended course of neoadjuvant chemotherapy seemed to have similar or even better outcomes than patients receiving the full dose, prompting the current study.

"We do all sorts of supportive care options to help keep patients on these therapies at the recommended, high doses. But based on our observations and on this early study, we're starting to talk about having less hesitancy to drop the drug or at least decrease the dosing," says Lieu, who is CU Cancer Center's interim associate director for clinical research.

Of the 48 patients included in the study, only 12.5 percent were able to tolerate the full dose of chemotherapy. Due to side effects, zero of six patients taking the regimen CapeOx completed the recommended dose.

"CapeOx is a treatment option mostly taken at home as a pill so it's easier for patients - they only have to come in for infusion once every three weeks. But the regimen wasn't tolerated by any patients on this study. It makes us think about not offering the option of CapeOx, and sticking with FOLFOX instead," Glode says.

In 42 patients receiving less than the full dose of FOLFOX, 45 percent experienced a complete response, meaning that cancer was undetectable after treatment (negating the need for surgery in eight cases). In 6 patients receiving the full dose of FOLFOX, 33 percent experienced a complete response.

"This is a small, single-institution study, but it certainly gives us pause," Glode says. "Why would patients take more chemotherapy and have more side effects, when less chemotherapy seems equally or even more beneficial?"

Credit: 
University of Colorado Anschutz Medical Campus

Likelihood of e-book purchases increase 31% by combining previews and reviews

Research reveals consumers don't simply rely on other peoples' opinions in reviews but leverage a combination of reviews and previews when purchasing e-books.

The purchase likelihood escalates 31% when consumers are exposed to both e-book previews and reviews.

Purchase likelihood is between 7 and 17% when consumers are exposed to either e-book previews only or online reviews only.

CATONSVILLE, MD, January 30, 2020 - New research in the INFORMS journal Information Systems Research finds that the purchasing decision of customers considering buying e-books is significantly influenced through easy access to a combination of e-book previews and reviews, resulting in a staggering 31% increase in a consumer's likelihood to purchase an e-book. When exposed to either previews only or online reviews only, purchase likelihood is between 7 and 17%.

The study, "When Seeing Helps Believing: The Interactive Effects of Previews and Reviews on E-Book Purchases," was conducted by Angela Choi of Florida State University, Daegon Cho and Wonseok Oh of the Korea Advanced Institute of Science and Technology, Dobin Yim of Loyola University and Jae Yun Moon of Korea University. Their work suggests that a combination of e-book previews alongside online reviews positively influence individual purchase decisions.

The researchers analyzed two months of data on more than 270,000 sessions that comprise clickstream data on consumers' exposure to previews and reviews and data on their subsequent purchase behaviors.

"Online reviews have served as critical resources for driving traffic and increasing sales, especially for e-books, but have also suffered from many market imperfections such as fake reviews, grade inflation and deliberate manipulation," said Choi.

To enhance consumers' direct product experiences and facilitate the validation of product-taste fit prior to purchase, many platform operators offer online previews. They allow for the opportunity to listen to the first few pages or chapters of a book ahead of purchase.

The notable difference between previews and online reviews is a matter of distance, meaning first-hand knowledge from direct experience, or a preview, versus second-hand knowledge from indirect experience, or an online review.

"Striking the right balance between direct and indirect product experience is becoming an important strategic challenge," concluded Choi.

Credit: 
Institute for Operations Research and the Management Sciences