Tech

Cell membranes in super resolution

image: Sphingolipid expansion microscopy (ExM) of tenfold expanded cells infected with chlamydia. The bacterial membranes are marked green; the inner and outer membranes of the bacteria can be distinguished (c). Under (a) confocal laser scanning and under (b) structured illumination microscopy (SIM). Scale bars: 10 and 2 microns in the small white rectangles respectively.

Image: 
(Picture: Sauer group / University of Würzburg)

Expansion microscopy (ExM) enables the imaging of cells and their components with a spatial resolution far below 200 nanometres. For this purpose, the proteins of the sample under investigation are cross-linked into a swellable polymer. Once the interactions between the molecules have been destroyed, the samples can be expanded many times over with water. This allows detailed insights into their structures.

"This method was previously limited to proteins. In the journal Nature Communications we are now presenting a way of expanding lipids and thus cell membranes," says Professor Markus Sauer, an expert in super-resolution microscopy from the Biocentre of Julius-Maximilians-Universität (JMU) Würzburg in Bavaria, Germany. JMU professors Thomas Rudel (microbiology) and Jürgen Seibel (chemistry) are also involved in the publication.

Synthetic lipids are marked and expanded

Jürgen Seibel's team has synthesised functionalised sphingolipids, which are an important component of cell membranes. If these lipids are added to cell cultures, they are incorporated into the cell membranes. They can then be marked with a dye and expanded four to ten times in a swellable polymer.

The JMU researchers show that this method - in combination with structured illumination microscopy (SIM) - makes it possible for the first time to image different membranes and their interactions with proteins with a resolution of 10 to 20 nanometres: cell membranes, the outer and inner cell nuclear membrane and also the membranes of intracellular organelles such as mitochondria.

Focusing on bacteria and viruses

The sphingolipids also integrate highly efficiently into the membranes of bacteria. This means that, for the first time, pathogens such as Neisseria gonorrhoeae, Chlamydia trachomatis and Simkania negevensis can now be visualised in infected cells with a resolution that was previously only achieved using electron microscopy. Even the inner and outer membranes of Gram-negative bacteria can be distinguished from each other.

"With the new super-resolution microscopic methods, we now want to investigate bacterial infection mechanisms and causes of antibiotic resistance. What we learn in the process could possibly be used for improved therapies," says Professor Thomas Rudel, an expert on bacterial infections.

The sphingolipids might also integrate into the membrane of viruses. If this is successful, the interactions of corona viruses with cells could be studied for the first time with high resolution light microscopy.

Credit: 
University of Würzburg

Study shows promising material can store solar energy for months or years

As we move away from fossil fuels and shift to renewable energy to tackle climate change, the need for new ways to capture and store energy becomes increasingly important.

Lancaster University researchers studying a crystalline material have discovered it has properties that allow it to capture energy from the sun. The energy can be stored for several months at room temperature, and it can be released on demand in the form of heat.

With further development, these kinds of materials could offer exciting potential as a way of capturing solar energy during the summer months, and storing it for use in winter - where less solar energy is available.

This would prove invaluable for applications such as heating systems in off-grid systems or remote locations, or as an environmentally-friendly supplement to conventional heating in houses and offices. It could potentially also be produced as a thin coating and applied to the surface of buildings, or used on the windscreens of cars where the stored heat could be used to de-ice the glass in freezing winter mornings.

The material is based on a type of 'metal-organic framework' (MOF). These consist of a network of metal ions linked by carbon-based molecules to form 3-D structures. A key property of MOFs is that they are porous, meaning that they can form composite materials by hosting other small molecules within their structures.

The Lancaster research team set out to discover if a MOF composite, previously prepared by a separate research team at Kyoto University in Japan and known as 'DMOF1', can be used to store energy - something not previously researched.

The MOF pores were loaded with molecules of azobenzene - a compound that strongly absorbs light. These molecules act as photoswitches, which are a type of 'molecular machine' that can change shape when an external stimulus, such as light or heat, is applied.

In tests, the researchers exposed the material to UV light, which causes the azobenzene molecules to change shape to a strained configuration inside the MOF pores. This process stores the energy in a similar way to the potential energy of a bent spring. Importantly, the narrow MOF pores trap the azobenzene molecules in their strained shape, meaning that the potential energy can be stored for long periods of time at room temperature.

The energy is released again when external heat is applied as a trigger to 'switch' its state, and this release can be very quick - a bit like a spring snapping back straight. This provides a heat boost which could be used to warm other materials of devices.

Further tests showed the material was able to store the energy for at least four months. This is an exciting aspect of the discovery as many light-responsive materials switch back within hours or a few days. The long duration of the stored energy opens up possibilities for cross-seasonal storage.

The concept of storing solar energy in photoswitches has been studied before, but most previous examples have required the photoswitches to be in a liquid. Because the MOF composite is a solid, and not a liquid fuel, it is chemically stable and easily contained. This makes it much easier to develop into coatings or standalone devices.

Dr John Griffin, Senior Lecturer in Materials Chemistry at Lancaster University and joint Principal Investigator of the study, said: "The material functions a bit like phase change materials, which are used to supply heat in hand warmers. However, while hand warmers need to be heated in order to recharge them, the nice thing about this material is that it captures "free" energy directly from the sun. It also has no moving or electronic parts and so there are no losses involved in the storage and release of the solar energy. We hope that with further development we will be able to make other materials which store even more energy."

These proof-of-concept findings open up new avenues of research to see what other porous materials might have good energy storage properties using the concept of confined photoswitches.

Joint investigator Dr Nathan Halcovitch added: "Our approach means that there are a number of ways to try to optimise these materials either by changing the photoswitch itself, or the porous host framework."

Other potential applications for crystalline materials containing photoswitch molecules include data storage - the well-defined arrangement of photoswitches in the crystal structure means that they could in principle be switched one-by-one using a precise light source and therefore store data like on a CD or DVD, but at a molecular level. They also have potential for drug delivery - drugs could be locked inside a material using photoswitches and then released on demand inside the body using a light or heat trigger.

Although the results were promising for this material's ability to store energy for long periods of time, its energy density was modest. Next steps are to research other MOF structures as well as alternative types of crystalline materials with greater energy storage potential.

Credit: 
Lancaster University

Kidney disease leading risk factor for COVID-related hospitalization

An analysis of Geisinger's electronic health records has revealed chronic kidney disease to be the leading risk factor for hospitalization from COVID-19.

A team of Geisinger researchers studied the health records of 12,971 individuals who were tested for COVID-19 within the Geisinger system between March 7 and May 19. Of this group, 1,604 were COVID-positive and 354 required hospitalization. The team analyzed the records for association between specific clinical conditions, including kidney, cardiovascular, respiratory and metabolic conditions, and COVID-19 hospitalization.

Overall, chronic kidney disease was most strongly associated with hospitalization, and COVID-19 patients with end-stage renal disease were 11 times more likely to be admitted to the hospital than patients without kidney disease.

The results were published in PLOS ONE.

"Previous studies have identified a variety of health conditions associated with an increased risk of COVID-related hospitalization, including diabetes, heart failure, hypertension, and chronic kidney disease. What is significant here is the magnitude of the kidney disease-related risk," said Alex Chang, M.D., Geisinger nephrologist and co-director of Geisinger's Kidney Health Research Institute. "These findings highlight the need to prevent COVID-19-related illness in patients with kidney disease and other high-risk conditions."

How underlying medical conditions increase the risk of COVID-19-related complications is not yet fully clear; however, the study suggests that the physiological stress caused by an excessive inflammatory response to COVID-19 infection could destabilize organs already weakened by chronic disease, or that organ injury from the virus could act as a "second-hit" to these organs.

"Consistent with this hypothesis, kidney and heart are among the tissues with the highest expression of ACE2, a SARS-CoV-2 receptor," the team wrote.

While the sample size studied was relatively small, Geisinger's resources as an integrated health system allowed for a fairly comprehensive analysis of available data.

"Our team used a novel approach made possible by our extensive electronic health records, unique demographic data and integrated health system," said Tooraj Mirshahi, Ph.D., associate professor for Geisinger's Department of Molecular and Functional Genomics. "We were able to perform this study despite having a much lower number of COVID-19 cases compared to large hospitals in metropolitan areas."

Credit: 
Geisinger Health System

RUDN University professor suggested how to clean up space debris

image: A specialist in spacecraft movement control analyzed the process of placing vehicle stages, boosters, and other space debris into the so-called disposal orbit and suggested cleaning lower orbits up with a spacecraft that has modules with engine units on board. These modules will attach to space debris objects and move them away. As for the geostationary orbit, a preferable way to clean it up would be a towing spacecraft that transports space debris objects into the disposal orbit.

Image: 
RUDN University

A specialist in spacecraft movement control analyzed the process of placing vehicle stages, boosters, and other space debris into the so-called disposal orbit and suggested cleaning lower orbits up with a spacecraft that has modules with engine units on board. These modules will attach to space debris objects and move them away. As for the geostationary orbit, a preferable way to clean it up would be a towing spacecraft that transports space debris objects into the disposal orbit. The research was carried out in collaboration with a team from Bauman Moscow State Technical University, and its results were published in the Advances in Space Research journal.

Besides satellites and the International Space Station, thousands of out-of-service spacecrafts, boosters, and other space debris objects move along different orbits around the Earth. Sometimes they collide and break down: for example, over 1,000 new observable fragments appeared in 2018 when eight objects fell to pieces in the near-Earth space. The more debris is left in space, the higher is the risk that it would damage the satellites, leaving us without communication and surveillance systems. Prof. Andrey Baranov from RUDN University together with his colleagues from Bauman Moscow State Technical University Dr. Dmitriy Grishko and Prof. Georgii Shcheglov studied the parameters of space debris in different orbits and came up with the most feasible ways for cleaning it up.

160 vehicle stages (from 1.1 to 9 tons each) are situated in low near-Earth orbits, i.e. at a height from 600 to 2,000 km. As for the geostationary orbit at the height of 35,786 km, the most potentially dangerous objects there are 87 boosters, each weighing from 3.2 to 3.4 tons. The size, weight, and parameters of these objects are quite different, therefore, they require different equipment to collect them and move to the so-called disposal orbit where the debris is safe to store.

A spacecraft-collector suggested by the team to clean up the near-Earth low orbits is 11.5 m long, 3 m in diameter, and weighs just over 4 tons. Such a collector can carry 8 to 12 modules with engine units on board. The movement of light vehicle stages will require 50 to 70 kg of fuel, while the transportation of a Zenit-2 stage that weighs 9 tons--around 350. The total weight of a spacecraft-collector at launch is expected to be from 8 to 12 tons. Modern-day boosters can easily place a weight like this into any orbit up to 1,000 km high. After a collector runs out of modules, it will attach itself to the last booster stage, move to the top layer of the atmosphere with it, and burn down.

As for the geostationary orbit, to clean it up the team suggested a spacecraft that is about 3.4 m long, 2.1 m wide, and weighs around 2 tons. According to their calculations, if loaded with modules, such a device would not be extremely efficient, and it would take 3-4 times more collectors to clean the orbit up. Therefore, in this case, the spacecraft-collector should work as a tow for space debris objects. Preliminary calculations suggest that it could operate for up to 15 years and transfer 40 to 45 space debris objects into the disposal orbit.

"Designing a spacecraft-collector for lower orbits is a more complicated task than creating one for the geostationary orbit. Best-case scenario, one spacecraft would be able to remove only 8 to 12 objects from lower orbits, while in the geostationary orbit it could transport 40 to 45. Therefore, cleaning up lower orbits is much more difficult. This factor should be taken into consideration by businesses and space agencies that plan to launch groups of hundreds or thousands of satellites in this area of the near-Earth space," explained Prof. Andrey Baranov, a PhD in Physics and Mathematics from the Department of Mechanics and Mechatronics, RUDN University.

Credit: 
RUDN University

New platform generates hybrid light-matter excitations in highly charged graphene

image: Massive work function-mediated charge transfer in graphene/?-RuCl3 heterostructures provides the necessary conditions for generating plasmon polaritons without electrostatic or chemical doping. The image depicts a characteristic infrared near-field image of such a heterostructure, revealing a host of plasmonic oscillations derived from substantial mutual doping of interfacial graphene/?-RuCl3 layers.

Image: 
Daniel J. Rizzo/Columbia University

New York, NY--December 2, 2020--Graphene, an atomically thin carbon layer through which electrons can travel virtually unimpeded, has been extensively studied since its first successful isolation more than 15 years ago. Among its many unique properties is the ability to support highly confined electromagnetic waves coupled to oscillations of electronic charge--plasmon polaritons--that have potentially broad applications in nanotechnology, including biosensing, quantum information, and solar energy.

However, in order to support plasmon polaritons, graphene must be charged by applying a voltage to a nearby metal gate, which greatly increases the size and complexity of nanoscale devices. Columbia University researchers report that they have achieved plasmonically active graphene with record-high charge density without an external gate. They accomplished this by exploiting novel interlayer charge transfer with a two-dimensional electron-acceptor known as α-RuCl3. The study is available now online as an open access article and will appear in the December 9th issue of Nano Letters.

"This work allows us to use graphene as a plasmonic material without metal gates or voltage sources, making it possible to create stand-alone graphene plasmonic structures for the first time" said co-PI James Hone, Wang Fong-Jen Professor of Mechanical Engineering at Columbia Engineering.

All materials possess a property known as a work function, which quantifies how tightly they can hold on to electrons. When two different materials are brought into contact, electrons will move from the material with the smaller work function to the material with the larger work function, causing the former to become positively charged and the latter to become negatively charged. This is the same phenomenon that generates static charge when you rub a balloon against your hair.

α-RuCl3 is unique among nanomaterials because it has an exceptionally high work function even when it is exfoliated down to a one- or few-atom-thick 2D layers. Knowing this, the Columbia researchers created atomic-scale stacks consisting of graphene on top of α-RuCl3. As expected, electrons were removed from the graphene, making it highly conductive and able to host plasmon polaritons -- without the use on an external gate.

Using α-RuCl3 to charge graphene brings two main advantages over electrical gating. α-RuCl3 induces much greater charge than can be achieved with electrical gates, which are limited by breakdown of the insulating barrier with the graphene. In addition, the spacing between graphene and the underlying gate electrode blurs the boundary between charged and un-charged regions due to "electric field fringing." This prevents realization of sharp charge features within the graphene and along the graphene edge necessary to manifest novel plasmonic phenomena. In contrast, at the edge of the α-RuCl3, the charge in the graphene drops to zero on nearly the atomic scale.

"One of our major achievements in this work is attaining charge densities in graphene roughly 10 times larger than the limits imposed by dielectric breakdown in a standard gated device," said the study's lead PI Dmitri Basov, professor of physics. "Moreover, since the α-RuCl3--the source of electronic charge--is in direct contact with graphene, the boundaries between the charged and uncharged regions in the graphene are razor-sharp. This allows us to observe mirror-like plasmon reflection from these edges and to create historically elusive one-dimensional edge plasmons that propagate along the graphene edge." The team also observed sharp boundaries at "nano-bubbles," where contaminants trapped between the two layers disrupt charge transfer.

"We were very excited to see how abruptly the graphene charge density can change in these devices," said Daniel Rizzo, a postdoctoral research scientist with Basov and the lead author on the paper. "Our work is a proof-of-concept for nanometer charge control that was previously the realm of fantasy."

The work was carried out in the Energy and Frontier Research Center on Programmable Quantum Materials funded by the United States Department of Energy and led by Basov. The research project used shared facilities operated by the Columbia Nano Initiative.

The researchers are now pursuing routes to use etched α-RuCl3 as a platform for generating custom nanoscale charge patterns in graphene to precisely tune the plasmonic behavior according to various practical applications. They also hope to demonstrate that α-RuCl3 can be interfaced with a wide range of 2D materials to access novel material behaviors that require the exceptionally high charge density imparted by interlayer charge transfer demonstrated in their manuscript.

Hone noted, "When our interlayer charge transfer technique is combined with existing procedures for patterning 2D substrates, we can easily generate tailor-made nanoscale charge patterns in graphene. This opens up a wealth of new opportunities for new electronic and optical devices"

Credit: 
Columbia University School of Engineering and Applied Science

Nanomaterials enable dual-mode heating and cooling device

image: A nanomaterial sheet can be used to either bounce heat away or absorb it. Here it is in heating mode (top) and cooling mode (bottom).

Image: 
Po-Chun Hsu, Duke Engineering

DURHAM, N.C. - Engineers at Duke University have demonstrated a dual-mode heating and cooling device for building climate control that, if widely deployed in the U.S., could cut HVAC energy use by nearly 20 percent.

The invention uses a combination of mechanics and materials science to either harness or expel certain wavelengths of light. Depending on conditions, rollers move a sheet back and forth to expose either heat-trapping materials on one half or cooling materials on the other. Specially designed at the nanoscale, one material absorbs the sun's energy and traps existing heat, while the other reflects light and allows heat to escape through the Earth's atmosphere and into space.

"I think we are the first to demonstrate a reversible thermal contact, which allows us to switch between the two modes for heating or cooling," said Po-Chun Hsu, assistant professor of mechanical engineering and materials science at Duke and leader of the team. "This allows the material to be movable while still maintaining a good thermal contact with the building to either bring heat in or let heat out."

The results appeared online November 30, in the journal Nature Communications.

About 15% of energy consumption in the U.S., and more than 30% globally, is for the heating and cooling of buildings, which is responsible for about 10% of global greenhouse gas emissions. Yet, up to now, most approaches to minimize the carbon footprint have only addressed either heating or cooling. That leaves the world's temperate climate zones that require both heating and cooling during the year - or sometimes in a single 24 hours - out in the cold. In the new paper, Hsu and his team demonstrate a device that potentially could keep us either cozy or cool as the weather changes.

The specially designed sheet starts with a polymer composite as the base that can expand or contract by running electricity through it. This allows the device to maintain contact with the building for transmitting energy while still being able to disengage so that the rollers can switch between modes.

The cooling portion of the sheet has an ultra-thin silver film covered by an even thinner layer of clear silicone, which together reflect the sun's rays like a mirror. The unique properties of these materials also convert energy into and emit mid-range infrared light, which does not interact with the gasses in the Earth's atmosphere and easily passes into outer space.

When a change in weather brings the need for heating, the electrical charge releases and the rollers pull the sheet along a track. This swaps the cooling, reflective half of the sheet for the heat-absorbing half.

To heat the building beneath, the engineers used an ultra-thin layer of copper topped by a layer of zinc-copper nanoparticles. By making the nanoparticles a specific size and spacing them a certain distance apart, they interact with the copper beneath them in a way that traps light onto their surface, allowing the material to absorb more than 93% of the sunlight's heat.

Hsu and his team see the device as something that could work with existing HVAC systems, rather than a full replacement.

"Instead of directly heating and cooling the building, we could use a water panel to take hot or cold water to a heat pump or boiler system," said Hus. "I also imagine that with additional engineering, this could also be used on walls, forming a sort of switchable building envelop." said Hsu.

Moving forward, the team is working on several aspects of the design to advance it from a prototype to one scalable for manufacturing. Among these, explained Hsu, are concerns about the long-term wear and tear of the moving parts and costs of the specialized materials. For example, they will investigate whether lower-cost aluminum can substitute for the silver and are also working on a static version that can switch modes chemically rather than mechanically.

Despite the many obstacles, Hsu believes this technology could be an energy-saving boon in the future. And he's not alone.

"We're already working with a company to determine the ideal locations for deploying this technology," said Hsu. "And because almost every climate zone in the United States requires both heating and cooling at some point throughout the year, the advantages of a dual-mode device such as this are obvious."

Credit: 
Duke University

Sensor can detect scarred or fatty liver tissue

CAMBRIDGE -- About 25 percent of the U.S. population suffers from fatty liver disease, a condition that can lead to fibrosis of the liver and, eventually, liver failure.

Currently there is no easy way to diagnose either fatty liver disease or liver fibrosis. However, MIT engineers have now developed a diagnostic tool, based on nuclear magnetic resonance (NMR), that could be used to detect both of those conditions.

"Since it's a noninvasive test, you could screen people even before they have obvious symptoms of compromised liver, and you would be able to say which of these patients had fibrosis," says Michael Cima, the David H. Koch Professor of Engineering in MIT's Department of Materials Science and Engineering, a member of MIT's Koch Institute for Integrative Cancer Research, and the senior author of the study.

The device, which is small enough to fit on a table, uses NMR to measure how water diffuses through tissue, which can reveal how much fat is present in the tissue. This kind of diagnostic, which has thus far been tested on mice, could help doctors catch fatty liver disease before it progresses to fibrosis, the researchers say.

MIT PhD recipient Ashvin Bashyam and graduate student Chris Frangieh are the lead authors of the paper, which appears today in Nature Biomedical Engineering.

Tissue analysis

Fatty liver disease occurs when liver cells store too much fat. This leads to inflammation and eventually fibrosis, a buildup of scar tissue that can cause jaundice and liver cirrhosis, and eventually liver failure. Fibrosis is usually not diagnosed until the patient begins to experience symptoms that include not only jaundice but also fatigue and abdominal swelling. A biopsy is needed to confirm the diagnosis, but this is an invasive procedure and may not be accurate if the biopsy sample is taken from a part of the liver that is not fibrotic.

To create an easier way to check for this kind of liver disease, Cima and his colleagues had the idea of adapting a detector that they had previously developed to measure hydration levels before and after patients undergo dialysis. That detector measures fluid volume in patients' skeletal muscle by using NMR to track changes in the magnetic properties of hydrogen atoms of water in the muscle tissue.

The researchers thought that a similar detector could be used for identifying liver disease because water diffuses more slowly when it encounters fatty tissue or fibrosis. Tracking how water moves through tissue over time can reveal how much fatty or scarred tissue is present.

"If you watch how the magnetization changes, you can model how fast the protons are moving," Cima says. "Those cases where the magnetization doesn't go away very fast would be ones where the diffusivity was low, and they would be the most fibrotic."

In a study of mice, the researchers showed that their detector could identify fibrosis with 86 percent accuracy, and fatty liver disease with 92 percent accuracy. It takes about 10 minutes to obtain the results, but the researchers are now working on improving the signal-to-noise ratio of the detector, which could help to reduce the amount of time it takes.

Early detection

The current version of the sensor can scan to a depth of about 6 millimeters below the skin, which is enough to monitor the mouse liver or human skeletal muscle. The researchers are now working on designing a new version that can penetrate deeper below the tissue, to allow them to test the liver diagnosis application in human patients.

If this type of NMR sensor could be developed for use in patients, it could help to identify people in danger of developing fibrosis, or in the early stages of fibrosis, so they could be treated earlier, Cima says. Fibrosis can't be reversed, but it can be halted or slowed down through dietary changes and exercise. Having this type of diagnostic available could also aid in drug development efforts, because it could allow doctors to more easily identify patients with fibrosis and monitor their response to potential new treatments, Cima says.

Another potential application for this kind of sensor is to evaluate human livers for transplant. In this study, the researchers tested the monitor on human liver tissue and found that it could detect fibrosis with 93 percent accuracy.

Credit: 
Massachusetts Institute of Technology

Drug attenuates weight gain in animals fed a high-fat diet

image: Enoxacin-stimulated adipocytes shown in microscopy

Image: 
Danilo Ferrucci and Andréa Rocha/INFABiC/UNICAMP

A drug originally developed to treat bacterial infections has proved capable of boosting the metabolism and attenuating the weight gain induced by a fatty diet in tests with mice. The study was conducted at the University of Campinas (UNICAMP) with FAPESP's support, and the findings have just been published in the journal Science Advances.

According to Marcelo Mori, a professor at the university's Biology Institute (IB-UNICAMP) and principal investigator for the project, the treatment made the animals' white adipose tissue, which normally stores surplus energy as fat, behave like brown adipose tissue by burning calories to generate heat (thermogenesis). This effect, often called browning, was observed only when the animals were exposed to cold.

"The purpose of the study was to prove a principle, to show that it's possible to increase energy expenditure and reduce the harmful consequences of obesity by acting pharmacologically on a metabolic pathway of interest," Mori told Agência FAPESP. "We are not proposing use of this specific drug to combat weight gain. Our results point to ways of finding compounds that act on the same pathway more effectively and safely."

The substance used in the experiments was enoxacin, a broad-spectrum antibiotic that has not been used in clinical practice since the emergence of more effective drugs in the same class (fluoroquinolones) some years ago. According to Mori, it was chosen on the basis of studies showing that it can modulate the production of microRNAs in adipose tissue. MicroRNAs (or miRNAs) are short non-coding RNAs that regulate gene expression.

"Previous studies in animal models and humans indicated that interventions capable of extending longevity and improving metabolic health, such as physical exercise and caloric restriction, promote this benefit by increasing adipose tissue's capacity to process mirRNAs. In other words, these practices modulate the pathway responsible for miRNA biogenesis," Mori said. "We showed that it's possible to bring about a metabolic improvement in the organism by stimulating this same pathway with a drug. On the other hand, we showed that inhibiting miRNA biogenesis triggers processes that accelerate aging."

Laboratory experiments

Some of the experiments reported in the article involved cultures of human and mouse preadipocytes, cells that store fat when mature and become part of white adipose tissue. In the laboratory, cellular differentiation takes about eight days and can be induced by a cocktail of chemicals and hormones.

In one of the protocols the researchers added enoxacin to the differentiation cocktail, and at the end of the process compared the mature adipocytes with a control culture not exposed to enoxacin. The analysis showed that in white adipocytes the drug triggered expression of PPARGC1A and UCP1, genes that are significantly active in brown adipose tissue cells under normal circumstances. The proteins encoded by these two genes are considered molecular markers of thermogenesis. The conclusion was that enoxacin induced browning by reprogramming the white adipose cells to dissipate energy in the form of heat instead of storing it as fat.

"The protein UCP1 acts on mitochondria [energy-producing organelles] by uncoupling the electron transport chain used in the synthesis of ATP [adenosine triphosphate, a molecule that serves as fuel for cells>]. This makes the cells dissipate energy as heat," Mori said. "To understand the process better, we can compare it to the damming of a river to produce electricity. If holes are made in the dam, the difference in elevation that should be used to drive the turbine is dissipated without generating the desired energy."

In the second protocol tested in vitro, the differentiated adipocytes were exposed to enoxacin for 24 hours and then analyzed. The researchers found that the cells expressed the same markers of thermogenesis, showing that the white adipocytes now formed had also undergone browning.

Functional tests showed that both antibiotic treatment protocols led the white adipocytes to "breathe" more deeply, i.e. consume more oxygen than the untreated cells even under the same conditions.

Activated by cold

According to Mori, the experiments with mice replicated the thermogenic effect observed in vitro. In this case, the enoxacin was administered by intraperitoneal injection. Oral administration of an antibiotic could be harmful to gut microbiota and interfere with the results.

Several different protocols were tested. In one protocol, the researchers fed a group of rodents a high-fat diet for a month before starting the treatment with enoxacin. The mice were given the drug once a day from Monday to Friday for ten weeks, and the fatty diet was maintained throughout the period. At the end, the control animals (given placebo) displayed a weight gain of 13 grams (g) on average, while those treated with the antibiotic had put on 8 g.

"The weight gain curve began diverging between the two groups as soon as administration of the drug began," Mori said. "The control group's weight gain continued to rise strongly, while the treated group's curve became less steep. At the end we found the treatment also led to enhanced insulin sensitivity and glucose tolerance."

Analysis of adipose tissue evidenced increased expression of thermogenesis marker genes PPARGC1A and UCP1 in both white and brown adipose cells.

However, the beneficial effect on the metabolism was observed only when the mice were exposed to mild cold stress (about 24° C) or severe cold stress (in the range of 6o C). When thermoneutrality was maintained (about 30o C, the ideal temperature for these mice), enoxacin was not found capable of activating thermogenesis, indicating that the drug intensified the action of cold.

To find out whether the effect observed in the mice exposed to mild cold stress depended on the drug's action on gut bacteria, which are known to influence weight gain, the researchers repeated the experiment using germ-free mice bred under controlled laboratory conditions to be totally free of microbiota. Enoxacin also promoted thermogenesis in this experiment.

"The fact that the drug doesn't work in thermoneutrality reduces its potential for use as a sole treatment for obesity, but it can help us understand the mechanisms involved in the process or be used as an adjuvant," Mori said. "In any event, we expect to find a molecule with even more powerful thermogenic action and without the side-effects of an antibiotic."

Substances marketed for this purpose already exist, but according to Mori some excessively stimulate the adrenergic system, which regulates the fight-or-flight response to threats, and can cause cardiac arrhythmia and even a heart attack. Others are even more dangerous, chemically uncoupling the inner mitochondrial membrane and increasing cellular energy expenditure.

"The problem with current interventions is that they have systemic effects, acting on various cell types including even neurons and cardiomyocytes [heart muscle cells]," Mori said. "They can cause cell death and affect important processes in the organism."

Action mechanism

In the last stage of the investigation, the researchers sequenced all the mirRNAs expressed in the adipose tissue of the treated mice as well as the control mice. The initial analysis showed that the treatment altered the expression of a few dozen mirRNAs.

"We cross-tabulated our results with data from public repositories in search of miRNAs regulated by both enoxacin and other interventions that induce browning. We eventually arrived at miR-34a-5p, which is consistently modulated both by the drug and by caloric restriction and cold stress," Mori said.

Enoxacin may interfere in the processing and stability of miR-34a-5p, he added. Because it suppresses signaling by hormones, including FGF21, which promotes energy expenditure, a reduction in miR-34a-5p intensifies the thermogenic stimulus.

Researchers at AstraZeneca collaborated with Mori's group in this investigation. On the basis of the findings, UNICAMP and the pharmaceutical company are testing molecules with similar structures to enoxacin but without antibacterial effects. In vitro experiments have had promising results, pointing to compounds with even stronger thermogenic action than the original molecule and without microbicidal effects.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Why does it matter if most Republican voters still think Biden lost?

image: Trump supporters and opponents differ most in their expressed levels of confidence in how votes were counted. Confidence plummeted at all levels among Trump supporters from Bright Line Watch's October survey compared to their November survey. Most notably, confidence in the national vote count collapsed from 56 percent to 28 percent among Trump supporters at the national level.

Image: 
Bright Line Watch

As President-elect Joe Biden and his administrative team officially begin the transition process, only about 20 percent of Republican voters consider him the true winner of the election. Nearly half of all respondents--48 percent--still expect President Donald Trump to be inaugurated for a second term on January 20, despite plenty of indicators to the contrary. Those are among the findings of the latest Bright Line Watch (November) survey--the political science research project of faculty at the University of Rochester, the University of Chicago, and Dartmouth College.

"We were really struck by how much Trump's baseless narrative of voter fraud to explain his loss to Biden has continued to shape the beliefs of his supporters after the election," says Gretchen Helmke, a professor of political science at the University of Rochester and a Bright Line Watch founder.

The watchdog group's latest survey shows a clear increase in the public's partisan divide over the legitimacy of the election compared to their previous survey in October. In the October survey, 33 percent of strong Trump supporters still said they would view a victorious Biden as definitely or probably the "rightful winner." Yet after the election, only 9 percent of strong Trump supporters saw Biden as the rightful winner. Indeed, 67 percent of that group now expressed certainty that Biden was "definitely not the rightful winner," the scientists found.

"Although it's typical that voters' optimism about things like the economy, or the direction of the country shift depending on whether their candidate wins the election, it's really striking just how pervasive the loss of confidence in elections has become among Trump's supporters in the wake of his defeat," says Helmke.

As they had done throughout the project, the group fielded two parallel surveys-- one to political experts and one to a representative sample of the US population--between November 12 and 25.

Bright Line Watch asked the public about the legitimacy of the election results, their confidence that votes were cast and counted fairly, their beliefs about voter fraud, and their willingness to condone political violence. Meanwhile, the experts were asked to rate the likelihood of 23 scenarios related to the November election and the transition to a new administration that could lead to political crises. As in previous surveys, the team asked both groups to assess the quality of US democracy overall and to rate the performance on 30 distinct democratic principles.

Read the watchdog group's latest (November 2020) survey, "A Democratic Stress Test -- The 2020 Election and Its Aftermath."

Key findings

Public survey findings

Compared to before the election, confidence in the election process and the legitimacy of the outcome became much more polarized between Trump supporters and opponents.

Most notably, confidence in the national vote count plummeted among Trump supporters, declining from 56 percent before the election--to 28 percent afterward.

Belief about voter fraud became even more polarized: larger majorities of Trump supporters now believe that fraud is rampant compared to before the election.

More encouragingly, the willingness to condone political violence declined slightly after the election.

Expert survey findings

Experts in the October survey correctly identified six nightmare scenarios that did transpire (among the eight that the experts had forecast as most likely) but they generally overestimated the probability of outcomes seen as somewhat or not very likely (and none of which actually took place).

Experts believe it's highly likely that Trump will continue to refuse to recognize Biden's presidency and to obstruct the transition.

Experts think it's highly likely that the president will take actions to protect himself and those around him from legal exposure after leaving office.

Experts regard problems with the Electoral College and the formal recognition of Biden's presidency as unlikely.

Large majorities of experts regard Trump's attacks on US elections and the press as serious or grave threats to American democracy. By contrast, experts do not consider mail balloting to pose a threat to democracy and are divided over Amy Coney Barrett's Supreme Court appointment.

"We found especially disconcerting that the polarization of beliefs about the legitimacy of election between Trump supporters and opponents, which was already extremely high beforehand, has only grown in the wake of the presidential election," says Helmke.

In the last poll conducted right before the November election, for example, Bright Line Watch found huge gaps between the two groups in terms of concern about voter fraud, but not about whether votes would be counted fairly at the national level. In the postelection survey, however, the gaps in perceived fraud were even larger, with roughly 80 percent of Trump supporters now expressing the belief that fraud of all types (stealing or tampering with ballots, pretending to be someone else, voting by non-US citizens, voting multiple times, or voting with another person's absentee ballot) was rampant in the 2020 election--having occurred thousands of times, or even millions of times. Yet, fewer than 20 percent of Trump opponents believe in this kind of fraud, the scientists found.

"The levels of fraud in which these respondents profess to believe are staggering and would require the complicity of thousands of local electoral officials and volunteers, including numerous Republicans and nonpartisan participants," the political scientists write. Yet, regardless of the fact that all claims of widespread fraud and electoral misconduct have been dismissed by judges in Pennsylvania, Michigan, Nevada, Arizona, and Georgia, the false narrative has become pervasive among Trump supporters, the team notes.

Additionally, the group saw gaps open up in the respondents' confidence that votes are counted fairly, says Helmke. For example, prior to the election, about 60 percent of both Trump opponents and approvers thought that votes would be counted as intended at the level of the national government. After the election, however, confidence increased to more than 80 percent among Trump opponents, but fell to about 25 percent among Trump approvers.

"I'm deeply worried about the divide in confidence in the election system that our data reveal," says Brendan Nyhan, professor of government at Dartmouth College and one of the founders of Bright Line Watch. "Far too many Trump supporters say they lack confidence in the national vote count and don't see Joe Biden as the rightful winner. Even if some of them are expressing their political viewpoint through their responses, those sentiments can be deeply damaging to our democracy, which relies on the losing side accepting the legitimacy of the outcome."

Credit: 
University of Rochester

Once in a lifetime floods to become regular occurrences by end of century

image: Sea level rise scenarios have a 50% and 5% chance of exceedance in the mid-twenty-first century (panels a and b) and the late-twenty-first century (panels c and d). Areas in dark and light blue cover flooded areas.

Image: 
Stevens Institute of Technology

Superstorm Sandy brought flood-levels to the New York region that had not been seen in generations. Causing an estimated $74.1 billion in damages, it was the fourth-costliest U.S. storm behind Hurricane Katrina in 2005 and hurricanes Harvey and Maria in 2017 according to the National Oceanic and Atmospheric Administration. Now, due to the impact of climate change, researchers at Stevens Institute of Technology have found that 100-year and 500-year flood levels could become regular occurrences for the thousands of homes surrounding Jamaica Bay, New York by the end of the century.

The study, led by Reza Marsooli, assistant professor of civil, environmental and ocean engineering at Stevens, can help policymakers and the coastal municipality of Jamaica Bay make decisions on whether to apply coastal flood defenses or other planning strategies or policies for reducing future risk. It also provides an example of the extent of how coastal flooding will increase in the future across the New York region and other areas due to the impacts of climate change.

"While this study was specific to Jamaica Bay, it shows how drastic and costly of an impact that climate change will make," said Marsooli, whose work appears in the Nov. 26 issue of Climatic Change. "The framework we used for this study can be replicated to demonstrate how flooding in other regions will look by the end of the century to help them mitigate risk and best protect communities and assets in impacted areas."

Based on the anticipated greenhouse gas concentration by the end of the 21st century, Marsooli and his co-author Ning Lin, from Princeton University, conducted high resolution simulations for different scenarios to find the probability of different flood levels being reached, assuming emissions remain at a high level. They studied how sea level rise and hurricane climatology change would impact the area in the future due to storm surge and wave hazards.

Marsooli and Lin found that the historical 100-year flood level would become a nine-year flood level by mid-century (2030-2050) and a one-year flood level by late 21st century (2080-2100). Most recently reached by Superstorm Sandy, 500-year flood levels would become a 143-year flood level, and then a four-year flood level by the end of the century. Additionally, sea-level rise would result in larger waves which could lead to more flood hazards such as erosion and damage to coastal infrastructure.

"Future projections of the hurricane climatology suggest that climate change would lead to storms that move more slowly and are more intense than we have ever seen before hitting Jamaica Bay," said Marsooli. "But the increase in these once-in-a-generation or even less frequent floods is so dramatic because the impact of sea-level rise will create greater flooding, even if the storms we were seeing today stayed the same."

Credit: 
Stevens Institute of Technology

ETRI, DGIST develop new electrode structure for all-solid-state secondary battery

image: ETRI researchers are looking at a new type of electrode structure for all-solid-state secondary battery.

Image: 
Electronics and Telecommunications Research Institute(ETRI)

South Korean researchers have developed a new type of electrode structure for all-solid-state secondary batteries. If this technology is adopted, the energy density of the batteries could increase significantly when compared to existing technologies, contributing tremendously to the development of high-performance secondary batteries.

A joint research team from Electronics and Telecommunications Research Institute (ETRI) and Daegu Gyeongbuk Institute of Science and Technology (DGIST) announced that it had designed a new electrode structure for all-solid-state secondary batteries after identifying the mechanism of facile lithium-ion diffusion between active materials. The achievement received international recognition when the results were published by the ACS Energy Letters, an international online academic journal specializing in the energy sector which is run by the American Chemical Society (ACS)*.

Unlike primary cells that can be only used once and never be able to be reused, secondary batteries can be recharged and used repeatedly. With the recent advances in electronic devices, the importance of secondary battery technology to robots, electric cars, energy storage systems (ESS), and drones is growing year by year.

All-solid-state secondary battery is a next-generation energy storage device that uses a solid electrolyte to transport ions within battery electrodes. Solid electrolytes are safer than liquid electrolytes which can cause a fire. Moreover, solid electrolytes can be implemented in a bipolar-type secondary cell* to increase energy density by a simple battery configuration.

The electrode structure of a conventional all-solid-state secondary cell consists of a solid electrolyte responsible for ionic conduction, a conductive additive that provides the means for electron conduction; active material responsible for storing energy; and a binder that holds these constituent parts physically and chemically.

ETRI researchers discovered through systematic experiments, however, that ions are transported even between graphite active material particles. And they proposed a new type of electrode structure for all-solid-state secondary cell consisting of only the active material and the binder. The researchers confirmed the possibility that even without a solid electrolyte additive within the electrodes, the performance of an all-solid-state secondary cell could be superior.

The theoretical feasibility of the novel structure proposed by ETRI was verified at DGIST through electrochemical testing (using a supercomputer) of a virtual model. ETRI researchers succeeded in demonstrating this structure in an actual experiment. ETRI named this technology 'diffusion-dependent all-solid-state electrode' and submitted a paper to an international journal.

If ETRI's technology is adopted, solid conduction additive material will become unnecessary in the electrode; instead, the more active material can be squeezed into the same volume. In other words, the amount of active material in the electrode can increase by up to 98wt%* and as a result, the energy density* can be made 1.5 times greater than the conventional graphite composite electrode.

The technology offers advantages in manufacturing process aspects as well. Sulfide-type solid electrolytes*, which have high ion conductivity and moderate plasticity, are regarded as an excellent candidate for the fabrication of all-solid-state batteries. But due to its high chemical reactivity*, the sulfide-type solid electrolytes leave battery developers with very few options when it comes to solvents and binders. In contrast, with the new ETRI electrode, developers can freely select the type of solvent and binder to use in the battery because the electrode contains no solid electrolytes that are highly reactive. This also permits researchers to pursue new approaches for improving the performance of all-solid-state secondary cells.

Dr. Young-Gi Lee, who was involved in this research, said, "We have revealed for the first time that ions can be diffused just with active materials. We are no longer bound to the structure used in existing all-solid-state secondary cells. We plan to develop secondary cells with even high energy densities, using this technology. We will also secure our rights to the core technology and work on a version that could be commercialized."

Although ETRI conducted its research using graphite cathode active material, it intends to continue its research based on the same concept, using various other electrode materials. It is also planning to enhance the technology in order to increase efficiency. This can be accomplished by eliminating the interfacial issues between electrodes and thinning the volume of electrodes.

Credit: 
National Research Council of Science & Technology

New glue sticks easily, holds strongly, and is a gas to pull apart

image: A temporary adhesive based on molecular solids is strong enough to hold a chemistry PhD candidate, but can be released without force through the use of heat in a vacuum.

Image: 
Photo courtesy of Nicholas Blelloch.

HANOVER, N.H. - December 2, 2020 - Temporary glues may not steal headlines, but they can make everyday life easier.

Sticky office notes, bandage strips and painter's tape are all examples of products that adhere to surfaces but can be removed with relative ease.

There's only one drawback. To remove any of those adhesives, the glued surfaces need to be pulled apart from each other.

Dartmouth research has discovered a class of molecular materials that can be used to make temporary adhesives that don't require force for removal. These non-permanent glues won't be available as home or office supplies, but they can lead to new manufacturing techniques and pharmaceutical design.

"This temporary adhesive works in an entirely different way than other adhesives," said Katherine Mirica, an assistant professor of chemistry at Dartmouth. "This innovation will unlock new manufacturing strategies where on-demand release from adhesion is required."

The Dartmouth research focuses on molecular solids, a special class of adhesive materials that exist as crystals. The molecules in the structures are sublimable, meaning that they shift directly from a solid to a gas without passing through a liquid phase.

The ability to bypass the liquid phase is the key to the new type of temporary adhesives. The adhesive sticks as a solid but then turns to a vapor and releases once it is heated in a vacuum environment.

"The use of sublimation--the direct transition from solid to vapor--is valuable because it offers gentle release from adhesion without the use of solvent or mechanical force," said Mirica.

Previous Dartmouth research was the first to identify how molecular solids can act as temporary adhesives. According to new research, published in the academic journal Chemistry of Materials, the class of molecules that can be used to make these new-generation materials is wider than previously thought.

"We've expanded the list of molecules that can be used as temporary adhesives," said Nicholas Blelloch, a PhD candidate at Dartmouth and first author of the paper. "Identifying more materials to work with is important because it offers expanded design strategies for bonding surfaces together."

The research team says the new temporary adhesives can be useful in technical applications such as semiconductor manufacturing and drug development.

When making computer chips, silicon components need to be temporarily bonded. The use of a strong adhesive that releases through sublimation can allow for the development of smaller, more powerful chips since tapes requiring forceful pulling would no longer be required.

In pharmaceuticals, the design principles highlighted through this work can help the development of smaller, faster-acting pills. The adhesives can also be helpful in the design of nano- and micromechanical devices where the use of tape is not possible.

The finding also gives researchers more flexibility in developing temporary adhesives.

"Identifying more molecules with adhesives properties refines our fundamental understating of the multi-scale and multi-faceted factors that contribute to the adhesive properties of the system," said Blelloch

Most common temporary adhesives that are used in the home or office are polymers, long chemical chains that create strong bonds, but can be difficult to be pulled from surfaces.

If polymers can be described as long chemical strands that easily tangle, molecular solids are more like individual chemical beads that sit atop each other. Both can be made to adhere, but there are tradeoffs.

Polymers used to make super glues tangle so well that they form exceedingly strong bonds that are difficult to pull apart. Sticky office notes and painter's tape are also polymers, but with much less holding strength. They also require a peeling or ripping action to remove the bond.

Molecular solids being studied by the Dartmouth team can be as strong as temporary, polymer-based adhesives. The advantage of the new glues is that they not only adhere easily, they can be released without force, and without disturbing the bonded surfaces.

Credit: 
Dartmouth College

Next step in simulating the universe

image: Figure

Image: 
University of Tsukuba

Tsukuba, Japan - Computer simulations have struggled to capture the impact of elusive particles called neutrinos on the formation and growth of the large-scale structure of the Universe. But now, a research team from Japan has developed a method that overcomes this hurdle.

In a study published this month in The Astrophysical Journal, researchers led by the University of Tsukuba present simulations that accurately depict the role of neutrinos in the evolution of the Universe.

Why are these simulations important? One key reason is that they can set constraints on a currently unknown quantity: the neutrino mass. If this quantity is set to a particular value in the simulations and the simulation results differ from observations, that value can be ruled out. However, the constraints can be trusted only if the simulations are accurate, which was not guaranteed in previous work. The team behind this latest research aimed to address this limitation.

"Earlier simulations used certain approximations that might not be valid," says lead author of the study Lecturer Kohji Yoshikawa. "In our work, we avoided these approximations by employing a technique that accurately represents the velocity distribution function of the neutrinos and follows its time evolution."

To do this, the research team directly solved a system of equations known as the Vlasov-Poisson equations, which describe how particles move in the Universe. They then carried out simulations for different values of the neutrino mass and systemically examined the effects of neutrinos on the large-scale structure of the Universe.

The simulation results demonstrate, for example, that neutrinos suppress the clustering of dark matter--the 'missing' mass in the Universe--and in turn galaxies. They also show that neutrino-rich regions are strongly correlated with massive galaxy clusters and that the effective temperature of the neutrinos varies substantially depending on the neutrino mass.

"Overall, our findings suggest that neutrinos considerably affect the large-scale structure formation, and that our simulations provide an accurate account for the important effect of neutrinos," explains Lecturer Yoshikawa. "It is also reassuring that our new results are consistent with those from entirely different simulation approaches."

Credit: 
University of Tsukuba

Hydrogen-powered heavy duty vehicles could contribute significantly to achieving climate goals

A partial transition of German road transport to hydrogen energy is among the possibilities being discussed to help meet national climate targets. A team of researchers from the Institute for Advanced Sustainability Studies (IASS) has examined the hypothetical transition to a hydrogen-powered transport sector through several scenarios. Their conclusion: A shift towards hydrogen-powered mobility could significantly reduce greenhouse gas emissions and greatly improve air quality - in particular, heavy duty vehicles represent a low-hanging fruit for decarbonization of German road transport.

"Hydrogen fuel cell vehicles offer competitive advantages over battery electric vehicles regarding heavy loads, longer driving ranges and shorter fuelling times - making them particularly attractive to the heavy duty vehicle segment" explains lead author Lindsey Weger: "Moreover, transitioning heavy-duty vehicles to green hydrogen could already achieve a deep reduction in emissions - our results indicate a potential of -57 MtCO2eq annually, which translates to about a 7 percent drop in German greenhouse gas emissions for the current conditions".

Accordingly, heavy duty vehicles (which here include not only trucks but also commercial vehicles and buses) equipped with (green) hydrogen fuel cells are a possibility worth considering on the path to road transport decarbonization.

Road transport is a major source of emissions

Transport is one of the most emission-intensive sectors for both climate and air pollutants. In 2017, for example, Germany's transport sector accounted for 18.4 percent of CO2eq emissions; 96 percent of which derived from road traffic.

While Germany has successfully decreased its emissions considerably in most areas of the economy since 1990, little progress has been made in the transport sector, which is in large part responsible for Germany's failure to meet its target of a (lasting) 40 percent reduction in greenhouse gas emissions by 2020 compared to 1990 levels.

The major reasons for this are:

- the increasing kilometres travelled;

- the continued dominance of fossil fuels in transport;

- and high average vehicular CO2 emissions.

Due to extraordinary circumstances, including the countermeasures adopted to contain the Covid-19 pandemic, Germany is now set to meet its original 2020 emissions reduction target. However, this reduction is not expected to be lasting, with emissions from the transport sector almost returning to their original levels in mid-June 2020.

Green hydrogen: a key to reducing emissions

The overall emissions impact depends on the method of hydrogen production: According to the analysis, emissions change between -179 and +95 MtCO2eq annually from a hypothetical full transition to hydrogen vehicular traffic, with the greatest emissions reduction afforded by green hydrogen production (i.e., zero-carbon hydrogen based on renewable-powered water electrolysis), while the greatest emissions increase results from electrolysis using the fossil fuel-intense current electricity mix. Hence green hydrogen in particular could contribute significantly towards achieving Germany's future greenhouse gas emissions reduction targets.

The green hydrogen scenario also promises to deliver the largest reduction in air pollutants - up to 42 percent for NMVOCs, NOx and CO - compared to emissions from the German energy sector for the current conditions. However, producing hydrogen with the current (fossil fuel-intense) electricity mix would result in an increase or minimal effect (i.e., no benefit) in emissions of some pollutants.

Transitioning only heavy duty vehicles to green hydrogen would already deliver a large reduction in emissions (-57 MtCO2eq). "According to our calculations, if only the HDV vehicle segment were to undergo this transition, then we would already get nearly a third of the total possible reduction, with only one third of total hydrogen demand that would be needed to fuel the entire vehicle fleet - a clear low-hanging fruit", says scientist Weger. In conclusion, the team of authors argue that commercial and large vehicles powered by hydrogen could make a rapid and substantial contribution to Germany's overall reduction in emissions.

Background information on hydrogen:

Hydrogen is a non-toxic, colourless, and odourless gas. It has been safely produced for decades and is used in industry and space research. Hydrogen has the highest energy density by mass among conventional fuels (although not by volume at standard atmospheric pressures) and, crucially, hydrogen refuelling infrastructure is comparable to that used for conventional road fuels.

In addition, hydrogen can be produced from a wide range of energy forms, including renewable electricity. It can be easily stored, compressed or liquefied either in pure form, mixed with natural gas, or bound with larger molecules. Hydrogen is easily transported by pipeline, truck, or ship. It can be safely used to fuel vehicles and is in many respects even safer than petrol and diesel.

Credit: 
Research Institute for Sustainability (RIFS) – Helmholtz Centre Potsdam

Ultrasensitive transistor for herbicide detection in water

image: University of Tokyo researchers have fabricated a tiny electronic sensor that can detect very low levels of a commonly used weed killer in drinking water.

Image: 
Institute of Industrial Science, the University of Tokyo

A new polymer-based, solid-state transistor can more sensitively detect a weed killer in drinking water than existing hydrogel-based fluorescence sensor chips. The details were published in Chemistry-A European Journal.

The sensor is a specially designed organic thin-film transistor based on semiconducting molecules of carboxylate-functionalized polythiophene (P3CPT). What's special about this particular device is that, unlike other conjugated polymer-based sensors, this one is a solid-state device that can conduct an electric current when placed inside a fluid.

The device, designed by Tsuyoshi Minami of The University of Tokyo's Institute of Industrial Science and colleagues, works by adding copper ions, which bind to the P3CPT molecules. When the device is placed in water that contains even the smallest amount of the herbicide glyphosate, the copper ions leave the P3CPT molecules to attach to the glyphosate molecules. This causes a detectable reduction in the flow of electric current through the device. Glyphosate is a commonly used weed killer in agriculture, and there are concerns that its presence in drinking water can be harmful to human health.

The scientists found that the device was so sensitive that it could detect as low as 0.26 parts per million of glyphosate in drinking water. The team compared their new device to a conventional fluorescence sensor chip, which was only capable of detecting down to 0.95 parts per million of glyphosate. To put this into perspective, the maximum allowable amount of glyphosate in drinking water, according to the United States Environmental Protection Agency, is 0.7 parts per million.

The scientists believe that the sensitivity of their device boils down to interactions occurring within individual polymer molecules and between neighbouring ones. Commonly used fluorescence sensors depend solely on interactions occurring within individual molecules.

"Our device could be a novel solid-state platform for sensing target molecules in aqueous media," says Minami. The researchers are currently working on further developing their polythiophene-based sensors.

Credit: 
Institute of Industrial Science, The University of Tokyo