Tech

Tel Aviv university scientists develop novel nano-vaccine for melanoma

Researchers at Tel Aviv University have developed a novel nano-vaccine for melanoma, the most aggressive type of skin cancer. Their innovative approach has so far proven effective in preventing the development of melanoma in mouse models and in treating primary tumors and metastases that result from melanoma.

The focus of the research is on a nanoparticle that serves as the basis for the new vaccine. The study was led by Prof. Ronit Satchi-Fainaro, chair of the Department of Physiology and Pharmacology and head of the Laboratory for Cancer Research and Nanomedicine at TAU's Sackler Faculty of Medicine, and Prof. Helena Florindo of the University of Lisbon while on sabbatical at the Satchi-Fainaro lab at TAU; it was conducted by Dr. Anna Scomparin of Prof. Satchi-Fainaro's TAU lab and postdoctoral fellow Dr. João Conniot. The results were published on August 5 in Nature Nanotechnology.

Melanoma develops in the skin cells that produce melanin or skin pigment. "The war against cancer in general, and melanoma in particular, has advanced over the years through a variety of treatment modalities, such as chemotherapy, radiation therapy and immunotherapy; but the vaccine approach, which has proven so effective against various viral diseases, has not materialized yet against cancer," says Prof. Satchi-Fainaro. "In our study, we have shown for the first time that it is possible to produce an effective nano-vaccine against melanoma and to sensitize the immune system to immunotherapies."

The researchers harnessed tiny particles, about 170 nanometers in size, made of a biodegradable polymer. Within each particle, they "packed" two peptides -- short chains of amino acids, which are expressed in melanoma cells. They then injected the nanoparticles (or "nano-vaccines") into a mouse model bearing melanoma.

"The nanoparticles acted just like known vaccines for viral-borne diseases," Prof. Satchi-Fainaro explains. "They stimulated the immune system of the mice, and the immune cells learned to identify and attack cells containing the two peptides -- that is, the melanoma cells. This meant that, from now on, the immune system of the immunized mice will attack melanoma cells if and when they appear in the body."

The researchers then examined the effectiveness of the vaccine under three different conditions.

First, the vaccine proved to have prophylactic effects. The vaccine was injected into healthy mice, and an injection of melanoma cells followed. "The result was that the mice did not get sick, meaning that the vaccine prevented the disease," says Prof. Satchi-Fainaro.

Second, the nanoparticle was used to treat a primary tumor: A combination of the innovative vaccine and immunotherapy treatments was tested on melanoma model mice. The synergistic treatment significantly delayed the progression of the disease and greatly extended the lives of all treated mice.

Finally, the researchers validated their approach on tissues taken from patients with melanoma brain metastases. This suggested that the nano-vaccine can be used to treat brain metastases as well. Mouse models with late-stage melanoma brain metastases had already been established following excision of the primary melanoma lesion, mimicking the clinical setting. Research on image-guided surgery of primary melanoma using smart probes was published last year by Prof. Satchi-Fainaro's lab.

"Our research opens the door to a completely new approach -- the vaccine approach -- for effective treatment of melanoma, even in the most advanced stages of the disease," concludes Prof. Satchi-Fainaro. "We believe that our platform may also be suitable for other types of cancer and that our work is a solid foundation for the development of other cancer nano-vaccines."

Credit: 
American Friends of Tel Aviv University

Long-lasting effects of ironwork on mammal distributions over the last millennium

image: Flying squirrels are among the mammals estimated to be most highly impacted by ironwork. (Photo by Hisashi Yanagawa)

Image: 
NIES

Awareness is growing among scientists about the significance of pre-modern anthropogenic impacts prior to the Industrial Revolution on present-day patterns of biodiversity. In particular, pre-modern energy-intensive industries, such as ironwork, of the sort depicted in the 1997 anime film Princess Mononoke directed by Hayao Miyazaki, were major drivers of ecosystem alteration and have had long-lasting impacts on the distributions of many species. However, the phenomenon remains insufficiently studied and the empirical evidence is quite limited.

Millennial-scale effects of past energy-intensive anthropogenic activities are the subject of a new study led by two Japanese researchers from the National Institute for Environmental Studies and Obihiro University of Agriculture and Veterinary Medicine, published in Scientific Reports.

The researchers used a statistical framework to estimate the impact of pre-modern ironwork during four historical periods in the last millennium on the current distributions of 29 mammalian genera native to Japan, taking into account other potential factors such as paleoclimate and modern-day land use. Past ironwork impacts were quantified using site records from a national archaeological database.

The c urrent distributions of 21 of 29 mammalian genera were significantly affected by the impacts of past ironwork activities. In particular, the impacts of ironwork in the Kofun period (about 1700-1300 years ago), when iron production originally began in Japan, were significant for 13 genera. Medium-to-large mammals, such as the fox and wild boar, showed positive responses to the impacts of ironwork, but small mammals, such as the flying squirrel and dormouse, were negatively impacted in many different historical periods. The difference in response between small and medium-to-large mammals could be explained by traits related to body size, such as dispersal ability and habitat generalism, which are important for survival in a disturbed, heterogeneous landscape.

"Ironwork brought long-term environmental change in multiple ways," says lead author Keita Fukasawa of the National Institute for Environmental Studies. "It required large quantities of charcoal, and the mountains around ironworking sites were often stripped bare due to intensive logging. Moreover, mining of iron sand resulted in soil erosion, which sometimes led to irreversible habitat degradation for small mammals dwelling in old-growth forests. However, such habitat alterations also contributed to the development of the traditional rural landscape in Japan, called satoyama, which consists of patches of various types of habitats such as grassland and secondary forest, which are suitable for medium-to-large mammals."

"Today in Japan , iron production relies on imported iron ore and fossil fuels, so the exploitation of domestic resources for iron production has ended. On a global scale, however, over-exploitation of firewood and mining remain drivers of biodiversity loss. Studies examining the long-lasting effects of pre-industrial Anthropogenic activities on biodiversity will offer insights into the historical background to macro-ecological patterns and provide practical knowledge for the development of sustainable societies in the Anthropocene that mitigate impacts on ecosystems. For example, if we can identify species that may be negatively impacted over the long term by the exploitation of a specific resource, it will help us to establish appropriate zoning for conservation and resource use."

Credit: 
National Institute for Environmental Studies

How light steers electrons in metals

image: This is an illustration of the setup and the interaction of a short laser pulse (red oscillating line) with the lattice of titanium atoms (centre, lower half of figure). The red and blue structures represent the redistribution of the electron density in the vicinity of a titanium atom. A close-up of this change in density is shown on the bottom right.

Image: 
ETH Zurich/D-PHYS Ultrafast Laser Physics group

The distribution of electrons in transition metals, which represent a large part of the periodic table of chemical elements, is responsible for many of their interesting properties used in applications. The magnetic properties of some of the members of this group of materials are, for example, exploited for data storage, whereas others exhibit excellent electrical conductivity. Transition metals also have a decisive role for novel materials with more exotic behaviour that results from strong interactions between the electrons. Such materials are promising candidates for a wide range of future applications.

In their experiment, whose results they report in a paper published today in Nature Physics, Mikhail Volkov and colleagues in the Ultrafast Laser Physics group of Prof. Ursula Keller exposed thin foils of the transition metals titanium and zirconium to short laser pulses. They observed the redistribution of the electrons by recording the resulting changes in optical properties of the metals in the extreme ultraviolet (XUV) domain. In order to be able to follow the induced changes with sufficient temporal resolution, XUV pulses with a duration of only few hundred attoseconds (10^-18 s) were employed in the measurement. By comparing the experimental results with theoretical models, developed by the group of Prof. Angel Rubio at the Max Planck Institute for the Structure and Dynamics of Matter in Hamburg, the researchers established that the change unfolding in less than a femtosecond (10^-15 s) is due to a modification of the electron localization in the vicinity of the metal atoms. The theory also predicts that in transition metals with more strongly filled outer electron shells an opposite motion -- that is, a delocalization of the electrons -- is to be expected.

Ultrafast control of material properties

The electron distribution defines the microscopic electric fields inside a material, which do not only hold a solid together but also to a large extent determine its macroscopic properties. By changing the distribution of electrons, one can thus steer the characteristics of a material as well. The experiment of Volkov et al. demonstrates that this is possible on time scales that are considerably shorter than the oscillation cycle of visible light (around two femtoseconds). Even more important is the finding that the time scales are much shorter than the so-called thermalization time, which is the time within which the electrons would wash out the effects of an external control of the electron distribution through collisions between themselves and with the crystal lattice.

Initial surprise

Initially, it came as a surprise that the laser pulse would lead to an increased electron localization in titanium and zirconium. A general trend in nature is that if bound electrons are provided with more energy, they will become less localized. The theoretical analysis, which supports the experimental observations, showed that the increased localization of the electron density is a net effect resulting from the stronger filling of the characteristic partially filled d-orbitals of the transition-metal atoms. For transition metals that have d-orbitals which are already more than half filled (that is, elements more towards the right in the periodic table), the net effect is to the opposite and corresponds to a delocalization of the electronic density.

Towards faster electronic components

While the result now reported is of fundamental nature, the experiments demonstrate the possibility of a very fast modification of material properties. Such modulations are used in electronics and opto-electronics for the processing of electronic signals or the transmission of data. While present components process signal streams with frequencies in the gigahertz (10^9 Hz) range, the results of Volkov and co-workers indicate the possibility of signal processing at petahertz frequencies (10^15 Hz). These rather fundamental findings might therefore inform the development of the next generations of ever-faster components, and through this indirectly find their way into our daily life.

Credit: 
ETH Zurich Department of Physics

Twelve centuries of European summer droughts

image: These are maps showing decadal correlation during the 20th century between instrumental measurements of temperature and precipitation (left), tree-ring reconstructed temperature and drought (middle), and model-simulated temperature and precipitation (right) for the summer season. The stronger the red colour, the more positive (warm = wet) is the correlations. The stronger the blue colour, the more negative (warm = dry) is the correlations.

Image: 
Fredrik Charpentier Ljungqvist

An international team of researchers have published a study exploring the association between summer temperature and drought across Europe placing recent drought in the context of the past 12 centuries. The study reveals that, throughout history, northern Europe has tended to get wetter and southern Europe to get drier during warmer periods. They also observe that recent changes in drought patterns are not unprecedented as yet and emphasising that continuing to improve understanding of the relationship between summer heat and drought is critical to projecting flood and drought risks.

The new study, published in Environmental Research Letters, explores the relationship between summer temperature and drought using weather measurements going back to the 18th century and tree-ring reconstructions of temperature and drought going back to the 9th century. The team then compared the picture of past temperature and drought, revealed by the tree-ring records, to simulations from the same climate models that are used to predict future climate.

This comparison revealed that the climate model simulations show a too strong relationship between warm and dry summers, and do not capture that a large part of Europe has received more precipitation, not less, when it has been warm in the past 12 centuries.

Project leader Dr. Fredrik Charpentier Ljungqvist, Associate Professor at Stockholm University, said these new findings are important as we are able to see for the first time that the relationship between summer temperature and drought in modern weather measurements has persisted for at least 12 centuries. "We can also see that wetting trend in northern Europe, and drying trend in southern Europe, during the 20th century is not unprecedented over this time perspective," he said.

Going on to discuss the climate model results, Dr. Ljungqvist said: "Crucially, our study shows that the very strong link between warm and dry periods being simulated in the climate models could be too simple. It's not a picture backed up by the weather records and tree-ring data. The climate model simulations seem to underestimate how large part of Europe actually experiences wetter summers when the climate is warmer."

"Our study implies a possible exaggeration in the climate models of temperature-driven drought risk in parts of northern Europe under global warming. But this also means that the models may well underestimate future excessive precipitation, with associated flood risks, in northern Europe," continues Dr. Ljungqvist.

Credit: 
Stockholm University

A novel robotic jellyfish able to perform 3D jet propulsion and maneuvers

video: Jellyfishes in nature use jet propulsion to move through the water, which have been proven to be one of the most energetically efficient swimmers on the planet. Therefore, the movements of jellyfish have attracted significant interest over the past decade in the context of bioinspired underwater vehicle. Now researchers in Beijing have developed a novel robotic jellyfish able to perform vertical and horizontal jellyfish-like propulsion and maneuvers.

Image: 
©Science China Press

As a source of inspiration, aquatic creatures such as fish, cetaceans, and jellyfish could inspire innovative designs to improve the ways that manmade systems operate in and interact with aquatic environments. Jellyfishes in nature propel themselves through their surroundings by radially expanding and contracting their bell-shaped bodies to push water behind them, which is called jet propulsion.

Contrary to prevailing view that jellyfishes are described as inefficient swimmers, jellyfishes have been proven to be one of the most energetically efficient swimmers. That is, it indicates that jellyfish-like swimming will have a remarkable propulsive advantage if low-energy propulsion is demanded. Therefore, the movements of jellyfish have attracted significant interest over the past decade in the context of bioinspired underwater vehicle.

Recently, researchers from Institute of Automation, Chinese Academy of Sciences in Beijing, China successfully developed a novel robotic jellyfish able to perform three-dimensional jellyfish-like propulsion and maneuvers based on a reinforcement learning-based method.

Combining the latest advancements in mechatronic design, materials, electronics, and control methods, researchers are making an integrated effort to develop smart actuators to fabricate various robotic jellyfishes. In generally, such robotic jellyfishes are often tethered and much slower in speed in comparison with the kind actuated by conventional electric motors. Most of existing robotic jellyfishes cannot freely adjust their three-axis attitude, which has an adverse effect on free-swimming propulsion and plausible applications.

To solve this problem, the research group led by Prof. Junzhi Yu from Institute of Automation, Chinese Academy of Sciences has investigated how a bioinspired motor-driven jellyfish-like robotic system capable of 3D motion is designed and controlled.

The designed robotic jellyfish models after Aurelia aurita (commonly termed moon jellyfish), which has a relatively large displacement and is especially suited for use with large load capacity. It is about 138 mm height and weights about 8.2 kg. As illustrated in Figure 1, the robotic jellyfish is hemispherical in shape and consists of a bell-shaped rigid head, a cylindroid main cavity, four separate six-bar linkage mechanisms, and a soft rubber skin. To enhance the maneuverability of the robotic jellyfish, a barycenter adjustment mechanism assembled inside the cavity is introduced. Through adjusting two clump weights in vertical or horizontal direction or in a combination of the two, the attitude regulation is achieved.

"It is very hard to establish a precise dynamic model for jellyfish-like swimming, since it is a highly nonlinear, strong coupling, and time-varying system." said by Prof. Junzhi Yu. "Parametric uncertainties and external disturbances in dynamic aquatic environments, at the same time, cause difficulty in deriving control laws by solving the inverse kinematics problem." Therefore, a reinforcement learning based closed-loop attitude control method is proposed for the robotic jellyfish, which can solve optimal decision control problem through direct interaction with the environment, particularly without the need for dynamic modeling.

Finally, the proposal of the reinforcement learning based attitude control method makes autonomous attitude regulation possible. "In comparison with most of the other robotic jellyfish, the built robot displays a high order of structure flexibility and yaw maneuverability." Pointed out by Prof. Junzhi Yu. He also stressed that this self-propelled robotic jellyfish with 3D motion has great implications for bioinspired design of jet propulsion system with great agility.

Credit: 
Science China Press

Accelerating development of STT-MRAM

image: Center for Innovative Integrated Electronic Systems (CIES) at Tohoku University

Image: 
Tohoku University

Researchers at the Center for Innovative Integrated Electronic Systems (CIES) at Tohoku University have successfully observed microscopic chemical bonding states in ultrathin MgO - an important determinant in STT-MRAM performance. The observation was carried out via an angle-resolved hard X-ray photoelectron spectroscopy (AR-HAXPES) in collaboration with Japan Synchrotron Radiation Research Institute (JASRI) at its Spring-8 Synchrotron Radiation facility.

STT-MRAM, a non-volatile memory, has been intensively researched and developed because of its high-performance and low power consumption. STT-MRAM contains magnetic tunnel junctions (MTJ) as an integrated memory element. Ultrathin MgO film is used as a tunneling barrier for MTJ, thus being a dominant determinant in STT-MRAM performance. It is, therefore, important to understand the microscopic characteristics of MgO and in particular, the chemical bonding state.

Researchers at Tohoku University lead by Prof. Tetsuo Endoh, director of CIES and Dr. Testuya Nakamura, group leader of JASRI have successfully observed the chemical bonding state of the ultrathin MgO in entire MgO layer by means of AR-HAXPES at SPring-8, the world's largest synchrotron radiation facility.

Figure 1 shows the sample structure used in this study. It is the simplest MTJ stack in which the ultra-thin MgO (0.8 nm) is sandwiched between CoFeB films. The chemical bonding state of the ultra-thin MgO in this study was evaluated according to film thickness direction.

Figure 2 shows the microscopic chemical bonding state of the MgO changes along the film thickness direction. This result shows that the microscopic bonding state of MgO, something usually considered to be homogeneous along the film thickness direction, actually changes depending on the distance from the interface.

The successful observation of the ultrathin MgO layer chemical bonding state will lead to an improvement of MgO quality. This in turn will accelerate STT-MRAM development.

Accordingly, a new synchrotron radiation facility (Slit-J) is now under construction at Aobayama New-Campus at Tohoku University in conjunction with relevant industries. The facility will allow for better understanding of lighter element microscopic features and hopefully lead to further prosperity for relevant industries.

Credit: 
Tohoku University

Polyoxometalate-based coordination frameworks for CH4 generation in photoreduction of CO2

image: POMCF for converting CO2 into CH4 in photoreduction system

Image: 
©Science China Press

Excessive CO2 discharge derived from the continuous burning of fossil fuels has caused global warming and environmental issues. Artificial conversion of excess CO2 into serviceable energy product is an important pathway to achieve sustainable development. Solar-driven photocatalytic reduction of CO2 to carbon-neutral fuels (CO, CH4) and/or value-added chemicals (HCOOH, CH3OH) affords a feasible strategy for the aforesaid conversion. The implementation of this reaction can mitigate the greenhouse effect and energy crisis simultaneously. However, the structural activation process of CO2 molecule is particularly difficult because of its intrinsically chemical inertness and high C=O bond cleavage enthalpy. In order to circumvent the highly negative equilibrium potential (versus NHE) for thermodynamically unfavourable CO2* - intermediate, proton-assisted multiple electron reductive products including chemicals and/or hydrocarbon are commonly obtained so as to lower the activation energy of photocatalytic CO2 conversion. Even so, the formation of high-order proton and electron transferring products still needs to surmount considerable kinetic barrier, and the competitive H2 evolution further increase the difficulty for getting the aiming product selectively. For instance, the photosynthesis of CH4, one kind of most desirable and valuable hydrocarbon fuel in photoreaction system, has been a grand challenge, since the accomplishment of eight-electron transport process requires the photocatalyst to offer both strong reducing capability and sufficient electrons theoretically.

Researchers conceived that polyoxometalate (POM)-based coordination frameworks (POMCFs), with well-known structural stability and favorable catalytic performance, are probably more beneficial to execute photocatalytic reduction of CO2 due to the synergistic effect originated from the integration of POM and MCF. In particular, the Zn-ε-Keggin cluster of PMo12 "electron sponges" family, including eight MoV atoms, can behave as a strong reductive component and contribute eight electrons theoretically. In addition, the Zn-ε-Keggin, a tetrahedral node, is formed by four-trapped Zn(II) locating in ε-Keggin (PMo12). Compared with most anionic POMs, the ε-Keggin modified with metal Zn becomes a cationic cluster, which is favorable for coordination with organic ligands. Consequently, if the reductive POM cluster and porphyrin derivative can be employed to fabricate POMCF, having both the visible-light harvesting and photo-excited electron migration, that would be a good strategy towards selectively photoreducing CO2 to multielectron reductive products.

Therefore, we developed two POMCFs, NNU-13 and NNU-14, fabricated with reductive Zn-ε-Keggin cluster and visible-light responsive TCPP linker. These POMCFs exhibit high photocatalytic CH4 selectivity (> 96%) and activity that have far surpassed many MCF-based photocatalysts. Theoretical calculations revealed that the photo-generated carriers of VB and CB are mostly distributed on TCPP group and Zn-ε-Keggin cluster, respectively. The photo-excited electrons more easily flow to POM port by efficient intercoupling between reductive Zn-ε-Keggin unit and TCPP linker. Noted that the introduction of POM building blocks with potent reducing ability not only endows NNU-13 and NNU-14 with favorable structural rigidity, but also it indeed facilitates the photocatalytic selectivity of CH4 by theoretically delivering adequate electrons to accomplish the eight-electron reduction of the CO2 molecule. We expect such a feasible approach, assembling strong reducing component into visible-light sensitized photocatalyst architecture, can ignite research enthusiasm towards the construction of efficient POMCFs photocatalysts for highly selective reduction of CO2 to CH4 or other high-valued hydrocarbons.

Credit: 
Science China Press

Earthquakes, hurricanes and other natural disasters obey same mathematical pattern

Tracking the magnitude of several catastrophic natural events and drawing a graphic of how many episodes of each have occurred throughout history yields a result which cannot be ignored. Quite on the contrary, what the graphic reveals is a highly defined curve which luckily shows that the stronger the capacity to devastate, the less frequently the episode occurs. For example, very few earthquakes are devastating, but small earthquakes occur frequently, most of them so weak that people do not perceive them and are only detected by highly sensitive sensors. This information is essential when calculating any associated risks.

However, this dependence is not always as obvious nor does it adjust to the same mathematical function, particularly in the case of larger events. Álvaro Corral and Álvaro González, researchers at the Centre for Mathematical Research (CRM) and the UAB Department of Mathematics, have conducted the most precise statistical analysis existing up to date of the entire set of natural phenomena capable of causing disasters: earthquakes, hurricanes, forest fires, meteorite impacts, torrential rains and land subsidence caused by karst phenomena (in which groundwater erodes rock).

After analysing the data of thousands of different episodes of each of these natural phenomena, researchers were able to use one same mathematical technique to describe the functions related to the frequency of these phenomena and the value of their magnitude or size. The majority of them are ruled by the power law, in which events occur more frequently the smaller they are, without any definition of a "normal" or typical size.

Nevertheless, the frequency of other events, such as forest fires, follows a different mathematical distribution, known as the lognormal distribution, regardless of whether they are small episodes or devastating fires which can burn up to hundreds of thousands of acres of land.

The study made it possible to specify exactly how these functions adjust in each case, and whether they continue to be valid or not for limit cases (e.g., extremely large magnitude events), with the aim of using the same patterns to describe events of a great variety of magnitudes and also of very different origins.

"Thanks to this study there will be an improvement in the risk estimations of catastrophic events occuring in different parts of the world, depending on the historical registry of each region", Álvaro Corral affirms.

The scientists find it remarkable that such diverse natural phenomena follow a power law distribution. According to Corral, "some interpretations point to this occuring when the phenomenon displays what is called an 'avalanche' behaviour, quickly releasing energy it has accumulated over time; but there is still much to be investigated in this area."

For example, forest fires are an exception to the rule, given that they could also be defined as 'avalanches' suddenly releasing energy accumulated in the form of biomass. "We do not know in detail why some 'avalanche' phenomena follow a lognormal distribution, and this actually contradicts previous studies. Better physical models will be necessary to explain the magnitude reached by these processes", the authors point out.

Credit: 
Universitat Autonoma de Barcelona

Overweight, obesity in children across Europe

What The Study Did: This study (called a systematic review and meta-analysis) combined the results of 103 studies with nearly 478,000 children (ages 2 to 13) to look at how common overweight and obesity are among children across Europe.

Author: Iván Cavero-Redondo, Ph.D., of the Universidad de Castilla-La Mancha, in Cuenca, Spain, was the coauthor.

(doi:10.1001/jamapediatrics.2019.2430)

Editor's Note: The article contains conflict of interest disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

#  #  #

Media advisory: The full study is linked to this news release.

Embed this link to provide your readers free access to the full-text article: This link will be live at the embargo time: https://jamanetwork.com/journals/jamapediatrics/fullarticle/2747328?guestAccessKey=941f6c35-cff7-45b8-a229-ec10182a8edf&utm_source=For_The_Media&utm_medium=referral&utm_campaign=ftm_links&utm_content=tfl&utm_term=080519

Credit: 
JAMA Network

Put a charge on it

image: The results, published in Nature Materials, make the US Department of Energy 150ºC challenge for emissions more attainable. Scientists from the López Group propose a dynamic charge and oxidation state for Single-Atom Catalysts. The dynamic charge transfer between metal and oxide is crucial to understanding the nature of the active site in Single-Atom Catalysts.

Image: 
Núria López (ICIQ)

Pollutants coming out of cars' exhausts are harmful to the environment and public health. With the goal of overall curbing car emissions, the US Department of Energy (DOE) issued a challenge to scientists worldwide: catalytically converting 90% of all critical pollutants (hydrocarbons, CO, NOx etc.) in car exhaust into less harmful substances at 150ºC. However, nanoparticle based heterogeneous catalysts - like the three-way exhaust catalyst used in cars - work best at high temperatures (between 200 and 400ºC), thus making the 150ºC DOE challenge seem difficult to attain.

Now, researchers from the López Group, have studied in detail the behavior of Pt single atoms supported on CeO2 - what the researchers argue would outperform the Pt nanoparticles supported on CeO2 currently employed in the three-way exhaust catalyst. The results, published in Nature Materials, show that the common assumption of a static charge in Single-Atom Catalysis is oversimplified. Instead, the scientists propose a dynamic charge, able to explain the unique reactivity found for activated single platinum atoms on ceria, which in turn can perform CO-oxidation meeting the DOE 150ºC challenge for emissions.

Dynamic charge and oxidation state
Since Single-Atom Catalysis field flourished, scientists have been working to understand the intimate behavior at the interface between Single-Atom Catalysts and the oxides supporting them, hoping this knowledge will allow the tuning of their catalytic activity. The scientists from the López Group combined Density Functional Theory (DFT) and first-principles Molecular Dynamics (BOMD) to elucidate what is exactly going on at the interface.

The simulations revealed a metastable system where the Pt atoms have several overlapping oxidation states, allowing the catalyst to shift from one state to another. These dynamically interconnected oxidation states are "a completely new concept," as Nathan Daelman, first author of the study, explains.

For the scientists, it's clear the dynamic behavior influences the reactivity of the system and, for the first time, they have been able to explain the Pt activation step needed for the three-way exhaust catalysts to properly function under DOE 150ºC working conditions. To the researchers, the next steps will be working to prepare a model of the mechanism that will be able to predict with temperature the behavior of the catalytic system.

Credit: 
Institute of Chemical Research of Catalonia (ICIQ)

In the future, this electricity-free tech could help cool buildings in metropolitan areas

image: The system helps cool its surroundings by absorbing heat from the air inside the box and transmitting that energy through the Earth's atmosphere into outer space.

Image: 
University at Buffalo

BUFFALO, N.Y. -- Engineers have designed a new system that can help cool buildings in crowded metropolitan areas without consuming electricity, an important innovation at a time when cities are working to adapt to climate change.

The system consists of a special material -- an inexpensive polymer/aluminum film -- that's installed inside a box at the bottom of a specially designed solar "shelter." The film helps to keep its surroundings cool by absorbing heat from the air inside the box and transmitting that energy through the Earth's atmosphere into outer space. The shelter serves a dual purpose, helping to block incoming sunlight, while also beaming thermal radiation emitted from the film into the sky.

"The polymer stays cool as it dissipates heat through thermal radiation, and can then cool down the environment," says co-first author Lyu Zhou, a PhD candidate in electrical engineering in the University at Buffalo School of Engineering and Applied Sciences. "This is called radiative or passive cooling, and it's very interesting because it does not consume electricity -- it won't need a battery or other electricity source to realize cooling."

"One of the innovations of our system is the ability to purposefully direct thermal emissions toward the sky," says lead researcher Qiaoqiang Gan, PhD, UB associate professor of electrical engineering. "Normally, thermal emissions travel in all directions. We have found a way to beam the emissions in a narrow direction. This enables the system to be more effective in urban environments, where there are tall buildings on all sides. We use low-cost, commercially available materials, and find that they perform very well."

Taken together, the shelter-and-box system the engineers designed measures about 18 inches tall (45.72 centimeters), 10 inches wide and 10 inches long (25.4 centimeters). To cool a building, numerous units of the system would need to be installed to cover a roof.

The research will be published on Aug. 5 in Nature Sustainability. The study was an international collaboration between Gan's group at UB, Boon Ooi's group at King Abdullah University of Science and Technology (KAUST) in Saudi Arabia, and Zongfu Yu's group at the University of Wisconsin-Madison. Along with Zhou, co-first authors are Haomin Song, PhD, UB assistant professor of research in electrical engineering, and Jianwei Liang at KAUST. The study was funded in part by the National Science Foundation.

A system that works during the day and in crowded environments

The new passive cooling system addresses an important problem in the field: How radiative cooling can work during the day and in crowded urban areas.

"During the night, radiative cooling is easy because we don't have solar input, so thermal emissions just go out and we realize radiative cooling easily," Song says. "But daytime cooling is a challenge because the sun is shining. In this situation, you need to find strategies to prevent rooftops from heating up. You also need to find emissive materials that don't absorb solar energy. Our system address these challenges."

When placed outside during the day, the heat-emanating film and solar shelter helped reduce the temperature of a small, enclosed space by a maximum of about 6 degrees Celsius (11 degrees Fahrenheit). At night, that figure rose to about 11 degrees Celsius (about 20 degrees Fahrenheit).

How innovative architecture can drive radiative cooling

The new radiative cooling system incorporates a number of optically interesting design features.

One of the central components is the polymer/metal film, which is made from a sheet of aluminum coated with a clear polymer called polydimethylsiloxane. The aluminum reflects sunlight, while the polymer absorbs and dissipates heat from the surrounding air.
Engineers placed the material at the bottom of a foam box and erected a solar "shelter" atop the box, using a solar energy-absorbing material to construct four outward-slanting walls, along with an inverted square cone within those walls.

This architecture serves a dual purpose: First, it helps to sponge up sunlight. Second, the shape of the walls and cone direct heat emitted by the film toward the sky.

"If you look at the headlight of your car, it has a certain structure that allows it to direct the light in a certain direction," Gan says. "We follow this kind of a design. The structure of our beam-shaping system increases our access to the sky. The ability to direct the emissions improves the performance of the system in crowded areas."

Credit: 
University at Buffalo

Researchers find proteins that might restore damaged sound-detecting cells in the ear

image: Mouse cochlea with hair cells shown in green and auditory nerves shown in red.

Image: 
Doetzlhofer lab

Using genetic tools in mice, researchers at Johns Hopkins Medicine say they have identified a pair of proteins that precisely control when sound-detecting cells, known as hair cells, are born in the mammalian inner ear. The proteins, described in a report published June 12 in eLife, may hold a key to future therapies to restore hearing in people with irreversible deafness.

"Scientists in our field have long been looking for the molecular signals that trigger the formation of the hair cells that sense and transmit sound," says Angelika Doetzlhofer, Ph.D., associate professor of neuroscience at the Johns Hopkins University School of Medicine. "These hair cells are a major player in hearing loss, and knowing more about how they develop will help us figure out ways to replace hair cells that are damaged."

In order for mammals to hear, sound vibrations travel through a hollow, snail shell-looking structure called the cochlea. Lining the inside of the cochlea are two types of sound-detecting cells, inner and outer hair cells, which convey sound information to the brain.

An estimated 90% of genetic hearing loss is caused by problems with hair cells or damage to the auditory nerves that connect the hair cells to the brain. Deafness due to exposure to loud noises or certain viral infections arises from damage to hair cells. Unlike their counterparts in other mammals and birds, human hair cells cannot regenerate. So, once hair cells are damaged, hearing loss is likely permanent.

Scientists have known that the first step in hair cell birth starts at the outermost part of the spiraled cochlea. Here, precursor cells start transforming into hair cells. Then, like sports fans performing "the wave" in a stadium, precursor cells along the spiral shape of the cochlea turn into hair cells along a wave of transformation that stops when it reaches the inner part of the cochlea. Knowing where hair cells start their development, Doetzlhofer and her team went in search of molecular cues that were in the right place and at the right time along the cochlear spiral.

Of the proteins the researchers examined, the pattern of two proteins, Activin A and follistatin, stood out from the rest. Along the spiral path of the cochlea, levels of Activin A increased where precursor cells were turning into hair cells. Follistatin, however, appeared to have the opposite behavior of Activin A. Its levels were low in the outermost part of the cochlea when precursor cells were first starting to transform into hair cells and high at the innermost part of the cochlea's spiral where precursor cells hadn't yet started their conversion. Activin A seemed to move in a wave inward, while follistatin moved in a wave outward.

"In nature, we knew that Activin A and follistatin work in opposite ways to regulate cells," says Doetzlhofer. "And so, it seems, based on our findings like in the ear, the two proteins perform a balancing act on precursor cells to control the orderly formation of hair cells along the cochlear spiral."

To figure out how exactly Activin A and follistatin coordinate hair cell development, the researchers studied the effects of each of the two proteins individually. First, they increased the levels of Activin A in the cochleas of normal mice. In these animals, precursor cells transformed to hair cells too early, causing hair cells to appear prematurely all along the cochlear spiral. In mice engineered to either overproduce follistatin or not produce Activin A at all, hair cells were late to form and appeared disorganized and scattered across multiple rows inside the cochlea.

"The action of Activin A and follistatin is so precisely timed during development that any disturbance can negatively affect the organization of the cochlea," says Doetzlhofer. "It's like building a house -- if the foundation is not laid correctly, anything built upon it is affected."

Looking more closely at why overproduction of follistatin results in disorganized hair cells, the researchers found that high levels of this protein caused precursor cells to divide more frequently, which in turn made more of them convert into inner hair cells in a haphazard way.

Doetzlhofer notes that her research in hair cell development, although fundamental, has potential applications to treat deafness caused by damaged hair cells: "We are interested in how hair cells evolved because it's an interesting biological question," she says. "But we also want to use that knowledge to improve or develop new treatment strategies for hearing loss."

Credit: 
Johns Hopkins Medicine

Intense look at La Brea Tar Pits explains why we have coyotes, not saber-toothed cats

video: Paleontologist Larisa DeSantis explains her study of ancient predators trapped in the La Brea Tar Pits.

Image: 
Vanderbilt University

The most detailed study to date of ancient predators trapped in the La Brea Tar Pits is helping Americans understand why today we're dealing with coyotes dumping over garbage cans and not saber-toothed cats ripping our arms off.

Larisa DeSantis, a Vanderbilt University paleontologist, grew up visiting the one-of-a-kind fossil site in Los Angeles, which contains fossils of predators that tried to eat horses, bison and camels stuck in the tar over the past 50,000 years and themselves became trapped, offering the best opportunity to understand Ice Age animals facing climate change. The Pleistocene Epoch spanned 2.6 million years ago to about 10,000 years ago, encompassing multiple glacial and interglacial periods and the arrival of humans, one or both of which forced predators to adapt their diets or die.

DeSantis spent the last decade visiting La Brea, studying the teeth of extinct species such as American lions, saber-toothed cats and dire wolves; and teeth from ancient animals whose offspring are still alive today, such as gray wolves, cougars and coyotes. Her work revealed that competition for prey among carnivores wasn't a likely cause of the Pleistocene megafaunal extinction as formerly believed, because, like dogs and cats of today, one preferred running after herbivores in the open fields, while the other preferred stalking them in forested areas.

"Isotopes from the bones previously suggested that the diets of saber-toothed cats and dire wolves overlapped completely, but the isotopes from their teeth give a very different picture," said DeSantis, an associate professor of biological sciences at Vanderbilt. "The cats, including saber-toothed cats, American lions and cougars, hunted prey that preferred forests, while it was the dire wolves that seemed to specialize on open-country feeders like bison and horses.  While there may have been some overlap in what the dominant predators fed on, cats and dogs largely hunted differently from one another."

To study these ancient predators, she employs dentistry -- taking molds of the teeth and shaving off tiny bits of enamel for chemical analysis. Information about everything the animal ate lies within the isotopes, she said. Further, the microscopic wear patterns on teeth can clarify who was eating flesh or scavenging on bones.

It's likely that those giant predators went extinct due to climate change, the arrival of humans to their environment or a combination of the two, she said, and her team is working to clarify the cause of the extinction with multiple colleagues across six institutions as part of a separate on-going study.

What they know is predators alive today in the Americas were better able to adapt their diets. Instead of only feeding on large prey, they could effectively hunt small mammals, scavenge what they could from carcasses or do both.

"The other exciting thing about this research is we can actually look at the consequences of this extinction," DeSantis said. "The animals around today that we think of as apex predators in North America -- cougars and wolves -- were measly during the Pleistocene. So when the big predators went extinct, as did the large prey, these smaller animals were able to take advantage of that extinction and become dominant apex-predators."

An even more detailed picture of ancient life at La Brea is contained in the paper "Causes and consequences of Pleistocene megafaunal extinctions as revealed from Rancho La Brea mammals," published today in the journal Current Biology.

Credit: 
Vanderbilt University

Google maps for tissues

video: With the "BigStitcher" software you can reconstruct a sample and then rotate and turn it virtually, get an overview of the big picture or zoom into individual structures. This works both as a user, as illustrated here, and as an algorithm that analyzes the data and cannot load the entire image into RAM. Neurons expressing a specific gene are marked in green. Such data now make it possible for the first time to systematically characterize differences at the single-cell level between normal and genetically modified mice and to draw conclusions about potential behavioral changes that may result.

Image: 
Preibisch Lab / Treier Lab, MDC

Modern light microscopic techniques provide extremely detailed insights into organs, but the terabytes of data they produce are usually nearly impossible to process. New software, developed by a team led by MDC scientist Dr. Stephan Preibisch and now presented in Nature Methods, is helping researchers make sense of these reams of data.

It works almost like a magic wand. With the help of a few chemical tricks and ruses, scientists have for a few years now been able to render large structures like mouse brains and human organoids transparent. CLARITY is perhaps the most well-known of the many different sample clearing techniques, with which almost any object of study can be made nearly as transparent as water. This enables researchers to investigate cellular structures in ways they could previously only dream of.

And that's not all. In 2015 another conjuring trick - called expansion microscopy - was presented in the journal Science. A research team at Massachusetts Institute of Technology (MIT) in Cambridge discovered that it was possible to expand ultrathin slices of mouse brains nearly five times their original volume, thereby allowing samples to be examined in even greater detail.

The software brings orders to the data chaos

"With the aid of modern light-sheet microscopes, which are now found in many labs, large samples processed by these methods can be rapidly imaged," says Dr. Stephan Preibisch, head of the research group on Microscopy, Image Analysis & Modeling of Developing Organisms at MDC's Berlin Institute for Medical Systems Biology (BIMSB). "The problem, however, is that the procedure generates such large quantities of data - several terabytes - that researchers often struggle to sift through and organize the data."

To create order in the chaos, Preibisch and his team have now developed a software program that after complex reconstructing the data resembles somewhat Google Maps in 3D mode. "One can not only get an overview of the big picture, but can also zoom in to specifically examine individual structures at the desired resolution," explains Preibisch, who has christened the software "BigStitcher." Now, the computer program, which any interested scientist can use, has been presented in the scientific journal Nature Methods.

A team of twelve researchers from Berlin, Munich, the United Kingdom, and the United States were involved in the development. The paper's two lead authors are David Hoerl, from Ludwig-Maximilians-Universitaet Muenchen, the Berlin Institute for Medical Systems Biology (BIMSB) of the MDC, as well as the MDC researcher Dr. Fabio Rojas Rusak. The researchers show in their paper that algorithms can be used to reconstruct and scale the data acquired by light-sheet microscopy in such a way that renders a supercomputer unnecessary. "Our software runs on any standard computer," says Preibisch. "This allows the data to be easily shared across research teams."

Data quality is also determined

The development of BigStitcher began about ten years ago. "At that time, I was still a PhD student and was thinking a lot about how to best handle very large amounts of data," recalls Preibisch. "The frameworks we created back then have helped us to successfully tackle a very current problem." But, of course, he adds, many new algorithms were also incorporated into the software.

BigStitcher can visualize on screen the previously imaged samples in any level of detail desired, but it can also do much more. "The software automatically assesses the quality of the acquired data," says Preibisch. This is usually better in some parts of the object being studied than in others. "Sometimes, for example, clearing doesn't work so well in particular area, meaning that fewer details are captured there," explains the MDC researcher.

"The brighter a particular region of, say, a mouse brain or a human organ is displayed on screen, the higher the validity and reliability of the acquired data," says Preibisch, describing this additional feature of his software. And because even the best clearing techniques never achieve 100 percent transparency of the sample, the software lets users rotate and turn the image captured by the microscope in any direction on screen. It is thus possible to view the sample from any angle. "This is another new feature of our software," says Preibisch.

Anyone can download the software for free

The zoom function allows biologists to find answers to many questions, such as: Where in the brain is cell division currently taking place? Where is RNA expressed? Or where do particular neuronal projections end? "In order to find all this out, it is first necessary to get an overview of the entire object of study, but then to be able to zoom in to view the smallest of details in high resolution," explains Preibisch. Therefore, many labs today have a need for software like BigStitcher. The program is distributed within the Fiji framework, where any interested scientist can download and use the plug-in free of charge.

Credit: 
Max Delbrück Center for Molecular Medicine in the Helmholtz Association

2015 Volkswagen emissions scandal damaged other German automakers' reputations and profits

image: Car emissions

Image: 
istock/University of Notre Dame

In 2015, Volkswagen was exposed for bypassing U.S. emissions standards by equipping their diesel-engine cars with a so-called defeat device that could detect emissions tests, adjust levels to ensure compliance, then revert to non-compliant levels after test completion. Stepan Family Associate Professor of Economics Rüdiger Bachmann at the University of Notre Dame and his co-authors studied the scandal and found that the fallout from Volkswagen's wrongdoing cost other German car makers billions of dollars in sales.

"We find that the VW scandal reduced the sales of the other German auto manufacturers. ... Specifically, the overall effect on those manufacturers amounted to a decline in sales of 104,661 vehicles valued at $5.2 billion in 2016, based on the list prices in the data," Bachmann and his co-authors write in a working paper released by the National Bureau of Economic Research.

For years, Volkswagen -- which also owns Porsche and Audi -- emphasized the value of "German engineering" in its advertisements. Stefan Gies, Volkswagen's head of chassis development, told Car and Driver that, in his view, "German engineering stands for precision in all that we do -- precision in the design and what you feel in the car. Everything the driver touches and controls must instill this feeling of confidence and precision. We want that person to feel that they have the car under control and that it will do exactly what they desire it to do."

The reputation of German know-how led Volkswagen to top Toyota as the world's most profitable car manufacturer in 2015, listing its 2014 net profit at €2.5 billion in its annual report. The researchers note that, since vehicles by other German makes are substitutes for Volkswagen vehicles, substitution away from Volkswagen should have increased demand for the other German makes. However, they find that it did not do so sufficiently to overcome the negative reputation effect of the scandal. BMW and Mercedes, for example, were not implicated in the emissions scandal, yet their revenue took a hit. Bachmann and his co-authors found that collective reputation matters.

In order to document changes in sentiment toward non-VW German automakers that indicate harm to their collective reputation, Bachmann and his co-authors reviewed Twitter data from before and after the scandal broke. Not surprisingly, the data reflect a sharp spike in negative sentiment toward VW: from an average of 3 percentage points below the company's average in the pre-scandal month to an average of 26 percentage points above. There was also a statistically significant decline of 3.5 percentage points in positive sentiment toward non-VW German auto manufacturers as a result of the scandal.

A review of Google searches after the scandal broke did not reveal a jump in searches about cheating by non-VW German automakers.

"Our paper shows that the scandal's effects on non-VW German automakers were unlikely to be driven by information. Rather, the scandal must have tarnished the reputations of the other German auto manufacturers through their association with Volkswagen, consistent with the notion of a collective reputation," the authors write.

Bachmann and his co-authors' case study has broader implications that apply to other industries. Bachmann, a trained macroeconomist, sees value in incorporating more micro studies into big-picture questions that macroeconomists usually tackle, like inflation, unemployment and the business cycle.

"I am increasingly convinced that we macroeconomists need to learn more from micro studies," he said. "For example, this paper is a great case study about how general economic activity is influenced when a significant event happens to large industrial powerhouse firms."

Credit: 
University of Notre Dame