Tech

Scientists and schoolkids find family soups have antimalarial properties

London schoolchildren have found that some of their families' soup recipes have antimalarial properties, with the help of Imperial scientists.

Researchers from Imperial College London helped the schoolchildren test their family soup broths for activity against the malaria parasite.

Several of the soup broths, collected from traditional family recipes that originated around the world, showed activity against the malaria parasite Plasmodium falciparum, either by curbing its growth or preventing it from maturing. The soup recipes had been passed down through the generations for the treatment of fever.

Five of the broths were able to curb growth of the parasite in its disease-causing stage by more than 50 percent. In two of these, the inhibitory activity was comparable with that of a leading antimalarial drug, dihydroartemisinin.

Four other broths were more than 50 percent effective at blocking the ability of the parasites to mature to a form that infects mosquitoes, potentially blocking the process of transmission.

Dihydroarteminisin contains artemisinin, which was isolated from a traditional Chinese herbal medicine. The researchers behind the new study hope that they may discover new antimalarial compounds in a similar way from the work looking at soup. The next step is to identify the active ingredients responsible.

Lead researcher Professor Jake Baum, from the Department of Life Sciences at Imperial, said: "Malaria kills more than 400,000 people per year and infects more than 200 million, yet resistance to our frontline drugs continues to emerge. We may have to look beyond the chemistry shelf for new drugs, and natural remedies shouldn't be off our watch list, as artemisinin shows."

The study, published today in the Archives of Disease in Childhood, was carried out by students at Eden Primary School in North London alongside researchers from Imperial College London.

The pupils brought in samples of homemade clear soup broths from family recipes from across Europe, North Africa, and the Middle East. Filtered extracts of the broths were then tested against two stages of the parasite: when it can infect mosquitoes, and when it can cause the disease in humans.

The recipes for each of the broths varied. They were vegetarian, chicken, or beef based, with no particular ingredient common to those with the strongest antimalarial activity.

The active ingredients in the broths studied are yet to be identified and tested. To move forwards, the active ingredients would need to be isolated, before tests of toxicology and effectiveness, first in human cells and later preclinical trials.

Professor Baum said: "It's really interesting to find potential routes for future drug development in something like your grandmother's soup. In all honesty, the true strength of the study however was engaging children in the idea of what's the difference between a natural remedy a real medicine - the answer is evidence! The children understood that soups could really become a drug if you test them the right way."

Credit: 
Imperial College London

Study measures impact of agriculture on diet of wild mammals

image: (clockwise: Southern tiger cat [Leopardus guttulus]; Brocket deer [Mazama spp.]; Greater naked-tailed armadillo [Cabassous tatouay]; Red-rumped agouti [Dasyprocta leporina]

Image: 
photo: ICMBio / CENAP)

Margays (Leopardus wiedii), small wild cats living in forest areas fragmented by agriculture near Campinas and Botucatu in São Paulo State, Brazil, prey on animals inhabiting nearby sugarcane plantations, such as birds and small rodents.

The diet of other mammals, such as the herbivorous Wild cavy (Cavia aperea) or the omnivorous Crab-eating fox (Cerdocyon thous), is also influenced by the region's agriculture. They live in areas of native vegetation but often have to seek food in fields of corn, sugarcane or pasture in order to survive. Each in its own way, the Cougar (Puma concolor), Capybara (Hydrochoerus hydrochaeris), Brocket deer (Mazama spp.), Ocelot (Leopardus pardalis) and Crab-eating raccoon (Procyon cancrivorus) have also adapted their diet in comparison with animals living in large areas of well-preserved forest.

These examples, described in an article published in Proceedings of the National Academy of Sciences (PNAS), confirm the hypothesis that in addition to its negative effect on wildlife in terms of species richness, diversity and abundance, agriculture also impacts the diet and habitat use of wild mammals living in areas of fragmented forest near croplands and pasturelands.

The study was performed by researchers affiliated with the University of São Paulo's Luiz de Queiroz College of Agriculture (ESALQ-USP) and Center for Nuclear Energy in Agriculture (CENA-USP), São Paulo State University (UNESP) and the Chico Mendes Institute for Biodiversity Conservation (ICMBio). It was supported by São Paulo Research Foundation - FAPESP via a project led by Katia Maria Paschoaletto Micchi de Barros Ferraz (ESALQ-USP). It also received funding from Fundação Boticário.

"Forest remnants and the agricultural matrix aren't separate. There's an interface between these areas. It's hardly news that animals need to find food in plantations, but this practice hadn't been quantified until now. I should stress that the diet in question isn't ideal. It's a matter of survival," said Marcelo Magioli, who at the time had a PhD scholarship from FAPESP and is first author of the article.

According to the study, the impact of agriculture on conservation relates not just to deforestation and forest fragmentation but also to the alterations brought about by the process in the diet of wild animals. The researchers stress the need for adequate management of human-modified environments to support wildlife survival.

"Our findings point to the need for more favorable agricultural management to support these animals and underscore the importance of the Brazilian Forest Code and of maintaining legal reserves and permanent conservation areas [APPs]," Ferraz said.

Records of feeding habits

To measure how much the diet of these mammals had been altered by the influence of the agricultural matrix, the researchers analyzed stable carbon and nitrogen isotopes in the animals' fur. The method, widely used in trophic studies of marine animals, identifies the type of food consumed in a period of approximately three months and the individual's position in the food chain.

Because the Margay is an endangered species and many of these other animals are also threatened with extinction, the researchers used noninvasive techniques such as hair traps and collection of droppings. Samples were collected in four areas of São Paulo State - two areas next to croplands in Campinas and Botucatu and two conserved areas in Serra do Mar and Serra de Paranapiacaba mountain ranges.

Samples were collected from 29 species of mammals, with 194 samples coming from individuals that lived in human-modified areas and 126 from individuals in well-preserved forest areas.

"From previous studies using GPS collars and camera traps, we knew the animals moved through these areas," Magioli said. "However, stable isotope analysis told us where they were feeding and how important each food source was in their diet."

So near yet so far

According to the researchers, while 34.5% of individuals based on forest fragments within human-modified areas fed only on agricultural resources, 67.5% of the animals living in large areas of well-preserved forest fed mainly on forest resources.

"There's a very big difference in the diets of these two groups of mammals. Given the different species compositions of the two types of areas, we grouped the animals according to diet: carnivores, omnivores, herbivores, frugivores and insectivores," Magioli told.

In the comparison, frugivores and insectivores consumed the same resources regardless of where they lived. Herbivores and omnivores inhabiting forest fragments were the most affected and tended to consume agricultural resources. Carnivores in this environment close to croplands consumed a relatively high proportion of prey that feed on agricultural resources.

"We can conclude that in landscapes with scant forest cover, small fragments prove insufficient to supply the resources species need," Magioli said.

Another finding of the research relates to the effects of organic fertilizer on the animals, especially herbivores, and the impact of sugarcane burning on soil nitrogen cycling and hence on the plants consumed by the animals.

"We observed a difference in nitrogen isotope values in the fur of animals living in forest fragments. Because they consume resources from the agricultural matrix, nitrogen levels are higher, as is the case in the soil, for example. Nitrogen levels typically rise from the bottom to the top of the food chain, so it's harder to explain the order of the food chain for these modified areas, different as it is from that of preserved areas," Magioli said.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Google's new system captures character lighting for virtually any environment

image: Computer scientists at Google are revolutionizing this area of volumetric capture technology with a novel, comprehensive system that is able, for the first time, to capture full-body reflectance of 3D human performances, and seamlessly blend them into the real world through AR or into digital scenes in films, games, and more.

Image: 
SIGGRAPH Asia

Even novice photographers and videographers who rely on their handheld devices to snap photos or make videos often consider their subject's lighting. Lighting is critical in filmmaking, gaming, and virtual/augmented reality environments and can make or break the quality of a scene and the actors and performers in it. Replicating realistic character lighting has remained a difficult challenge in computer graphics and computer vision.

While significant progress has been made on volumetric capture systems, focusing on 3D geometric reconstruction with high resolution textures, such as methods to achieve realistic shapes and textures of the human face, much less work has been done to recover photometric properties needed for relighting characters. Results from such systems lack fine details and the subject's shading is prebaked into the texture.

Computer scientists at Google are revolutionizing this area of volumetric capture technology with a novel, comprehensive system that is able, for the first time, to capture full-body reflectance of 3D human performances, and seamlessly blend them into the real world through AR or into digital scenes in films, games, and more. Google will present their new system, called The Relightables, at ACM SIGGRAPH Asia, held Nov. 17 to 20 in Brisbane, Australia. SIGGRAPH Asia, now in its 12th year, attracts the most respected technical and creative people from around the world in computer graphics, animation, interactivity, gaming, and emerging technologies.

There have been major advances in this realm of work that the industry calls 3D capture systems. Through these sophisticated systems, viewers have been able to experience digital characters come to life on the big screen, for instance, in blockbusters such as Avatar and the Avengers series and much more.

Indeed, the volumetric capture technology has reached a high level of quality, but many of these reconstructions still lack true photorealism. In particular, despite these systems using high-end studio setups with green screens, they still struggle to capture high frequency details of humans and they only recover a fixed illumination condition. This makes these volumetric capture systems unsuitable for photorealistic rendering of actors or performers in arbitrary scenes under different lighting conditions.

Google's Relightables system makes it possible to customize lighting on characters in real time or re-light them in any given scene or environment.

They demonstrate this on subjects that are recorded inside a custom geodesic sphere outfitted with 331 custom color LED lights (also called a Light Stage capture system), an array of high-resolution cameras, and a set of custom high-resolution depth sensors. The Relightables system captures about 65 GB per second of raw data from nearly 100 cameras and its computational framework enables processing the data effectively at this scale. A video demonstration of the project can be seen here: https://youtu.be/anBRroZWfzI

Their system captures the reflectance information on a person--the way lighting interacts with skin is a major factor in how realistic digital people appear. Previous attempts used either flat lighting or required computer generated characters. Not only are they able to capture reflectance information on a person, but they are able to record while the person is moving freely within the volume. As a result, they are able to relight their animation in arbitrary environments.

Historically, cameras record people from a single viewpoint and lighting condition. This new system, note the researchers, allows users to record someone then view them from any viewpoint and lighting condition, removing the need for a green screen to create special effects and allowing for more flexible lighting conditions.

The interactions of space, light, and shadow between a performer and their environment play a critical role in creating a sense of presence. Beyond just 'cutting-and-pasting' a 3D video capture, the system gives the ability to record someone and then seamlessly place them into new environments--whether in their own space for AR experiences--or in the world of a VR, film, or game experience.

At SIGGRAPH Asia, The Relightables team will present the components of their system, from capture to processing to display, with video demos of each stage. They will walk attendees through the ins and outs of building The Relightables, describing the major challenges they tackled in the work and showcasing some cool applications and renderings.

Credit: 
Association for Computing Machinery

Blowing bubbles: PPPL scientist confirms way to launch current in fusion plasmas

image: PPPL physicist Fatima Ebrahimi

Image: 
Elle Starkman / PPPL Office of Communications

An obstacle to generating fusion reactions inside facilities called tokamaks is that producing the current in plasma that helps create confining magnetic fields happens in pulses. Such pulses, generated by an electromagnet that runs down the center of the tokamak, would make the steady-state creation of fusion energy difficult to achieve. To address the problem, physicists have developed a technique known as transient coaxial helicity injection (CHI) to create a current that is not pulsed.

Now, physicist Fatima Ebrahimi of the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL) has used high-resolution computer simulations to investigate the practicality of this technique. The simulations show that CHI could produce the current continuously in larger, more powerful tokamaks than exist today to produce stable fusion plasmas.

"Stability is the most important aspect of any current-drive system in tokamaks," said Ebrahimi, author of a paper reporting the findings in Physics of Plasmas. "If the plasma is stable, you can have more current and more fusion, and have it all sustained over time."

Fusion, the power that drives the sun and stars, is the fusing of light elements in the form of plasma -- the hot, charged state of matter composed of free electrons and atomic nuclei -- that generates massive amounts of energy. Scientists are seeking to replicate fusion on Earth for a virtually inexhaustible supply of power to generate electricity.

The CHI technique replaces an electromagnet called a solenoid that induces current in today's tokamaks. CHI produces the critical current by spontaneously generating magnetic bubbles, or plasmoids, into the plasma. The new high-resolution simulations confirm that a parade of plasmoids marching through the plasma in future tokamaks could create the current that produces the confining fields. The simulations further showed that the plasmoids would stay intact even when buffeted by three-dimensional instabilities.

In the future, Ebrahimi plans to simulate CHI startup while including even more physics about the plasma, which would provide insights to further optimize the process and to extrapolate toward next-step devices. "That's a little bit harder," she says, "but the news right now is that these simulations show that CHI is a reliable current-drive technique that could be used in fusion facilities around the world as they start to incorporate stronger magnetic fields."

Credit: 
DOE/Princeton Plasma Physics Laboratory

Four ways to curb light pollution, save bugs

image: Both local sources of artificial light (left) and diffuse skyglow (right) can impact the physiology, behavior, and fitness of insects. Read more: https://doi.org/10.1016/j.biocon.2019.108259

Image: 
Image courtesy Biological Conservation. See: <a href="https://doi.org/10.1016/j.biocon.2019.108259" target="_blank">https://doi.org/10.1016/j.biocon.2019.108259</a>

Artificial light at night negatively impacts thousands of species: beetles, moths, wasps and other insects that have evolved to use light levels as cues for courtship, foraging and navigation.

Writing in the scientific journal Biological Conservation, Brett Seymoure, the Grossman Family Postdoctoral Fellow of the Living Earth Collaborative at Washington University in St. Louis, and his collaborators reviewed 229 studies to document the myriad ways that light alters the living environment such that insects are unable to carry out crucial biological functions.

"Artificial light at night is human-caused lighting -- ranging from streetlights to gas flares from oil extraction," Seymoure said. "It can affect insects in pretty much every imaginable part of their lives."

Insects and spiders have experienced global declines in abundance over the past few decades -- and it's only going to get worse. Some researchers have even coined a term for it: the insect apocalypse.

"Most of our crops -- and crops that feed the animals that we eat -- need to be pollinated, and most pollinators are insects," Seymoure said. "So as insects continue to decline, this should be a huge red flag. As a society of over 7 billion people, we are in trouble for our food supply."

Unlike other drivers of insect declines, artificial light at night is relatively straightforward to reverse. To address this problem, here are four things that Seymoure recommends:

1. Turn off lights that aren't needed

The evidence on this one is clear.

"Light pollution is relatively easy to solve, as once you turn off a light, it is gone. You don't have to go and clean up light like you do with most pollutants," Seymoure said.

"Obviously, we aren't going to turn off all lights at night," he said. "However, we can and must have better lighting practices. Right now, our lighting policy is not managed in a way to reduce energy use and have minimal impacts on ecosystem and human health. This is not OK, and there are simple solutions that can remedy the problem."

Four characteristics of electrical light matter the most for insects: intensity (or overall brightness); spectral composition (how colorful and what color it is); polarization; and flicker.

"Depending on the insect species, its sex, its behavior and the timing of its activity, all four of these light characteristics can be very important," Seymoure said.

"For example, overall intensity can be harmful for attracting insects to light. Or many insects rely upon polarization to find water bodies, as water polarizes light. So polarized light can indicate water, and many insects will crash into hoods of cars, plastic sheeting, etc., as they believe they are landing on water."

Because it is impossible to narrow down one component that is most harmful, the best solution is often to just shut off lights when they are not needed, he said.

2. Make lights motion-activated

This is related to the first recommendation: If a light is only necessary on occasion, then put it on a sensor instead of always keeping it on.

3. Put fixtures on lights to cover up bulbs and direct light where it is needed

"A big contributor to attraction of light sources for most animals is seeing the actual bulb, as this could be mistaken as the moon or sun," Seymoure said. "We can use full cut-off filters that cover the actual bulb and direct light to where it is needed and nowhere else.

"When you see a lightbulb outside, that is problematic, as that means animals also see that light bulb," he said. "More importantly, that light bulb is illuminating in directions all over the place, including up toward the sky, where the atmosphere will scatter that light up to hundreds of miles away resulting in skyglow. So the easiest solution is to simply put fixtures on light to cover the light bulb and direct the light where it is needed -- such as on the sidewalk and not up toward the sky."

4. Use different colors of lights

"The general rule is that blue and white light are the most attractive to insects," Seymoure said. "However, there are hundreds of species that are attracted to yellows, oranges and reds."

Seymoure has previously studied how different colors of light sources -- including the blue-white color of LEDs and the amber color of high pressure sodium lamps -- affect predation rates on moths in an urban setting.

"Right now, I suggest people stick with amber lights near their houses, as we know that blue lights can have greater health consequences for humans and ecosystems," Seymoure said. "We may learn more about the consequences of amber lights. And make sure these lights are properly enclosed in a full cut-off fixture."

Credit: 
Washington University in St. Louis

Potato virus Y is the most serious threat to potato -- some strains more than others

image: Screen-house experiments in Hermiston, Oregon

Image: 
Alexander Karasev

Potato virus Y (PVY) is the most serious problem facing the potato industry in the United States and is the main cause for rejection of seed potato lots. The virus affects potatoes in two ways: It reduces the yield of potato tubers by 70-80% and also negatively affects the quality of the remaining tubers due to necrotic reactions.

PVY encompasses a complex network of strains with a range of symptoms. During the last 10 years, major changes have been observed in the prevalence of different strains. PVYO, a nonrecombinant strain dominant in the United States until 2012, has virtually disappeared, while two recombinant strains (associated with tuber damage) have been on the rise. These strains, PVYN-Wi and PVYNTN, now represent more than 90% of all PVY isolates found in potato.

In the webcast posted August 26th, "Changing Strain Composition of Potato virus Y (PVY) in the U.S. Potato," University of Idaho Professor of Plant Virology Alexander Karasev discusses this strain-prevalence shift, drawing conclusions based on screen house experiments conducted over two growing seasons in 2015 and 2016. His research shows that while three of the four potato types resisted PVYO, none was able to resist PVYN-Wi.

These changes in PVY strain composition in potato fields have important consequences for potato certification, potato breeding programs, and diagnostic laboratories. According to Karasev, more emphasis should be placed on lab tests, since visual symptoms are more difficult to see. Additionally, commercial diagnostics kits should be able to detect these new strains.

Credit: 
American Phytopathological Society

NASA finds light rain in fading Tropical Depression Fengshen

image: The GPM core satellite passed over Tropical Depression Fengshen in the Northwestern Pacific Ocean on Nov. 18 at 1:56 a.m. EST (0656 UTC) and found a few areas of light rain (blue) falling at a rate of 0.4 inches (10 mm) per hour.

Image: 
NASA/JAXA/NRL

A NASA analysis of rainfall rates shows that the once mighty Fengshen is now a depression devoid of heavy rainfall.

On Nov. 17 at 1 p.m. EST (1800 UTC), the Joint Typhoon Warning Center noted that Tropical Depression Fengshen was located near 24.8 degrees north latitude and 157.9 east longitude, approximately 585 nautical miles northwest of Wake Island, and had tracked southeastward. Maximum sustained surface winds were estimated at 30 knots (34.5 mph/55.5 kph) and weakening.

NASA has the unique capability of peering under the clouds in storms and measuring the rate in which rain is falling. Global Precipitation Measurement mission or GPM core passed over Fengshen from its orbit in space and measured rainfall rates throughout the storm on Nov. 18 at 1:56 a.m. EST (0656 UTC). GPM found a few areas of light rain (blue) falling at a rate of 0.4 inches (10 mm) per hour remained in the weakening system.

Fengshen is expected to become a remnant low pressure system later on Nov. 18.

Typhoons and hurricanes are the most powerful weather event on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

Credit: 
NASA/Goddard Space Flight Center

Nitrous oxide, a greenhouse gas, is on the rise

CAMBRIDGE, MD (November 18, 2019)--Most of us know nitrous oxide as "laughing gas," used for its anaesthetic effects. But nitrous oxide (N2O) is actually the third most important long-lived greenhouse gas, after carbon dioxide (CO2) and methane. Nitrous oxide is also one of the main stratospheric ozone depleting substances-- and we are releasing more of it into the atmosphere than previously thought, according to a new study published this week in Nature Climate Change.

"We see that the N2O emissions have increased considerably during the past two decades, but especially from 2009 onwards," said lead scientist Rona L. Thompson from NILU-Norwegian Institute for Air Research. "Our estimates show that the emission of N2O has increased faster over the last decade than estimated by the Intergovernmental Panel on Climate Change (IPCC) emission factor approach."

Increasing use of nitrogen fertilizers is leading to higher N2O levels in the atmosphere

In the study, Thompson and scientists including Eric Davidson of the University of the Maryland Center for Environmental Science found that nitrous oxide in the atmosphere has risen steadily since the mid-20th century. This rise is strongly linked to an increase in nitrogen substrates released to the environment. Since the mid-20th century, the production of nitrogen fertilizers, widespread cultivation of nitrogen-fixing crops (such as clover, soybeans, alfalfa, lupins, and peanuts), and the combustion of fossil and biofuels has increased enormously the availability of nitrogen substrates in the environment.

"The increased nitrogen availability has made it possible to produce a lot more food," Thompson said. "The downside is of course the environmental problems associated with it, such as rising N2O levels in the atmosphere."

Rate of increase has been underestimated

The study authors found that N2O emissions increased globally to approximately 10% of the global total between 2000-2005 and 2010-2015. This is about twice the amount reported to the United Nations Framework Convention on Climate Change based on the amount of nitrogen fertilizer and manure used and the default emission factor specified by the IPCC. The researchers argue that this discrepancy is due to an increase in the emission factor (that is, the amount of N2O emitted relative to the amount of N-fertilizer used) associated with a growing nitrogen surplus. This suggests that the IPCC method, which assumes a constant emission factor, may underestimate emissions when the rate of nitrogen input and the nitrogen surplus are high.

From scientific methods to practical measures

"This new publication demonstrates both how we can solve a problem of growing greenhouse gas emissions and how current efforts are falling short in some regions of the world," said co-author Eric Davidson of the University of Maryland Center for Environmental Science. "These emissions come primarily from using fertilizers to grow food and increasing livestock herds, but we've learned how to produce more food with less nitrous oxide emission."

"In Europe and North America, we have succeeded in decreasing growth in nitrous oxide emissions, an important contributor to climate change and stratospheric ozone depletion," he added. "Unfortunately, the same can't be said for Asia and South America, where fertilizer use, intensification of livestock production, and the resulting nitrous oxide emissions are growing rapidly.

"The good news is that this problem can be solved, but the less good news is that it will take a global effort, and we are far from there yet," he said.

Credit: 
University of Maryland Center for Environmental Science

Get over it? When it comes to recycled water, consumers won't

If people are educated on recycled water, they may come to agree it's perfectly safe and tastes as good -- or better -- than their drinking water. They may even agree it's an answer to the critical water imbalance in California, where the northern third of the state holds 75% of the water despite 80% of the demand coming from the southern two-thirds.

But that doesn't mean they're going to use recycled water -- and it sure doesn't mean they'll drink it. And the reason lies in the word "disgust."

That's the result of a series of studies by UC Riverside psychology researchers Mary Gauvain and Daniel Harmon published recently in the journal Basic and Applied Social Psychology.

Past research by Harmon and Gauvain explored whether people sense a difference in taste among recycled water, conventional tap water, and commercially bottled water. That study, released in spring 2018, was based on a blind taste test and found people actually preferred the taste of recycled water over conventional tap water.

However, "The idea of recycled wastewater in general evokes disgust reactions," Harmon said at the time.

This idea was addressed in the psychologists' latest research. If people disgusted by the notion of recycled wastewater are educated on its safety and benefits, will their attitudes change? And, will they change their behaviors?

In the research paper, "disgust" is defined as "a strong repulsion to a potentially harmful substance." In addition to disgust, the research considered other factors that dissuade people from using recycled water. Those included misinformation, ignorance, and peoples' desire to conform to social norms.

The research involved three separate studies and a total of 886 participants.

In study one, half of the subjects viewed a brief, pro-conservation internet video. The other group watched a short video about water, but not conservation, about the urban myth that crocodiles live in the sewer system of New York City.

Researchers found both groups failed to budge in their willingness to endorse sustainable water. Ninety-six percent of participants cited disgust as the reason. Distilling the reasoning even more, the researchers asked if participants were motivated by cleanliness or fear of illness. Sixty-five percent said cleanliness.

In study two, the videos were used again. But this time, an educational video demonstrating recycled wastewater is contaminant-free was also shown to address the disgust reaction. The pro-conservation and disgust videos had a "small but unsubstantial effect on peoples' willingness to use recycled wastewater" the research found.

In the last study, participants viewed all three videos. But this time, after completing a post-video survey, they were offered a bottle of water labeled "SMARTdrop--Pure Recycled Water" and asked to sign a conservation petition.

Researchers hypothesized participants who watched the video addressing disgust would be more likely to accept the water and sign the petition. In fact, a similar number across all three groups -- about two-thirds -- took the water bottle and signed the petition.

The results of the three studies run counter to previous findings that assert media information can influence peoples' water conservation attitudes. Instead, they show internet messages may encourage people to view water sustainability more positively, but they do not encourage more sustainable water behaviors.

The article drawn from the research, "Influence of Internet-Based Messages and Personal Motivations on Water-Use Decisions," discourages using pro-recycled wastewater internet videos about water scarcity and conservation alone. Instead, researchers urge a focus on the more visceral roadblock of disgust. As an example, the researchers suggest a video stressing the extent of water purification in recycling plants as part of larger campaigns to change behaviors.

"Disgust is an exceptionally robust motivation that may require stronger intervention to overcome," Harmon said.

Credit: 
University of California - Riverside

Hot electrons harvested without tricks

image: This is a set up for ultrafast spectroscopy, as used in the study.

Image: 
Maxim Pchenitchnikov, University of Groningen

Semiconductors convert energy from photons (light) into an electron current. However, some photons carry too much energy for the material to absorb. These photons produce 'hot electrons', and the excess energy of these electrons is converted into heat. Materials scientists have been looking for ways to harvest this excess energy. Scientists from the University of Groningen and Nanyang Technological University (Singapore) have now shown that this may be easier than expected by combining a perovskite with an acceptor material for 'hot electrons'. Their proof of principle was published in Science Advances on 15 November.

In photovoltaic cells, semiconductors will absorb photon energy, but only from photons that have the right amount of energy: too little and the photons pass right through the material, too much and the excess energy is lost as heat. The right amount is determined by the bandgap: the difference in energy levels between the highest occupied molecular orbital (HOMO) and the lowest unoccupied molecular orbital (LUMO).

Nanoparticles

'The excess energy of hot electrons, produced by the high-energy photons, is very rapidly absorbed by the material as heat,' explains Maxim Pshenichnikov, Professor of Ultrafast Spectroscopy at the University of Groningen. To fully capture the energy of hot electrons, materials with a larger bandgap must be used. However, this means that the hot electrons should be transported to this material before losing their energy. The current general approach to harvesting these electrons is to slow down the loss of energy, for example by using nanoparticles instead of bulk material. 'In these nanoparticles, there are fewer options for the electrons to release the excess energy as heat,' explains Pshenichnikov.

Together with colleagues from the Nanyang Technological University, where he was a visiting professor for the past three years, Pshenichnikov studied a system in which an organic-inorganic hybrid perovskite semiconductor was combined with the organic compound bathophenanthroline (bphen), a material with a large bandgap. The scientists used laser light to excite electrons in the perovskite and studied the behavior of the hot electrons that were generated.

Barrier

'We used a method called pump-push probing to excite electrons in two steps and study them at femtosecond timescales,' explains Pshenichnikov. This allowed the scientists to produce electrons in the perovskites with energy levels just above the bandgap of bphen, without exciting electrons in the bphen. Therefore, any hot electrons in this material would have come from the perovskite.

The results showed that hot electrons from the perovskite semiconductor were readily absorbed by the bphen. 'This happened without the need to slow down these electrons and, moreover, in bulk material. So, without any tricks, the hot electrons were harvested.' However, the scientists noticed that the energy required was slightly higher than the bphen bandgap. 'This was unexpected. Apparently, some extra energy is needed to overcome a barrier at the interface between the two materials.'

Nevertheless, the study provides a proof of principle for the harvesting of hot electrons in bulk perovskite semiconductor material. Pshenichnikov: 'The experiments were performed with a realistic amount of energy, comparable to visible light. The next challenge is to construct a real device using this combination of materials.'

Credit: 
University of Groningen

Vaping less harmful than smoking for vascular health, major study finds

image: Professor Jacob George, Professor of Cardiovascular Medicine and Therapeutics at Dundee and Chief Investigator of the VESUVIUS trial.

Image: 
Dominic Glasgow / University of Dundee

Study finds significant improvements in vascular health of chronic smokers who transition to e-cigarettes

Women see greater health benefits than men following switch to e-cigarettes

VESUVIUS is the largest study to-date on the vascular impact of e-cigarettes versus tobacco cigarettes

Cigarette smokers who switch to nicotine containing vaporisers could significantly improve their vascular health, a major University of Dundee study has concluded.

A two-year trial hosted by the University's School of Medicine found that smokers who switched to e-cigarettes demonstrated a significant improvement in their vascular health within four weeks, with women experiencing greater gains by switching than men. The study also found that participants who transitioned achieved greater improvement compared to those who continued to use both tobacco cigarettes and e-cigarettes.

Named VESUVIUS, the British Heart Foundation-commissioned study is believed to have been the largest undertaken to-date in determining the impact of e-cigarettes on heart health, with the findings published by the Journal of the American College of Cardiology.

Professor Jacob George, Professor of Cardiovascular Medicine and Therapeutics at Dundee and Chief Investigator of the trial, said that while e-cigarettes were found to be less harmful, the devices may still carry health risks.

"It is crucial to emphasise that e-cigarettes are not safe, just less harmful than tobacco cigarettes when it comes to vascular health. Smoking of any kind is a preventable risk factor for heart disease.

"They should not be seen as harmless devices for non-smokers or young people to try. However, for chronic tobacco smokers there were significant improvements in vascular function within a month of switching from a tobacco cigarette to an e-cigarette.

"To put into context, each percentage point improvement in vascular function results in a 13% reduction in cardiovascular event rates such as heart attack. By switching from cigarettes to e-cigarettes we found an average percentage point improvement of 1.5 within just one month. This represents a significant improvement in vascular health. We also found that, in the short term at least, regardless of whether the e-cigarette does or does not contain nicotine, a person will see vascular health improvements compared to smoking a traditional cigarette. The longer term impact of nicotine requires further study and follow-up.

"Women benefitted significantly more than men by making the switch to e-cigarettes, and we are still looking into the reasons for this. Our research also revealed that if a person had smoked less than 20 pack years, their blood vessel stiffness also significantly improved compared to those who smoked more than 20 pack years."

The VESUVIUS study recruited 114 long-term cigarette smokers who had smoked at least 15 cigarettes per day for at least two years. Participants were allocated to one of three groups for one month: Those who continued smoking tobacco cigarettes, those who switched to e-cigarettes with nicotine and those who switched to e-cigarettes without nicotine. Participants were monitored throughout the test period while undergoing vascular testing before and after the trial.

According to UK Government statistics, approximately 6% of Britain's adult population use a vaporiser, though previously the impact of the devices on vascular health have been unclear. While the majority of e-cigarette liquids contain nicotine, they do not contain tobacco, which is claimed to have caused more than 7.1 million deaths worldwide in 2016. The number of additional chemicals is also typically much lower than those found in cigarettes.

Professor Jeremy Pearson, Associate Medical Director at the British Heart Foundation, said, "Our hearts and blood vessels are the hidden victims of smoking. Every year in the UK, 20,000 people die from heart and circulatory disease caused by smoking cigarettes - that's more than 50 people a day, or two deaths every hour. Stopping smoking is the single best thing you can do for your heart health.

"This study suggests that vaping may be less harmful to your blood vessels than smoking cigarettes. Within just one month of ditching tobacco for electronic cigarettes, people's blood vessel health had started to recover.

"Just because e-cigarettes may be less harmful than tobacco doesn't mean they are completely safe. We know they contain significantly fewer of the harmful chemicals, which can cause diseases related to smoking, but we still don't know the long term impact on the heart and circulation, or other aspects of health. E-cigarettes and vaping should never be taken up by people who don't already smoke, but could be a useful tool to help people to stop smoking completely."

Meanwhile, Scotland's Public Health Minister, Joe FitzPatrick MSP, said, "I welcome the publication of this report, which contributes to the ongoing debate about the place of e-cigarettes in our communities. It is good to see important and relevant studies like this being produced in Scotland - justifying our reputation as being one of the leading centres for medical research.

"While research shows that switching to e-cigarettes can lead to vascular health benefits for chronic tobacco smokers, access to e-cigarettes needs to be controlled carefully - they are simply not products for children or non-smokers."

Credit: 
University of Dundee

Mapping disease outbreaks in urban settings using mobile phone data

Researchers from EPFL and MIT have shown that human mobility is a major factor in the spread of vector-borne diseases such as malaria and dengue even over short intra-city distances. In a paper published in Scientific Reports, the team compares different mobility models and concludes that having access to mobile phone location data can prove crucial in understanding disease transmission dynamics - and, ultimately, in stopping an outbreak from evolving into an epidemic. Yet, according to the researchers, this kind of information is hard to come by. They recommend bringing in new legislation to fill a legal void and enable scientists, NGOs and political decision-makers to access people's phone location data for public health purposes.

"Urbanization, mobility, globalization and climate change could be all factors in the emergence of vector-borne diseases, even here in Europe," explains Emanuele Massaro, the paper's lead author and a scientist at EPFL's Laboratory for Human-Environment Relations in Urban Systems (HERUS), which is led by Claudia R. Binder. "Until now, most research has examined how mobility affects the spread of infections in larger areas such as countries or regions. In this study, we focused on the same question, but this time in towns and cities. We also wanted to explore when people's mobile phone location data might prove useful."

The authors studied the interplay between human mobility and the 2013 and 2014 dengue outbreaks in Singapore. They found that even low levels of mobility can cause the epidemic to spread, underscoring the need for an effective spatial distribution model.

Dengue is a viral disease carried by the Aedes aegypti mosquito. It occurs in the tropics and subtropics, and is particularly prevalent in rural areas and poor urban communities. Symptoms include headache and fever, and mortality rates vary from 1% when treated to as high as 20% when left untreated. According to the World Health Organization, the incidence of dengue has increased 30-fold worldwide over the past 50 years. Some 3.9 billion people in 128 countries - almost half of the world's population - are exposed to the virus.

Comparing models

The researchers used an agent-based transmission model in which humans and mosquitoes are represented as agents that go through the epidemic stages of dengue. Using digital simulations, they compared how the system responded to an outbreak against actual reported cases from 2013 and 2014 in Singapore, where a further spike in cases has been recorded this year.

The team then compared four different mobility models, each using different datasets: mobile phone location data, census records, random mobility, and theoretical assumptions. In each model, citizens were assigned two locations - home and work - as places they visit daily and could potentially become infected. The mobile phone model was based on anonymized device data sourced from a Singaporean mobile operator, using call, text and other activity records to pinpoint users' home and work addresses.

Useful during an outbreak

The researchers demonstrated that the mobile phone data and census models were effective at predicting the spatial distribution of dengue cases in Singapore, and that such data could be obtained without infringing on people's privacy. Their findings invite further discussion about the merits and drawbacks of using mobile phone data to model disease outbreaks, as well as other potential applications. "In an emergency, having accurate information makes all the difference," says Massaro. "That's why phone location data is better than annual census records. The problem is that the data is owned by private companies. We need to think seriously about changing the law around accessing this kind of information - not just for scientific research, but for wider prevention and public health reasons."

The team's model could equally be applied to other vector-borne diseases, which, led by malaria, together account for 17% of all infectious diseases. The UN estimates that over 80% of the world's population is exposed to at least one vector-borne disease, with over 50% exposed to two or more.

Credit: 
Ecole Polytechnique Fédérale de Lausanne

Atomically dispersed Ni is coke-resistant for dry reforming of methane

image: DFT calculation of CH4 decomposition.

Image: 
QIAO Botao

Dry reforming of methane (DRM) is the process of converting methane (CH4) and carbon dioxide (CO2) into synthesis gas (syngas). Since CO2 and CH4 are the two most important atmospheric greenhouse gases (GHGs), as well as abundant and low-cost carbon sources, DRM has the potential to mitigate rising GHG emissions and simultaneously realize clean(er) fossil fuel utilization.

Ni catalysts are the most promising candidates for DRM due to their low cost and high initial activity. However, in situ catalyst deactivation caused mainly by carbon deposition (coking) has hindered their commercial use.

Scientists at the Dalian Institute of Chemical Physics (DICP) of the Chinese Academy of Sciences have now developed completely coke-resistant Ni-based single-atom catalyst (SAC). Their findings were published in Nature Communications.

The researchers first developed a hydroxyapatite- (HAP) supported Ni SAC, studied its DRM performance, and found that both HAP-supported Ni SAC and Ni nanocatalyst deactivated quickly during high-temperature DRM.

However, characterization of the used samples revealed that the deactivation mechanisms were totally different: Deactivation of nanocatalyst originated from the coke, while deactivation of Ni SAC stemmed from the sintering of Ni single atoms without any coke formation. These results implied that highly stable and coke-resistant Ni SAC could be obtained if Ni single atoms were effectively stabilized upon reaction.

The scientists then doped HAP with cerium to stabilize Ni single atoms through strong metal-support interaction. The resulting HAP-Ce-supported Ni SAC was highly stable upon reaction, without any coke formation.

Further studies revealed that Ni SAC is intrinsically coke-resistant. In other words, no coke was formed at all during the reaction (in contrast with coke being formed then removed). The coke resistance of Ni SAC derives from the catalyst's unique capacity for selective activation of the first C-H bond in CH4.

Credit: 
Chinese Academy of Sciences Headquarters

Volcanoes under pressure

image: This is the eruption of the Merapi on 11 May 2018.

Image: 
Université de Strasbourg/Uppsala University/Technical University of Munich/The University of Leeds/Universitas Gadjah Mada/German Research Center for Geosciences

When will the next eruption take place? Examination of samples from Indonesia's Mount Merapi show that the explosivity of stratovolcanoes rises when mineral-rich gases seal the pores and microcracks in the uppermost layers of stone. These findings result in new possibilities for the prediction of an eruption.

Mount Merapi on Java is among the most dangerous volcanoes in the world. Geoscientists have usually used seismic measurements which illustrate underground movements when warning the population of a coming eruption in time.

An international team including scientists from the Technical University of Munich (TUM) has now found another indication for an upcoming eruption in the lava from the peak of Mount Merapi: The uppermost layer of stone, the "plug dome", becomes impermeable for underground gases before the volcano erupts.

"Our investigations show that the physical properties of the plug dome change over time," says Prof. H. Albert Gilg from the TUM Professorship for Engineering Geology . "Following an eruption the lava is still easily permeable, but this permeability then sinks over time. Gases are trapped, pressure rises and finally the plug dome bursts in a violent explosion."

Mount Merapi as a model volcano

Using six lava samples, one from an eruption of Mount Merapi in 2006, the others from the 1902 eruption - the researchers were able to ascertain alterations in the stone. Investigation of pore volumes, densities, mineral composition and structure revealed that permeability dropped by four orders of magnitude as stone alteration increased. The cause is newly formed minerals, in particular potassium and sodium aluminum sulfates which seal the fine cracks and pores in the lava.

The cycle of destruction

Computer simulations confirmed that the reduced permeability of the plug dome was actually responsible for the next eruption. The models show that a stratovolcano like Mount Merapi undergoes three phases: First, after an eruption when the lava is still permeable, outgassing is possible; in the second phase the plug dome becomes impermeable for gases, while at the same time the internal pressure continuously increases; in the third phase the pressure bursts the plug dome.

Photographs of Mount Merapi from the period before and during the eruption of May 11, 2018 support the three-phase model: The volcano first emitted smoke, then seemed to be quiet for a long time until the gas found an escape and shot a fountain of ashes kilometers up into the sky.

"The research results can now be used to more reliably predict eruptions," says Gilg. "A measurable reduction in outgassing is thus an indication of an imminent eruption."

Mount Merapi is not the only volcano where outgassing measurements can help in the timely prediction of a pending eruption. Stratovolcanoes are a frequent source of destruction throughout the Pacific. The most famous examples are Mount Pinatubo in the Philippines, Mount St. Helens in the western USA and Mount Fuji in Japan.

Credit: 
Technical University of Munich (TUM)

NASA identifies new eastern pacific tropical storm

image: On Nov. 14, the MODIS instrument that flies aboard NASA's Terra satellite provided a visible image of then Tropical Depression 20 in the Eastern Pacific. TD20 continued organizing and became Tropical Storm Raymond on Nov. 15.

Image: 
NASA Worldview

NASA's Terra satellite captured an image of developing Tropical Storm Raymond in the Eastern Pacific Ocean.

On Nov. 14, the MODIS or Moderate Resolution Imaging Spectroradiometer instrument that flies aboard NASA's Terra satellite provided a visible image of then Tropical Depression 20 (TD20) in the Eastern Pacific. The image showed that the storm was being affected by outside winds, pushing much of the clouds and showers to the east of the center. Despite that wind shear, TD20 continued organizing and became Tropical Storm Raymond on Nov. 15.

In general, wind shear is a measure of how the speed and direction of winds change with altitude. Tropical cyclones are like rotating cylinders of winds. Each level needs to be stacked on top each other vertically in order for the storm to maintain strength or intensify. Wind shear occurs when winds at different levels of the atmosphere push against the rotating cylinder of winds, weakening the rotation by pushing it apart at different levels.

By Nov. 15 at 10 a.m. EST, TD20 had strengthened and organized into a tropical storm and was named Raymond. At that time, the cyclone was located several hundred miles south of the Baja California peninsula. NOAA's National Hurricane Center (NHC) noted that the system is still sheared from the west; however, recent microwave and first-light visible imagery indicate that the center of the cyclone may be reforming closer to the stronger developing thunderstorms.

The center of Tropical Storm Raymond was located near latitude 14.1 degrees north and longitude 108.8 degrees west. That is about 610 miles (985 km) south of the southern tip of Baja California, Mexico. Raymond is moving toward the north-northwest near 7 mph (11 kph), and this general motion with a slight increase in forward speed is expected through today. A turn toward the north or north-northeast is forecast by late Saturday, Nov. 16.

The estimated minimum central pressure is 1005 millibars. Maximum sustained winds have increased to near 45 mph (75 kph) with higher gusts. Some additional strengthening is anticipated during the next day or so.

Weakening is forecast to occur by Sunday, and the system is predicted to degenerate into a remnant low by late Sunday or early Monday, Nov. 18.

NHC cautioned that interests in the southern Baja California peninsula should monitor the progress of Raymond. Raymond is expected to produce total rainfall accumulations of 2 to 4 inches across the southern portions of Baja California Sur. This rainfall could produce life-threatening flash floods.

NASA's Terra satellite is one in a fleet of NASA satellites that provide data for hurricane research.

Hurricanes are the most powerful weather event on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

Credit: 
NASA/Goddard Space Flight Center