Tech

Laser loop couples quantum systems over a distance

image: A loop of laser light connects the oscillations of a nanomechanical membrane and the spin of a cloud of atoms.

Image: 
University of Basel, Department of Physics

Quantum technology is currently one of the most active fields of research worldwide. It takes advantage of the special properties of quantum mechanical states of atoms, light, or nanostructures to develop, for example, novel sensors for medicine and navigation, networks for information processing and powerful simulators for materials sciences. Generating these quantum states normally requires a strong interaction between the systems involved, such as between several atoms or nanostructures.

Until now, however, sufficiently strong interactions were limited to short distances. Typically, two systems had to be placed close to each other on the same chip at low temperatures or in the same vacuum chamber, where they interact via electrostatic or magnetostatic forces. Coupling them across larger distances, however, is required for many applications such as quantum networks or certain types of sensors.

A team of physicists, led by Professor Philipp Treutlein from the Department of Physics at the University of Basel and the Swiss Nanoscience Institute (SNI), has now succeeded for the first time in creating strong coupling between two systems over a greater distance across a room temperature environment. In their experiment, the researchers used laser light to couple the vibrations of a 100 nanometer thin membrane to the motion of the spin of atoms over a distance of one meter. As a result, each vibration of the membrane sets the spin of the atoms in motion and vice versa.

A loop of light acts as a mechanical spring

The experiment is based on a concept that the researchers developed together with the theoretical physicist Professor Klemens Hammerer from the University of Hanover. It involves sending a beam of laser light back and forth between the systems. "The light then behaves like a mechanical spring stretched between the atoms and the membrane, and transmits forces between the two," explains Dr. Thomas Karg, who carried out the experiments as part of his doctoral thesis at the University of Basel. In this laser loop, the properties of the light can be controlled such that no information about the motion of the two systems is lost to the environment, thus ensuring that the quantum mechanical interaction is not disturbed.

The researchers have now succeeded in implementing this concept experimentally for the first time and used it in a series of experiments. "The coupling of quantum systems with light is very flexible and versatile," explains Treutlein. "We can control the laser beam between the systems, which allows us to generate different types of interactions that are useful for quantum sensors, for example."

A new tool for quantum technologies

In addition to coupling atoms with nanomechanical membranes, the new method might also be used in several other systems; for example, when coupling superconducting quantum bits or solid-state spin systems used in quantum computing research. The new technique for light-mediated coupling could be used to interconnect such systems, creating quantum networks for information processing and simulations. Treutlein is convinced: "This is a new, highly useful tool for our quantum technology toolbox."

Credit: 
University of Basel

Quantum jump tipping the balance

image: Measurements at space-like temperatures: Pentatrap is located in a large superconducting magnet. The inside of the vessel is cooled to a temperature near absolute zero so that disturbing heat movements of the atoms are frozen.. Because individuals in the room would influence the measurements by their body temperature, among other things, nobody is allowed to enter the laboratory during the experiment. The system is remote controlled.

Image: 
MPI for Nuclear Physics

A new door to the quantum world has been opened: when an atom absorbs or releases energy via the quantum leap of an electron, it becomes heavier or lighter. This can be explained by Einstein's theory of relativity (E = mc2). However, the effect is minuscule for a single atom. Nevertheless, the team of Klaus Blaumand Sergey Eliseev at the Max Planck Institute for Nuclear Physics has successfully measured this infinitesimal change in the mass of individual atoms for the first time. In order to achieve this, they used the ultra-precise Pentatrap atomic balance at the Institute in Heidelberg. The team discovered a previously unobserved quantum state in rhenium, which could be interesting for future atomic clocks. Above all, this extremely sensitive atomic balance enables a better understanding of the complex quantum world of heavy atoms.

Astonishing, but true: if you wind a mechanical watch, it becomes heavier. The same thing happens when you charge your smartphone. This can be explained by the equivalence of energy (E) and mass (m), which Einstein expressed in the most famous formula in physics: E = mc2 (c: speed of light in vacuum). However, this effect is so small that it completely eludes our everyday experience. A conventional balance would not be able to detect it.

But at the Max Planck Institute for Nuclear Physics in Heidelberg, there is a balance that can: Pentatrap. It can measure the minuscule change in mass of a single atom when an electron in it absorbs or releases energy via a quantum jump, thus opening a new world for precision physics. Such quantum jumps in the electron shells of atoms shape our world - whether in life-giving photosynthesis and general chemical reactions or in the creation of colour and our vision.

An ant on top of an elephant

Rima Schüssler, now a postdoctoral fellow at the Max Planck Institute for Nuclear Physics, has helped build Pentatrap since completing her Master's thesis in 2014. She is the lead author of a paper on an unexpected discovery made in a collaboration at the Max Planck PTB Riken Centre: In rhenium, there is a previously undiscovered electronic quantum state with special properties. Schüssler uses the following analogy to describe the degree of sensitivity with which Pentatrap can detect the jump of an electron into this quantum state via the mass change of a rhenium atom: "By weighing a six-tonne elephant, we were able to determine whether a ten-milligram ant was crawling on it".

Pentatrap consists of five Penning traps. In order for such a trap to be able to weigh an atom, it must be electrically charged (i.e. become an ion). Because rhenium was stripped of 29 of its 75 electrons, it is highly charged. This dramatically increases the accuracy of the measurement. The trap captures this highly charged rhenium ion in a combination of a magnetic field and a specially shaped electric field. Inside, it travels in a circular path, which is intricately twisted into itself. In principle, it can be thought of as a ball on a rope, which is allowed to rotate in the air. If this is done with constant force, a heavier ball rotates slower than a lighter one.

An extremely long-lived quantum state in rhenium

In Pentatrap, two rhenium ions rotated alternately in the stacked traps. One ion was in the energetically lowest quantum state. When the second ion was generated, an electron was randomly excited into a higher state by supplying energy. In a sense, it was the wound watch. Because of the stored energy, it became marginally heavier and thus circulated slower than the first ion. Pentatrap precisely counts the number of revolutions per time unit. The difference in the number of revolutions yielded the increase in weight.

Using this method, the team discovered an extremely long-lived quantum state in rhenium. It is metastable (i.e. it decays after a certain lifetime). According to the calculations of theoreticians from the institute led by Zoltán Harman and Christoph H. Keitel, the University of Heidelberg, and the Kastler Brossel Laboratory in Paris, this is 130 days. The energy of the quantum state also agrees quite well with model calculations using state-of-the-art quantum mechanical methods.

Possible application in future atomic clocks

Such excited electronic states in highly charged ions are interesting for basic research as well as for possible application in future atomic clocks as researched by the working group of José Crespo López-Urrutia at the Institute in cooperation with the Physikalisch-Technische Bundesanstalt (PTB). For them, the metastable state in rhenium is attractive for several reasons. First, because of its longevity, it corresponds to a sharp orbital frequency of the electron around the atomic nucleus. Second, the electron can be excited with soft X-ray light to jump into this quantum state. In principle, such a clock could tick faster and therefore even more accurately than the current generation of optical atomic clocks. However, according to Ekkehard Peik, who is in charge of the "Time and Frequency" Department at PTB and who was not involved in the work, it is still too early to speculate whether the discovery could be suitable for a new generation of atomic clocks.

"Nevertheless, this new method for discovering long-lived quantum states is spectacular", stresses the physicist. He imagines that atomic clocks working with such new quantum states could initially offer a new test field for basic research. Because the rhenium ions lack many mutually shielding electrons, the remaining electrons feel the electric field of the atomic nucleus particularly strongly. The electrons therefore race around the nucleus at such high speeds that their motion must be described using Einstein's theory of special relativity. With the new atomic balance, it would also be possible to test with high precision whether special relativity and quantum theory interact as described by this theory.

In general, the new atomic balance offers a novel access to the quantum-like inner life of heavier atoms. Because these consist of many particles - electrons, protons, and neutrons - they cannot be calculated exactly. The atomic models for theoretical calculations are therefore based on simplifications, and these can now be checked extremely accurately. It might be possible to use such atoms as probes in the search for unknown particles, which can be detected only by the extremely weak gravitational force. This dark matter is one of the greatest unsolved mysteries of physics.

Credit: 
Max-Planck-Gesellschaft

2D oxide flakes pick up surprise electrical properties

image: Electrets -- electrons trapped in defects in two-dimensional molybdenum dioxide -- give the material piezoelectric properties, according to Rice University researchers. The defects (blue) appear in the material during formation in a furnace, and generate an electric field when under pressure.

Image: 
Ajayan Research Group/Rice University

HOUSTON - (May 7, 2020) - Rice University researchers have found evidence of piezoelectricity in lab-grown, two-dimensional flakes of molybdenum dioxide.

Their investigation showed the surprise electrical properties are due to electrons trapped in defects throughout the material, which is less than 10 nanometers thick. They characterize these charges as electrets, which appear in some insulating materials and generate internal and external electric fields.

Piezoelectricity is likewise a property of materials that respond to stress by generating an electric voltage across their surfaces or generate mechanical strain in response to an applied electric field. It has many practical and scientific uses, from the conversion of a wiggling guitar string into an electrical signal to scanning microscopes like those used to make the new finding.

The researchers at Rice's Brown School of Engineering found their micron-scale flakes exhibit a piezoelectric response that is as strong as that observed in such conventional 2D piezoelectric materials as molybdenum disulfide.
The report by Rice materials scientist Pulickel Ajayan and collaborators appears in Advanced Materials.

The key appears to be defects that make molybdenum dioxide's crystal lattice imperfect. When strained, the dipoles of electrons trapped in these defects seem to align, as with other piezoelectric materials, creating an electric field leading to the observed effect.

"Super thin, 2D crystals continue to show surprises, as in our study," Ajayan said. "Defect engineering is a key to engineer properties of such materials but is often challenging and hard to control."

"Molybdenum dioxide isn't expected to show any piezoelectricity," added Rice postdoctoral researcher Anand Puthirath, a co-corresponding author of the paper. "But because we're making the material as thin as possible, confinement effects come into the picture."

He said the effect appears in molybdenum dioxide flakes grown by chemical vapor deposition. Stopping the growth process at various points gave the researchers some control over the defects' density, if not their distribution.
Lead author and Rice alumna Amey Apte added the researchers' single-chemical, precursor-based vapor deposition technique "helps in the reproducibility and clean nature of growing molybdenum oxide on a variety of substrates."

The researchers found the piezoelectric effect is stable at room temperature for significant timescales. The molybdenum dioxide flakes remained stable at temperatures up to 100 degrees Celsius (212 degrees Fahrenheit). But annealing them for three days at 250 C (482 F) eliminated the defects and halted the piezoelectric effect.

Puthirath said the material has many potential applications. "It can be used as an energy harvester, because if you strain this material, it will give you energy in the form of electricity," he said. "If you give it voltage, you induce mechanical expansion or compression. And if you want to mobilize something at the nanoscale, you can simply apply voltage and this will expand and move that particle the way you want."

Credit: 
Rice University

Variance in tree species results in the cleanest urban air

image: The simulation was used to model the course of small particles generated by traffic according to five different tree boulevard options, also comparing particle concentrations to an entirely treeless option

Image: 
Sasu Karttunen

What kind of an effect do trees have on aerosol particle concentrations in cities? Modelling carried out at the University of Helsinki revealed that the air was cleanest on the street level with three rows of trees of variable height situated along boulevard-type city street canyons.

A study carried out at the Institute for Atmospheric and Earth System Research (INAR) of the University of Helsinki modelled how different street-tree alternatives and the location of the trees affect air quality on the pedestrian level. The study was carried out in collaboration by the University of Helsinki, the City of Helsinki and the Finnish Meteorological Institute.

In the end, the best option was found to be a row of taller common lime trees (Tilia x vulgaris) in the middle of the street canyon and lower Swedish whitebeams (Sorbus intermedia) lining the sides.

"In that case, the air was one-fifth cleaner compared to the least favourable option, that is the one with four rows of trees of equal height," says Associate Professor Leena Järvi, head of the research group.

Järvi considers the difference surprisingly marked. Another find that can be considered a surprise is the fact that metre-high hedgerows located under the tree row closest to the edge were seen to have no practical impact on the air quality of pavements.

Trees prevent pollutant ventilation

Globally air pollution kills approximately seven million people every year. Even in relatively clean country such as Finland, hundreds of people die (1560 in 2016) as the result of air pollution every year, which makes the reduction of small particle concentration important.

The City of Helsinki is currently designing new city boulevards to replace a few of the entry motorways in order to answer the densification needs of the city. These boulevards will be accommodated and accessed by several people at the same time with expected high traffic rates. As one mean to improve the local air quality, the city is interested on vegetation, which has multiple positive effects to our well-being and aesthetics.

The cleanest air in street canyons would be achieved without any trees at all, as street trees reduce wind speed and, thus, impair the airing out of traffic pollutants from the street canyon. At the same time, vegetation is needed, for example, for ensuring wellbeing, which makes a treeless option unrealistic for planning.

In the options investigated, the number of tree rows ranged between two and four, with the latter option being four metres wider and the one with two rows of trees four meters narrower than the three-row option, as stipulated by the rules of urban planning. In the option with three rows of trees, the effect of hedgerows located beneath the trees and that of various tree species was investigated as well.

Particle size also matters

The attachment of particles to vegetation does reduce the particle concentration in the air, but the effect is slimmer than the increase brought about the deceleration of wind, or the reduced ventilation of the street canyon. The smallest aerosol particles were more apt to bind with vegetation compared to the larger ones, which is why increasing the number of trees increased the number of the smallest particles found in the air less than that of larger particles.

With trees of uniform height in three rows, the concentration of small particles under 10 micrometres in size grew by 88%, with even the concentration of particles under 2.5 micrometres increasing 42% compared to the entirely treeless street canyon. The option with three rows of trees with different heights, which was the cleanest option, increased the concentration of small particles under 10 micrometres by 75% and that of small particles under 2.5 micrometres by 35%.

The study applied the high-quality PALM model designed for air quality modelling, which enables the assessment of air quality differences with a very fine resolution, that is, to a high degree of precision. The modelling was carried out with the CSC supercomputer, as a regular desktop computer would have spent roughly a year to compute each option, altogether some 12 years taking all wind direction and tree options into consideration. For the supercomputer, this task only took approximately 14 days to complete.

Credit: 
University of Helsinki

GW survey evaluates influence of social media in attracting patients

WASHINGTON (May 7, 2020) - Patients often do not take social media into consideration when looking for a dermatologist, according to a survey from researchers at the George Washington University. The survey was published recently in the Journal of Drugs in Dermatology.

As of 2019, 79% of Americans have a social media presence on platforms such as Facebook, Twitter, and Instagram. Many dermatologists consider social media to be a useful tool for building their practices and recruiting patients. However, limited data exists about whether a provider's social media presence is a driver in attracting new patients to their practice.

"A rapidly growing number of dermatologists are advocating for the value of social media to promote their practices," said Adam Friedman, MD, interim chair of the Department of Dermatology at the GW School of Medicine and Health Sciences and senior author on the study. "Only one other survey has been conducted on patient perception of social media. There hasn't been enough to show us how effective social media is as a marketing tool for dermatologists."

The GW research team distributed a 10-question online survey to a diverse patient population to evaluate their perceptions of social media and what aspects of a dermatologist's site are the most helpful. Only 25% of respondents aged 18-30 years old thought social media was extremely or very important, suggesting that leaning on social media may not be the best way to grow a practice.

The results also indicated that respondents who did utilize social media for these purposes were interested in seeing patient education, viewing patient reviews, as well as dermatologists' experience levels rather than personal information.

"While patients overall may not rely on social media to select a dermatologist nor be interested in nonmedical content, many of our respondents did express interest in educational content written by their dermatologists on social media," Friedman said. "Practitioners should still count social media as a tool in building their practices and engaging their current patients, however, it should be one of many methods that they rely on to recruit new patients."

The authors say that further research needs to be done to determine whether social media is an effective educational tool for dermatologists.

Credit: 
George Washington University

The great unconformity

The geologic record is exactly that: a record. The strata of rock tell scientists about past environments, much like pages in an encyclopedia. Except this reference book has more pages missing than it has remaining. So geologists are tasked not only with understanding what is there, but also with figuring out what's not, and where it went.

One omission in particular has puzzled scientists for well over a century. First noticed by John Wesley Powell in 1869 in the layers of the Grand Canyon, the Great Unconformity, as it's known, accounts for more than one billion years of missing rock in certain places.

Scientists have developed several hypotheses to explain how, and when, this staggering amount of material may have been eroded. Now, UC Santa Barbara geologist Francis Macdonald and his colleagues at the University of Colorado, Boulder and at Colorado College believe they may have ruled out one of the more popular of these. Their study appears in the Proceedings of the National Academy of Sciences.

"There are unconformities all through the rock record," explained Macdonald, a professor in the Department of Earth Science. "Unconformities are just gaps in time within the rock record. This one's called the Great Unconformity because it was thought to be a particularly large gap, maybe a global gap."

A leading thought is that glaciers scoured away kilometers of rock around 720 to 635 million years ago, during a time known as Snowball Earth, when the planet was completely covered by ice. This hypothesis even has the benefit of helping to explain the rapid emergence of complex organisms shortly thereafter, in the Cambrian explosion, since all this eroded material could have seeded the oceans with tremendous amounts of nutrients.

Macdonald was skeptical of this reasoning. Although analogues of the Great Unconformity appear throughout the world -- with similar amounts of rock missing from similar stretches of time -- they don't line up perfectly. This casts doubt as to whether they were truly eroded by a global event like Snowball Earth.

Part of the challenge of investigating the Great Unconformity is that it happened so long ago, and the Earth is a messy system. "These rocks have been buried and eroded multiple times through their history," Macdonald said.

Fortunately, the team was able to test this hypothesis using a technique called thermochronology. A few kilometers below the Earth's surface, the temperature begins to rise as you get closer to the planet's hot mantle. This creates a temperature gradient of roughly 50 degrees Celsius for every kilometer of depth. And this temperature regime can become imprinted in certain minerals.

As certain radioactive elements in rocks break down, Helium-4 is produced. In fact helium is constantly being generated, but the fraction retained in different minerals is a function of temperature. As a result, scientists can use the ratio of helium to thorium and uranium in certain minerals as a paleo-thermometer. This phenomenon enabled Macdonald and his coauthors to track how rock moved in the crust as it was buried and eroded through the ages.

"These unconformities are forming again and again through tectonic processes," Macdonald said. "What's really new is we can now access this much older history."

The team took samples from granite just below the boundary of the Great Unconformity at Pikes Peak in Colorado. They extracted grains of a particularly resilient mineral, zircon, from the stone and analyzed the radio nucleotides of helium contained inside. The technique revealed that several kilometers of rock had been eroded from above this granite between 1,000 and 720 million years ago.

Importantly, this stretch of time definitively came before the Snowball Earth episodes. In fact, it lines up much better with the periods in which the supercontinent Rodinia was forming and breaking apart. This offers a clue to the processes that may have stricken these years from the geologic record.

"The basic hypothesis is that this large-scale erosion was driven by the formation and separation of supercontinents," Macdonald said.

The Earth's cycle of supercontinent formation and separation uplifts and erodes incredible extents of rock over long periods of time. And because supercontinent processes, by definition, involve a lot of land, their effects can appear fairly synchronous across the geologic record.

However, these processes don't happen simultaneously, as they would in a global event like Snowball Earth. "It's a messy process," Macdonald said. "There are differences, and now we have the ability to perhaps resolve those differences and pull that record out."

While Macdonald's results are consistent with a tectonic origin for these great unconformities, they don't end the debate. Geologists will need to complement this work with similar studies in other regions of the world in order to better constrain these events.

The mystery of the Great Unconformity is inherently tied to two of geology's other great enigmas: the rise and fall of Snowball Earth and the sudden emergence of complex life in the Ediacaran and Cambrian. Progress in any one could help researchers finally crack the lot.

"The Cambrian explosion was Darwin's dilemma," Macdonald remarked. "This is a 200-year old question. If we can solve that, we would definitely be rock stars."

Credit: 
University of California - Santa Barbara

Traffic pollution drops in lockdown -- but other risks revealed by Manchester experts

Traffic pollution for most parts of the UK is plummeting thanks to the COVID-19 lockdown but more urban ozone - a dangerous air pollutant which can cause airway inflammation in humans - is probably being generated, say experts from The University of Manchester.

The analysis was led by Hugh Coe, Professor of Atmospheric Composition, plus air pollution expert Dr James Allan from Manchester's Department of Earth and Environmental Sciences. Their findings have been submitted in response to a call for evidence from the government's Department for Environment, Food and Rural Affairs (Defra).

According to Manchester research, levels of nitrogen oxides have shown reduction in most locations in the UK during mid-March and April when lockdown has been in full force - but the level of decline ranges from of 20 to 80 percent.

Manchester's city centre, for example, has seen a 70 per cent reduction in nitrogen oxides.

This drop can be attributed to the recent impact to traffic on the nation's roads, either private cars or public transport, as citizens were advised to stay at home to help prevent the spread of COVID-19.

"However, there is considerable site-to-site variability with some locations showing far less reduction than others," said Professor Coe. "In fact, a small number of sites have even shown a modest increase, for example in parts of Edinburgh.

"Whether this is due to changes in the number or type of vehicles now travelling in that particular area, changes in driving patterns or other causes is not clear but the reductions are certainly not uniform."

For example, levels of nitrogen oxides fall less in rural areas than urban areas; and they are higher in the morning than compared to later in the day. Unlike NO2, there was no evidence of a decrease in PM2.5 - tiny particulates that can make the air appear hazy.

"While these particle are produced by vehicles, they are also known to originate from domestic wood burning and chemical reactions involving emissions from industry and agriculture, so there has been no significant improvement in air quality in that regard," said Professor Coe.

At the same time, the Manchester team speculate that photochemical production of ozone may become more important in urban areas during summertime in these low NOx conditions.

This is an important finding because while ozone is extremely important for screening harmful solar ultraviolet (UV) radiation when present higher up in the atmosphere - it can be a dangerous air pollutant at the Earth's surface. Increasing surface ozone above natural levels is harmful to humans, plants, and other living systems because ozone reacts strongly to destroy or alter many biological molecules.

"Ozone is a strong oxidant and induces a range of health effects such as throat irritation and airway inflammation. It can reduce lung function and as a result worsens diseases such as bronchitis and asthma. In addition to human health impacts, ozone reduces plant growth and hence agricultural yields and chemically ages a wide range of polymers," explained Professor Coe

He added: "Observations in cities across the UK show marked decreases in nitrogen oxides but with corresponding increases in ozone during lockdown."

As nitrogen oxides reduce then photochemical production may well become more efficient and can lead to higher ozone concentrations in summertime as higher temperatures increase emissions of biogenic hydrocarbon from natural sources such as trees. These biogenic hydrocarbons significantly affect urban ozone levels.

As a result of the Manchester research government and local authorities will need to be alert to the potential increase in urban ozone during lockdown.

The Manchester team used the government's Automatic Urban and Rural Network (AURN) to help gather their nationwide data and the University's own Manchester Air Quality Supersite (MAQS), located in Fallowfield on the University campus. The work is carried out through the Manchester Environment Research Institute, which has a theme dedicated to Pollution, Human Health and Wellbeing.

The AURN is the UK's largest automatic monitoring network and it includes automatic air quality monitoring stations measuring oxides of nitrogen (NOx), sulphur dioxide (SO2), ozone (O3), carbon monoxide (CO) and particulate matter (including PM10, PM2.5).

Nitrogen oxides (NOx) is a generic term that includes nitric oxide (NO) and nitrogen dioxide (NO2) and these gases contribute to air pollution, including the formation of smog and acid rain, as well as affecting tropospheric ozone.

Credit: 
University of Manchester

Prediction tool shows how forest thinning may increase Sierra Nevada snowpack

image: University of Nevada, Reno students perform maintenance on data-collection equipment to help with projects to maintain the Sierra Nevada forest.

Image: 
Photo by Adrian Harpold, University of Nevada, Reno.

RENO, Nev. - The forest of the Sierra Nevada mountains is an important resource for the surrounding communities in Nevada and California. Thinning the forest by removing trees by hand or using heavy machinery is one of the few tools available to manage forests. However, finding the best way to thin forests by removing select trees to maximize the forest's benefits for water quantity, water quality, wildfire risk and wildlife habitat remains a challenge for resource managers. The U.S. Forest Service is leading an effort to balance all these challenges in landscape-scale forest restoration planning as part of the Lake Tahoe West Restoration Partnership.

As part of this effort, University of Nevada, Reno's Adrian Harpold recently led a team in developing a modeling tool to focus on the issue of water quantity. The tool predicts how different approaches to thinning the forest impact snowpack accumulation in Lake Tahoe, which controls how much water is available for downstream communities such as Reno.

"The snowpack we've relied upon is under pressure from years of fire suppression that increased tree density, combined with the effects of climate change and warming temperatures," Harpold, natural resources & environmental science assistant professor with the College of Agriculture, Biotechnology & Natural Resources, said.

He explained that too many trees means less snow reaches the ground. In addition, when many trees are clumped together, they warm up and release heat, which can melt the snow on the ground. However, too few trees means the snowpack is less protected from the sun and wind, which also melt snow.

The tool, developed with funding from the College's Experiment Station and the U.S. Forest Service, was built to specifically model the west shore of Lake Tahoe, which the team felt was a good sample of the Sierra Nevada forest. The team initially created a small-scale high-resolution model using data collected with 3D laser scanners, called "LiDAR."

"The LiDAR data lets us see individual trees, which we use to 'virtually thin' the forest by taking trees out of the model," he said. "As such, it lets us create a thinning experiment that's realistic. We can then represent different management actions, such as removing trees below certain heights."

His team, including the post-doctoral scholar Sebastian Krogh, graduate student Devon Eckberg, undergraduate students Makenzie Kohler and Gary Sterle, the College's Associate Professor of Remote Sensing Jonathan Greenberg and University of Arizona's Patrick Broxton, tested the model's accuracy by conducting thinning experiments and comparing the predictions to measurements in the real forest. Results were discussed in a recently published article on the proof-of-concept for using high-resolution modeling to predict the effect of forest thinning for snow, for which Harpold was the lead author.

Once the team determined the model was working correctly, they increased the model size to represent Lake Tahoe's western shore. Results are discussed in another recently published article on using the model to predict the effects of forest thinning on the northern Sierra Nevada snowpack, led by Krogh with Harpold, Broxton and the Forest Service's Patricia Manley as co-authors. Their experiments showed that overall, more trees removed means more snow maintained. However, there are beneficial ways and detrimental ways to remove trees. The method that appeared to be most effective was removing dense trees that had many leaves and branches and were shorter than about 50 feet, leaving behind taller trees. There were also differences in effectiveness depending on the elevation, the slope and the direction the slope was facing.

Harpold plans to continue expanding the model, testing to see if it will work for Lake Tahoe's eastern shore and in the American River Basin, with the ultimate goal of providing a tool for Forest Service decision-makers and others to inform their forest-thinning plans.

The water-quantity tool is one of many different modeling tools being developed with funding from the Forest Service as part of the Tahoe-Central Sierra Initiative, which aims to quickly restore the forest to improve the health and resilience of Sierra Nevada mountains and maximize the benefits that the forest provides.

"My decision-support tool for water quantity would be a smaller piece in a larger toolkit to help determine how and where to thin the forests," Harpold said.

Other tools being designed to predict forest-thinning impacts include a tool to predict impact on wildfire spread, a tool to predict impact on smoke, a tool to predict impact on endangered and threatened species, a tool to predict sediment flow into Lake Tahoe and a tool to predict economic impact.

Credit: 
University of Nevada, Reno

Scientists measured electrical conductivity of pure interfacial water

image: Graphical abstract

Image: 
Journal of Physical Chemistry Letters

Skoltech scientists in collaboration with researchers from the University of Stuttgart, the Karlsruhe Institute of Technology and the Russian Quantum Center achieved the first systematic experimental measurements of the electrical conductivity of pure interfacial water, hence producing new results significantly extending our knowledge of interfacial water.

Interfacial water may be found everywhere around us. Biological systems, electrochemical devices, food preservation methods, climate-related processes to name a few, all depend on the properties of water near interfaces. However, direct access to the physical-chemical properties of pure interfacial water is arduous and explains why much remains to be discovered and understood.

The results obtained by the scientists from the Skoltech Center for Energy Science and Technology (CEST) in collaboration with German researchers provide new and detailed insights into complex fluids. The discovery of new electrical properties of interfacial water will clearly have impact on the future development of electrochemical systems, both for electrical power generation and storage.

"We used diamond-based ceramics with open-pore structure filled with water. By consistent reduction of the pore size from 500 nm to 5 nm, we increased the interfacial-to-bulk-water ratio up to its maximum, at which point the interfacial water showed anomalous DC conductivity, five orders of magnitude higher than that of the bulk water. Our analysis shows that this unusual conductivity is a genuine intrinsic property of the interfacial water, as the surface chemistry contribution clearly appears not to be the dominant one," explains Vasily Artemov, Senior Research Scientist in the group of Skoltech professor Henni Ouerdane.

"The topic of interfacial water is of immense interest to a wide audience of physicists, electrochemists, climate researchers, geologists and biologists, and we anticipate that the research we report will be influential across a diverse range of scientific and technological fields, such as electrochemical energy systems, membrane technologies and nanofluidics," said Henni Ouerdane.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

Olanzapine may help control nausea, vomiting in patients with advanced cancer

ROCHESTER, Minn. — Olanzapine, a generic drug used to treat nervous, emotional and mental conditions, also may help patients with advanced cancer successfully manage nausea and vomiting unrelated to chemotherapy. These are the findings of a study published Thursday, May 7 in JAMA Oncology.

Charles Loprinzi, M.D., a Mayo Clinic medical oncologist, played a leadership role in this work in conjunction with Rudolph Navari, M.D., of The University of Alabama at Birmingham.

"It's well-appreciated by most people that patients receiving cancer chemotherapy suffer from nausea and vomiting," explains Dr. Loprinzi. "However, it's less well-appreciated that patients with advanced cancer also have significant problems with nausea and vomiting that are unrelated to chemotherapy."

Drs. Loprinzi and Navari found limited research regarding nausea and vomiting in patients with advanced cancer unrelated to chemotherapy, so they decided to conduct a clinical trial.

The colleagues, along with other collaborators, conducted a randomized, placebo-controlled trial in 30 patients with advanced cancer who had not recently received chemotherapy or radiation therapy but did have substantial trouble with nausea and vomiting. Researchers randomly assigned patients to receive a low-dose of olanzapine or a placebo daily. Neither the trial participants nor their clinicians knew whether participants were receiving olanzapine or a placebo.

Prior to starting their medications on the first day of the study, participants rated their nausea over the previous 24 hours on a scale of 0-10, with 0 being none and 10 being as bad as it could be. Participants continued to rate their nausea every day at about the same time of day for the duration of the study.

When the study was unblinded, the research team learned that all 30 participants recorded nausea scores of 8-10 on the first day of the study. After one day and one week, nausea scores in the 15 patients who received a placebo were all still 8-10 out of 10. In contrast, the 15 patients who received olanzapine had scores of 2-3 out of 10 after one day and 0-3 out of 10 after one week. Correspondingly, these patients reported less vomiting, better appetite and better well-being. No patient-reported adverse events were observed among trial participants receiving olanzapine.

"Olanzapine given at 5 milligrams per day for seven days markedly improved patient quality of life with no side effects," says Dr. Navari. "And as a generic drug, it's also relatively affordable, with a one-month supply often costing anywhere from $10 to $15."

"Current guidelines for the management of nausea and vomiting in patients with advanced cancer have not specifically indicated that one drug looks substantially better than a variety of other drugs," says Dr. Loprinzi. "However, we believe the present results may be viewed as a best practice for treating nausea and vomiting in patients with advanced cancer-associated nausea and vomiting."

Credit: 
Mayo Clinic

Telescopes and spacecraft join forces to probe deep into Jupiter's atmosphere

image: These images of Jupiter's Great Red Spot were made using data collected by the Hubble Space Telescope and the Gemini Observatory on April 1, 2018. By combining observations captured at almost the same time from the two different observatories, astronomers were able to determine that dark features on the Great Red Spot are holes in the clouds rather than masses of dark material.

Upper left (wide view) and lower left (detail): The Hubble image of sunlight (visible wavelengths) reflecting off clouds in Jupiter's atmosphere shows dark features within the Great Red Spot.

Upper right: A thermal infrared image of the same area from Gemini shows heat emitted as infrared energy. Cool overlying clouds appear as dark regions, but clearings in the clouds allow bright infrared emission to escape from warmer layers below.

Lower middle: An ultraviolet image from Hubble shows sunlight scattered back from the hazes over the Great Red Spot. The Great Red Spot appears red in visible light because these hazes absorb blue wavelengths. The Hubble data show that the hazes continue to absorb even at shorter ultraviolet wavelengths.

Lower right: A multiwavelength composite of Hubble and Gemini data shows visible light in blue and thermal infrared in red. The combined observations show that areas that are bright in infrared are clearings or places where there is less cloud cover blocking heat from the interior.

The Hubble and Gemini observations were made to provide a wide-view context for Juno's 12th pass (Perijove 12).

Image: 
NASA, ESA, and M.H. Wong (UC Berkeley) and team

NASA's Hubble Space Telescope and the ground-based Gemini Observatory in Hawaii have teamed up with the Juno spacecraft to probe the mightiest storms in the solar system, taking place more than 500 million miles away on the giant planet Jupiter.

A team of researchers led by Michael Wong at the University of California, Berkeley, and including Amy Simon of NASA's Goddard Space Flight Center in Greenbelt, Maryland, and Imke de Pater also of UC Berkeley, are combining multiwavelength observations from Hubble and Gemini with close-up views from Juno's orbit about the monster planet, gaining new insights into turbulent weather on this distant world.

"We want to know how Jupiter's atmosphere works," said Wong. This is where the teamwork of Juno, Hubble and Gemini comes into play.

Radio 'Light Show'

Jupiter's constant storms are gigantic compared to those on Earth, with thunderheads reaching 40 miles from base to top -- five times taller than typical thunderheads on Earth -- and powerful lightning flashes up to three times more energetic than Earth's largest "superbolts."

Like lightning on Earth, Jupiter's lightning bolts act like radio transmitters, sending out radio waves as well as visible light when they flash across the sky.

Every 53 days, Juno races low over the storm systems detecting radio signals known as "sferics" and "whistlers," which can then be used to map lightning even on the day side of the planet or from deep clouds where flashes are not otherwise visible.

Coinciding with each pass, Hubble and Gemini watch from afar, capturing high-resolution global views of the planet that are key to interpreting Juno's close-up observations. "Juno's microwave radiometer probes deep into the planet's atmosphere by detecting high-frequency radio waves that can penetrate through the thick cloud layers. The data from Hubble and Gemini can tell us how thick the clouds are and how deep we are seeing into the clouds," Simon explained.

By mapping lightning flashes detected by Juno onto optical images captured of the planet by Hubble and thermal infrared images captured at the same time by Gemini, the research team has been able to show that lightning outbreaks are associated with a three-way combination of cloud structures: deep clouds made of water, large convective towers caused by upwelling of moist air -- essentially Jovian thunderheads -- and clear regions presumably caused by downwelling of drier air outside the convective towers.

The Hubble data show the height of the thick clouds in the convective towers, as well as the depth of deep water clouds. The Gemini data clearly reveal the clearings in the high-level clouds where it is possible to get a glimpse down to the deep water clouds.

Wong thinks that lightning is common in a type of turbulent area known as folded filamentary regions, which suggests that moist convection is occurring in them. "These cyclonic vortices could be internal energy smokestacks, helping release internal energy through convection," he said. "It doesn't happen everywhere, but something about these cyclones seems to facilitate convection."

The ability to correlate lightning with deep water clouds also gives researchers another tool for estimating the amount of water in Jupiter's atmosphere, which is important for understanding how Jupiter and the other gas and ice giants formed, and therefore how the solar system as a whole formed.

While much has been gleaned about Jupiter from previous space missions, many of the details -- including how much water is in the deep atmosphere, exactly how heat flows from the interior and what causes certain colors and patterns in the clouds -- remain a mystery. The combined result provides insight into the dynamics and three-dimensional structure of the atmosphere.

Seeing a 'Jack-O-Lantern' Red Spot

With Hubble and Gemini observing Jupiter more frequently during the Juno mission, scientists are also able to study short-term changes and short-lived features like those in the Great Red Spot.

Images from Juno as well as previous missions to Jupiter revealed dark features within the Great Red Spot that appear, disappear and change shape over time. It was not clear from individual images whether these are caused by some mysterious dark-colored material within the high cloud layer, or if they are instead holes in the high clouds -- windows into a deeper, darker layer below.

Now, with the ability to compare visible-light images from Hubble with thermal infrared images from Gemini captured within hours of each other, it is possible to answer the question. Regions that are dark in visible light are very bright in infrared, indicating that they are, in fact, holes in the cloud layer. In cloud-free regions, heat from Jupiter's interior that is emitted in the form of infrared light -- otherwise blocked by high-level clouds -- is free to escape into space and therefore appears bright in Gemini images.

"It's kind of like a jack-o-lantern," said Wong. "You see bright infrared light coming from cloud-free areas, but where there are clouds, it's really dark in the infrared."

Hubble and Gemini as Jovian Weather Trackers

The regular imaging of Jupiter by Hubble and Gemini in support of the Juno mission is proving valuable in studies of many other weather phenomena as well, including changes in wind patterns, characteristics of atmospheric waves and the circulation of various gases in the atmosphere.

Hubble and Gemini can monitor the planet as a whole, providing real-time base maps in multiple wavelengths for reference for Juno's measurements in the same way that Earth-observing weather satellites provide context for NOAA's high-flying Hurricane Hunters.

"Because we now routinely have these high-resolution views from a couple of different observatories and wavelengths, we are learning so much more about Jupiter's weather," explained Simon. "This is our equivalent of a weather satellite. We can finally start looking at weather cycles."

Because the Hubble and Gemini observations are so important for interpreting Juno data, Wong and his colleagues Simon and de Pater are making all of the processed data easily accessible to other researchers through the Mikulski Archives for Space Telescopes (MAST) at the Space Telescope Science Institute in Baltimore, Maryland.

"What's important is that we've managed to collect this huge data set that supports the Juno mission. There are so many applications of the data set that we may not even anticipate. So, we're going to enable other people to do science without that barrier of having to figure out on their own how to process the data," Wong said.

Credit: 
NASA/Goddard Space Flight Center

CCNY physicists shed light on the nanoscale dynamics of spin thermalization

image: Image represents a system of nuclear spins whose interactions are mediated by electron spins.

Image: 
Carlos Meriles Research Group

In physics, thermalization, or the trend of sub-systems within a whole to gain a common temperature, is typically the norm. There are situations, however, where thermalization is slowed down or virtually suppressed; examples are found when considering the dynamics of electron and nuclear spins in solids, where certain sub-groups behave as if isolated from the rest. Understanding why this happens and how it can be controlled is presently at the center of a broad effort, particularly for applications in the emerging field of quantum information technologies.

Reporting in the latest issue of "Science Advances," a group of researchers based at The City College of New York (CCNY) provide new insights on the dynamics of spin thermalization at the nanoscale. The paper is entitled: "Optically pumped spin polarization as a probe of many-body thermalization," and the work was carried out under the supervision of Carlos A. Meriles, the Martin and Michele Cohen Professor of Physics in CCNY's Division of Science.

One of the main hurdles to investigating nanoscale thermalization is the huge disparity between the numbers of thermal and athermal spins, the latter being only a tiny fraction of the total. To show the flow of spin polarization between these groups, experiments must be simultaneously sensitive to both groups, a difficult proposition as most techniques are adapted to one group or the other but ill-suited for both. Working with physicists at the University of California, Berkeley, and Argentina's Universidad Nacional de Cordoba, Meriles' CCNY group developed a technique that circumvents this problem. Further, using this technique it was possible to see that under certain specific conditions, it is possible to make those isolated ('athermal') spins 'communicate' with the rest.

"In a solid, electron spins typically take the form of impurities or imperfections in the crystal lattice, whereas nuclear spins are associated to the atoms of the crystal itself and thus are way more abundant," said Meriles. "For example, for diamond, the system we studied, electron spins are the 'NV' and 'P1' centers, and nuclear spins are the carbons in the diamond lattice."

Because the electron spin is much stronger than the nuclear spin, carbons close to NVs or P1s experience a local magnetic field, absent for carbons that are farther away. Because of the local field they experience, hyperfine-coupled carbons have been traditionally assumed to be isolated from the rest, in the sense that, if polarized, they cannot pass this polarization to the bulk, i.e., their spin is frozen or 'localized', hence leading to an 'athermal' behavior.

"Our experiments demonstrate that the ideas above are not valid when the concentration of electron spins is sufficiently high. In this limit, we find that hyperfine coupled and bulk nuclei communicate efficiently because groups of electron spins serve as effective linkers to move around otherwise isolated nuclear spin polarization. We find this process can be really effective, leading to fast nuclear spin transport rates, exceeding even those between bulk nuclei," said Meriles.

Overall, the CCNY team's findings could help realize devices that use electron and nuclear spins in solids for quantum information processing or sensing at the nanoscale. Indirectly, it could also help implement states of high nuclear spin polarization that could be applied in MRI and NMR spectroscopy.

Credit: 
City College of New York

Clay layers and distant pumping trigger arsenic contamination in Bangladesh groundwater

image: Workers install a monitoring well near the study site in Bangladesh.

Image: 
Rajib Mozumder

Well water contaminated by arsenic in Bangladesh is considered one of the most devastating public health crises in the world. Almost a quarter of the country's population, an estimated 39 million people, drink water naturally contaminated by this deadly element, which can silently attack a person's organs over years or decades, leading to cancers, cardiovascular disease, developmental and cognitive problems in children, and death. An estimated 43,000 people die each year from arsenic-related illness in Bangladesh.

To avoid arsenic contamination, many Bangladeshi households access water via private wells drilled to 300 feet or less, beneath impermeable clay layers. Such clay layers have been thought to protect groundwater in the underlying aquifers from the downward flow of contaminants. However, a study published in Nature Communications this week suggests that such clay layers do not always protect against arsenic, and could even be a source of contamination in some wells.

Clay layers had previously been suspected of contaminating groundwater with arsenic in parts of Bangladesh, the Mekong delta of Vietnam and the Central Valley of California, but the new paper provides the most direct evidence so far.

"Our findings challenge a widely held view, namely that impermeable clay layers necessarily protect an aquifer from perturbation," said Alexander van Geen, a research professor at Columbia University's Lamont-Doherty Earth Observatory who has been studying arsenic contamination of drinking water for two decades. "In this context, we show from several different angles -- failed attempts to lower local exposure, high-resolution drilling, monitoring, and groundwater dating -- that this is actually not the case for groundwater arsenic, because distant municipal pumping can trigger remotely the release of arsenic below such a clay layer."

The researchers were inspired to conduct the study after two manually pumped community wells drilled to intermediate depths in the vicinity of Dhaka, the capital of Bangladesh, suddenly failed, producing water with elevated concentrations of arsenic after having generated clean water for many months.

Most sand contains arsenic, but it is not a problem until the arsenic is released into drinking water in some way, typically through response to reactive carbon. The sources of this reactive carbon remain poorly understood, despite decades of study. One possibility is that it travels into the sediment with the downward flowing of surface waters, but the researchers showed with groundwater dating that such flow was not responsible in the case of their study area. Another is that reactive carbon is released as plant matter breaks down underground. The third theory, demonstrated for the first time in the new paper, is that excessive municipal pumping can compress the clay layers, squeezing out reactive carbon, which then releases arsenic from local sediments.

Indeed, the researchers found that the recent changes in arsenic near Dhaka were the result of pumping from deeper aquifers to satisfy the municipal supply of the city. Because of this deep municipal pumping, water levels under Dhaka itself are a hundred meters below what they would naturally be -- the aquifer just doesn't refill fast enough. This depressed area is called the Dhaka "cone of depression," and it extends approximately 20 kilometers around the city.

"In Dhaka, the pumping probably accelerated the release of arsenic and allowed us to document the changes within a decade," said van Geen. "We wouldn't have figured this out without having been there monitoring wells for at least 10 years. Monitoring is not very exciting, but because of the monitoring we discovered something fascinating."

The research team's findings are especially worrisome for local households on the outskirts of Dhaka that have been privately re-installing wells to access relatively shallow aquifers beneath the impermeable clay layer.

Even in the absence of deep pumping for municipal needs, long-term diffusion of dissolved organic carbon from clay layers could explain why private wells screened just below a clay layer in other sedimentary aquifers are more likely to be contaminated with arsenic than deeper wells, according to the paper.

While the geochemical conditions surrounding every aquifer are different, the problem of arsenic and other contaminants leaking into deep aquifer groundwater is not unique to Dhaka. "It's a warning and it means that in some areas you need to probably test wells more frequently than others," said van Geen.

The problem is not unique to Bangladesh, either. With groundwater pumping from aquifers expected to continue throughout the world, more global monitoring for contamination by arsenic from compacting clay layers may be necessary, according to the paper's authors.

The dilemma of how to provide Bangladesh's population with clean water remains. Deep wells are currently supplying some of the safest water in Bangladesh, said Charles Harvey, a professor of civil engineering at MIT who has long studied arsenic in drinking water but did not contribute to this research. "Most of them seem to be fine, but this raises the alarm that maybe they won't stay fine."

The research question van Geen would like to address next occupies the realm of behavioral economics: "How can you encourage people who have wells that are high in arsenic to do something about it?"

Credit: 
Columbia Climate School

Carbon footprint hotspots: Mapping China's export-driven emissions

The coronavirus pandemic has highlighted just how reliant the United States and other countries are on Chinese manufacturing, with widespread shortages of protective medical gear produced there.

But U.S. dependence on China extends far beyond surgical masks and N95 respirators. China is the largest producer of many industrial and consumer products shipped worldwide, and about one-quarter of the country's gross domestic product comes from exports.

It is also the world's largest emitter of climate-altering carbon dioxide gas, generated by the burning of fossil fuels. A new study details the links between China's exports and its emissions by mapping the in-country sources of carbon dioxide emissions tied to products consumed overseas.

University of Michigan researchers and their Chinese collaborators tracked these emissions to a small number of coastal manufacturing hubs and showed that about 1% of the country's land area is responsible for 75% of the export-linked CO2 emissions.

The study, scheduled for publication May 7 in Nature Communications, provides the most detailed mapping of China's export-driven CO2 emissions to date, according to corresponding author Shen Qu of the U-M School for Environment and Sustainability. The findings, which are based on 2012 emissions data, offer insights that can guide policymakers, he said.

"Developing localized climate mitigation strategies requires an understanding of how global consumption drives local carbon dioxide emissions with a fine spatial resolution," said Qu, a Dow Sustainability Postdoctoral Fellow at SEAS who combines the tools of input-output analysis and network analysis to uncover the role of international trade in global environmental impacts.

"The carbon footprint hotspots identified in this study are the key places to focus on collaborative mitigation efforts between China and the downstream parties that drive those emissions," he said.

The study found that the manufacturing hubs responsible for most of the foreign-linked emissions are in the Yangtze River Delta (including Shanghai, China's top CO2-emitting city), the Pearl River Delta (including Dongguan) and the North China Plain (including Tianjin). These cities have, or are close to, ports for maritime shipping.

The modeling study uses data from large-scale emissions inventories derived from 2012 surveys of individual firms in all Chinese industries that generate carbon dioxide emissions. Emissions levels have likely changed in response to recent U.S.-China trade disputes and the COVID-19 pandemic, which has significantly impacted Chinese manufacturing and exports.

Chinese CO2 emissions driven by foreign consumption totaled 1.466 megatons in 2012, accounting for 14.6% of the country's industrial-related carbon dioxide emissions that year. If the Chinese manufacturing hubs identified in the U-M study constituted a separate country, their CO2 emissions in 2012 would have ranked fifth in the world behind China, the United States, India and Russia, according to the authors.

The study also found that:

Exports to the United States, Hong Kong and Japan were responsible for the biggest chunks of Chinese foreign-linked CO2 emissions, contributing about 23%, 10.8% and 9%, respectively.

About 49% of the U.S.-linked CO2 emissions were driven by the production of consumer goods for the household.

About 42% of the export-driven CO2 emissions in China are tied to electricity generation, with notable hotspots in the cities of Shanghai, Ningbo, Suzhou (Jiangsu Province) and Xuzhou. Much of that electricity is produced at coal-fired power plants.

China is the world's largest steel producer and exporter. Cities that manufacture large amounts of iron and steel--and that use large amounts of coal in the process--were hotspots for export-driven CO2 emissions. Cement plants and petroleum refineries were also big contributors.

In the study, U-M researchers and their collaborators used carbon footprint accounting--i.e., consumption-based accounting--to track greenhouse gas emissions driven by global supply chains. They mapped those emissions at a spatial resolution of 10 kilometers by 10 kilometers, a level of detail that enabled them to identify specific source cities.

"Previous studies have linked greenhouse gas emissions to final consumption of products, but primarily at national or regional levels," said study co-author Ming Xu of the U-M School for Environment and Sustainability and the Department of Civil and Environmental Engineering.

"Given the increasing importance of non-state actors--provinces, states, cities and companies--in climate mitigation, it becomes increasingly important to be able to explicitly link the final consumers of products to the subnational actors that have direct control over greenhouse gas emissions."

Credit: 
University of Michigan

Modeling gas diffusion in aggregated soils

image: Navodi Jayarathne prepares samples for gas diffusivity measurement.

Image: 
Courtesy of Timothy Clough

Agricultural soils contribute to 16% of total Greenhouse gas emissions, particularly nitrous oxide (N2O). Migration of gases in agricultural subsurface and emission across the soil-atmosphere interface is primarily diffusion-controlled and is explained by soil-gas diffusivity. Since experimental determination of soil-gas diffusivity demands expensive apparatus and time-consuming controlled laboratory measures, predictive models are commonly used to estimate diffusivity from easy-to-measure soil properties like soil total porosity and soil-air content.

In a recently published article in the Soil Science Society of America Journal, researchers introduced a descriptive soil-gas diffusivity model. Presented as a two-region soil gas diffusivity model, it was developed based on measured gas diffusivity data taken from two agricultural soils from Peradeniya- Sri Lanka under different soil density conditions.

Researchers identified that pore network in agricultural soils exhibits two distinct pore regions: inter-aggregate and intra-aggregate. As such, they constitute a bimodal pore structure. The Two-region model developed in the study could adequately parameterize and characterize the soil-gas diffusivity in selected bimodal soils outperforming the conventional models.

The developed two-region model provides a tool to accurately estimate gas diffusion in aggregated soils, thus providing models to quantify the gas exchange between soil and atmosphere with respect to different land use and water management practices.

Credit: 
American Society of Agronomy