Tech

Laser trick produces high-energy terahertz pulses

image: From the color difference of two slightly delayed laser flashes (left) a non-linear crystal generates an energetic terahertz pulse (right).

Image: 
DESY, Lucid Berlin

A team of scientists from DESY and the University of Hamburg has achieved an important milestone in the quest for a new type of compact particle accelerator. Using ultra-powerful pulses of laser light, they were able to produce particularly high-energy flashes of radiation in the terahertz range having a sharply defined wavelength (colour). Terahertz radiation is to open the way for a new generation of compact particle accelerators that will find room on a lab bench. The team headed by Andreas Maier and Franz Kärtner from the Hamburg Center for Free-Electron Laser Science (CFEL) is presenting its findings in the journal Nature Communications. CFEL is jointly run by DESY, the University of Hamburg and the Max Planck Society.

The terahertz range of electromagnetic radiation lies between the infrared and microwave frequencies. Air travellers may be familiar with terahertz radiation from the full-body scanners used by airport security to search for objects hidden beneath a person's garments. However, radiation in this frequency range might also be used to build compact particle accelerators. "The wavelength of terahertz radiation is about a thousand times shorter than the radio waves that are currently used to accelerate particles," says Kärtner, who is a lead scientist at DESY. "This means that the components of the accelerator can also be built to be around a thousand times smaller." The generation of high-energy terahertz pulses is therefore also an important step for the AXSIS (frontiers in Attosecond X-ray Science: Imaging and Spectroscopy) project at CFEL, funded by the European Research Council (ERC), which aims to open up completely new applications with compact terahertz particle accelerators.

However, chivvying along an appreciable number of particles calls for powerful pulses of terahertz radiation having a sharply defined wavelength. This is precisely what the team has now managed to create. "In order to generate terahertz pulses, we fire two powerful pulses of laser light into a so-called non-linear crystal, with a minimal time delay between the two," explains Maier from the University of Hamburg. The two laser pulses have a kind of colour gradient, meaning that the colour at the front of the pulse is different from that at the back. The slight time shift between the two pulses therefore leads to a slight difference in colour. "This difference lies precisely in the terahertz range," says Maier. "The crystal converts the difference in colour into a terahertz pulse."

The method requires the two laser pulses to be precisely synchronised. The scientists achieve this by splitting a single pulse into two parts and sending one of them on a short detour so that it is slightly delayed before the two pulses are eventually superimposed again. However, the colour gradient along the pulses is not constant, in other words the colour does not change uniformly along the length of the pulse. Instead, the colour changes slowly at first, and then more and more quickly, producing a curved outline. As a result, the colour difference between the two staggered pulses is not constant. The difference is only appropriate for producing terahertz radiation over a narrow stretch of the pulse.

"That was a big obstacle towards creating high-energy terahertz pulses," as Maier reports. "Because straightening the colour gradient of the pulses, which would have been the obvious solution, is not easy to do in practice." It was co-author Nicholas Matlis who came up with the crucial idea: he suggested that the colour profile of just one of the two partial pulses should be stretched slightly along the time axis. While this still does not alter the degree with which the colour changes along the pulse, the colour difference with respect to the other partial pulse now remains constant at all times. "The changes that need to be made to one of the pulses are minimal and surprisingly easy to achieve: all that was necessary was to insert a short length of a special glass into the beam," reports Maier. "All of a sudden, the terahertz signal became stronger by a factor of 13." In addition, the scientists used a particularly large non-linear crystal to produce the terahertz radiation, specially made for them by the Japanese Institute for Molecular Science in Okazaki.

"By combining these two measures, we were able to produce terahertz pulses with an energy of 0.6 millijoules, which is a record for this technique and more than ten times higher than any terahertz pulse of sharply defined wavelength that has previously been generated by optical means," says Kärtner. "Our work demonstrates that it is possible to produce sufficiently powerful terahertz pulses with sharply defined wavelengths in order to operate compact particle accelerators."

Credit: 
Deutsches Elektronen-Synchrotron DESY

Squid could thrive under climate change

Squid will survive and may even flourish under even the worst-case ocean acidification scenarios, according to a new study published this week.

Dr Blake Spady, from the ARC Centre of Excellence for Coral Reef Studies (Coral CoE) at James Cook University (JCU), led the study. He said squid live on the edge of their environmental oxygen limitations due to their energy-taxing swimming technique. They were expected to fare badly with more carbon dioxide (CO2) in the water, which makes it more acidic.

"Their blood is highly sensitive to changes in acidity, so we expected that future ocean acidification would negatively affect their aerobic performance," said Dr Spady.

Atmospheric CO2 concentrations have increased from 280 parts per million (ppm) before the industrial revolution to more than 400 ppm today. Scientists project atmospheric CO2--and by extension CO2in the oceans--may exceed 900 ppm by the end of this century unless current CO2 emissions are curtailed.

But when the team tested two-toned pygmy squid and bigfin reef squid at JCU's research aquarium, subjecting them to CO2levels projected for the end of the century, they received a surprise.

"We found that these two species of tropical squid are unaffected in their aerobic performance and recovery after exhaustive exercise by the highest projected end-of-century CO2 levels," said Dr Spady.

He said it may be an even greater boost for the squid as some of their predators and prey have been shown to lose performance under predicted climate change scenarios.

"We think that squid have a high capacity to adapt to environmental changes due to their short lifespans, fast growth rates, large populations, and high rate of population increase," said Dr Spady.

He said the work is important because it gives a better understanding of how future ecosystems might look under elevated CO2conditions.

"We are likely to see certain species as being well-suited to succeed in our rapidly changing oceans, and these species of squid may be among them."

"The thing that is emerging with most certainty is that it's going to be a very different world," he said.

Credit: 
ARC Centre of Excellence for Coral Reef Studies

New imaging modality targets cholesterol in arterial plaque

image: Figure 11: (a) Normalized 14-MHz IVUS image of the atherosclerotic artery phantom by IVUS at 14-MHz. (b) Normalized PAR amplitude and (c) phase images of IV-DPAR, single-ended 980-nm PAR and singleended 1210-nm PAR modes. The same endoscopic transducer and instrumentation were shared by IVUS and IV-DPAR for coregistration. C1, Cholesterol sample 1 and C2, Cholesterol sample 2.

Image: 
Sung Soo Sean Choi <em>et al</em>.

BELLINGHAM, Washington, USA and CARDIFF, UK - In an article published in the peer-reviewed SPIE publication Journal of Biomedical Optics (JBO), "Frequency-domain differential photoacoustic radar: theory and validation for ultra-sensitive atherosclerotic plaque imaging," researchers demonstrate a new imaging modality that successfully identifies the presence of cholesterol in the arterial plaque.

Cholesterol in plaque, along with fat, calcium, and other blood-transported substances, can lead to atherosclerosis, a disease which can cause heart attacks or strokes. Early detection of cholesterol can lead to earlier treatments and improved health outcomes. Toronto-based researchers have demonstrated a unique detection technique that combines laser photoacoustics, a hybrid optical-acoustic imaging technology, with low-power continuous wave lasers and frequency-domain signal processing, in an approach known as photoacoustic radar. This advanced technology can accurately evaluate plaque-based cholesterol, and allow for more timely treatment of atherosclerosis.

According to JBO Editor-in-Chief, SPIE Fellow, and MacLean Professor of Engineering at the Thayer School of Engineering at Dartmouth College, New Hampshire, Brian Pogue, the findings mark an exciting new direction in imaging: "This is an original direction of imaging research that utilizes an innovative idea of detection based upon differences between wavelengths, and signal analysis based upon radar methods. Photoacoustic imaging has the best potential for imaging through thick tissues or blood: the high-sensitivity detection of cholesterol described in this paper is made possible with a specifically modified, dual wavelength approach."

Credit: 
SPIE--International Society for Optics and Photonics

On your bike?

A James Cook University researcher says a lack of suitable roads is a big reason why cycling participation rates in Australia and Queensland are so low.

JCU lecturer Jemma King said recent estimates suggest that only 34% of Australian adults have cycled in the previous year and 16% in the previous week.

Ms King said a survey of more than 1200 people in Queensland found two thirds of respondents did not cycle at all.

"The majority of non-cyclists reported it was because of ill-health, age or lack of fitness, a lack of interest or enjoyment in riding a bicycle, or safety concerns. Among those who own a bike the reason most commonly given for not using it was a lack of time or opportunity," she said.

Ms King said rural people in Australia differ to urban residents in their reasons for not biking.

"Rural residents were more likely to cite environmental concerns and preference for other modes of transport or exercise as reasons they had not cycled. Environmental reasons include such things as unsealed shoulders on rural roads, high speed limits and just the sheer distances involved in rural travel."

Ms King said only a small percentage of cyclists seem to be getting the known protective health benefits of cycling 3.5 hours per week.

"Environmental factors appear to be inhibiting cycling participation in rural areas. Government funding for infrastructure development that supports safe cycling across Queensland including outside of metropolitan areas is something that should be looked at," she said.

Ms King said ideally, this would be in the form of infrastructure that would enable separation of cyclists from motorised vehicles.

Key findings:

Cyclists in the sample were predominately male (61.3%), employed full time (53%) and had high levels of education (52.6%).

Concerns for safety (14.3%) and environmental concerns including a terrain or lack of suitable 'bicycle friendly' areas to ride (9.1%) were two other reasons that non-cyclists indicated as influencing their decision not to ride.

For more than half of cyclists, the usual cycling duration was over 30 minutes.

Compared with those cycling for less than 30 minutes, cyclists who cycled for over 30 minutes more frequently reported excellent health, cycled as part of a sport, had 15 or more years education, and lived in urban Queensland.

Only 7.6% of cyclists reported cycling more than 182 hours per year.

Non-cyclists were typically 55 years of age or older, had no other reported physical activity, had strong religious beliefs and a body mass index above a healthy range.

Cyclists were typically male, engaged in physical activity, believed bicycle injuries were preventable and self-reported 'not good' mental health at least one day in the past month.

Credit: 
James Cook University

People with mobility issues set to benefit from wearable devices

The lives of thousands of people with mobility issues could be transformed thanks to ground-breaking research by scientists at the University of Bristol.

The FREEHAB project will develop soft, wearable rehabilitative devices with a view to helping elderly and disabled people walk and move from sitting to a standing position in comfort and safety.

Led by University of Bristol Professor of Robotics Jonathan Rossiter, FREEHAB builds on discoveries from his previous Right Trousers project, which saw his team develop new soft materials that could be used like artificial muscles.

Professor Rossiter said: "There are over 10.8 million disabled people living in the UK today. Nearly 6.5 million have mobility impairments. These numbers are growing as the median population age increases and age-related mobility issues due to conditions such as arthritis and stroke become more prevalent."

Rehabilitation is vital for patients, but according to Professor Rossiter, outcomes are hampered by a lack of easy-to-use dynamic tools to help therapists accurately analyse mobility performance and devise effective programmes; and as rehabilitation increasingly takes place in patients' homes in the absence of a therapist, better ways to support in-home mobility and training are needed.

The materials from which the artificial muscles are made include 3D-printable electroactive gel materials, and soft but strong pneumatic chains that change shape when inflated and can exert considerable force.

Professor Rossiter said: "Together with integrated sensing technology, we will make devices that physiotherapists can use to accurately pinpoint limitations in their patients' movements, thus enabling them to plan personalised training programmes.

"We will also make simpler devices that the patient can use to enhance their mobility activities and exercise with confidence when a therapist is not with them."

To develop the project, the researchers will work with physiotherapists in the NHS and private practice, and with people who have undergone physiotherapy for their mobility problems.

Following research and development, the aim is to conduct clinical trials and then bring the devices into the supply chain once the project is over.

The three-year FREEHAB project, due to start in September, has received £1,162,224 funding from the Engineering and Physical Sciences Research Council (EPSRC).

Philippa Hemmings, head of Healthcare Technologies at EPSRC: said: "The work supported within the FREEHAB project will increase the ability of physiotherapists to support people with mobility impairments. It shows the power of engineers and physical scientists working in collaboration with partners, something our Healthcare Impact Partnership awards were set up to support."

Credit: 
Engineering and Physical Sciences Research Council

Perfect diamagnetism observation of high-temperature superconductivity in compressed H2S

image: (a) Simplified flow chart for the magnetic susceptibility measurement set-up for the diamond anvil cell (DAC). LIA and AC denote lock-in amplifier and alternating-current source, respectively. (b) Left: The sample in the gasket hole at 2 GPa and 200 K (left), and at 155 GPa and 300 K (right), respectively. Right: Magnetic susceptibility signals of sulfur hydride at various pressures.

Image: 
©Science China Press

The discovery of the extremely high superconducting temperature (Tc) of ~200 K reported in the sulfur hydride system above 100 GPa, has broken the high-temperature superconductivity record for the copper oxides. The zero resistance superconducting measurements of sulfur hydride system have been reported by Eremets et al.. Direct, complete and many pressure-point Meissner effect measurements under high pressures are urgently needed. Motivated by this, the research group of Prof. Tian Cui from Jilin University has made a breakthrough in fulfilling the perfect diamagnetism of sulfur hydride system under high pressure, using a highly sensitive magnetic susceptibility technique adapted for a megabar-pressure diamond anvil cell (DAC).

Through theoretical calculation and experimental research, scientists have found that some hydrogen-rich compounds show very high superconducting transition temperature under high pressure. For example, the superconducting transition temperature of sulfur hydride at 155 GPa is 203 K, and that of lanthanum hydride at 170 GPa is 250 K. However, the experimental studies on the superconductivity of hydrogen-rich compounds are focused on the characterization of the superconducting zero resistance characteristics. As is known to all, perfect diamagnetism is another essential characteristic of superconductors, but the experimental measurement of perfect diamagnetism in hydrogen-rich compounds at ultra-high pressure is extremely challenging. Megabar presures or even higher can be generated by the diamond anvil cell, but the sample size is less than 0.05×0.05×0.01 mm3. The very small sample dimension and sensitive metal parts of the cell lead to very low signal-to-noise ratio when performing magnetic measurement, so it is very difficult to extract weak sample signals from huge noise signals.

By suppressing the noise of signal source, shielding the transmission noise and improving the sensitivity of signal extraction, the authors optimize the magnetic measurement method based on DAC, build the magnetic susceptibility measurement system with high sensitivity, and measures the alternating current magnetic susceptibility of sulfur hydrogen sample under high pressure. They firstly sealed liquid hydrogen sulfide into a DAC using cryogenic technology. The target sample H3S was prepared by low temperature compression path and the superconducting transition was observed at 183 K and 149 GPa. The trend of superconducting transition temperature under different pressures was obtained, and the superconducting phase diagram of hydrogen sulfide was improved.

The research results show that it is feasible to measure the alternating current susceptibility in the hydrogen-rich compounds above megarbar pressure. The above research confirms the high-temperature superconductivity and determined the superconducting phase diagram of sulfur hydrogen system from magnetic susceptibility data, which opens up a broad research prospect for the experimental research of hydrogen-rich superconductors under ultra-high pressure.

Credit: 
Science China Press

FEFU scientists to broaden ideas about reactive sintering of transparent ceramics

image: Denis Kosyanov, FEFU

Image: 
FEFU press office

Green bodies' porous structure, i.e. mesostructure, affects dramatically the functional parameters of the optical ceramics obtained by reactive sintering. Characteristics of the mesostructure are proposed to regulate by pre-annealing of green bodies at temperatures below the phase formation and consolidation. The approach was presented in the article published in Journal of the European Ceramic Society.

Developing an advanced technology and creating a new family of optical and, in particular, luminescent and laser ceramics is a fundamental scientific problem and a key task of modern ceramic materials science. Such materials are wanted to accurately measure distances (optical location), implement new modes of materials processing, create qualitatively new optical information carriers and medical equipment, IR windows, high-power LEDs and thermoelectric elements.

'The homogeneity of the mesostructure (inner structure) of green bodies, i.e. nanopowders compacted into pellets, from which ceramics were obtained, is one of the most important characteristics during sintering. For optical (laser) quality ceramic materials, it is necessary to achieve the removal of hundredths and thousandths of a percent of residual porosity in the final stage of sintering in order to ensure a low degree of light scattering. Control of the homogeneity of the initial compacts can be achieved by the variation of many technological parameters. For example, by changing the pressing technology, the choice, and mode of preparation of the initial nanopowders. However, a key feature of obtaining of the transparent ceramics by reactive sintering is that the energy transferred to the multi-component powder system, from which ceramics is obtained, is spent on the competing processes of phase transformations and consolidation. For this reason, we proposed to control the initial state of green bodies by annealing, prior to their sintering,' says Denis Kosyanov, head of the FEFU research team, a Senior Researcher of the National technology initiative Center for VR and AR (FEFU NTI Center).

The scientist noted that ceramic samples prepared from green bodies which pre-annealing at optimized conditions (600°C / 4 hours) exhibit porosity ?0.001 vol% and yield efficient laser emission at 1064 nm with a slope efficiency as high as 67% in quasi-continuous pumping at 807 nm (meet the level of world analogues).

Previously, scientists have already described this original approach in the Ceramics International journal on the example of the vacuum reactive sintered Y3Al5O12: Nd3+ laser ceramics.

The present work is to establish a more detailed correlation between the homogeneity of the initial green bodies, their structural-phase state and the functional parameters of the final materials.

Credit: 
Far Eastern Federal University

From rain to flood

image: KITcube mobile measurement facility: with the help of a truck crane, the precipitation radar is installed at Müglitztal/Saxony.

Image: 
Dr. Andreas Wieser, KIT

Extreme weather events, such as thunderstorms or heavy rainfall and the resulting floods, influence Earth and environmental systems in the long term. To holistically study the impacts of hydrological extremes - from precipitation to water entering the ground to discharge to flow into the ocean -, a measurement campaign at Müglitztal/Saxony is about to start under the MOSES Helmholtz Initiative. The measurement campaign is coordinated by Karlsruhe Institute of Technology (KIT).

A single heavy rainfall event may have serious impacts on an entire river system, ranging from land erosion by floods to nutrient and pollutant transports to changes of the ecosystem. The current MOSES measurement campaign studies hydrological extreme events from the source in the atmosphere to response of biosystems.

MOSES stands for "Modular Observation Solutions for Earth Systems." Within this joint initiative, nine research centers of the Helmholtz Association set up mobile and modular observation systems to study the impacts of temporarily and spatially limited dynamic events, such as extreme precipitation and discharge events, on the long-term development of Earth and environmental systems. The current measurement campaign on hydrological extremes coordinated by KIT takes place from mid-May to mid-July 2019 at Müglitztal, Saxony. In this region located in the Eastern Erzgebirge (Ore Mountains), certain weather conditions may result in extreme precipitations and floods, an example being the flood of 2002. Such extreme events are triggered either by depressions which, together with blockage effects by mountains, cause high precipitation or by small-scale convective precipitation events, i.e. thunderstorms, that may be associated with floods in a limited area, i.e. a mountain valley.

Apart from the Troposphere Research Division of KIT's Institute of Meteorology and Climate Research (IMK-TRO), the Helmholtz Centre for Environmental Research (UFZ) Leipzig, Forschungszentrum Jülich (FZJ), and the Helmholtz Centre Potsdam - German Research Centre for Geosciences (GFZ) are involved in the current measurement campaign with their measurement systems.

KIT will use its mobile KITcube observatory. Its supplies information on the formation and development of strong rainfall, precipitation distribution, and evaporation. Among others, a radar is applied to measure precipitation within a radius of 100 km, a microwave radiometer serves to determine the atmospheric temperature and humidity profiles, and a lidar system is used to measure the wind profile with the help of lasers. Radiosondes supply information on the state of the atmosphere up to 18 km height. A network of distrometers, i.e. systems for continuous monitoring of precipitation intensity and raindrop size, supplies additional information on processes in the observation area.

UFZ scientists will focus on soil humidity that is an important variable to control discharge of rainwater. If the soil is too humid or extremely dry, rainwater flows off the land surface and floods may develop more quickly. To optimally monitor the development of soil humidity, UFZ will install a mobile, wireless sensor network to measure soil humidity and temperature at variable depths. In contrast to classical systems, the sensor network allows precise adjustment of sensor positions and distribution as well as of scanning rates to local measurement conditions. Apart from the stationary sensor network, mobile cosmic ray rovers with specially developed neutron sensors will be applied. With them, researchers can observe large-scale variation of soil humidity in the Müglitz catchment area.

Scientists of Forschungszentrum Jülich will launch balloon probes up to 35 km height to determine, among others, how thunderstorms affect climate in the long term. Using water vapor, ozone, and cloud instruments, they study trace gas transport through thunderstorms into the upper troposphere - the bottom layer of the Earth's atmosphere - or even into the stratosphere above.

GFZ researchers will use mobile measurement units to study the influence of stored water on the development of a flood. Apart from cosmic ray sensors to measure water in the upper soil and sensors to measure close-to-surface groundwater, they will also use so-called gravimeters. These systems detect variations of the Earth's gravity due to changing underground water masses, also at larger depths.

Credit: 
Karlsruher Institut für Technologie (KIT)

Environmental oxygen triggers loss of webbed digits

image: This cartoon shows how interdigital cell death and environmental oxygen are correlated in various tetrapods.

Image: 
Cordeiro et al. / Developmental Cell

Free fingers have many obvious advantages on land, such as in locomotion and grasping, while webbed fingers are typical of aquatic or gliding animals. But both amphibians and amniotes--which include mammals, reptiles, and birds--can have webbed digits. In new research from Japan, scientists show for the first time that during embryo development, some animal species detect the presence of atmospheric oxygen, which triggers removal of interdigital webbing. Their research appears June 13 in the journal Developmental Cell.

Amphibians--animals like frogs, toads, salamanders, and newts--form fingers without webbing by differential growth patterns between the digits and the areas between them, or interdigital regions. By comparison, amniotes rely on interdigital cell death, or death of cells in the webbing between digits, a mechanism that contributes to a greater variation of limb shapes.

"We found that the removal of the interdigital membrane by cell death depends on the production of reactive oxygen species (ROS), which only occurs in embryos exposed to sufficient oxygen levels during development," said senior author Mikiko Tanaka of the Tokyo Institute of Technology.

Since high oxygen levels can induce cell death in a frog, the researchers believe this mechanism is likely shared by all tetrapods--both amphibians and amniotes. "But amphibians do not employ cell death to shape their interdigital regions; it is the difference in growth rate between the digital and interdigital regions that will determine their final proportions," she says. "We think that interdigital cell death appeared in amphibians only as a by-product of the high oxygen levels, a first step in this evolutionary process. This new step eventually was integrated to the limb development and became essential to shape the limbs of modern amniotes."

In their study, Tanaka's team examined embryos from several species. In chicken embryos, an amniote with interdigital cell death, changing the oxygen levels directly affected the number of dying cells. They also noted that increasing the amount of environmental oxygen induced interdigital cell death in the African clawed frog, an amphibian that typically lacks it. And increasing the density of blood vessels in the limbs of these frogs also induced cell death.

To gain an evolutionary perspective, researchers also studied cell death and ROS in two other amphibian species, the Japanese fire-bellied newt and the coquí frog. Like the African clawed frog, the Japanese fire-bellied newt had no interdigital cell death, but the coquí frogs had dying cells in their interdigital regions. Importantly, unlike the other two amphibians, the coquí frogs grow without a tadpole stage in terrestrial eggs and breathe oxygen from the air. "This way, we show both experimentally and comparatively that interdigital cell death is correlated with life history strategy and oxygen availability in tetrapods (four-legged vertebrates)," says Tanaka.

The researchers explain that the interdigital region is rich in blood vessels, the source of oxygen to the tissues. Part of the oxygen can be converted to ROS. "Paradoxically, ROS are traditionally considered villains such as in aging and infertility," she says, "but it is becoming clear that there are physiological levels of ROS which vary according to each cell and regulate several signaling pathways during the development and in the adult organism."

The team believes there are two main factors that made the interdigital region sensitive to this increase in ROS--active Bmp signaling and blood vessel remodeling. For almost 20 years, Bmps have been described as key for the induction of cell death in amniote limbs. "However, this pathway also plays a role in patterning the number of digits and joints, and is active in the interdigital region of amphibians as well," she says. In the same way, blood vessel remodeling increases the oxygen availability to the limbs and is correlated with interdigital cell death in amniotes but is part of another process: ossification of the fingers.

"But the interesting point is that, while both amniotes and amphibians can have either free or webbed fingers, they are formed in different ways in these two groups," says Tanaka. "The new developmental mechanism acquired by amniotes - interdigital cell death -allowed for the evolution of a great variety of limb shapes, such as the lobed fingers of coots, and even removal of some fingers in horses and camels."

Looking ahead, the team is interested in understanding exactly which pathways are regulated by ROS during development. They hope to uncover how cell death became an integral part of the limb development of amniotes during evolution. And they also hope to gain insight into how drugs that may lead to excessive ROS production, such as ethanol, phenytoin, and thalidomide, may cause developmental defects in humans.

Credit: 
Cell Press

Viruses found to use intricate 'treadmill' to move cargo across bacterial cells

video: The video shows previously unseen processes of giant phage while reproducing inside a bacterial cell. Phage replication begins when a virus particle attaches to the surface of the cell and injects its DNA. A protein shell assembles around replicating phage DNA, establishing the phage nucleus, which is positioned at the center of the cell by filaments. Later, capsids travel along these 'treadmill' filaments to dock on the phage nucleus and initiate viral DNA packaging.

Image: 
Video created by Janet Iwasa. Joe Pogliano and Elizabeth Villa, 2019

Countless textbooks have characterized bacteria as simple, disorganized blobs of molecules.

Now, using advanced technologies to explore the inner workings of bacteria in unprecedented detail, biologists at the University of California San Diego have discovered that in fact bacteria have more in common with sophisticated human cells than previously known.

Publishing their work June 13 in the journal Cell, UC San Diego researchers working in Professor Joe Pogliano's and Assistant Professor Elizabeth Villa's laboratories have provided the first example of cargo within bacterial cells transiting along treadmill-like structures in a process similar to that occurring in our own cells.

"It's not that bacteria are boring, but previously we did not have a very good ability to look at them in detail," said Villa, one of the paper's corresponding authors. "With new technologies we can start to understand the amazing inner life of bacteria and look at all of their very sophisticated organizational principles."

Study first-author Vorrapon Chaikeeratisak of UC San Diego's Division of Biological Sciences and his colleagues analyzed giant Pseudomonas bacteriophage (also known as phage, the term used to describe viruses that infect bacterial cells). Earlier insights from Pogliano's and Villa's labs found that phage convert the cells they have infected into mammalian-type cells with a centrally located nucleus-like structure, formed by a protein shell surrounding the replicated phage DNA. In the new study the researchers documented a previously unseen process that transports viral components called capsids to DNA at the central nucleus-like structure. They followed as capsids moved from an assembly site on the host membrane, trafficked upon a conveyer belt-like path made of filaments and ultimately arrived at their final phage DNA destination.

"They ride along a treadmill in order to get to where the DNA is housed inside the protein shell, and that's critical for the life cycle of the phage," said Pogliano, a professor in the Section of Molecular Biology. "No one has seen this intracellular cargo travelling along a filament in bacterial cells before."

"The way this giant phage replicates inside bacteria is so fascinating," said Chaikeeratisak. "There are a lot more questions to explore about the mechanisms that it uses to take over the bacterial host cell."

Opening the door to the new discovery was a research combination of time-lapse fluorescence microscopy, which offered a broad perspective of movement within the cell, similar to a Google Earth map perspective of roadways, in coordination with cryo-electron tomography, which provided the ability to zoom into a "street level" view that allowed the scientists to analyze components on a scale of individual vehicles and people within them.

Villa said each technique's perspective helped provide key answers but also brought new questions about the transportation and distribution mechanisms within bacterial cells. Kanika Khanna, a student member of both labs, is trained to use both technologies to gain data and insights from each.

"Zooming in and out allowed us to observe a unique example where things just don't randomly diffuse inside bacterial cells," said Khanna. "These phages have evolved a sophisticated and directed mechanism of transport using filaments to replicate inside their hosts that we could have not seen otherwise."

Phage infect and attack many types of bacteria and are known to live naturally in soil, seawater and humans. Pogliano believes the new findings are important for understanding more about the evolutionary development of phage, which have been the subject of recent attention.

"Viruses like phage have been studied for 100 years but they are now receiving renewed interest because of the potential of using them for phage therapy," said Pogliano.

The type of phage studied in the new paper is the kind that one day could be used in new treatments to cure a variety of infections.

Last year UC San Diego's School of Medicine started the Center for Innovative Phage Applications and Therapeutics (IPATH), which was launched to develop new treatments for infectious disease as widespread resistance to traditional antibiotics continues to grow.

"If we understand how phages operate inside bacteria and what they do, the end goal would be that you might be able to start designing tailor-made phages for particular infections that are resistant," said Villa.

Credit: 
University of California - San Diego

Pollen collected by US honey bees in urban settings shows dramatic seasonal variation

image: Pollen collected in traps.

Image: 
Pierre Lau

The diversity and availability of pollen foraged by honey bees across urban and suburban areas in the US varies drastically with the seasons, according to a study published June 12, 2019 in PLOS ONE by Juliana Rangel from Texas A&M University, USA, and colleagues.

Honey bee (Apis mellifera) colonies require a diversity of protein-rich pollen in order to rear healthy brood and ensure colony survival. During certain seasons, insufficient or poor-quality pollen can limit brood nutrition. In this study, the authors investigated the variation in pollen collected by honey bees across developed landscapes in California, Michigan, Florida, and Texas over the seasons of the year.

The authors tracked a total of 394 sites with at least two hives each in urban and suburban locations across California, Texas, Florida, and Michigan. They placed a pollen trap at each hive entrance, which passively collected pollen from foraging bees, and sampled pollen from the traps in multiple months of 2014 and 2015. The researchers used a light microscope to identify pollen grains to the family, genus, and species level where possible.

The total overall pollen species diversity varied significantly across all four states, with highest diversity in California and lowest diversity in Texas. Nationally, the total pollen diversity was significantly higher in the spring across all locations as compared to other seasons. Top pollen sources across all states included legumes, oaks, roses and daisies. Only a few plant groups provided pollen throughout the year - for example, eucalyptus and palm pollen was consistently available in California and Florida.

Since pollen traps were only in use over limited periods, the assessment of pollen collection was not comprehensive, and the pollen was not quantified to examine the proportion collected of each type. However, these results provide information about honey bee foraging patterns over the year. The authors hope this might help urban planners and gardeners choose plants that can provide appropriate pollen resources to honey bees in developed areas year-round, and plan pesticide treatment regimens around honey bee foraging schedules.

The authors add: "This study describes the seasonal and geographic variation of floral sources of pollen for honey bees in urban and suburban landscapes, giving us for the first time a comprehensive look at some of the most important plants for honey bees in developed areas, and serves as a foundation for studies related to honey bee nutritional ecology in urban settings."

Credit: 
PLOS

Body composition shown to affect energy spent standing versus sitting

A person's body composition could influence the difference between the amount of energy they spend while sitting versus standing, according to new research published in the open-access journal PLOS ONE. Conducted by Francisco J. Amaro-Gahete of the University of Granada, Spain, and colleagues, this work adds to mounting evidence that more energy is expended while standing than while sitting or lying down.

Sedentary lifestyles are linked to increased risk for a variety of health conditions, including diabetes, obesity, and cancer. The difference in energy a person expends while standing versus sitting or lying down may be a key factor influencing health risks, but previous studies have found conflicting results about the actual size of these differences. Also, body composition--the proportion of fat in a person's body--could impact these differences, but its role has been unclear.

To address these issues, Amaro-Gahete and colleagues measured energy expenditure differences between lying, sitting, and standing for 55 young adults aged 18 to 25. In line with previous research, the participants burned significantly more kilocalories per minute while standing than while sitting or lying, while no difference was seen for sitting versus lying.

Notably, the researchers also examined associations between energy expenditure in different positions and body composition of the participants. They found no significant associations when comparing energy spent lying versus sitting or lying versus standing. However, they did find that participants with a higher lean body mass had a smaller difference in energy spent sitting versus standing.

These findings lend new support to the idea that a simple way for a person to increase their energy expenditure is by increasing their time spend standing. The results could also aid efforts to better understand, monitor, and counteract sedentary lifestyles.

Amaro-Gahete adds: "Increasing the time spent standing could be a simple strategy to increase energy expenditure."

Credit: 
PLOS

Protecting coral reefs in a deteriorating environment

WASHINGTON -- Coral reefs around the world face growing danger from a changing climate, on top of the historic threats from local pollution and habitat destruction. In response, scientists are researching new interventions that have the potential to slow coral reef damage from warming and acidifying oceans. The interventions span a wide range of physical and biological approaches for increasing the stability of coral reefs, but they have only been tested at small scales.

A new report from the National Academies of Sciences, Engineering, and Medicine examines these resilience tools and provides decision-makers with a process they can follow in considering whether to use one or more of the novel approaches.

Many of the new interventions seek to amplify natural resilience, such as laboratory breeding of corals that show greater heat resistance. Other methods, some merely on the horizon such as genetic manipulation of corals, might one day introduce new levels of stress tolerance. Ultimately, all interventions alter the reef in some way. These changes will result in benefits that differ across sites, and they may have varying unintended consequences - meaning that the risks and benefits need to be weighed locally, the report says.

"Maintaining the stability of coral reefs in the face of local and climate stressors is a key goal for supporting human well-being around the world," said Stephen Palumbi, chair of the 12-member committee that wrote the report, and Jane and Marshall Steel Jr. Professor in Marine Sciences at Stanford University. "Many new interventions have promise for these efforts, but they differ widely in their readiness levels, and implementing them will require careful attention to regional contexts."

Novel solutions to growing threats

Since the 1980s, tropical coral reef coverage around the world has declined by about 30% to 50%. Pollution, habitat destruction, and overfishing have long been among the culprits in many places, but increasingly, coral reef loss can be attributed to a changing climate. Rising water temperatures are increasing the frequency of mass bleaching events and are making disease outbreaks more common. And as ocean waters become more acidic from absorbing carbon dioxide, it will become harder for corals to grow and maintain their skeletons.

Coral reefs' destruction has serious human costs, because many coastal communities depend on local reefs for fishing and tourism. Coral reefs also absorb energy from the waves that pass over them, buffering shore communities against destructive storms.

In response to these threats, researchers are developing new ways to improve corals' persistence in a changing climate - 23 of which were described in the first report released by the committee last November.

The committee's new, final report includes an assessment of the technical readiness of various interventions. Some are possible to use now - for example, pre-exposing corals to mild warming in order to improve their tolerance of greater heat levels. With more research and testing, others may be available for use in the next two to five years, such as using antibiotics to treat disease, mixing cool water into reef habitats, or shading corals from sunlight and heat. Still other proposed interventions - for example, using tools such as CRISPR/Cas9 to genetically manipulate corals to make them more threat-resistant - need significantly more research and development, and are at least a decade away.

"Though all of these interventions entail some risk, the risk from doing nothing is increasing year by year," said committee member Nancy Knowlton, former holder of the Sant Chair in Marine Science at the Smithsonian Institution.

Choosing the right intervention strategy is a local, stakeholder-driven decision

Whether a specific intervention (or a combination of interventions) is suitable depends not just on its technical readiness, but on each particular ecological and social setting, the report stresses. Local factors such as the level of reef degradation, the quality of the water, and the resources and infrastructure available will determine if an intervention is needed or beneficial.

Equally important is whether the intervention is acceptable to a community, the report says. Throughout the decision-making process, it is important to engage a broad set of stakeholders - both to establish objectives and to choose a course of action that reflects community values. "Stakeholder engagement enables choosing interventions with expected outcomes that align with the goals of the community," said committee member Marissa Baskett, professor of environmental science and policy at the University of California, Davis.

The report also recommends that coral reef managers follow an "adaptive management" approach in making decisions about interventions - an approach that recognizes uncertainty and incorporates learning to adjust and improve strategies over time. "The science of coral reef interventions is still young, and particular environments may respond to them in different ways. So using a structured and adaptive decision-making approach allows managers to make decisions even when there is uncertainty," said Palumbi.

The first step of the adaptive management cycle is setting goals and objectives along with stakeholders, against which interventions' effects can be compared, and outlining acceptable courses of action. Locally-tailored models of coral reef dynamics, and how they change with different interventions, are a necessary tool to evaluate and compare the impact of different intervention strategies on coral reef outcomes. Developing a successful modeling framework requires substantial effort, the report concludes, but it pays off in the ability to assess the risks of new interventions compared to the risks posed by taking no action.

As strategies are implemented, efforts must be invested in monitoring the effects of interventions to increase knowledge about their impacts, evaluating their results, and communicating with stakeholders. Based on the observed effects, managers can then continue or alter their approach as needed.

The ability to make informed decisions and effectively implement novel interventions could be improved if remaining gaps in research - both on the interventions and on corals themselves - were filled, the report says. It identifies priority areas for research in basic coral biology, site-based assessments, improvement of interventions, and improvements in risk assessment and modeling. Increasing the ease of use and scale of use of different interventions, so communities have a larger toolbox to choose from, are important research goals.

"We must also grapple with and solve the key problem of greenhouse gas emissions," Palumbi said. "But the wealth of intervention options gives us some hope that we can help coral reefs successfully survive the next century as diverse, productive, and beautiful places in the sea."

The study -- undertaken by the Committee on Interventions to Increase the Resilience of Coral Reefs -- was sponsored by NOAA and the Paul G. Allen Family Foundation. The National Academies are private, nonprofit institutions that provide independent, objective analysis and advice to the nation to solve complex problems and inform public policy decisions related to science, technology, and medicine. They operate under an 1863 congressional charter to the National Academy of Sciences, signed by President Lincoln. For more information, visit nationalacademies.org.

Credit: 
National Academies of Sciences, Engineering, and Medicine

Old ice and snow yields tracer of preindustrial ozone

image: Rice University researchers and collaborators used ice cores, like the one shown here from Antarctica, in combination with atmospheric chemistry models to establish an upper limit for the increase in ozone levels in the lower atmosphere since 1850.

Image: 
Photo by Jeff Fitlow/Rice University

HOUSTON -- (June 12, 2019) -- Using rare oxygen molecules trapped in air bubbles in old ice and snow, U.S. and French scientists have answered a long-standing question: How much have "bad" ozone levels increased since the start of the Industrial Revolution?

"We've been able to track how much ozone there was in the ancient atmosphere," said Rice University geochemist Laurence Yeung, the lead author of a study published online today in Nature. "This hasn't been done before, and it's remarkable that we can do it at all."

Researchers used the new data in combination with state-of-the-art atmospheric chemistry models to establish that ozone levels in the lower atmosphere, or troposphere, have increased by an upper limit of 40% since 1850.

"These results show that today's best models simulate ancient tropospheric ozone levels well," said Yeung. "That bolsters our confidence in their ability to predict how tropospheric ozone levels will change in the future."

The Rice-led research team includes investigators from the University of Rochester in New York, the French National Center for Scientific Research's (CNRS) Institute of Environmental Geosciences at Université Grenoble Alpes (UGA), CNRS's Grenoble Images Speech Signal and Control Laboratory at UGA and the French Climate and Environmental Sciences Laboratory of both CNRS and the French Alternative Energies and Atomic Energy Commission (CEA) at the Université Versailles-St Quentin.

"These measurements constrain the amount of warming caused by anthropogenic ozone," Yeung said. For example, he said the most recent report from the Intergovernmental Panel on Climate Change (IPCC) estimated that ozone in Earth's lower atmosphere today is contributing 0.4 watts per square meter of radiative forcing to the planet's climate, but the margin of error for that prediction was 50%, or 0.2 watts per square meter.

"That's a really big error bar," Yeung said. "Having better preindustrial ozone estimates can significantly reduce those uncertainties.

"It's like guessing how heavy your suitcase is when there's a fee for bags over 50 pounds," he said. "With the old error bars, you'd be saying, 'I think my bag is between 20 and 60 pounds.' That's not good enough if you can't afford to pay the penalty."

Ozone is a molecule that contains three oxygen atoms. Produced in chemical reactions involving sunlight, it is highly reactive, in part because of its tendency to give up one of its atoms to form a more stable oxygen molecule. The majority of Earth's ozone is in the stratosphere, which is more than five miles above the planet's surface. Stratospheric ozone is sometimes called "good" ozone because it blocks most of the sun's ultraviolet radiation, and is thus essential for life on Earth.

The rest of Earth's ozone lies in the troposphere, closer to the surface. Here, ozone's reactivity can be harmful to plants, animals and people. That's why tropospheric ozone is sometimes called "bad" ozone. For example, ozone is a primary component of urban smog, which forms near ground level in sunlit-driven reactions between oxygen and pollutants from motor vehicle exhaust. The Environmental Protection Agency considers exposure to ozone levels greater than 70 parts per billion for eight hours or longer to be unhealthy.

"The thing about ozone is that scientists have only been studying it in detail for a few decades," said Yeung, an assistant professor of Earth, environmental and planetary sciences. "We didn't know why ozone was so abundant in air pollution until the 1970s. That's when we started to recognize how air pollution was changing atmospheric chemistry. Cars were driving up ground-level ozone."

While the earliest measurements of tropospheric ozone date to the late 19th century, Yeung said those data conflict with the best estimates from today's state-of-the-art atmospheric chemistry models.

"Most of those older data are from starch-paper tests where the paper changes colors after reacting with ozone," he said. "The tests are not the most reliable -- the color change depends on relative humidity, for example -- but they suggest, nevertheless, that ground-level ozone could have increased up to 300% over the past century. In contrast, today's best computer models suggest a more moderate increase of 25-50%. That's a huge difference.

"There's just no other data out there, so it's hard to know which is right, or if both are right and those particular measurements are not a good benchmark for the whole troposphere," Yeung said. "The community has struggled with this question for a long time. We wanted to find new data that could make headway on this unsolved problem."

Finding new data, however, is not straightforward. "Ozone is too reactive, by itself, to be preserved in ice or snow," he said. "So, we look for ozone's wake, the traces it leaves behind in oxygen molecules.

"When the sun is shining, ozone and oxygen molecules are constantly being made and broken in the atmosphere by the same chemistry," Yeung said. "Our work over the past several years has found a naturally occurring 'tag' for that chemistry: the number of rare isotopes that are clumped together."

Yeung's lab specializes in both measuring and explaining the occurrence of these clumped isotopes in the atmosphere. They are molecules that have the usual number of atoms -- two for molecular oxygen -- but they have rare isotopes of those atoms substituted in place of the common ones. For example, more than 99.5% of all oxygen atoms in nature have eight protons and eight neutrons, for a total atomic mass number of 16. Only two of every 1,000 oxygen atoms are the heavier isotope oxygen-18, which contains two additional neutrons. A pair of these oxygen-18 atoms is called an isotope clump.

The vast majority of oxygen molecules in any air sample will contain two oxygen-16s. A few rare exceptions will contain one of the rare oxygen-18 atoms, and rarer still will be the pairs of oxygen-18s.

Yeung's lab is one of the few in the world that can measure exactly how many of these oxygen-18 pairs are in a given sample of air. He said these isotope clumps in molecular oxygen vary in abundance depending on where ozone and oxygen chemistry occurs. Because the lower stratosphere is very cold, the odds that an oxygen-18 pair will form from ozone/oxygen chemistry increase slightly and predictably compared to the same reaction in the troposphere. In the troposphere, where it is warmer, ozone/oxygen chemistry yields slightly fewer oxygen-18 pairs.

With the onset of industrialization and the burning of fossil fuels around 1850, humans began adding more ozone to the lower atmosphere. Yeung and colleagues reasoned that this increase in the proportion of tropospheric ozone should have left a recognizable trace -- a decrease in the number of oxygen-18 pairs in the troposphere.

Using ice cores and firn (compressed snow that has not yet formed ice) from Antarctica and Greenland, the researchers constructed a record of oxygen-18 pairs in molecular oxygen from preindustrial times to the present. The evidence confirmed both the increase in tropospheric ozone and the magnitude of the increase that had been predicted by recent atmospheric models.

"We constrain the increase to less than 40%, and the most comprehensive chemical model predicts right around 30%," Yeung said.

"One of the most exciting aspects was how well the ice-core record matched model predictions," he said. "This was a case where we made a measurement, and independently, a model produced something that was in very close agreement with the experimental evidence. I think it shows how far atmospheric and climate scientists have come in being able to accurately predict how humans are changing Earth's atmosphere -- particularly its chemistry."

Credit: 
Rice University

NASA explores our changing freshwater world

image: Follow the Freshwater: By predicting droughts and floods and tracking blooms of algae, NASA's view of freshwater around the globe helps people manage their water.

Image: 
NASA/ Katy Mersmann

Water is so commonplace that we often take it for granted. But too much - or too little of it - makes headlines.

Catastrophic flooding in the U.S. Midwest this spring has caused billions of dollars in damage and wreaked havoc with crops, after rain tipped off a mass melting of snow. Seven years of California drought so debilitating that it led to water rationing came to a close after a wet and snowy winter capped off several years of slow rebound and replenished the vital mountain snowpack.

These efforts are shaped by local geography and specific user needs to ensure they address freshwater data that are most valuable to communities. For this reason, NASA supports a number of water-management applications that are customized to support different regions. For example, NASA's Western Water Applications Office works with various entities in the western U.S., including state governments, tribal nations, and private industries to track the impacts of drought on agriculture and general water supplies. Abroad, NASA partners with the U.S. Agency for International Development through the SERVIR program to provide satellite data, computing tools, and training to local partners that improve local flood forecasting in Africa and assess climate impacts on mountain snow packs in the Himalayas, among other efforts.

These programs are but a few examples of many NASA-supported projects. Hundreds of other researchers, government agencies, and non-profits develop their own water-management tools and applications using NASA's free and open datasets.

NASA is improving on existing and developing new remote sensing methods that can reveal how much water is stored in mountain and seasonal snowpack - one of the world's most vital sources of freshwater. More than a billion people, spanning multiple continents, rely on water from mountain snow for their water supplies that support drinking water, farming, and even hydroelectric power. Snowfall patterns shift over time, however, both year-to-year from natural variability and due to long-term climate effects. With persistent human demands, the ability to accurately measure how much water is in mountain snowpack becomes an even more critical capability.

Through the Airborne Snow Observatory program, NASA and California's Department of Water Resources use instruments mounted on airplanes to create high resolution estimates of snow water content for priority watersheds in the Western U.S. The collected data helps determine the timing of the spring melt, which has downstream effects on hydroelectric power generation and planning for how much water can be held in reservoirs.

NASA is also focused on the long-term development of tools to measure water in snow through an airborne field campaign called SnowEx. This type of field campaign connects detailed measurements of snow in the Colorado Rocky Mountains taken by researchers on the ground to remote sensing observations made by aircraft flying over the ground sites. The connections made from these highly detailed datasets will help scientists design future satellite missions that will make similar measurements from space.

Airborne snow measurements, as well as other programs, complement long-term regional observations from NASA satellites that create estimates for entire mountain ranges in the Western U.S. and around the world.

Water in the Sky

When we think of water on Earth we may think of the ocean, rivers and lakes. But as water cycles around the planet, the atmosphere holds moisture, creating a reservoir in the sky that periodically condenses into rain and snow. NASA is part of a team from more than a dozen countries whose satellites are working together to deliver global rainfall data every half hour. Over land, rain has immediate impact as it soaks into the ground, which supports crops.

Rainfall data is one of the most essential for monitoring freshwater's movement around the planet, and goes into applications that touch people's everyday lives, including weather forecasting, crop monitoring, and flood prediction. For many parts of the world, especially developing countries and hard-to-reach terrain where ground measurements are sparse to non-existent, these global NASA datasets are sometimes the only consistent source of information on rainfall and soil moisture.

Half a world away, drought in eastern Australia so depleted the wheat crop that it had to be imported for the first time in 12 years. In eastern Africa and the Middle East, some of the most severe drought conditions on Earth are contributing to stressed crops across Somalia, Sudan, and Yemen.

Whether concerned with floods, droughts, or the status and quality of water supplies, addressing the water-related needs of humans on Earth starts with knowing where the water is. With unique views from space, NASA is at the forefront of studying and monitoring this most precious resource that is constantly on the move. Researchers use data from satellites, aircraft, and other efforts, to find out where and when water is available around the globe, how much, and how are those patterns changing. They then figure out how to best use that data and get it into the hands of the people who need it most.

Over the next few weeks, we'll be exploring areas of NASA research into Earth's freshwater and surveying how those advances help people solve real world problems.

NASA and its partners are using satellites to revolutionize our ability to track and understand the flow of freshwater around Earth - whether it is in the atmosphere, at the Earth's surface, or underground. In the last two decades, freely available NASA datasets have been used for extensive research into the movement, distribution, and interaction of each part of the water cycle worldwide.

It's a complex cycle: Evaporating from warm tropical oceans, freshwater condenses into clouds, circulating on the winds where a portion of it falls as rain or snow. On the ground, freshwater is stored in ice, snow, rivers and lakes. Or, it soaks into the ground, disappearing from view to infiltrate into soils and aquifers. Or, before it disappears from view, it can evaporate back to the atmosphere, where moisture is tightly related to Earth's energy flow, which in turn influences weather patterns that govern freshwater's distribution.

"Fresh water is critically important to humans, both in obvious ways and in unseen ways such as moving heat around Earth's entire climate system," said Jared Entin, terrestrial hydrology program manager in the Earth Science Division at NASA Headquarters, Washington. "With our current satellites, we are now making great progress in pinning down both the detail needed for local water decisions and the global view essential to better understanding our changing climate."

Researchers funded by NASA have used satellite and airborne data to better inform existing tools for flooding, drought forecasts and famine relief efforts, and for planning and monitoring regional water supplies. These efforts are tackling some of the most pressing needs of people around the world.

NASA satellites monitoring Earth's gravity field have given scientists insight into the movement of large masses such as ice and water - including water hidden underground. This global look at changes to the amount of water storied in aquifers, massive underground freshwater reservoirs, has revealed some concerning trends. Of the 37 largest aquifers on Earth, a third of them are being depleted by communities pumping the water faster than it recharges from rainfall. These water declines occur primarily where agriculture and aquifers coincide, and where human water demands can easily exacerbate conditions of periodic drought. Among those most stressed in the past decade are the Central Valley of California, the Indus Basin in northwestern India and Pakistan, and the Arabian Aquifer System in Saudi Arabia.

About 70% of all freshwater on Earth is used for irrigated agriculture. Underground aquifers are water sources that act like waiting bank savings accounts, providing a dependable supply and making agriculture possible in arid areas where significant rain events may only occur once a year and during droughts when surface water is scarce. We do not know the full extent of these underground water aquifers or when they may run dry, but understanding the change in available water that occurs both seasonally and throughout the satellite record helps decision-makers manage their resources.

In addition to witnessing the effects of agriculture, the satellite data show the effects of climate change, most notably in the decline of sea ice and ice sheets at the poles. They also observe the ups and downs of more natural variability that reflects a region's span of wet or dry years. As the global satellite record extends into the future, researchers and water managers will continue to monitor freshwater hidden below as climate patterns shift and human demands grow.

Credit: 
NASA/Goddard Space Flight Center