Earth

Humid air can extend lifetime of virus-laden aerosol droplets

image: Color map showing the amount of time a free-falling 100-micron droplet at an initial height of 1.6 meters is affected by temperature and humidity. For relative humidities (RH) and temperatures (T) below the yellow arc, the droplet will fall to the ground in the number of seconds indicated by the color scale; above the arc, the droplet will completely evaporate in air, never reaching the ground.

Image: 
Binbin Wang

WASHINGTON, August 18, 2020 -- The novel coronavirus that causes COVID-19 is thought to spread through natural respiratory activities, such as breathing, talking and coughing, but little is known about how the virus is transported through the air.

University of Missouri scientists report, in Physics of Fluids, by AIP Publishing, on a study of how airflow and fluid flow affect exhaled droplets that can contain the virus. Their model includes a more accurate description of air turbulence that affects an exhaled droplet's trajectory.

Calculations with their model reveal, among other things, an important and surprising effect of humid air. The results show high humidity can extend the airborne lifetime of medium-sized droplets by as much as 23 times.

Droplets exhaled in normal human breath come in a range of sizes, from about one-tenth of a micron to 1,000 microns. For comparison, a human hair has a diameter of about 70 microns, while a typical coronavirus particle is less than one-tenth of a micron. The most common exhaled droplets are about 50 to 100 microns in diameter.

The droplets exhaled by an infectious individual contain virus particles as well as other substances, such as water, lipids, proteins and salt. The research considered not just transport of droplets through the air but also their interaction with the surrounding environment, particularly through evaporation.

The investigators used an improved description of air turbulence to account for natural fluctuations in air currents around the ejected droplet. They were able to compare their results to other modeling studies and to experimental data on particles similar in size to exhaled droplets. The model showed good agreement with data for corn pollen, which has a diameter of 87 microns, approximately the same size as most of the exhaled droplets.

Humidity affects the fate of exhaled droplets, since dry air can accelerate natural evaporation. In air with 100% relative humidity, the simulations show larger droplets that are 100 microns in diameter fall to the ground approximately 6 feet from the source of exhalation. Smaller droplets of 50 microns in diameter can travel further, as much as 5 meters, or about 16 feet, in very humid air.

Less humid air can slow the spread. At a relative humidity of 50%, none of the 50-micron droplets traveled beyond 3.5 meters.

The investigators also looked at a pulsating jet model to mimic coughing.

"If the virus load associated with the droplets is proportional to the volume, almost 70% of the virus would be deposited on the ground during a cough," said author Binbin Wang. "Maintaining physical distance would significantly remediate the spread of this disease through reducing deposition of droplets onto people and through reducing the probability of inhalation of aerosols near the infectious source."

Credit: 
American Institute of Physics

LJI team gets first-ever look at a rare but vital stem cell in humans

image: Cancer can derail neutrophil development.

Image: 
Daniel Araujo, La Jolla Institute for Immunology

LA JOLLA--Neutrophils are the warriors of the immune system. They are always ready to spring to action to help heal injuries or fight off disease. Unless, that is, something goes wrong in their developmental process.

Immature neutrophils aren't all warriors--they can be dangerous turncoats. High levels of immature neutrophils in the bloodstream can be a tell-tale sign of cancer and may even be a biomarker for COVID-19.

Now scientists at La Jolla Institute for Immunology (LJI) have tracked down the rare stem cells that generate neutrophils in human bone marrow. This research, published August 18, 2020, in Immunity, gives researchers a potential path for intervening in diseases where neutrophil development goes awry.

"We have identified the stem cells that are the early origins of neutrophils, the most abundant blood cell type in humans," says Huy Dinh, Ph.D., a former LJI postdoctoral associate who recently moved to a faculty position at The University of Wisconsin-Madison. Dinh led the study with LJI Professor Catherine C. Hedrick, Ph.D. "Knowing how human neutrophils develop is especially relevant today because immature neutrophils have been found to be elevated in both the blood and lungs of severe COVID-19 patients."

Despite their importance, neutrophils have proven very hard to study. They don't hold up well outside the body, and the stem cells that make them are even harder to investigate because they only live in bone marrow.

In 2018, the Hedrick Lab reported the discovery of a group of "progenitor" stem cells that give rise to mature neutrophils. These progenitors' sole job was to generate neutrophils, yet they appeared to also promote tumor growth. The researchers believed that detecting these progenitors could give doctors a better way to catch early cancer cases. But first, the team needed to know a lot more about neutrophil development.

The new research revealed a progenitor cell type that exists even earlier in human neutrophil development. Dinh, a past SPARK Award recipient, together with Tobias Eggert, Ph.D., a LJI visiting scientist and Melissa Meyer, Ph.D., a LJI postdoc, who served as the co-first authors in the study, spearheaded the effort to use a tool called cytometry by time-of-flight (CyTOF) to distinguish these rare cells from other types of immune progenitor cells. This work also made it possible for the researchers to identify more specific protein markers on this early progenitor cell surface.

The discovery of these protein markers was important because until now, scientists have used only a few of markers to track neutrophils over time. The new study gives scientists specific markers for tracking neutrophil development from day one.

The researchers also found that cases of skin and lung cancers are often accompanied by a flood of immature neutrophils including the early progenitor cells into the bloodstream. These immature neutrophils change as they interact with tumor cells, though the researchers aren't sure yet how these changes affect cancer progression.

Dinh likens the stages of neutrophil development to the cars on a train. The early progenitors are like the train engine, keeping everything going smoothly along the track to maturity. Cancer shakes everything up, and immature neutrophils jump off the track before they reach maturity. "It's like the train is falling apart," Dinh says.

Neutrophil development has been in the news recently due to the COVID-19 pandemic, as studies have shown immature neutrophils are also more abundant in some patients with COVID-19. Dinh and Hedrick think perhaps the threat of the virus prompts the body to churn out neutrophils too quickly, again forcing immature cells off the track to maturity.

"We need to study this phenomenon further to see if these neutrophils can be tied to case prognosis or if they can be a drug target for COVID-19," says Dinh.

The researchers hope to continue their work to discover the exact mechanisms that stop neutrophils from reaching maturity. "Knowing the earliest cell that gives rise to neutrophils is really critical for trying to target and control these cells," says Hedrick. "But we don't know exactly how to do that yet."

Credit: 
La Jolla Institute for Immunology

There is at least 10 times more plastic in the Atlantic than previously thought

video: Wide shot of Dr Katsiaryna Pabortsava in the microplastics lab. For more footage of the lab and one of the NOC's research ships, please follow the following dropbox link https://www.dropbox.com/sh/fjdemtlrwsdz0lu/AACBHWREd0-67abvdBK-0L5Xa?dl=0

Image: 
The National Oceanography Centre

The mass of 'invisible' microplastics found in the upper waters of the Atlantic Ocean is approximately 12- 21 million tonnes, according to research published in the journal Nature Communications today.

Significantly, this figure is only for three of the most common types of plastic litter in a limited size range. Yet, it is comparable in magnitude to estimates of all plastic waste that has entered the Atlantic Ocean over the past 65 years: 17 million tonnes. This suggests that the supply of plastic to the ocean have been substantially underestimated.

The lead author of the paper, Dr Katsiaryna Pabortsava from the National Oceanography Centre (NOC), said "Previously, we couldn't balance the mass of floating plastic we observed with the mass we thought had entered the ocean since 1950. This is because earlier studies hadn't been measuring the concentrations of 'invisible' microplastic particles beneath the ocean surface. Our research is the first to have done this across the entire Atlantic, from the UK to the Falklands."

Co-author, Professor Richard Lampitt, also from the NOC, added "if we assume that the concentration of microplastics we measured at around 200 metres deep is representative of that in the water mass to the seafloor below with an average depth of about 3000 metres, then the Atlantic Ocean might hold about 200 million tonnes of plastic litter in this limited polymer type and size category. This is much more than is thought to have been supplied. "

"In order to determine the dangers of plastic contamination to the environment and to humans we need good estimates of the amount and characteristics of this material, how it enters the ocean, how it degrades and then how toxic it is at these concentrations. This paper demonstrates that scientists have had a totally inadequate understanding of even the simplest of these factors, how much is there, and it would seem our estimates of how much is dumped into the ocean has been massively underestimated".

Pabortsava and Lampitt collected their seawater samples during the 26th Atlantic Meridional Transect expedition in September to November 2016. They filtered large volumes of seawater at three selected depths in the top 200 metres and detected and identified plastic contaminants using state-of-the-art spectroscopic imaging technique. Their study focussed on polyethylene, polypropylene and polystyrene, which are commercially most prominent and also most littered plastic types.

This study builds on the NOC's cutting-edge research into marine plastic contamination, which aims to better understand the magnitude and persistence of exposure to plastics and the potential harms it can cause. This work was supported by the EU H2020 AtlantOS programme and the NOC. The AMT programme was supported by the UK Natural Environment Research Council's National Capability, Climate Linked Atlantic Sector Science (CLASS) programme.

Credit: 
National Oceanography Centre, UK

Free-roaming dogs prevent giant pandas from thriving in the wild

Before China declared giant pandas a protected species in 1962 - hunters in pursuit of the black and white bear used dogs to track them. Since then measures have been put in place to protect the vulnerable pandas, but more than half a century later, dogs are still jeopardizing their safety, according to a group of researchers that included Drexel's James Spotila, PhD.

Spotila, the L.D. Betz Chair Professor in the Department of Biodiversity, Earth and Environmental Science in Drexel's College of Arts and Sciences, and the group began to investigate the problem after two captive-born pandas, which had been released into Liziping Nature Reserve, were attacked by dogs.

The group found that dogs are still menacing giant pandas in part because nature reserves in China are often closely connected to human settlements where dogs roam free. Dogs can roam over 10 km in a night and some feral dogs have even set up permanent residence in the reserves.

A GIS analysis of Liziping Nature Reserve revealed this to be the case, as much of that reserve was within the range of free roaming dogs from the nearby villages. The finding led researchers to expand their scope and suggest that reserves designated for the release of translocated pandas should receive priority consideration for dog-control efforts.

Pandas are vulnerable species in part because they require a minimum habitat size of 114 square-kilometers to thrive. While most nature reserves designated for giant pandas are large enough to sustain their population, encroachment by free-roaming dogs could significantly limit the bears' territory.

Because of this concern, the research team, working out of Chengdu Research Base, expanded its analysis to include all giant panda reserves in China, which revealed that across the entire range 40% of panda habitats are within range of roaming dogs. Therefore, the area safely available for giant pandas in nature reserves throughout China is only 60% of the official "protected" area.

"Dogs have to be removed from giant panda reserves if they are to survive in the wild," Spotila said.
"Predation, harassment and disease transmission by dogs can have large-scale edge effects in both fragmented habitats and protected nature reserves."

The team recently published its findings in Scientific Reports under the title "Free-roaming Dogs Limit Habitat Use of Giant Pandas in Nature Reserves." In it, the team recommends a comprehensive approach to dog-control efforts by local governments, implemented by village leaders, that includes licensing and collaring. It also suggests that education for residents, free neuter and vaccination clinics and procedures to ensure ethical treatment (through consultation with the Society for the Prevention of Cruelty to Animals or similar local groups) of feral dogs removed from reserves should be incorporated in a dog management plan.

Spotila believes that China has done a good job in its conservation efforts, but dog-control efforts need to be considered and implemented in order for giant pandas to thrive in the wild.

"Only by understanding and managing complex interactions between humans, domestic animals and wild animals can we sustain natural systems in a world increasingly dominated by humans," Spotila said.

Credit: 
Drexel University

Airborne viruses can spread on dust, non-respiratory particles

Influenza viruses can spread through the air on dust, fibers and other microscopic particles, according to new research from the University of California, Davis and the Icahn School of Medicine at Mt. Sinai. The findings, with obvious implications for coronavirus transmission as well as influenza, are published Aug. 18 in Nature Communications.

"It's really shocking to most virologists and epidemiologists that airborne dust, rather than expiratory droplets, can carry influenza virus capable of infecting animals," said Professor William Ristenpart of the UC Davis Department of Chemical Engineering, who helped lead the research. "The implicit assumption is always that airborne transmission occurs because of respiratory droplets emitted by coughing, sneezing, or talking. Transmission via dust opens up whole new areas of investigation and has profound implications for how we interpret laboratory experiments as well as epidemiological investigations of outbreaks."

Fomites and influenza virus

Influenza virus is thought to spread by several different routes, including in droplets exhaled from the respiratory tract or on secondary objects such as door handles or used tissues. These secondary objects are called fomites. Yet little is known about which routes are the most important. The answer may be different for different strains of influenza virus or for other respiratory viruses, including coronaviruses such as SARS-CoV2.

In the new study, UC Davis engineering graduate student Sima Asadi and Ristenpart teamed up with virologists led by Dr. Nicole Bouvier at Mt. Sinai to look at whether tiny, non-respiratory particles they call "aerosolized fomites" could carry influenza virus between guinea pigs.

Using an automated particle sizer to count airborne particles, they found that uninfected guinea pigs give off spikes of up to 1,000 particles per second as they move around the cage. Particles given off by the animals' breathing were at a constant, much lower rate.

Immune guinea pigs with influenza virus painted on their fur could transmit the virus through the air to other, susceptible guinea pigs, showing that the virus did not have to come directly from the respiratory tract to be infectious.

Finally, the researchers tested whether microscopic fibers from an inanimate object could carry infectious viruses. They treated paper facial tissues with influenza virus, let them dry out, then crumpled them in front of the automated particle sizer. Crumpling the tissues released up to 900 particles per second in a size range that could be inhaled, they found. They were also able to infect cells from these particles released from the virus-contaminated paper tissues.

Credit: 
University of California - Davis

Is turning back the clock in aging fat cells a remedy for lifestyle diseases?

image: (Mouse adipose tissue) The abundance of Rubicon in adipose tissue from 25-month-old mice was significantly decreased compared with that of 3-month-old mice. Levels of an autophagic substrate, p62, also decreased with age, suggesting that autophagy increases with age. β-actin was used as the protein control.

Image: 
Osaka University

Osaka, Japan - No matter how much we try and fight it, aging is a part of life. High cholesterol, diabetes, and fatty liver, the collection of conditions referred to as lifestyle diseases, all become more commonplace as we get older. Interestingly, however, many of these age-related conditions are caused by changes inside adipocytes, the fat cells responsible for storing excess energy.

Now, in a study published in Nature Communications, researchers led by Osaka University have uncovered exactly how these changes lead to the onset of lifestyle diseases, with an eye to reversing the process.

"Adipocytes produce hormones and cytokines that regulate the function of other metabolic organs," explains study lead author Tadashi Yamamuro. "Age-related changes in adipose tissue result in metabolic disorders that are closely associated with life-threatening cardiovascular diseases. However, no one really knows what causes adipocyte dysfunction in aged organisms."

The research team decided to focus on autophagy, the process used by cells to eliminate unwanted or dysfunctional cellular components. Previous studies had shown that autophagy plays an important role in the prevention of various age-related disorders and is likely to be involved in the aging process. But most pertinent was the finding that autophagy is essential for the normal function and longevity of normal organs, such as liver or kidney.

Says Yamamuro, "We previously showed that a protein called Rubicon, which inhibits autophagy, is upregulated in aging tissues. We therefore hypothesized that Rubicon likely accumulates in aged adipocytes, decreasing autophagic activity and contributing to the onset of metabolic disorders."

Surprisingly though, the researchers found that Rubicon levels were actually decreased in the adipose tissue of aged mice, resulting in increased autophagic activity.

To dig deeper into the underlying mechanism, the researchers developed a mouse line in which Rubicon was specifically inactivated in adipose tissue.

"In the absence of Rubicon, we observed excessive autophagy in adipocytes and a decline in adipocyte function," explains senior author Tamotsu Yoshimori. "As a result, the mice developed lifestyle diseases such as diabetes and fatty liver and had significantly higher cholesterol levels, despite being fed the same diet as control animals."

The researchers went on to identify the specific proteins affected by the increased levels of autophagy, showing that supplementation of these proteins in the Rubicon deletion mice restored adipocyte function.

"This is a really exciting discovery with important therapeutic implications," says Yoshimori. "Because age-dependent loss of adipose Rubicon causes lifestyle diseases via excess autophagy, inhibiting autophagy in adipocytes may help prevent the onset of these prevalent and potentially life-threatening conditions."

Credit: 
Osaka University

Low-cost home air quality monitors prove useful for wildfire smoke

image: A view of the San Francisco Bay Area from Berkeley Lab during the 2018 Camp fire (left) and three weeks prior. The graphs show PM2.5 concentrations (microgram per cubic meter), with the x-axis as measured by the regulatory monitor in downtown Berkeley and the y-axis as measured by PurpleAir monitors. On the good air day, the numbers were roughly the same. However, when wildfire smoke was present the PurpleAir readings were consistently far higher than the regulatory monitor's.

Image: 
Kelly J. Owen/Berkeley Lab

Over the last few years of frequent and intense wildfire seasons, many parts of the U.S. have experienced hazardous air quality for days on end. At the same time a number of low-cost air quality monitors have come on the market, allowing consumers to check the pollutant levels in their own homes and neighborhoods. So, air quality scientists at Lawrence Berkeley National Laboratory (Berkeley Lab) wanted to know: are these low-cost monitors any good?

The answer is: yes - to a degree.

Published recently in the journal Sensors, their study tested four models of low-cost air quality monitors during actual wildfire pollution events and found that their readings of PM2.5 - or particulate matter under 2.5 microns, which has been linked to respiratory and cardiovascular issues - were consistently higher than the reference monitor used by the regulatory agencies; however, since each monitor had a relatively consistent response to the smoke, it is possible to use the readings to estimate true PM2.5 levels. Overall, the researchers concluded that the monitors can provide actionable information.

"We compared the low-cost monitors to one that is used by regulatory agencies in air monitoring stations. It turns out their correlations are phenomenally good. When one goes up, the other goes up at the same time, and it is proportional. That gives us a lot of hope for being able to use them for real information," said Woody Delp, one of the lead authors of the study. "And it could let someone know how well their new portable air filter is reducing smoke particles. But from an absolute point of view, it's becoming clear these sensors require some adjustments and checks to use the numbers."

For the study, titled "Wildfire Smoke Adjustment Factors for Low-Cost and Professional PM2.5 Monitors with Optical Sensors," Delp and co-author Brett Singer tested four low-cost air quality monitors:

IQAir AirVisual Pro

PurpleAir Indoor

Air Quality Egg

eLichens Indoor Air Quality Pro station

These devices, which cost in the range of a few hundred dollars, were compared to reference monitors used by regulatory agencies and researchers, which cost $20,000 or more. They also tested two monitors that are used by researchers and industrial hygienists and cost in the range of $5,000 to $10,000. Additionally, the researchers compared public data from PurpleAir PA-II monitors to nearby regulatory monitoring stations impacted by four wildfires in 2018.

Calibrations and adjustment factors

In the past, air quality monitoring has been limited to the high-priced professional monitors, making them inaccessible for personal use. The manufacturers recommend that the devices be calibrated to the specific pollution source of interest because the sensors use an optical sensing technique that responds differently to different sources. Pollution from a backyard barbecue or car exhaust may differ in size and density from pollution from a forest fire, and a forest fire may emit different types of particles than an urban fire.

The low-cost monitors use the same optical sensing technique - estimating particle concentrations based on light scattering - but use mass-produced optical sensors that are not as precisely machined as those in the professional-grade devices. In contrast, the most expensive monitors, those used by regulatory agencies, are calibrated using gravimetric analysis, which is based on the weight of particles.

With the air quality monitors deployed at Berkeley Lab inside a well-ventilated single-story laboratory building, the researchers collected data as the Camp Fire burned in Northern California in the fall of 2018. They found that the four low-cost monitors substantially overreported PM2.5 levels, by factors of 1.6 to 2.4 times higher than the readings on the regulatory reference monitor. However, the relative changes correlated well with both the regulatory and professional monitors.

The researchers calculated an adjustment factor of approximately 0.48 when using PurpleAir PA-II monitors outdoors during the Camp, Carr, and Mendocino Complex Fires in California and the Pole Creek Fire in Utah (meaning the readings should be multiplied by 0.48 to estimate the true PM2.5 level). This correction is very close to one of the data conversion options ("LRAPA") given on the PurpleAir website.

Good for other indoor pollutants too

In a separate study earlier this year, Delp and Singer, along with first author Zhiqiang Wang, evaluated six low-cost air quality monitors by comparing their output to reference PM2.5 and PM10 measurements from 21 common residential sources, such as frying, grilling, microwaving popcorn, vacuuming, and burning candles. The study, published in the journal Building and Environment, was an update to their 2018 study.

They found that for most pollution sources, the low-cost monitors tracked with the professional ones within a factor of two for PM2.5. Delp said the consumer monitors "enable people to identify activities that emit fine particulate matter inside their homes and to determine if operating filters or just keeping windows closed is effectively reducing exposure inside when there is very bad air pollution outside. For these purposes, they work as well as professional grade monitors, and appear to be very reliable."

"We are impressed and excited by the usefulness and performance of these air quality monitors that cost under $300," said Singer, head of the Indoor Environment Group in Berkeley Lab's Energy Technologies Area.

Both studies were supported by the Department of Energy's Building Technologies Office and the U.S. Environmental Protection Agency Indoor Environments Division.

Credit: 
DOE/Lawrence Berkeley National Laboratory

Ultrafast hydrogen bond dynamics of liquid water revealed by THz-induced Kerr effect

image: a, Schematic diagram of the experimental system. A broadband THz pump pulse (peak electric field strength of 14.9 MV/cm, centre frequency of 3.9 THz, and bandwidth of 1-10 THz) excites liquid water to initiate transient birefringence caused by THz Kerr effect (TKE), which is monitored by an 800 nm probe pulse that becomes elliptically polarized as it passes through the water film. b, The TKE responses of liquid water and heavy water are demonstrated for comparison. The relatively large damping coefficient of heavy water in the stretching mode corresponds to the faster energy decay process of the harmonic oscillator, resulting in a reduction in the second peak of the TKE response compared to that of water. c, The TKE response is assigned to the superposition of four components, among which the bidirectional contributions of bending and stretching modes play dominant roles. The proposed a hydrogen bond oscillator model based on the Lorentz dynamic equation to describe the dynamics of the intermolecular modes of liquid water and successfully reproduced the measured TKE responses.

Image: 
Hang Zhao, Yong Tan, Liangliang Zhang, Rui Zhang, Mostafa Shalaby, Cunlin Zhang, Yuejin Zhao, and Xi-Cheng Zhang

Liquid water is considered the cornerstone of life and has many extraordinary physical and biochemical properties. The hydrogen bond network of liquid water is widely recognized to play a crucial role in these properties. Due to the complexity of intermolecular interactions and the large spectral overlap of relevant modes, the study of hydrogen bond dynamics is challenging. In recent years, exciting the liquids resonantly with terahertz (THz) waves provides a new perspective for exploring the transient evolution of low-frequency molecular motion. However, water has a large absorption coefficient in THz band, the application of the THz-induced Kerr effect technique in hydrogen bond dynamic research has remained challenging.

In a new paper published in Light Science & Application, a team of scientists, led by Professor Yuejin Zhao from Beijing Key Laboratory for Precision Optoelectronic Measurement Instrument and Technology, School of Optics and Photonics, Beijing Institute of Technology, China; Professor Liangliang Zhang from Beijing Advanced Innovation Center for Imaging Technology and Key Laboratory of Terahertz Optoelectronics (MoE), Department of Physics, Capital Normal University, China; and co-workers used an intense and broadband THz pulse to resonantly excite intermolecular modes of liquid water and obtained bipolar THz field-induced transient birefringence signals by adopting a free-flowing water film. They proposed a hydrogen bond harmonic oscillator model associated with the dielectric susceptibility and combined it with the Lorentz dynamic equation to investigate the intermolecular structure and dynamics of liquid water. They mainly decompose the bipolar signals into a positive signal caused by hydrogen bond stretching vibration and a negative signal caused by hydrogen bond bending vibration, indicating that the polarizability perturbation of water presents competing contributions under bending and stretching conditions. The results provide an intuitive time-resolved evolution of polarizability anisotropy, which can reflect the intermolecular modes of liquid water on the sub-picosecond scale.

The THz waves can resonantly excite one or several molecular motion modes in liquids, which is a powerful tool for exploring low-frequency molecular dynamics. These scientists summarize the principle of their work:

"We used a THz electric field to resonantly excite the intermolecular modes of liquid water. The transient rotation of a molecule produces an induced dipole moment, which immediately transfers the momentum driven by the THz field to the restricted translational motion of adjacent water molecules. This translational motion can be assigned to a bending mode and a stretching mode, which can lead to the components of polarizability anisotropy perpendicular and parallel to the hydrogen bonds, respectively, thus resulting in bidirectional performance."

"In the experiment, an intense THz excitation source and an ultrathin flowing water film that replaces traditional cuvettes are the basis for achieving high-quality signals." they added.

"The ultrafast intermolecular hydrogen bond dynamics of water revealed by a broadband THz pump pulse can provide further insights into the transient structure of liquid water corresponding to the pertinent modes. This breakthrough could open a new venue for detecting the physical mechanisms of the gas phase of water and crystalline and amorphous ices, as well as the complex interaction of reagents with solvent water molecules." the scientists forecast.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

High intensity physical activity in early life could lead to stronger bones in adulthood

The research, which analysed data from 2,569 participants of the Children of the 90s health study, found that more time spent doing moderate-to-vigorous intensity physical activity (MVPA) from age 12 years was associated with stronger hips at age 25 years, whereas time spent in light intensity activity was less clearly associated with adult hip strength.

Peak bone mass occurs in young adulthood and is considered to be a marker of the risk of fracture and osteoporosis in later life. Hip fractures make up a large proportion of the osteoporosis disease burden.

Researchers looked at data from healthy individuals who had physical activity measured up to 4 times using accelerometers worn as part of clinical assessments at age 12, 14, 16 and 25 years. This is a device that measures a person's movement for the whole time they wear it.

Researchers also found evidence to suggest that adolescent MVPA was more important than MVPA in adulthood, and that MVPA in early adolescence may be more important than in later adolescence. There was also some evidence that higher impact activity (consistent with jumping; assessed once in a subsample in late adolescence using custom accelerometer) was related to stronger hips at age 25.

Dr Ahmed Elhakeem, lead author and Senior Research Associate in Epidemiology, said: "The unique availability of repeated accelerometer assessments over many years beginning at age 12 within the Children of the 90s cohort, allowed us to describe the trajectory of time spent in different physical activity intensities through early life and to examine how this might relate to adult hip strength. The results highlight adolescence as a potentially important period for bone development through high intensity exercise, which could benefit future bone health and prevent osteoporosis in later life. We have also confirmed other studies showing that levels of MVPA decline through adolescence. Our findings show it is really important to support young people to remain active at this age"

Francesca Thompson, Clinical and Operations Director at the Royal Osteoporosis Society (ROS), said: "The ROS is working closely at the moment with Public Health England to review the importance of exercise for bone health in children. The findings from this study are welcome as they provide further evidence that children need to be doing moderate to vigorous intensity physical activity during their early adolescence to maximise bone strength in later life and reduce the risk of painful fractures. Supporting and encouraging young people to be more physically active needs to be a priority for bone as well as general health."

Credit: 
University of Bristol

How protein protects against fatty liver

image: Microscopic images of liver biopsies of three individuals with different degrees of liver fat accumulation

Image: 
DIfE

Non-alcoholic fatty liver disease is the most common chronic liver disease in the world, with sometimes life-threatening consequences. A high-protein, calorie-reduced diet can cause the harmful liver fat to melt away - more effectively than a low-protein diet. A new study by DIfE/DZD researchers published in the journal Liver International shows which molecular and physiological processes are potentially involved.

Causes and consequences of a non-alcoholic fatty liver

Non-alcoholic fatty liver disease is characterized by a build-up of fat in the liver and is often associated with obesity, type 2 diabetes, high blood pressure and lipid disorders. If left untreated, fatty liver can lead to cirrhosis with life-threatening consequences. The causes of the disease range from an unhealthy lifestyle - that is, eating too many high-fat, high-sugar foods and lack of exercise - to genetic components. Already in previous studies, the research team led by PD Dr. Olga Ramich and Professor Andreas Pfeiffer from the German Institute of Human Nutrition Potsdam-Rehbruecke (DIfE) observed a positive effect of a high-protein diet on liver fat content. "The new results now give us deeper insights into how the high-protein diet works," said Ramich, head of the research group Molecular Nutritional Medicine at DIfE.

High-protein diet is more effective than low-protein diet

For the current study, the research team led by Ramich and Pfeiffer investigated how the protein content of food influences the amount of liver fat in obese people with a non-alcoholic fatty liver. For this, the 19 participants were to follow either a diet with a high or low protein content for three weeks. Subsequently, surgery to treat obesity (bariatric surgery) was carried out and liver samples were collected.

Analysis of the samples showed that a calorie-reduced, high-protein diet decreased liver fat more effectively than a calorie-reduced, low-protein diet: while the liver-fat content in the high-protein group decreased by around 40 percent, the amount of fat in the liver samples of the low-protein group remained unchanged. The study participants in both groups lost a total of around five kilograms. "If the results continue to be confirmed in larger studies, the recommendation for an increased intake of protein together with a healthy low-fat diet as part of an effective fatty liver therapy could find its way into medical practice," said Andreas Pfeiffer, head of the Research Group Clinical Nutrition/DZD at DIfE and the Clinic for Endocrinology in the Charité -- Universitätsmedizin Berlin, Campus Benjamin Franklin.

Molecular fat absorption mechanisms

The researchers assume that the positive effect of the high-protein diet is mainly due to the fact that the uptake, storage and synthesis of fat is suppressed. This is indicated by extensive genetic analyses of the liver samples that Professor Stephan Herzig and his team at Helmholtz Zentrum München conducted. According to these analyses, numerous genes that are responsible for the absorption, storage and synthesis of fat in the liver were less active after the high-protein diet than after the low-protein diet.

Unexpected results

In addition, Olga Ramich's research group, together with the Department of Physiology of Energy Metabolism at DIfE, also investigated the functions of the mitochondria. "Mitochondrial activity was very similar in both groups. That surprised us. We originally assumed that the high-protein diet would increase mitochondrial activity and thus contribute to the degradation of liver fat," said Department Head Professor Susanne Klaus. The researchers were also surprised that the serum levels of Fibroblast Growth Factor 21 (FGF21) were lower after the high-protein diet which reduced liver fat than after the low-protein diet. "FGF21 is known to have beneficial effects on metabolic regulation. Further studies will be necessary to show why the factor was reduced in the actually positively acting high-protein diet," said Ramich. Furthermore, autophagy activity was lower in liver tissue after the high-protein diet compared to the low-protein diet. "Lipid degradation via 'lipophagy', as a special form of autophagy, therefore does not appear to be involved in the breakdown of liver fat in the high-protein diet."

As a next step, Ramich and Pfeiffer intend to follow up their findings about the mechanisms involved and thus gain new insights into the mode of action of targeted dietary intervention strategies.

Credit: 
Deutsches Zentrum fuer Diabetesforschung DZD

Native Hawaiian tiger cowries eat alien invasive species

image: Hawaiian tiger cowries feeding on the Orange keyhole sponge (Mycale grandis)

Image: 
Leon Weaver

Researchers at the University of Hawai'i (UH) at Mānoa's Hawai?i Institute of Marine Biology (HIMB) have just discovered that the Hawaiian tiger cowrie (Leho-kiko in Hawaiian) is a voracious predator of alien sponges such as the Orange Keyhole sponge, which can overgrow native corals and has become a concern as it spread across reefs within Kāne?ohe Bay. In the study published recently, researchers found that each cowrie eats more than half their body weight in sponges each week.

"We found that cowries ate most species of alien sponges that we offered them, and that a single snail can consume an entire sponge the size of your fist in roughly a week," said Jan Vicente, lead author of the study and postdoctoral researcher at HIMB in the UH Mānoa School of Ocean and Earth Science and Technology (SOEST).

Alien invasive species, such as killer algae in the Mediterranean or lionfish in the Caribbean, can devastate native ecosystems. Hawai?i ranks among the highest in the world for both the number of marine alien invasive species and the success of those invaders in taking over space from native species. Prevention, early detection, and rapid removal are the best tools to prevent impacts from alien species, because once invaders become established, efforts to eradicate them are expensive and often unsuccessful.

One relatively cheap option for management of alien species is known as biocontrol, in which a natural predator of the alien species is also introduced to control the invader. Hawai?i is also home to some spectacular failures of past biocontrol efforts, such as the mongoose and wolf snail that created new problems rather than solving existing ones. However, not all biocontrol efforts need to use one alien species to control another, because in some cases a native species can serve that critical role to control an alien invader.

Cowries have been overharvested throughout the Pacific and have experienced a precipitous decline in most populated locations. The Hawaiian tiger cowrie is more valuable than most, because it is not found anywhere else and reaches a much larger size. Tiger cowries, like most cowrie shells, have been in high demand for over 1000 years in Indo-Pacific trade, but harvest of the Hawaiian cowrie is unreported and unregulated by the state of Hawai'i.

"Our study shows that cowries may be able to control alien sponges if they were common enough, but we hear from the Native Hawaiian community that they are far less common today than in the past. We need to protect these culturally and ecologically important animals from overharvest. This study shows that Hawaiian tiger cowries could help control invasive species, so maybe we should stop people from killing them to sell as ornaments in the shell trade." said Vicente.

Co-author and HIMB professor Rob Toonen was quick to add, "If these snails can do the work of controlling alien invasive species for us, it will save the state both time and money to keep them in the ocean rather than on display for someone's shelf."

Credit: 
University of Hawaii at Manoa

Recent global warming trends are inconsistent with very high climate sensitivity

Research published this week in Earth System Dynamics reports that the most sensitive climate models overestimate global warming during the last 50 years.

Three scientists from the University of Exeter studied the output of complex climate models and compared them to temperature observations since the 1970s.

Recent developments in cloud modelling have produced models that portray very large sensitivity to rising greenhouse gas concentrations.

A subset of models even showed that a doubling of CO2 could lead to over 5°C of warming, questioning whether the goals of the Paris agreement are achievable even if nations do everything they can.

The lead author of the study, PhD candidate Femke Nijsse from the University of Exeter, said: "In evaluating the climate models we were able to exploit the fact that thanks to clean air regulation, air pollution in the form of climate-cooling aerosols have stopped increasing worldwide, allowing the greenhouse gas signal to dominate recent warming."

The amount of warming that occurs after CO2 concentrations in the atmosphere are doubled is called the equilibrium climate sensitivity.

The study found that based on the latest generation of climate models the equilibrium climate sensitivity is likely between 1.9 and 3.4 °C.

Co-author Mark Williamson, of Exeter's Global Systems Institute, added: "Global warming since 1970 also provides even better guidance on the rate of climate change in the future.

"We find a likely range for the 'Transient Climate Response' of 1.3-2.1oC, whether we use the latest models or the previous generation of models."

The new study is only one piece of the puzzle.

A recent review paper found that low estimates of climate sensitivity can be excluded because they are, in general, not consistent with climate changes in Earth's past.

Co-author Professor Peter Cox explains the significance of these findings: "It is good to see that studies are now converging on a range of equilibrium climate sensitivity, and that both high and low values can be excluded.

"For over forty years, climate scientists have tried to pinpoint this quantity and it seems that we're finally getting close."

Credit: 
University of Exeter

Researchers discover novel molecular mechanism that enables conifers to adapt to winter

In boreal forest during late winter, freezing temperatures are typical but at the same time the sun can already shine very brightly. This combination is especially dangerous to evergreen plants, such as conifers. The chlorophyll pigment-proteins in their needles absorb light, but the enzyme activity, stopped by the cold, prevents the plants from using the light for photosynthesis. This exposes the cells to damage.

Dissipating the excess light energy as heat, the so-called non-photochemical quenching, is a common, fast, and dynamic but intermittent regulation mechanism in all plants and algae, and it is employed to protect the plant from damage caused by high light intensity. However, the combination of freezing temperatures and high light intensity results in a particular form of quenching in conifers: sustained non-photochemical quenching.

Researchers from the University of Turku, Finland, discovered an essential part of the mechanism associated to sustained non-photochemical quenching in conifers. The discovery is significant as the mechanism in question is still poorly understood in science.

"We collected needle samples from nature for four years and studied spruce branches in simulated conditions mimicking late winter. On the basis of biophysical and molecular biology analyses, we could show that the triply phosphorylated LHCB1 isoform and phospho-PSBS protein in chloroplast appear to be prerequisites for the development of sustained non-photochemical quenching that safely dissipates absorbed light energy as heat," say Doctoral Candidate Steffen Grebe and Postdoctoral Researcher Andrea Trotta from the Molecular Plant Biology unit of the Department of Biochemistry at the University of Turku.

In the phosphorylation of a protein, a phosphoryl group is added to certain amino acids, which is a common mechanism for protein regulation in cells. The phosphorylation of the proteins discovered in spruce has not been described in science before.

The researchers believe that together with the limited photoinhibition of photosystem II, the phosphorylations lead to structural changes in pigment-proteins so that the needles can effectively dissipate the excess light energy.

Spruce genome sequencing enabled novel research

The regulation mechanisms of photosynthesis have been previously studied on a molecular level mainly on fast-growing species regularly used in plant biology, such as thale cress (Arabidopsis thaliana) and the alga Chlamydomonas reinhardtii. However, it is not possible to study the winter acclimatisation with these plants and easily transfer the knowledge to conifer species. The molecular biology research of conifers became possible after the spruce genome sequencing was published in 2013.

"The spruce genome is approximately ten times larger than that of humans. The genome sequencing of spruce led by our long-time partner, Professor Stefan Jansson from the Umeå University, enabled the molecular photosynthesis study we have now conducted in Turku, says Principal Investigator," Academician Eva-Mari Aro.

The new information on spruces' adaptation to their environment can be used in assessing the impact of climate change on photosynthesis of conifers and their carbon sink capacity as photosynthesis in conifer forests is one of the most important carbon sinks on a global scale.

Credit: 
University of Turku

A stepping stone for measuring quantum gravity

image: Space-time diagram of quantum states interference. By reversing the non-zero internal spin state at times \tau_{1} and \tau_{2} the particle can be made to follow the blue (spin +/-1) and orange (spin 0) paths. In doing so they reach a maximum spatial superposition size \Delta x before being brought back to interfere at time \tau_{3}

Image: 
R. Marshman et al

A group of theoretical physicists, including two physicists from the University of Groningen, have proposed a 'table-top' device that could measure gravity waves. However, their actual aim is to answer one of the biggest questions in physics: is gravity a quantum phenomenon? The key element for the device is the quantum superposition of large objects. Their design was published in New Journal of Physics on 6 August.

Already in the preprint stage, the paper that was written by Ryan J. Marshman, Peter F. Barker and Sougato Bose (University College London, UK), Gavin W. Morley (University of Warwick, UK) and Anupam Mazumdar and Steven Hoekstra (University of Groningen, the Netherlands) was hailed as a new method to measure gravity waves. Instead of the current kilometres-sized LIGO and VIRGO detectors, the physicists working in the UK and in the Netherlands proposed a table-top detector. This device would be sensitive to lower frequencies than the current detectors and it would be easy to point them to specific parts of the sky - in contrast, the current detectors only see a fixed part.

Diamond

The key part of the device is a tiny diamond, just a few nanometres in size. 'In this diamond, one of the carbons is replaced by a nitrogen atom,' explains assistant professor Anupam Mazumdar. This atom introduces a free space in the valence band, which can be filled with an extra electron. Quantum theory says that when the electron is irradiated with laser light, it can either absorb or not absorb the photon energy. Absorbing the energy would alter the electron's spin, a magnetic moment that can be either up or down.

'Just like Schrödinger's cat, which is dead and alive at the same time, this electron spin does and does not absorb the photon energy, so that its spin is both up and down.' This phenomenon is called quantum superposition. Since the electron is part of the diamond, the entire object - with a mass of about 10-17 kilograms, which is huge for quantum phenomena - is in quantum superposition.

'We have a diamond that has up spin and down spin at the same time,' explains Mazumdar. By applying a magnetic field, it is possible to separate the two quantum states. When these quantum states are brought together again by turning off the magnetic field, they will create an interference pattern. 'The nature of this interference depends on the distance the two separate quantum states have travelled. And this can be used to measure gravity waves.' These waves are contractions of space, so that their passing affects the distance between the two separated states and thus the interference pattern.

Missing link

The paper shows that this set-up could indeed detect gravity waves. But that is not what Mazumdar and his colleagues are really interested in. 'A system in which we can obtain quantum superposition of a mesoscopic object such as the diamond, and for a reasonable length of time, would be a real breakthrough,' Mazumdar says. 'It would allow all kinds of measurements to be taken, and one of those could be used to determine whether gravity itself is a quantum phenomenon.' Quantum gravity has been the 'missing link' in physics for nearly a century.

In a paper published in 2017 (1), Mazumdar and his long-time collaborator Sougato Bose, together with several colleagues, suggested that entanglement between two mesoscopic objects could be used to find out whether gravity itself is a quantum phenomenon. Simply put: entanglement is a quantum phenomenon, so when two objects that interact only through gravity show entanglement, this proves that gravity is a quantum phenomenon.

Technology

'In our latest paper, we describe how to create mesoscopic quantum superposition. With two of these systems, we were able to show entanglement.' However, as they noticed during their work, the single system would be sensitive to gravitational waves and this became the focus of the New Journal of Physics paper.

'The technology to build these systems could take a few decades to develop,' Mazumdar acknowledges. A vacuum of 10-15 Pascal is required, while the operating temperature should be as low as possible, near absolute zero (-273 °C). 'Technology to achieve either high vacuum or low temperature is available, but we need the technology to achieve both at the same time.' Furthermore, the magnetic field must be constant. 'Any fluctuation would collapse the quantum superposition.'

Freefall

The reward for creating this kind of system would be great. 'It could be used for all kinds of measurements in fields such as ultra-low energy physics or quantum computing, for example.' And it could, of course, be used to determine whether gravity is a quantum phenomenon. Mazumdar, Bose and colleagues have just uploaded another preprint (2) in which they describe how this experiment could be performed. 'To ensure that the only interaction between the two entangled objects is the gravity between them, the experiment should be done in free fall,' explains Mazumdar. With visible enthusiasm, he describes a one-kilometre long drop shaft in a deep mine, to reduce interference. Two entangled mesoscopic quantum systems should be dropped repeatedly to obtain a reliable measurement. 'I think this can be done in my lifetime. And the result would finally resolve one of the biggest questions in physics.'

Credit: 
University of Groningen

Research brief: Bee neighborly -- sharing bees helps more farmers

image: Researchers combined wild bee ecology, crop values, and land ownership patterns to reveal the benefits to landowners of restoring wild bee habitats in agricultural areas.

Image: 
Eric Lonsdorf / [Natural Capital Project, University of Minnesota Institute on the Environment]

Many farmers are used to sharing big equipment--like tractors and other costly machinery -- with neighboring farms. Sharing cuts costs, lowers the farmer's debt load, and increases community wellbeing. But big machinery might not be the only opportunity for farmers to reap the benefits of cost-sharing with their neighbors. New research suggests that the concept could also be applied to a more lively kind of agricultural resource -- wild bees. 

"Understandably, farmers with highly valuable crops don't always want to give up plantable space to create habitats for wild bees, especially if their crops could be pollinated by a neighbor's bees for free," said Eric Londsorf, lead scientist for the Natural Capital Project at the University of Minnesota's Institute on the Environment and lead author on the paper. "What we're proposing is that those farmers providing bee habitat could be rewarded for doing so, to the benefit of all."

The research, published in People and Nature, was done in collaboration with scientists at the University of Vermont. The team applied their work to the fields of California's Central Valley, one of the nation's top agricultural areas. They combined wild bee ecology, crop values, and land ownership patterns to reveal the benefits to landowners of restoring bee habitats. In Yolo County, where bee-dependent crops like berries and nuts are worth thousands of dollars per acre, every inch of land counts. 

"Bees don't pay attention to land boundaries," said Lonsdorf. "In the current system, farmers who choose to conserve habitat for bees on their lands are rarely recognized for the pollination benefits that they're also providing to their neighbors." 

Creating wild bee habitat on farms can be as simple as letting a small area of land remain wild, which provides bees with a familiar sanctuary amidst rows of crops. But there is little incentive for farmers to create space on their own lands if the costs are greater than the benefits they'd receive from the bees, which means that few farmers choose to sacrifice precious planting ground for bee habitat. "We know that wild bees are essential pollinators for many of our crops, but they also require space nearby the crops to live, so we need to know where and how to invest." 

The researchers found that if 40 percent of landowners were to provide space for wild bee habitat, those landowners would lose one million dollars themselves, but generate nearly two and a half million for their neighbors. If the landowners were able to work together so that those who benefitted paid into the cost of the bee habitat, then everyone could come out ahead. Especially relevant in the equation is the size and patterns of farms clustered together, which the researchers say is often overlooked in this kind of analysis. 

"This is about tackling the tragedy of the commons, the idea that what's good for society isn't always what's good for a particular individual," said co-author Taylor Ricketts, Director of the Gund Institute for Environment at the University of Vermont and co-founder of the Natural Capital Project. "This research shows how and where working together can really increase the benefits for everyone, and just as important: where it won't." 

Ideally, the researchers would like to see their work help inform policies that encourage cooperation and resource sharing amongst farmers. They suggest that small groups of neighboring farmers come together, supported by local or national agricultural agencies like the United States Department of Agriculture, to decide how to allocate their land and bee habitats. This common-pool resource framework can be informed by the kind of analysis done by Lonsdorf and his colleagues so that farmers are able to make decisions that result in the highest benefits for the group.  

In an increasingly volatile economy where small savings can go a long way, investing in shared natural resources like pollinators could be a good choice for small-scale farmers who grow high-value crops. "It's an opportunity to overcome the tragedy of the commons," said Lonsdorf. "And our goal is to spark the conversation."

Credit: 
University of Minnesota