Earth

Heart risk raised by sitting in front of the TV, not by sitting at work, finds study

NEW YORK, NY (June 26, 2019)--Sitting for long periods of time has been linked to increased risk of cardiovascular disease and early death, but a new study suggests that not all types of sitting are equally unhealthy.

The study, led by researchers at Columbia University Vagelos College of Physicians and Surgeons, found that leisure-time sitting (while watching TV)--but not sitting at work--was associated with a greater risk of heart disease and death among the study's more than 3,500 participants. The study also found that moderate-to-vigorous exercise may reduce or eliminate the harmful effects of sedentary television watching.

"Our findings show that how you spend your time outside of work may matter more when it comes to heart health," study author Keith M. Diaz, PhD, assistant professor of behavioral medicine at Columbia University Vagelos College of Physicians and Surgeons and a certified exercise physiologist. "Even if you have a job that requires you to sit for long periods of time, replacing the time you spend sitting at home with strenuous exercise could reduce your risk of heart disease and death."

The study was published online today in the Journal of the American Heart Association.

Background

A growing body of research shows that people who are sedentary--especially those who sit for long, uninterrupted periods of time--have a higher risk of cardiovascular disease and death.

But most previous studies did not follow people over time, making it difficult to draw conclusions about the relationship between sedentary behavior and health risk. These studies have included mainly people of European descent rather than African Americans, a group that has a higher risk of heart disease compared with whites. Previous studies also measured physical activity using an activity monitor, which is unable to distinguish between different types of sedentary behavior.

What the Study Found

The new study followed 3,592 people, all African Americans, living in Jackson, Miss., for almost 8.5 years. The participants reported how much time they typically spent sitting while watching TV and during work. They also reported how much time they spent exercising in their down time.

The participants who had logged the most TV-viewing hours (4 or more hours a day) had a 50% greater risk of cardiovascular events and death compared to those who watched the least amount of TV (less than 2 hours a day).

In contrast, those who sat the most at work had the same health risks as those who sat the least.

Even for the most dedicated TV watchers, moderate to vigorous physical activity--such as walking briskly or doing aerobic exercise--reduced the risk of heart attacks, stroke, or death. No increased risk of heart attack, stroke, or death was seen in people who watched TV for 4 or more hours a day and engaged in 150 minutes or more of exercise a week.

Why Does the Type of Sitting Matter?

In a previous study, Diaz found that excessive sitting is linked to worse health outcomes, and even more so when sitting occurs in lengthy, uninterrupted bouts.

"It may be that most people tend to watch television for hours without moving, while most workers get up from their desk frequently," Diaz says. "The combination of eating a large meal such as dinner and then sitting for hours could also be particularly harmful."

"More research is needed, but it's possible that just taking a short break from your TV time and going for a walk may be enough to offset the harm of leisure-time sitting," adds Diaz. "Almost any type of exercise that gets you breathing harder and your heart beating faster may be beneficial."

And although occupational sitting was less problematic, Diaz notes that the same approach to movement applies at work. "We recognize that it isn't easy for some workers, like truck drivers, to take breaks from sitting, but everyone else should make a regular habit of getting up from their desks. For those who can't, our findings show that what you do outside of work may be what really counts."

The researchers suspect that the study's findings may be applicable to anyone who is sedentary, even though the study focused on African Americans.

What's Next

In future studies, Diaz will examine why TV watching may be the most harmful sedentary behavior and whether the timing of sedentary behavior around dinner time could be a contributing factor.

Credit: 
Columbia University Irving Medical Center

Tool searches EHR data to find child leukemia patients for clinical studies

image: Charles A. Phillips, M.D., is a pediatric oncologist at Children's Hospital of Philadelphia.

Image: 
Children's Hospital of Philadelphia

Researchers who analyzed data in the electronic health records (EHR) of children seen by hematology/oncology specialists at three large medical centers have developed an algorithm to accurately identify appropriate pediatric oncology patients for future clinical studies. By expediting and refining the selection of patients for research, the researchers aim to ultimately improve outcomes for a variety of pediatric cancers.

"Accurately identifying patient cohorts is key to designing better research," said study leader Charles A. Phillips, MD, a pediatric oncologist at Children's Hospital of Philadelphia (CHOP). "Because not every patient in large datasets would be appropriate for a clinical study, having a tool to separate signals from the noise will help researchers leverage data to design pragmatic, real-world studies in patients with a range of different cancers. For instance, we could better evaluate nausea medicines or detect factors that influence the rates of infections in patients with central line placements."

Phillips and colleagues published their study online June 17, 2019 in Pediatric Blood and Cancer.

The study team analyzed EHR-derived data in PEDSnet, a national pediatric clinical research network, from 2011 to 2016 at three large pediatric hospital systems: CHOP, Children's Hospital Colorado and Seattle Children's Hospital. The EHR data included diagnoses, procedures, medications, laboratory tests and provider specialties.

In contrast to the narrowly defined eligibility requirements and smaller numbers of patients in clinical trials testing drugs in specific subtypes of cancers, said Phillips, studies of supportive care issues in patients with a broader range of cancer diagnoses may draw on already available data in EHR, but accuracy in patient selection is crucial.

"We found that over half of the children referred to an inpatient or outpatient clinic with a leukemia or lymphoma diagnosis in their charts did not actually have cancer," said Phillips. Some of the patients were survivors with a remote history of cancer, others were seen to rule out a cancer diagnosis, others were miscoded on the charts." He added that a single, isolated diagnostic code may not be reliable, in contrast to multiple diagnoses.

Therefore, in this study, Phillips and colleagues created a "computable phenotype," automating their search algorithm to check off a series of boxes: starting with at least three visits to a pediatric hematologist-oncologist (27,450 patients), then at least one leukemia or lymphoma diagnosis, which narrowed the number to 4,535. A further screen required the three specialist visits, at least two diagnostic codes and at least two administrations of chemotherapy--which winnowed the total to 1,825 patients. The final group of 1,825 was the computable phenotype curated cohort--suitable as a clinical study group.

When reviewers analyzed that cohort's full medical records in masked reviews, the computable phenotype showed 100 percent sensitivity and 99 to 100 percent specificity in accurately classifying the patients as having pediatric leukemia or lymphoma.

"This algorithm can accurately and efficiently narrow down the number of medical charts researchers need to review to identify a patient cohort for subsequent clinical studies," said Phillips. Although he added that further studies may be needed to refine the algorithm to meet their study-specific needs, it offers a potential new tool to clinical researchers in improving outcomes for children with leukemia or lymphoma, who represent about 40 percent of all U.S. pediatric cancers.

Credit: 
Children's Hospital of Philadelphia

Being a 'morning person' linked to lower risk of breast cancer

Being a morning person (popularly known as larks) is associated with a lower risk of developing breast cancer than being an evening person (popularly known as owls), finds a study published by The BMJ today.

Sleeping longer than the recommended 7-8 hours a night may also carry an increased risk, the results suggest.

The authors have previously posted a non-peer reviewed, unedited version of this research paper on a recognised preprint server* and presented it at the NCRI Cancer Conference in November 2018.

One in seven women will develop breast cancer at some stage in their lives. Previous studies have shown a link between night shift work and risk of breast cancer, thought to be due to disrupted sleep patterns, light exposure at night, and other lifestyle factors. But there has been much less research into the potential effects of sleep habits on breast cancer risk.

So an international research team set out to examine whether certain sleep traits could have a direct (causal) effect on risk of developing breast cancer.

Using a technique called Mendelian randomisation, they analysed genetic variants associated with three particular sleep traits - morning or evening preference (chronotype), sleep duration, and insomnia - for 180,216 women in the UK Biobank study and 228,951 women in the Breast Cancer Association Consortium (BCAC) study.

Analysing genetic information in this way avoids some of the problems that afflict traditional observational studies, making the results less prone to unmeasured (confounding) factors, and therefore more likely to be reliable.

An association that is observed using Mendelian randomisation strengthens the inference of a causal relationship.

In observational analysis of UK Biobank data, morning preference was associated with a slightly lower risk of breast cancer (one less woman per 100) than evening preference, whereas there was little evidence for an association with sleep duration and insomnia symptoms.

However, the authors stress that this represents differences at the extreme ends of the scale, and that the extent of effect is likely to be smaller than that of other known risk factors for breast cancer, such as BMI and alcohol intake.

Mendelian analysis of UK Biobank data provided some supportive evidence for a protective effect of morning preference on breast cancer risk, but imprecise estimates for sleep duration and insomnia symptoms.

Mendelian analysis from BCAC also supported a protective effect of morning preference, and showed a potential harmful effect of longer sleep duration (more than the recommended 7-8 hours) on breast cancer, whereas evidence for insomnia symptoms was inconsistent.

The researchers point to some limitations, for example the study partly relied on self-reported sleep measures and was restricted to women of European ancestry, so findings may not be applicable to other groups.

However, they used several methods to assess data from two high quality resources, and took account of established and potential risk factors. Results also remained largely unchanged after further sensitivity analyses.

As such, the researchers say their findings "provide strong evidence for a causal effect of chronotype on breast cancer risk." Further work to uncover possible reasons for the associations between sleep disruption and breast cancer is required, they add. Nonetheless, these findings "have potential implications for influencing sleep habits of the general population in order to improve health."

In a linked editorial, Professor Eva Schernhammer from the University of Vienna says these findings "identify a need for future research exploring how the stresses on our biological clock can be reduced."

This offers a tremendous opportunity for preserving good health, achieving healthy aging, and, more specifically, for developing new personalised strategies for reducing the risk of chronic diseases associated with the circadian system, she adds.

This line of research "could also help to align working hours with chronotype--to more closely match externally imposed timing with individual diurnal preference, especially in the working population," she concludes.

Credit: 
BMJ Group

The fundamental physics of frequency combs sheds light on nature's problem-solving skills

Nature has a way of finding optimal solutions to complex problems. For example, despite the billions of ways for a single protein to fold, proteins always fold in a way that minimizes potential energy. Slime mold, a brainless organism, always finds the most efficient route to a food source, even when presented with an obstacle. A jump rope, when held on both ends, always ends up in the same shape, a curve known as catenary.

This kind of optimization is explained by what's known as a variational principle: any other deformation - or variation - of the shape found by the protein, mold or jump rope would require more energy.

Now, researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), have found that some lasers use the same principle. The research is described in Physical Review Letters.

Frequency combs are widely-used, high-precision tools for measuring and detecting different frequencies -- a.k.a. colors -- of light. Unlike conventional lasers, which emit a single frequency, these lasers emit multiple frequencies in lockstep, evenly spaced to resemble the teeth of a comb.

When a laser produces a frequency comb, it emits waves of light that repeat themselves periodically in time. Depending on the parameters of the comb, these waves can either have constant intensity while varying in color, or look like short pulses of light that build and drop in intensity.

Researchers know how combs produce pulses, but how so-called frequency-modulated lasers can maintain a constant intensity in the face of changing frequencies has been a long-lasting puzzle.

The team of researchers, led by Federico Capasso, the Robert L. Wallace Professor of Applied Physics and Vinton Hayes Senior Research Fellow in Electrical Engineering, were able to reconstruct on a time scale of a trillionth of a second the waveform emitted by light sources known as quantum cascade lasers, widely used in spectroscopy and sensing. They found that the lasers choose to emit light waves in a way that not only suppresses the intensity fluctuations --leading to a constant intensity in time -- but also maximizes the power output.

"We discovered that a frequency-modulated laser can adjust parameters by itself, similar to a DJ turning knobs on a music synthesizer, to minimize fluctuations of the emitted intensity wave," said Marco Piccardo, a postdoctoral fellow at SEAS and first author of the paper. "Turning all these knobs in the right way is not an easy task. In producing a nearly-flat intensity waveform, the frequency-modulated laser has solved a complex optimization problem, performing just like an analog computer."

"This discovery unravels the physics of a promising frequency comb technology," said Capasso. "Benefitting by a minimal intensity modulation at the laser output, these devices could rival conventional ultra-short pulse mode-locked lasers in spectroscopy applications."

Credit: 
Harvard John A. Paulson School of Engineering and Applied Sciences

Ancient DNA analysis adds chapter to the story of neanderthal migrations

After managing to obtain DNA from two 120,000-year-old European Neandertals, researchers report that these specimens are more genetically similar to Neandertals that lived in Europe 80,000 year later than they are to a Neandertal of similar age found in Siberia. The findings, which reveal a stable, 80,000-year ancestry for European Neandertals, also suggest that this group may have migrated east and replaced some Siberian Neandertal populations. The work begins to unravel the early history of Neandertals, which has otherwise been inaccessible since DNA predating 100,000 years ago was lacking. Bone samples and genetic evidence indicate that Neanderthals lived in Europe and Central Asia until about 40,000 years ago. Recent studies have shown that those last Neanderthals all belonged to a single group, descended from a common ancestor who lived 97,000 years ago. However, a Neanderthal dated to 90,000 years ago found in Denisova Cave in modern day Siberia appears to be more closely related to those late Neanderthals than to the so-called Altai Neanderthal found in the same cave, but dated to 120,000 years ago. This suggests that there had been an early Neanderthal migration into Siberia, followed by a later migration from Europe that replaced the earlier population. To clarify how this happened, Stéphane Peyrégne and colleagues obtained nuclear DNA samples from Western European Neanderthals who lived about 120,000 years ago -- one from Scladina Cave in Belgium (called Scladina), and the other from Hohlenstein-Stadel Cave in Germany (HST). Using advanced techniques to account for microbial and present-day human DNA contamination, the study authors found that Scladina and HST were members of a population in Western Europe that gave rise to all currently identified Neanderthals except the Altai Neanderthal. This suggests that the population to which the Scladina and HST belonged lived in Western Europe contemporaneously with the Altai population in Siberia and later migrated east to replace them. Surprisingly, the researchers also found highly divergent mitochondrial DNA in HST, indicating an even more complex history that warrants further investigation.

Credit: 
American Association for the Advancement of Science (AAAS)

Widespread disease diabetes: Why do beta cells refuse to release insulin?

image: Super-connected "leader" cells coordinate insulin response and help us to understand how diabetes develops.

Image: 
© CRTD

Due to the increasing insulin resistance of the cells, patients suffer from an increased blood sugar level with far-reaching consequences. After many years of illness, insulin production dries up and patients with type 2 diabetes have to inject insulin.

What causes the lack of insulin production in people with type 2 diabetes? Researchers from the Center for Regenerative Therapies (CRTD) at the Technische Universität Dresden (TUD) together with colleagues from Imperial College London and other research institutes from the UK, Canada and Italy have observed amazing cell interactions: The beta cells of the pancreas work as highly-connected clusters, known as islets, and their responses to rising blood glucose levels are coordinated by small teams of "leader cells".

Previous work from co-author Professor Guy Rutter from Imperial College London and Professor David Hodson (now at Birmingham University in the UK) had provided evidence that this may be the case using isolated tissues. To show that this was also true in living animals including in zebrafish and mouse, the research teams developed an innovative imaging technique which allowed them to observe beta cells' hierarchical relationship "in vivo".

"In these model organisms we saw that when blood glucose levels increased, the response of beta cells originated from temporally defined leader cells. When we selectively deleted the leader cells, the level of coordination in subsequent responses to glucose was disrupted", explains CRTD PhD student Luis Delgadillo Silva, one of the two lead authors of the study. Mathematical analysis revealed that the leader cells have a controlling role over the islet. In addition, the researchers were able to show that some beta cells contained a unique molecular signature, which would allow them to be more metabolically active and perhaps more glucose-sensitive.

Based on their findings, the scientists will now aim to understand how important the leader cells are in the development of diabetes. "It's important for us to understand if the leader cells are vulnerable to damage as diabetes develops and, crucially, whether they can be targeted to maintain strong and healthy insulin responses to help cure the disease", explains Dr. Victoria Salem, senior clinical research fellow in the Section of Investigative Medicine at Imperial College London who co-led the UK study.

"To understand better the role of leader cells in islet function, we have established a set of new tools in zebrafish, which will help us to activate or silence beta cells by shining light on them, as well as to track individual cells over time. Using these tools, we will be able to ask precisely how many cells are controlled by a leader cell and what genes determine the identity of a leader cell", says Luis Delgadillo Silva.

The Scientists just published their results in the scientific journal Nature Metabolism, and are featured on the cover of the journal. The Dresden part of the study received funding from TUD / CRTD, the German Research Foundation, the Free State of Saxony, the German Center for Diabetes Research and the European Foundation for the Study of Diabetes.

Luis Delgadillo Silva is part of the research group of Dr. Nikolay Ninov. The team is investigating the beta cells of the pancreas as the key metabolic sensors and effectors for insulin release. They conduct their studies at the CRTD of TU Dresden, where top researchers from more than 30 countries are deciphering the principles of cell and tissue regeneration for disease diagnosis and treatment. The CRTD links laboratory and clinic, connects scientists with physicians, uses expertise in stem cell research, genome editing, and tissue regeneration - all for one goal: the cure of metabolic diseases such as diabetes, neurodegenerative diseases such as ALS, Alzheimer's and Parkinson's disease, haematological diseases such as leukaemia, as well as eye and bone diseases using novel diagnostic tools and therapeutic options.

Credit: 
Technische Universität Dresden

Experimental physicists redefine ultrafast, coherent magnetism

image: The red arrows mark the ordered magnetic moment of a layer stack of nickel (ferromagnet) and platinum (metal) before an ultrashort laser pulse inverts the magnetization of the two layers

Image: 
© J.K. Dewhurst

Electronic properties of materials can be directly influenced via light absorption in under a femtosecond (10-15 seconds), which is regarded as the limit of the maximum achievable speed of electronic circuits. In contrast, the magnetic moment of matter has only been able to be influenced up to now by a light and magnetism-linked process and roundabout way by means of magnetic fields, which is why magnetic switching takes that much longer and at least several hundred femtoseconds. A consortium of researchers from the Max Planck Institutes for Quantum Optics and for Microstructure Physics, of the Max Born Institute, at the University of Greifswald and Graz University of Technology have only now been able to manipulate the magnetic properties of a ferromagnetic material on a time scale of electrical field oscillations of visible light - and thus in sync with the electrical properties - by means of laser pulses. This influence was able to be accelerated by a factor of 200 and was measured and represented using time-resolved attosecond spectroscopy. The researchers described their experiment in the journal Nature.

Composition of the material as a crucial criterion

In attosecond spectroscopy, magnetic materials are bombarded with ultra-short laser pulses and electronically influenced. "The light flashes set off an intrinsic and usually delaying process in the material. The electronic excitation is translated into a change in magnetic properties," explains Martin Schultze, who until recently worked at the Max Planck Institute for Quantum Optics in Munich, but who is now professor at the Institute of Experimental Physics at TU Graz. Due to the combination of a ferromagnet with a non-magnetic metal, the magnetic reaction in the described experiment, however, is brought about as fast as the electronic one. "By means of the special constellation, we were optically able to bring about a spatial redistribution of the charge carrier, which resulted in a directly linked change in the magnetic properties," says Markus Münzenberg. Together with his team in Greifswald, he developed and produced the special material systems.

Schultze is enthusiastic about the scale of the success of the research: "Never before has such a fast magnetic phenomenon been observed. Through this, ultrafast magnetism will take on a completely new meaning." Sangeeta Sharma, researcher at the Max Born Institute in Berlin who predicted the underlying process using computer models, is impressed: "We are expecting a significant development boost from this for all applications in which magnetism and electron spin play a role."

Initial step towards coherent magnetism

Furthermore, the researchers show in their measurements that the observed process runs coherently: this means the quantum mechanical wave nature of the moving charge carriers is preserved. These conditions allow scientists to use individual atoms as information carriers instead of larger units of material or to influence the changing magnetic properties using another specifically delayed laser pulse, thus advancing technological miniaturisation. "Regarding new perspectives, this could lead to similar fantastic developments as in the field of magnetism, such as electronic coherence in quantum computing," says Schultze hopefully, who now leads a working group focusing on attosecond physics at the Institute of Experimental Physics.

Credit: 
Graz University of Technology

The water future of Earth's 'third pole'

video: Rapid changes in the region's climate are affecting glacier flows and snowmelt. Local people are already modifying their land-use practices in response to the changing supply, and the region's ecology is transforming. Scientists estimate that by 2100, these glaciers could be up to 75% smaller in volume.

Watch on YouTube: https://www.youtube.com/watch?v=2Ci84sPsJQU

Download in HD: https://svs.gsfc.nasa.gov/13243

Image: 
NASA's Goddard Space Flight Center

Himalaya. Karakoram. Hindu Kush. The names of Asia's high mountain ranges conjure up adventure to those living far away, but for more than a billion people, these are the names of their most reliable water source.

Snow and glaciers in these mountains contain the largest volume of freshwater outside of Earth's polar ice sheets, leading hydrologists to nickname this region the Third Pole. One-seventh of the world's population depends on rivers flowing from these mountains for water to drink and to irrigate crops.

Rapid changes in the region's climate, however, are affecting glacier melt and snowmelt. People in the region are already modifying their land-use practices in response to the changing water supply, and the region's ecology is transforming. Future changes are likely to influence food and water security in India, Pakistan, China and other nations.

NASA is keeping a space-based eye on changes like these worldwide to better understand the future of our planet's water cycle. In this region where there are extreme challenges in collecting observations on the ground, NASA's satellite and other resources can produce substantial benefits to climate science and local decision makers tasked with managing an already-scarce resource.

The most comprehensive survey ever made of snow, ice and water in these mountains and how they are changing is now underway. NASA's High Mountain Asia Team (HiMAT), led by Anthony Arendt of the University of Washington in Seattle, is in its third year. The project consists of 13 coordinated research groups studying three decades of data on this region in three broad areas: weather and climate; ice and snow; and downstream hazards and impacts.

All three of these subject areas are changing, starting with climate. Warming air and alterations in monsoon patterns affect the regional water cycle - how much snow and rain falls, and how and when the snowpack and glaciers melt. Changes in the water cycle raise or lower the risk of local hazards such as landslides and flooding, and have broad impacts on water allocation and crops that can be grown.

Making Impossible Science Possible

For most of human history, a detailed scientific study of these mountains was impossible. The mountains are too high and steep, and the weather too dangerous. The satellite era has given us the first opportunity to observe and measure snow and ice cover safely in places where no human has ever set foot.

"The explosive growth of satellite technology has been incredible for this region," said Jeffrey Kargel, a senior scientist at the Planetary Science Institute in Tucson, Arizona, and leader of a HiMAT team studying glacial lakes. "We can do things now that we couldn't do ten years ago - and ten years ago we did things we couldn't do before that." Kargel also credited advances in computer technology that have enabled far more researchers to undertake large data-processing efforts, which are required to improve weather forecasting over such complex topography.

Arendt's HiMAT team is charged with integrating the many, varied types of satellite observations and existing numerical models to create an authoritative estimate of the water budget of this region and a set of products local policy makers can use in planning for a changing water supply. A number of data sets by HiMAT teams have already been uploaded to NASA's Distributed Active Archive Center at the National Snow and Ice Data Center. Collectively, the suite of new products is called the Glacier and Snow Melt (GMELT) Toolbox.

Debris Dam Dangers and Other Impacts

There's some urgency in completing the toolbox, because changes in melt patterns appear to be increasing the region's hazards - some of which are found only in this kind of terrain, such as debris dam "failures" on glacial lakes and surging glaciers blocking access to mountain villages and pastures. In the last few decades, towns and infrastructure such as roads and bridges have been wiped out by these events.

Kargel's team is studying catastrophic flooding from glacial lakes. These lakes start as melt pools on the surfaces of glaciers, but under the right conditions they may continue to melt all the way to ground level, pooling behind a precarious pile of ice and debris that was originally the front end of the glacier. An earthquake, rockfall or simply the increasing weight of water may breach the debris dam and create a flash flood.

Lakes like this were almost unknown 50 or 60 years ago, but as most high mountain Asian glaciers have been shrinking and retreating, glacial lakes have been proliferating and growing. The largest one Kargel has measured, Lower Barun in Nepal, is 673 feet (205 meters) deep with a volume of almost 30 billion gallons (112 million cubic meters), or about 45,000 Olympic-sized swimming pools full. The HiMAT team has mapped every glacial lake larger than about 1,100 feet (330 meters) in diameter for three different time periods - about 1985, 2001 and 2015 - to study how the lakes have evolved.

As the size and number of glacial lakes increase, so does the threat they pose to the local population and infrastructure. Dalia Kirschbaum of NASA's Goddard Space Flight Center in Greenbelt, Maryland, leads a group that is using satellite data to predict what areas are most susceptible to landslides in high mountain Asia, which can then inform the placement of new infrastructure of the region.

Darker Snow, Faster Snowmelt

One critical factor in future rates of snow and ice melt is the role of dust, soot and pollution that settle on the frozen surfaces. Pristine white snow reflects more than 90% of incoming solar radiation back into the atmosphere. But when snow is blanketed by darker-colored particles of soot or dust, this coating absorbs more heat and the snow melts faster. Research has shown that the reason the Little Ice Age ended in Europe was the coating of soot deposited on the Alps by the Industrial Revolution. In Asia, the last 35 years have seen significant increases in the amount of soot settling on mountain snow. Whether these Asian ranges will react the same way the Alps did centuries ago is an important question.

Several HiMAT teams are focused on this issue. Si-Chee Tsay of NASA Goddard is using satellite data to gain a better understanding of the properties of snow, ice, and dust and soot particles in this region. His group is also working in collaboration with regional researchers in Nepal to install sensors at ground level on glaciers located on Mt. Everest, Annapurna and Dhaulagiri, among other sites. These sensors will allow researchers to check the accuracy of satellite readings obtained over the same sites.

Tom Painter of the University of California, Los Angeles, is leading a team using satellite data from NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) and the NOAA/NASA Visible Infrared Imaging Radiometer Suite (VIIRS) in the community Weather Research and Forecasting model to quantify past and possible future variations in snow cover and other factors as soot and dust change. Another team, led by Sarah Kapnick of NOAA, is accounting for dust and soot within global climate models, to improve understanding of both historical and predicted future regional changes.

The tallest mountains in the world make for unique challenges in weather forecasting. A team led by Summer Rupper of the University of Utah in Salt Lake City has addressed one of these challenges by developing a model that differentiates between ice and snow that were deposited on the region during the monsoon season and those that came from winter storms, so that scientists can study where and when snow is likely to fall throughout the year.

Early Conclusions

In the HiMAT survey's final year, Arendt said, the research is coming together and the teams' scientific papers are heading for publication. One of the more alarming conclusions is that the glaciers will be 35 to 75% smaller in volume by 2100 due to rapid melting. A paper published on June 19 in Science Advances by HiMAT team members supports this conclusion with an analysis of 40 years of satellite data on glaciers in the Himalayan range. (The early years of data that researchers used for this study come from declassified spy satellites.) Not only are all glaciers in the Himalayan Range losing ice, the average rate of ice loss doubled between the first 25 years of satellite data, 1975-2000, and the most recent 16 years, 2000-2016.

Whether rain and snowfall will also change, and whether changes would compound or mitigate the effects of ice loss, are not yet clear. Precipitation already varies considerably from one range to another in this region, depending on the monsoon and the flow of winter storms into the area. For example, precipitation is currently increasing in the Karakoram Range, where glaciers are either stable or advancing, but in every other range in this region, nearly all glaciers are retreating. Whether that anomaly will continue, grow stronger, or reverse as the climate continues to change is not yet clear. "Global climate dynamics will dictate where storms end up and how they intercept the mountains," Arendt said. "Even small changes in the tracking of the storms can create significant variability."

Findings like these are why the HiMAT teams are eager to complete their GMELT toolbox, Arendt noted. The new products will offer decision-makers the best compilation of knowledge that can currently be made of how high mountain Asia has been changing in recent decades, along with a new set of resources to help them plan how best to prepare for the future of this hard-to-predict region.

Credit: 
NASA/Goddard Space Flight Center

New indicators could help manage global overfishing

The smallest plants and creatures in the ocean power an entire food web, including the fish that much of the world's population depend on for food, work and cultural identity.

In a paper published in Science Advances, NOAA Fisheries researcher Jason Link and colleague Reg Watson from the University of Tasmania's Institute for Marine and Antarctic Studies suggest that scientists and resource managers need to focus on whole ecosystems rather than soley on individual populations. Population-by-population fishery management is more common around the world, but a new approach could help avoid damaging overfishing and the insecurity that brings to fishing economies.

"In simple terms, to successfully manage fisheries in an ecosystem, the rate of removal for all fishes combined must be equal to or less than the rate of renewal for all those fish," said Link, the senior scientist for ecosystem management at NOAA Fisheries and a former fisheries scientist at the Northeast Fisheries Science Center in Woods Hole, Massachusetts.

The authors suggest using large-scale ecosystem indices as a way to determine when ecosystem overfishing is occurring. They propose three indices, each based on widely available catch and satellite data, to link fisheries landings to primary production and energy transfer up the marine food chain. Specific threshholds developed for each index make it possible, they say, to determine if ecosystem overfishing is occurring. By their definition, ecosystem overfishing occurs when the total catch of all fish is declining, the total catch rate or fishing effort required to get that catch is also declining, and the total landings relative to the production in that ecosystem exceed suitable limits.

"Detecting overfishing at an ecosystem level will help to avoid many of the impacts we have seen when managing fished species on a population-by-population basis, and holds promise for detecting major shifts in ecosystem and fisheries productivity much more quickly," said Link.

As an example, in the North Sea, declines in these indices suggested that total declines in fish catch indicative of ecosystem overfishing was occurring about 5-10 years earlier than what was pieced together by looking at sequential collapses in individual populations of cods, herrings and other species. Undue loss of value and shifting the catches in that ecosystem to one dominated by smaller fishes and invertebrates could have been avoided, the authors say.

Looking at the Whole Ecosystem

The first index used in the study is the total catch in an area, or how much fish a given patch of ocean can produce. The second is the ratio of total catches to total primary productivity, or how much fish can come from the plants at the base of the food chain. The third index is the ratio of total catch to chlorophyll, another measure for marine plant life, in an ecosystem.

Proposed thresholds for each index are based on the known limits of the productivity of any given part of the ocean. Using these limits, the authors say local or regional context should be considered when deciding what management actions to take to address ecosystem overfishing. Having international standards would make those decisions much easier and emphasize sustainable fisheries.

The authors named the indices in honor of the late marine biologist John Ryther and NOAA Fisheries scientists Michael Fogarty and Kevin Friedland, both at the Northeast Fisheries Science Center. All have worked extensively on integrating ecosystem and fishery dynamics for better resource management.

Shifting Fish, Not Fleets

"We know that climate change is shifting many fish populations toward the poles, yet the fishing fleets and associated industries are not shifting with them," Link said. "That already has had serious economic and cultural impacts." The authors note that they are able to follow these shifts over time and see how they can exacerbate or even contribute to ecosystem overfishing.

Fisheries are an important part of the global economy. In addition to trade and jobs, fish provide the primary source of protein to more than 35 percent of the world's population, and 50 percent of the people in the least developed countries, according to the authors. Regions where the greatest amount of ecosystem overfishing occurs are also where impacts can be the greatest.

Tropics, Temperate Areas Face Most Overfishing

The researchers looked at 64 large marine ecosystems around the world and found those in the tropics, especially in Southeast Asia, have the highest proportion of ecosystem overfishing. Temperate regions also have a high level of ecosystem overfishing, with limited capability to absorb shifting fishing pressure from the tropics as species move toward the poles.

"Even if tropically-oriented fleets were able to shift latitudes and cross claims for marine exclusive economic zones, it remains unclear if temperate regions could absorb shifts from the tropics. Many temperate regions are also experiencing ecosystem overfishing and catches there have been flat for more than 30 years," Link said.

Potential International Standard

The three indices proposed represent a potential international standard for tracking the status of global fisheries ecosystems.

"They are easy to estimate and interpret, are based on widely repeated and available data, and are a practical way to identify when an ecosystem would be experiencing overfishing based on well understood and well-accepted primary production and food web limitations," Link said. "It would eliminate a lot of the debate about whether or not ecosystem overfishing is happening and instead focus attention on solutions. But until we can define and identify what ecosystem overfishing is, we cannot begin to address it."

Credit: 
NOAA Northeast Fisheries Science Center

Kyushu U researchers unlocking keys to longevity of egg cell supply in mammals

video: The nuclei of dormant mouse egg cells can be seen to be rotating in this time-lapse microscope video taken over 30 minutes. Researchers from Kyushu University have linked this rotation to the mechanism that keeps the egg cells in a dormant state until needed so that the mice can have a long period of fertility.

Image: 
Go Nagamatsu, Kyushu University

Since female mammals are born with a limited number of egg cells, most of the egg cells wait in a dormant state before maturing later in life to ensure a long period of fertility. However, how this dormant state is achieved and then maintained is still largely shrouded in mystery.

Researchers at Kyushu University's Department of Stem Cell Biology and Medicine have now uncovered several conditions necessary for creating this supply of dormant egg cells in mice, providing useful insight for the further understanding and development of reproductive biology and medicine.

The work, which span two papers, investigates the problem from two different angles: the creation of dormant egg cells from mouse stem cells and the maintenance of the dormant state in ovaries from mice.

While scientists studying mice have been able to create egg cells from stem cells, the egg cells created in the laboratory begin maturing after quickly passing through the dormant state, and the conditions to initially induce a longer-lasting dormant state have been unclear.

"The environment in nature and in the lab is vastly different, so we have been searching for how to recreate the necessary conditions in the lab to obtain the same growth process found in nature," says So Shimamoto, the first author on the Proceedings of the National Academy of Sciences of the United States of America paper reporting the results.

The researchers compared the various chemicals present at different stages in egg cells from mice to those in egg cells grown in the lab and found signs that the egg cells from mice were exposed to a reduced amount of oxygen while also having an increased level of the protein FOXO3.

By limiting the oxygen reaching the egg cells in the lab while at the same time increasing the amount of FOXO3, the researchers obtained egg cells very similar to those in the dormant state from live mice.

"This is the closest so far that egg cells developed from stem cells in the lab have come to resembling those found in nature," comments Katsuhiko Hayashi, head of the group leading the two studies.

At the same time, the researchers have also been investigating how this dormant state is maintained by transplanting mouse ovaries at various stages of development into adult mice and comparing differences between ovaries that retained their dormant egg cells to those that did not.

In this case, they found that egg cells that remained in the dormant state where surrounded by a denser extracellular matrix--the network of molecules between cells--that could be mechanically compressing the dormant egg cells.

Treating the ovaries to breakdown this extracellular matrix and relieve the compression caused the egg cells to show signs of exiting the dormant state. Furthermore, exerting pressure on these treated ovaries caused the egg cells to show characteristics of returning to the dormant state.

"While these observations supported our initial idea of pressure being a factor, we were surprised to find that the nuclei of some of the egg cells were also rotating when observed under the microscope," explains Go Nagamatsu, lead author on the paper reporting these latter results in Science Advances.

This rotation of the nuclei, which house the genetic material of the cells, also appears to play a role in the dormant state, as compressing the cells induced rotation of the nuclei while treating the cells with a chemical to inhibit rotation led to signs of maturation even under compression.

"By pursuing multiple approaches, we are making progress in better understanding this critical aspect of reproduction, but these results show that many factors come into play and that we may still just be scratching the surface," says Hayashi.

"We hope that the knowledge gained through this and continued research will contribute to developments in reproductive medicine in the future."

Credit: 
Kyushu University

Towards a worldwide inventory of all plants

image: Plant checklists for geographic regions like the island of Tenerife form the basis of the GIFT database. Here 630 native plant species occur from dry coastal scrublands up to the alpine vegetation of Mount Teide. Around 120 of the species only occur on this particular island and more than 300 species are restricted to the Canary Islands.

Image: 
Ptarick Wiegelt

Declining biodiversity due to man-made habitat destruction and climate change means that information about plant diversity and its distribution across the planet is now crucial for biodiversity conservation. With the Global Inventory of Floras and Traits (GIFT), a team of researchers from the Department of Biodiversity, Macroecology and Biogeography at the University of Göttingen has taken an important step forward in documenting and understanding global plant diversity. The results appear in the Journal of Biogeography.

It is over 200 years since Alexander von Humboldt started investigating the striking differences in plant diversity, and for a long time research progress has been limited by data availability. In recent times, the sheer number of known species and complex multiple facets of biodiversity, such as species richness, functional plant characteristics and relatedness, has overwhelmed efforts to bring all this data together. The researchers had to collect and standardise huge amounts of information from hundreds of published checklists and numerous unpublished regional inventories. For the first time, GIFT collates information about the plant species composition in nearly 2,900 regions including islands and protected areas. The data already covers about 79 percent of the global land surface and includes 80 percent (over 315,000 species) of all plant species known to science. The GIFT database links plant species to their geographic distribution, structural characteristics and to modern reconstructions of their evolutionary relationships, as well as to geographic, climatic and socio-economic characteristics of the regions.

"GIFT allows researchers, for the first time, to analyse near complete patterns of global plant diversity and regional species composition along with past and present effects", says Head of Department, Professor Holger Kreft. "Given the recent warnings about the devastating impact of humans on nature, for instance by the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES), and an increasing public awareness about climate change and its consequences, the publication of GIFT is a vital contribution to the field," says Dr Patrick Weigelt, lead author. "We envision data from GIFT to serve as a baseline to assess changes in plant diversity due to climate change, habitat alteration or introduced invasive species from a local right up to a global scale."

Credit: 
University of Göttingen

Restricted permit-only access to Yosemite National Park's Half Dome summit, anticipated to improve hiker safety, did not

image: Half Dome Trail map. The Half Dome Trail splits midway into the John Muir Trail and Mist Trail, which converge at Nevada Falls to form a single route through Little Yosemite Valley to Half Dome. Permit area is the restricted section including subdome, the cable route, and summit.

Image: 
Figure adapted from NPS file created by Jared Doke on August 3, 2010, and Paul Doherty on August 12, 2010, using NAD 1983 UTM Zone 11N.

Philadelphia, June 26, 2019 - Overcrowding on the trails leading to the summit on Yosemite National Park's Half Dome is not the key factor influencing hiker safety, according to a new study in Wilderness & Environmental Medicine, published by Elsevier. Implementation in 2010 of permit-only access to Half Dome cable handrails along the final ascent of this iconic landmark reduced the number of people on the summit at one time, but this did not result in a significant reduction in the overall toll of associated human suffering and mortality, or search and rescue (SAR) activity and costs.

"Using trailhead quotas in national parks to limit the number of hikers prevents natural resource degradation and preserves opportunities for solitude. Does the risk of a dangerous incident increase when more people ascend at once? Our study debunked that widespread belief. The data showed that limiting day-hiking access to only a few hundred people is not effective as a safety strategy," explained lead investigator Susanne J. Spano, MD, University of California, San Francisco Fresno, Fresno, CA, USA.

Yosemite National Park, like many US national parks, is increasingly popular as a tourist destination. The Half Dome is a popular, but challenging, day hike and climbing area, and there are frequent incidents that require SAR activity and expense. Improving safety has become increasingly important with the increased number of visitors. As a result, a lottery to issue permits to restrict use of the cable handrails leading to Half Dome was instituted in 2010. The permits limited the number of hikers in the region by 66 percent, yet analysis of the safety data found that the restrictions were not effective at reducing their risk of experiencing dangerous episodes.

This observational study compared the number of incidents, major incidents (exceeding $500), victims, and fatalities before (2005 to 2009) and after (2011 to 2015) permitting. From 2005 to 2009, there were 85 SAR incidents, 134 victims, eight fatalities, and 38 major incidents. After the permits were required, from 2011 to 2015, the same area saw 54 SAR incidents, 156 victims, four fatalities, and 35 major incidents. Dr. Spano concluded, "Statistically speaking, especially when considering incidents fell in the park overall during the same time period, that's no difference at all." There were no statistically significant SAR cost savings demonstrated by the study. Extremity injuries and medical and gastrointestinal illnesses have remained the most frequent causes for rescue activity over the past three decades.

The investigators noted that a possible factor contributing to Half Dome injuries was "summit fever," predicated by the difficulty that average hikers have in acquiring permits. With greatly curtailed opportunities, hikers may press on even when they feel fatigued or disabled by injury or bad weather, thinking that they may not get another chance to ascend Half Dome. In 2017, the success rate for the preseason lottery was as low as 2 percent for weekend dates and 32 percent for weekdays, San Francisco Chronicle columnist Tom Stienstra reported in March 2018 that the park granted weekend permits to an average 24 percent of applicants.

"If people are getting hurt, it's important to figure out why. Limiting day-hiking access to only a few hundred people may not be the best strategy for the public, or the park, unless preventing resource degradation is the key objective of a permit intervention. Prospective studies are needed to fully evaluate whether limiting use of handrails leads to detrimental effects for the permit holders and to better identify the variables having the greatest impact on mortality and cost," noted Dr. Spano.

Credit: 
Elsevier

Bleach-induced transformation for humidity-durable air filters

image: The paddlewheel-shaped structure of HKUST-1 was stable after weeks of submersion in water.

Image: 
DGIST

Adding hydroquinone, a skin-bleaching ingredient, to a well-known 'metal organic framework' changes its copper ions in a way that makes this porous material exceptionally stable in water.

"We developed a new method to enhance the water stability of metal organic frameworks, with potential for applications that can effectively filter and purify air from ultrafine dust without decomposing due to humidity," says DGIST materials scientist Nak Cheon Jeong. The Korean researchers reported their findings in the Journal of the American Chemical Society.

Metal organic frameworks (MOFs) are made from metal ions bonded by organic links. They assemble in a way that leads to the formation of internal cage-like structures, giving the material its porous nature. MOFs have an impressive surface area compared to other porous materials. It is this, and the ability of scientists to tune their structures, that has led to their use in a wide range of applications, including gas uptake, molecule separation, drug delivery, and catalysis. Most MOFs decompose in the presence of humidity and water, so scientists have been looking for ways to make them more durable.

Jeong and his colleagues found that treating a well-known copper-based MOF, called HKUST-1, with hydroquinone at 80°C made the material so stable that it didn't degrade after weeks of submersion in water or even after two years of exposure to humid air.

Copper ions and their organic links in HKUST-1 assemble to form large and small cages with paddlewheel-shaped metal ion nodes. Normally, water molecules attach to elements within this MOF, displacing the bonds between the copper ions and organic links, and causing the material to degrade or transform into a non-porous solid. Hydroquinone treatment, on the other hand, leads to a very stable HKUST-1 in water.

Jeong and his team found that a single electron from hydroquinone is transferred to cupric ions (Cu2+) within HKUST-1, changing them to cuprous ions (Cu+). This change is self-limiting: no more than 30% of the cupric ions change in this way. Half of the Cu+ ions remain in their positions on HKUST-1's paddlewheel cages. But the other half form complexes that dissociate from the structure and become trapped within the material's smaller cages, like a ship-in-a-bottle.

Further studies are needed to understand exactly how these changes lead to such a substantial improvement in HKUST-1's stability in water.

Jeong and his team believe the same concept could be applied to other copper-based paddlewheel MOFs. They plan to conduct follow-up research on potential practical applications for their approach.

Credit: 
DGIST (Daegu Gyeongbuk Institute of Science and Technology)

First snapshots of trapped CO2 molecules shed new light on carbon capture

image: Cryo-EM images show a slice through a single MOF particle in atomic detail (left), revealing cage-like molecules (center) that can trap other molecules inside. The image at right shows carbon dioxide molecules trapped in one of the cages -- the first time this has ever been observed. Bottom right, a drawing of the molecular structure of the cage and the trapped CO2.

Image: 
Li et al., <em>Matter</em>, 26 June 2019

Menlo Park, Calif. -- Scientists from the Department of Energy's SLAC National Accelerator Laboratory and Stanford University have taken the first images of carbon dioxide molecules within a molecular cage ¬¬- part of a highly porous nanoparticle known as a MOF, or metal-organic framework, with great potential for separating and storing gases and liquids.

The images, made at the Stanford-SLAC Cryo-EM Facilities, show two configurations of the CO2 molecule in its cage, in what scientists call a guest-host relationship; reveal that the cage expands slightly as the CO2 enters; and zoom in on jagged edges where MOF particles may grow by adding more cages.

"This is a groundbreaking achievement that is sure to bring unprecedented insights into how these highly porous structures carry out their exceptional functions, and it demonstrates the power of cryo-EM for solving a particularly difficult problem in MOF chemistry," said Omar Yaghi, a professor at the University of California, Berkeley and a pioneer in this area of chemistry, who was not involved in the study.

The research team, led by SLAC/Stanford professors Yi Cui and Wah Chiu, described the study today in the journal Matter.

Tiny specks with enormous surfaces

MOFs have the largest surface areas of any known material. A single gram, or three hundredths of an ounce, can have a surface area nearly the size of two football fields, offering plenty of space for guest molecules to enter millions of host cages.

Despite their enormous commercial potential and two decades of intense, accelerating research, MOFs are just now starting to reach the market. Scientists across the globe engineer more than 6,000 new types of MOF particles per year, looking for the right combinations of structure and chemistry for particular tasks, such as increasing the storage capacity of gas tanks or capturing and burying CO2 from smokestacks to combat climate change.

"According to the Intergovernmental Panel on Climate Change, limiting global temperature increases to 1.5 degrees Celsius will require some form of carbon capture technology," said Yuzhang Li, a Stanford postdoctoral researcher and lead author of the report. "These materials have the potential to capture large quantities of CO2, and understanding where the CO2 is bound inside these porous frameworks is really important in designing materials that do that more cheaply and efficiently."

One of the most powerful methods for observing materials is transmission electron microscopy, or TEM, which can make images in atom-by-atom detail. But many MOFs, and the bonds that hold guest molecules inside them, melt into blobs when exposed to the intense electron beams needed for this type of imaging.

A few years ago, Cui and Li adopted a method that's been used for many years to study biological samples: Freeze samples so they hold up better under electron bombardment. They used an advanced TEM instrument at the Stanford Nano Shared Facilities to examine flash-frozen samples containing dendrites - finger-like growths of lithium metal that can pierce and damage lithium-ion batteries - in atomic detail for the first time.

Atomic images, one electron at a time

For this latest study, Cui and Li used instruments at the Stanford-SLAC Cryo-EM Facilities, which have much more sensitive detectors that can pick up signals from individual electrons passing through a sample. This allowed the scientists to make images in atomic detail while minimizing the electron beam exposure.

The MOF they studied is called ZIF-8. It came in particles just 100 billionths of a meter in diameter; you'd need to line about 900 of them up to match the width of a human hair. "It has high commercial potential because it's very cheap and easy to synthesize," said Stanford postdoctoral researcher Kecheng Wang, who played a key role in the experiments. "It's already being used to capture and store toxic gases."

Cryo-EM not only let them make super-sharp images with minimal damage to the particles, but it also kept the CO2 gas from escaping while its picture was being taken. By imaging the sample from two angles, the investigators were able to confirm the positions of two of the four sites where CO2 is thought to be weakly held in place inside its cage.

"I was really excited when I saw the pictures. It's a brilliant piece of work," said Stanford Professor Robert Sinclair, an expert in using TEM to study materials who helped interpret the team's results. "Taking pictures of the gas molecules inside the MOFs is an incredible step forward."

Credit: 
DOE/SLAC National Accelerator Laboratory

Undercounting of agroforestry skews climate change mitigation planning and reporting

image: Farmer Pricila Kiprotich with fodder trees and shrubs for dairy in Kenya.

Image: 
S Odeyo, World Agroforestry. https://flic.kr/p/iq9Cii

BURLINGTON, VERMONT (26 June 2019) - Farmers incorporate trees into fields and pastures to earn cash from fruit or wood, increase fodder and shade for livestock, promote soil health or protect against wind or water erosion. In all cases, farmers contribute to climate change mitigation by increasing soil and biomass carbon sequestration.

But it appears they are a step ahead of the United Nations Framework Convention on Climate Change, the global organization that aims to stabilize our climate.

In an article published today in the journal Agriculture, Ecosystems and Environment, scientists expose the lack of measurement of and reporting on agroforestry in international climate agreements. Many countries intend to sequester carbon through agroforestry to help meet climate change mitigation targets, but the ability to document sequestration through agroforestry is often severely limited.

Scientists call for improved accounting and visibility, including through better data and satellite imagery, for agroforestry to support increases in food production and massive scaling of soil and biomass carbon sequestration.

Explicit ambitions, minimal reporting

Just as farmers value trees on their farms, 40% of developing countries name agroforestry as a strategy for adapting to and mitigating climate change. In Africa, 71% of countries identify agroforestry as a critical climate strategy.

However, research being released shows that just sixteen developing countries provided quantitative estimates that include the number or areal extent of trees outside forests. The gap between what is reported and what could be reported is immense: scientists have estimated that some type of agroforestry is practiced on 43% of all agricultural land - over 1 billion hectares - providing subsistence to more than 900 million people.

Scientists recommend four steps to improve national and global accounting of agroforestry, with recognition that better measurement, reporting and verification of farm-scale, national, and global benefits of agroforestry are necessary to tap transformative support for large-scale agroforestry. In many cases, capacity building and use of existing data could improve accounting of agroforestry.

Transparency of climate actions, including in agroforestry, is essential for long-term climate stability.

Credit: 
University of Vermont