Earth

Extent of US lives shortened by gun violence twice as great among blacks as whites

The higher rates of firearms suicide among white Americans after the age of 20 hasn't offset this yawning and widening racial gap in death rates linked to gun violence, the figures show.

Firearm deaths have become a major public health problem in the USA: US men can expect to live shorter lives than their peers in many other countries. And while overall US life expectancy increased from 76.8 in 2000 to 78.7 in 2014, it fell for the first time in 50 years in 2015, a trend that continued in 2016.

To try and quantify how much lives in the US might have been shortened by firearms assault and suicide since the turn of the century, and whether certain age groups and black Americans are disproportionately affected, the researchers used data for firearms deaths from 2000 to 2016.

The data came from the Centers for Disease Control and Prevention Wide-ranging Online Data for Epidemiologic Research (WONDER), and were categorised by age group, intent (suicide or assault), and race, and set against national estimates for life expectancy.

Reduced life expectancy usually falls with age, but it dropped suddenly at age 20 for black Americans as a result of being shot and killed, and steadily fell after the age of 20 among white Americans as a result of firearms suicide.

But although the premature shortening of life among whites over the age of 20 was greater than it was among blacks of the same age, this still didn't offset the reduced life expectancy among blacks-a finding that "is indicative of persisting disparities in homicide among younger age groups," say the researchers.

The calculations showed that the overall reduced life expectancy as a result of gun violence was just under 2.5 years, but it was twice as high among black Americans (4.14 years) as it was among white Americans (2.23 years).

Shootings lopped nearly a year off life expectancy in total, but nearly 3.5 years for black Americans compared with under 6 months for whites.

Firearm suicides shortened lives by 1.43 years, but by just over 6 months in black Americans compared with 1.62 years in whites.

The study authors point out that research in 2000 revealed a difference in deaths linked to gun violence between black and white Americans-a trend that has continued-but this gap looks to have widened even further, with the total premature loss of life even greater.

"Our study using cumulative data from 2000 to 2016 demonstrates a total firearm life expectancy loss of 905.2 days, which is nine times greater than observed in 2000, indicating increasing life expectancy loss by year," they write.

The authors acknowledge that the data didn't enable them to look at deaths by ethnicity-Hispanics, for example-and that many ethnic groups may be included in what is understood by black American nationality.

"Americans lose substantial years of life due to firearm injury," write the authors. "In the absence of comprehensive firearms legislation, targeted prevention programmes and policies are needed to mitigate the racial firearm injury gaps in the USA," they conclude.

Credit: 
BMJ Group

Too much or too little sleep linked to increased risk of cardiovascular disease and death

image: This is a take home image from accompanying editorial on the link between sleep and risk of cardiovascular disease and death.

Image: 
<i>European Heart Journal</i>

The amount of time you sleep, including daytime naps, is linked to your risk of developing cardiovascular disease and death, according to a study of over 116,000 people in seven regions of the world, published in the European Heart Journal [1] today (Wednesday).

The researchers found that people who slept for longer than the recommended duration of six to eight hours a day had an increased risk of dying or developing diseases of the heart or blood vessels in the brain. Compared to people who slept for the recommended time, those who slept a total of eight to nine hours a day had a 5% increased risk; people sleeping between nine and ten hours a day had an increased risk of 17% and those sleeping more than ten hours a day had a 41% increased risk. They also found a 9% increased risk for people who slept a total of six or fewer hours, but this finding was not statistically significant.

Before adjusting for factors that might affect the results, the researchers found that for every 1000 people sleeping six or fewer hours a night, 9.4 developed cardiovascular disease (CVD) or died per year; this occurred in 7.8 of those sleeping six to eight hours, 8.4 of those sleeping eight to nine hours, 10.4 of those sleeping nine to ten hours and 14.8 of those sleeping more than ten hours.

The lead author of the publication, Chuangshi Wang, a PhD student at McMaster and Peking Union Medical College, Chinese Academy of Medical Sciences, China, working at the Population Health Research Institute at McMaster, said: "Our study shows that the optimal duration of estimated sleep is six to eight hours per day for adults. Given that this is an observational study that can only show an association rather than proving a causal relationship, we cannot say that too much sleep per se causes cardiovascular diseases. However, too little sleep could be an underlying contributor to death and cases of cardiovascular disease, and too much sleep may indicate underlying conditions that increase risk."

Associations between sleep and death or cardiovascular and other diseases have been suggested by other studies, but results have been contradictory. In addition, they tended to look at particular populations and did not necessarily take account of the fact that in some countries daytime napping can be common and considered healthy.

This EHJ study looked at a total of 116,632 adults aged between 35 and 70 years in 21 countries with different income levels in seven geographic regions (North America and Europe, South America, the Middle East, South Asia, Southeast Asia, China and Africa). They were part of the Prospective Urban Rural Epidemiology (PURE) study that started in 2003.

During an average (median) follow-up time of nearly eight years, 4381 people died and 4365 suffered a major cardiovascular problem such as a heart attack or stroke. The researchers adjusted the results to take account of factors that could affect outcomes, such as age, sex, education, smoking, alcohol consumption, whether the participants lived in urban or rural areas, had a family history of cardiovascular disease, or had a history of diabetes, raised blood pressure, chronic obstructive pulmonary disease or depression.

They found that regular daytime naps were more common in the Middle East, China, Southeast Asia and South America. The duration of daytime naps varied mainly from 30 to 60 minutes. People who slept six or fewer hours at night, but took a daytime nap, and so slept an average of 6.4 hours a day in total, had a slightly increased risk compared to those who slept between six and eight hours at night without a daytime nap, but this finding was not statistically significant.

"Although daytime napping was associated with higher risks of death or cardiovascular problems in those with sufficient or longer sleep at night, this was not the case in people who slept under six hours at night. In these individuals, a daytime nap seemed to compensate for the lack of sleep at night and to mitigate the risks," Ms Wang added.

Professor Salim Yusuf, the Principal Investigator of the PURE study, Distinguished Professor of Medicine and Executive Director of the Population Health Research Institute at McMaster University and Hamilton Health Sciences, concluded: "The general public should ensure that they get about six to eight hours of sleep a day. On the other hand, if you sleep too much regularly, say more than nine hours a day, then you may want to visit a doctor to check your overall health. For doctors, including questions about the duration of sleep and daytime naps in the clinical histories of your patients may be helpful in identifying people at high risk of heart and blood vessel problems or death."

Limitations of the study include that the researchers estimated nocturnal sleep time based on the space between going to bed and waking up, and that they assumed that the duration of night time and daytime naps remained unchanged during the follow-up period. Nor did they collect information on sleep disorders such as insomnia and apnoea (a temporary cessation of breathing while asleep), which can have an impact on sleep and might also affect health.

In an accompanying editorial [2], Dr Dominik Linz, a cardiologist at the Royal Adelaide Hospital and Associate Professor at University of Adelaide, Australia, and colleagues write: "This study provides important epidemiological information, but causative factors explaining the described associations with increased CV [cardiovascular] risk remain speculative." They agree with Ms Wang that too much sleep might be an indicator of an underlying, undiagnosed health condition, and they raise the question: "once a 'pathological sleeping/napping pattern' has been identified, what interventions (if any) should be applied?"

They conclude: "We need to be aware and communicate to our patients, that sleeping a lot and having daytime naps may not always be harmless. Perhaps the ancient Greek poet Homer, author of the Iliad and the Odyssey, summed it up millennia of years ago when he said: 'Even where sleep is concerned, too much is a bad thing'."

Credit: 
European Society of Cardiology

Enhancing our vision of the past

image: A fossil trilobite with its complex eye. These ancient animals were inferred to have minimally possessed four opsins, like many modern arthropods, and should have therefore been able to see colors.

Image: 
University of Bristol

An international group of scientists led by researchers from the University of Bristol have advanced our understanding of how ancient animals saw the world by combining the study of fossils and genetics.

Ancestors of insects and crustaceans that lived more than 500 million years ago in the Cambrian period were some of the earliest active predators, but not much is known about how their eyes were adapted for hunting.

Work published in the Proceedings of the Royal Society B today suggests that when fossil and genetic data are assessed in tandem, previously inaccessible and exciting conclusions about long dead species can be made.

By examining the morphological characteristics of fossils' eyes, alongside the genetic visual pigment clues, a cross-disciplinary team led by a collaboration between the University of Bristol's Davide Pisani, Professor of Phylogenomics in the School of Earth Sciences and Nicholas Roberts, Professor of Sensory Ecology in the School of Biological Sciences, were able to find that ancient predators with more complex eyes are likely to have seen in colour.

Professor Pisani remarked: "Being able to combine fossil and genetic data in this way is a really exciting frontier of modern palaeontological and biological research. Vision is key to many animals' behaviour and ecology, and understanding how extinct animals perceived their environment will help enormously to clarify how they evolved."

By calculating the time of emergence of different visual pigments, and then comparing them to the inferred age of origin of key fossil lineages, the researchers were able to work out the number of pigments likely to have been possessed by different fossil species. They found that fossil animals with more complex eyes appeared to have more visual pigments, and that the great predators of the Cambrian period may have been able to see in colour.

Dr James Fleming, Professor Pisani and Roberts' former PhD student, explained: "Animal genomes and therefore opsin genes (constituting the base of different visual pigments) evolve by processes of gene duplication. The opsin and the pigment that existed before the duplication is like a parent, and the two new opsins (and pigments) that emerge from the duplication process are like children on a family tree.

"We calculated the birth dates of these children and this allowed understanding of what the ancient world must have seemed like to the animals that occupied it. We found that while some of the fossils we considered had only one pigment and were monochromat, i.e. they saw the world as if looking into a black and white TV, forms with more complex eyes, like iconic trilobites, had many pigments and most likely saw their world in colours."

The combinations of complex eyes and multiple kinds of visual pigments are what allows animals to distinguish between different objects based on colour alone - what we know as colour vision.

Professor Roberts commented: "It is remarkable to see how in only a very few million years the view those animals' had of their world changed from greys to the colourful world we see today."

The project involved scientists from all across the world - from the UK as well as Denmark, Italy, Korea and Japan, where Dr Fleming has now moved to work as a postdoctoral researcher. Each of them brought their own specialities to this multidisciplinary work, providing expertise in genetics, vision, taxonomy and palaeontology.

Credit: 
University of Bristol

Researchers classify Alzheimer's patients in 6 subgroups

image: Genetic data supported the contention that a particular way of sorting people resulted in biologically coherent subgroups (pictured lower right).

Image: 
University of Washington School of Medicine

Researchers studying Alzheimer's disease have created an approach to classify patients with Alzheimer's disease, a finding that may open the door for personalized treatments.

"Alzheimer's, like breast cancer, is not one disease," said lead author Shubhabrata Mukherjee, research assistant professor in general internal medicine at the University of Washington School of Medicine. "I think a good drug might fail in a clinical trial because not all the subjects have the same kind of Alzheimer's.

This study, published in the recent issue of Molecular Psychiatry, involves 19 researchers from several institutions, including Boston University School of Medicine, the VA Puget Sound Health Care System and Indiana University School of Medicine.

The researchers put 4,050 people with late-onset Alzheimer's disease into six groups based on their cognitive functioning at the time of diagnosis and then used genetic data to find biological differences across these groups.

"The implications are exciting," said corresponding author Paul Crane, professor of general internal medicine at the University of Washington School of Medicine. "We have found substantial biological differences among cognitively defined subgroups of Alzheimer's patients."

Identification of cognitive subgroups related to genetic differences is an important step toward developing a precision medicine approach for Alzheimer's disease.

The participants received cognitive scores in four domains: memory, executive functioning, language, and visuospatial functioning.

The largest group (39%) had scores in all four domains that were fairly close to each other. The next largest group (27%) had memory scores substantially lower than their other scores. Smaller groups had language scores substantially lower than their other scores (13%), visuospatial functioning scores substantially lower than their other scores (12%), and executive functioning scores substantially lower than their other scores (3%). There were 6% who had two domains that were substantially lower than their other scores.

The participants came from five studies, and it took more than two years to standardize the neuropsychological test scores across all the studies in order to detect meaningful patterns. The mean age was 80, 92 percent self-reported white race, and 61 percent were female.

The investigators used genome-wide genetic data to find out if the subgroups are biologically distinct.

Investigators found 33 single nucleotide polymorphisms (SNPs) - specific locations throughout the genome - where the genetic association was very strong for one of the subgroups. These genetic relationships were stronger than the strongest effects found by an earlier and much larger international consortium study where Alzheimer's disease was treated as a single homogeneous condition.

Several years ago, the International Genomics of Alzheimer's Project Consortium published the largest genome-wide association study of Alzheimer's disease and found about 20 SNPs associated with Alzheimer's disease risk.

This study found 33 additional SNPs with even stronger relationships with a single subgroup.

The study also found a particularly strong relationship between a particular variant of the APOE gene and risk for the memory subgroup. The APOE e4 allele is a very strong risk factor for developing Alzheimer's disease for people with European ancestry, and it also appears to influence which cognitive subtype of Alzheimer's a person is likely to develop.

People can currently find out if they have an APOE e4 allele with direct-to-consumer testing; however, the researchers note that many people with an APOE e4 allele never develop Alzheimer's disease, and many who don't carry any known genetic risk factor nevertheless end up with the condition.

While world leaders want to find a cure for Alzheimer's by 2025, so far no one has been able to develop an effective treatment let alone a cure. But this study suggests that thinking of Alzheimer's disease as six distinct conditions may provide a way forward.

"This study is not the end, it's a start," said Mukherjee.

Credit: 
University of Washington School of Medicine/UW Medicine

Machine learning helps predict worldwide plant-conservation priorities

image: The map shows predicted levels of risk to more than 150,000 plant species. Using vast amounts of open-access data, researchers were able to identify high-risk plants worldwide. Warmer colors denote areas with larger numbers of potentially at-risk species, while cooler colors denote areas with low overall predicted risk.

Image: 
Anahí Espíndola and Tara Pelletier.

COLUMBUS, Ohio - There are many organizations monitoring endangered species such as elephants and tigers, but what about the millions of other species on the planet -- ones that most people have never heard of or don't think about? How do scientists assess the threat level of, say, the plicate rocksnail, Caribbean spiny lobster or Torrey pine tree?

A new approach co-developed at The Ohio State University uses data analytics and machine learning to predict the conservation status of more than 150,000 plants worldwide. Results suggest that more than 15,000 species likely qualify as near-threatened, vulnerable, endangered or critically endangered.

The approach will allow conservationists and researchers to identify the species most at risk, and also to pinpoint the geographic areas where those species are highly concentrated.

The study appears online today (Dec. 3, 2018) in the journal Proceedings of the National Academy of Sciences.

"Plants form the basic habitat that all species rely on, so it made sense to start with plants," said Bryan Carstens, a professor of evolution, ecology and organismal biology at Ohio State.

"A lot of times in conservation, people focus on big, charismatic animals, but it's actually habitat that matters. We can protect all the lions, tigers and elephants we want, but they have to have a place to live in."

Currently, the International Union for the Conservation of Nature -- which produces the world's most comprehensive inventory of threatened species (the "Red List") -- more or less works on a species-by-species basis, requiring more resources and specialized work than is available to accurately assign a conservation-risk category to every species.

Of the nearly 100,000 species currently on the Red List, plants are among the least represented, with only 5 percent of all currently known species accounted for.

The new approach co-developed by Carstens and lead author Tara Pelletier, a former Ohio State graduate student who is now an assistant professor of biology at Radford University, aims to expand the number of plant species included.

The research team built their predictive model using open-access data from the Global Biodiversity Information Facility and TRY Plant Trait Database. Their algorithm compared data from those sources with the Red List to find risk patterns in habitat features, weather patterns, physical characteristics and other criteria likely to put species in danger of extinction.

A map of the data shows that at-risk plant species tend to cluster in regions with high native biodiversity, such as southwestern Australia, Central American rainforests and southeastern coast of the U.S., where more species compete for resources.

"What this allowed us to do is basically make a prediction about what sorts of conservation risks are faced by species that people haven't done these detailed assessments on," Carsten said.

"This isn't a substitute for more-detailed assessments, but it's a first pass that might help identify species that should be prioritized and where people should focus their attention."

Carsten said the biggest challenge was collecting data on such a large scale, noting it took several months of quality-control checking to ensure the team was working with reliable figures.

The new technique was created to be repeatable by other scientists, whether on a global scale like this study or for a single genus or ecosystem.

Credit: 
Ohio State University

New review highlights importance of good sleep routines for children

image: Sleep guidelines for children.

Image: 
University of British Columbia

Sleep hygiene, which includes practices like providing a cool and quiet sleeping environment or reading before bed time to help kids unwind, is increasingly popular among parents looking to ensure their children get a good night's rest. But are these practices all they're cracked up to be? University of British Columbia sleep expert and nursing professor Wendy Hall recently led a review of the latest studies to find out.

"Good sleep hygiene gives children the best chances of getting adequate, healthy sleep every day. And healthy sleep is critical in promoting children's growth and development," said Hall. "Research tells us that kids who don't get enough sleep on a consistent basis are more likely to have problems at school and develop more slowly than their peers who are getting enough sleep."

The American Academy of Sleep Medicine recommends the following amounts of sleep, based on age group:

4 to 12 months - 12 to 16 hours
1 to 2 years - 11 to 14 hours
3 to 5 years - 10 to 13 hours
6 to 12 years - 9 to 12 hours
13 to 18 years - 8 to 10 hours

The UBC review aimed at systematically analyzing the evidence for sleep hygiene across different countries and cultures, and honed in on 44 studies from 16 countries. The focus was on four age groups in particular: infants and toddlers (four months to two years), preschoolers (three to five years), school-age children (six to 12 years) and adolescents (13 to 18 years). These studies involved close to 300,000 kids in North America, Europe and Asia.

"We found good-to-strong endorsement of certain sleep hygiene practices for younger kids and school-age kids: regular bedtimes, reading before bed, having a quiet bedroom, and self-soothing--where you give them opportunities to go to sleep and go back to sleep on their own, if they wake up in the middle of the night," said Hall.

Even for older kids, keeping a regular bedtime was important. The review found papers that showed that adolescents whose parents set strict guidelines about their sleep slept better than kids whose parents didn't set any guidelines.

Hall and co-author Elizabeth Nethery, a nursing PhD student at UBC, also found extensive evidence for limiting technology use just before bedtime, or during the night when kids are supposed to be sleeping. Studies in Japan, New Zealand and the United States showed that the more exposure kids had to electronic media around bedtime, the less sleep they had.

"One big problem with school-age children is it can take them a long time to get to sleep, so avoiding activities like playing video games or watching exciting movies before bedtime was important," said Hall.

Many of the studies also highlighted the importance of routines in general. A study in New Zealand showed family dinner time was critical to helping adolescents sleep.

Information provided by Chinese studies and one Korean study linked school-age children's and adolescents' short sleep duration to long commute times between home and school and large amounts of evening homework. With more children coping with longer commutes and growing amounts of school work, Hall says this is an important area for future study in North America.

Surprisingly, there wasn't a lot of evidence linking caffeine use before bedtime to poor sleep; it appeared to be the total intake during the day that matters.

While Hall said more studies are needed to examine the effect of certain sleep hygiene factors on sleep quality, she would still strongly recommend that parents set bedtimes, even for older kids, and things like sitting down for a family dinner, establishing certain rituals like reading before bed, and limiting screen time as much as possible.

"Sleep education can form part of school programming," added Hall. "There was a project in a Montreal school where everyone was involved in designing and implementing a sleep intervention--the principal, teachers, parents, kids, and even the Parent Advisory Council. The intervention was effective, because everyone was on board and involved from the outset."

Credit: 
University of British Columbia

Snowpack declines may stunt tree growth and forests' ability to store carbon emissions

image: Researchers removed snowpack from various plots, such as the one seen here, to assess the potential environmental impact of reduced snowpack on northeast hardwood forests over time.

Image: 
Pamela Templer

NEW YORK, December 1, 2018 - Researchers conducting a 5-year-long study examining snow cover in a northern hardwood forest region found that projected changes in climate could lead to a 95 percent reduction of deep-insulating snowpack in forest areas across the northeastern United States by the end of the 21st century. The loss of snowpack would likely result in a steep reduction of forests' ability to store climate-changing carbon dioxide and filter pollutants from the air and water.

The new findings, out today in Global Change Biology, highlight a growing understanding of the broad impact of climate change across seasons on forest ecosystems, according to scientists who leveraged six decades of data showing declining winter snowpack at Hubbard Brook's forest. The 7,800-acre research forest in New Hampshire is heavily populated by sugar maple and yellow birch trees, and has been used for over 60 years to study changes in northern hardwood forests--an ecosystem covering over 54 million acres and stretching from Minnesota to southeastern Canada.

"We know global warming is causing the winter snowpack to develop later and melt earlier," said the paper's first author Andrew Reinmann, an assistant professor and researcher with the Environmental Science Initiative at the Advanced Science Research Center (ASRC) at The Graduate Center, CUNY, and with Hunter College's Department of Geography. "Our study advances our understanding of the long-term effects of this trend on northern hardwood forests--which are critical to North America's environmental health and several industries. The experiments we conducted suggest snowpack declines result in more severe soil freezing that damages and kills tree roots, increases losses of nutrients from the forest and significantly reduces growth of the iconic sugar maple trees."

The researchers' 5-year-long experiment consisted of removing snowpack from designated plots during the first 4-6 weeks of winter each year between 2008 and 2012, and then comparing the resulting condition of the soil and trees (all sugar maples) in those plots to the soil and trees in adjacent plots with natural snowpack. Their analysis found that soil frost depth reached over 30 centimeters in areas where snow cover had been removed compared to roughly 5 centimeters at control plots. The severe frost caused damage to tree roots that triggered a cascade of responses, including reduced nutrient uptake by trees, shorter branch growth, loss of nitrogen from soils into nearby waterways, and decreases in soil insect diversity and abundance. Scientists collected sample cores from sugar maple trees on their research plots and measured the width of the cores' rings to reconstruct growth rates. They found that growth declined by more than 40 percent in response to snow removal and increased soil freezing. The trees also were unable to rebound even after snowpack removal ceased.

"These experiments demonstrate the significant impact that changes in winter climate have on a variety of environmental factors, including forest growth, carbon sequestration, soil nutrients and air and water quality," Reinmann said. "Left unabated, these changes in climate could have a detrimental impact on the forests of the region and the livelihoods of the people who rely on them for recreation and industries such as tourism, skiing, snowmobiling, timber and maple syrup production."

Credit: 
Advanced Science Research Center, GC/CUNY

Experts present new recommendations on 'overlapping' type of leukemia

November 30, 2018 - Chronic myelomonocytic leukemia (CMML) is a rare disease with overlapping features of two categories of bone marrow and blood cell disorders that poses challenges in clinical management. Joint recommendations on diagnosis and treatment of CMML from two European specialty societies were published today in HemaSphere, the official journal of the European Hematology Association (EHA). The journal is published in the Lippincott portfolio by Wolters Kluwer.

The new document from the EHA and the European LeukemiaNet provides recommendations for standardized diagnosis, prognosis, and appropriate choice of treatment for adult patients with CMML. These recommendations have been established by a panel of European and US experts chaired by Prof. Pierre Fenaux, MD, PhD, of Hôpital Saint-Louis, Paris.

Expert Guidance on New Advances to Improve Treatment Outcomes in CMML

Chronic myelomonocytic leukemia is an unusual and highly variable condition representing an "overlap" between two different types of bone marrow and blood cancers: myelodysplastic syndromes (MDS - sometimes called "pre-leukemia") and myeloproliferative neoplasms (MPN - including several types of leukemias).

Occurring in about 1 in 100,000 people each year, mainly older adults, CMML can appear in different ways in different patients. Even for specialists, it can be a difficult condition to manage in terms of making the correct diagnosis, predicting the likely outcome (prognosis), and choosing the best treatment.

Until recently, CMML patients were grouped together with MDS patients. To date, there has been only one randomized trial focusing specifically on treatment for CMML. "Despite an increasing knowledge on the molecular and cellular features of CMML, the clinical management of these overlap MDS/MPN syndromes remains poorly codified," Prof. Fenaux and coauthors write.

To address this knowledge gap, the EHA and European LeukemiaNet assembled a panel of international experts, who were tasked with developing an initial set of clinical practice recommendations for CMML. The new article includes the panel's consensus recommendations, organized into three areas:

Diagnosis. Blood and bone marrow test recommendations for diagnosis of CMML are presented, including an elevated number of monocytes - a type of white blood cell. Advanced tests including flow cytometry and molecular genetic tests play key roles in narrowing down the diagnosis of CMML.

Prognosis. Outcome prediction is a critical step, as expected survival varies widely among patients with CMML. Although several predictive models have been developed, the best approach to determining prognosis in individual patients remains unclear. Both patient- and disease-related factors have important implications for predicting outcomes and monitoring the response to treatment.

Treatment. Several factors affect the choice of treatment for CMML. For some patients with favorable prognostic factors, "watchful waiting" (observation without treatment) may be appropriate. Available treatment options may lead to longer survival, but currently can't cure CMML. Stem cell transplantation remains the only curative therapy for CMML. However, this isn't an option for every patient and survival rates are relatively low. The experts emphasize the need for new transplantation strategies, including approaches to prevent relapse after transplantation.

The new recommendations are an important step forward in developing standardized approaches to clinical management of CMML. However, reflecting the limitations of the available research data, they are based mainly on expert consensus rather than on highest-quality evidence.

The expert panel urges steps to improve the quality of evidence supporting clinical recommendations for CMML. "Inclusion of CMML patients in clinical trials is strongly encouraged at all stages of the disease," the expert panel concludes. They also note that recently established international collaborative networks will help "to clarify the management strategy of CMML in the coming years."

Credit: 
Wolters Kluwer Health

Greenhouse gas 'detergent' hydroxyl (OH) radical recycles itself in atmosphere

image: Model output of OH primary production over a 24-hour period in July tracks with sunlight across the globe. Higher levels of OH over populated land are likely from OH recycling in the presence of NO and NO2, which are common pollutants from cars and industry.

Image: 
Credits: NASA / Julie Nicely

A simple molecule in the atmosphere that acts as a "detergent" to breakdown methane and other greenhouse gases has been found to recycle itself to maintain a steady global presence in the face of rising emissions, according to new NASA research. Understanding its role in the atmosphere is critical for determining the lifetime of methane, a powerful contributor to climate change.

The hydroxyl (OH) radical, a molecule made up of one hydrogen atom, one oxygen atom with a free (or unpaired) electron is one of the most reactive gases in the atmosphere and regularly breaks down other gases, effectively ending their lifetimes. In this way OH is the main check on the concentration of methane, a potent greenhouse gas that is second only to carbon dioxide in contributing to increasing global temperatures.

With the rise of methane emissions into the atmosphere, scientists historically thought that might cause the amount of hydroxyl radicals to be used up on the global scale and, as a result, extend methane's lifetime, currently estimated to be nine years. However, in addition to looking globally at primary sources of OH and the amount of methane and other gases it breaks down, this new research takes into account secondary OH sources, recycling that happens after OH breaks down methane and reforms in the presence of other gases, which has been observed on regional scales before.

"OH concentrations are pretty stable over time," said atmospheric chemist and lead author Julie Nicely at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "When OH reacts with methane it doesn't necessarily go away in the presence of other gases, especially nitrogen oxides (NO and NO2). The break down products of its reaction with methane react with NO or NO2 to reform OH. So OH can recycle back into the atmosphere."

Nitrogen oxides are one set of several gases that contribute to recycling OH back into the atmosphere, according to Nicely's research, published in the Journal of Geophysical Research: Atmospheres. She and her colleagues used a computer model informed by satellite observations of various gases from 1980 to 2015 to simulate the possible sources for OH in the atmosphere. These include reactions with the aforementioned nitrogen oxides, water vapor and ozone. They also tested an unusual potential source of new OH: the enlargement of the tropical regions on Earth.

OH in the atmosphere also forms when ultraviolet sunlight reaches the lower atmosphere and reacts with water vapor (H2O) and ozone (O3) to form two OH molecules. Over the tropics, water vapor and ultraviolet sunlight are plentiful. The tropics, which span the region of Earth to either side of the equator, have shown some evidence of widening farther north and south of their current range, possibly due to rising temperatures affecting air circulation patterns. This means that the tropical region primed for creating OH will potentially increase over time, leading to a higher amount of OH in the atmosphere. This tropical widening process is slow, however, expanding only 0.5 to 1 degree in latitude every 10 years. But the small effect may still be important, according to Nicely.

She and her team found that, individually, the tropical widening effect and OH recycling through reactions with other gases each comprise a relatively small source of OH, but together they essentially replace the OH used up in the breaking down of methane.

"The absence of a trend in global OH is surprising," said atmospheric chemist Tom Hanisco at Goddard who was not involved in the research. "Most models predict a 'feedback effect' between OH and methane. In the reaction of OH with methane, OH is also removed. The increase in NO2 and other sources of OH, such as ozone, cancel out this expected effect." But since this study looks at the past thirty-five years, it's not guaranteed that as the atmosphere continues to evolve with global climate change that OH levels will continue to recycle in the same way into the future, he said.

Ultimately, Nicely views the results as a way to fine-tune and update the assumptions that are made by researchers and climate modelers who describe and predict how OH and methane interact throughout the atmosphere. "This could add clarification on the question of will methane concentrations continue rising in the future? Or will they level off, or perhaps even decrease? This is a major question regarding future climate that we really don't know the answer to," she said.

Credit: 
NASA/Goddard Space Flight Center

US image abroad: It's the message not the messenger

Today's political climate in the U.S. is often peppered with animosity from the U.S. president towards other countries but how has the U.S. image fared? A Dartmouth study finds that the U.S. image abroad appears to be influenced more by policy content than by the person delivering the message, even if it is the U.S. president. The results are published in Political Behavior.

Conducted in Japan, the study gauges how U.S. policy messages impact foreign public opinion and is among the first to analyze the effects of a message's various elements.

"Our study reveals that Japanese public opinion of the U.S. depends largely on whether a policy message is cooperative or uncooperative in nature, rather than on who makes that statement," says co-author Yusaku Horiuchi, professor of government and the Mitsui Professor of Japanese Studies at Dartmouth. "Only when the message is uncooperative and attributed to U.S. President Donald Trump does it have a significant effect on changing attitudes, illustrating how Trump's influence on foreign public opinion appears to be conditional on policy content," adds Horiuchi.

To evaluate foreign public opinion of the U.S., the researchers administered a randomized survey experiment in April - May 2017 to more than 3,000 Japanese citizens of voting age. The messages presented varied by source cue, policy content and issue salience. Policy statements were attributed to either "U.S. President Donald Trump" or "an anonymous U.S. Congressman." The policy content presented was either cooperative or uncooperative in tone and drew on issues facing U.S. - Japan relations, which varied by importance: security (highly salient) and educational/cultural exchange programs (low salience).

For example, respondents were presented with varying statements regarding U.S. defense spending for the protection of Japan: one message attributed to an anonymous U.S. Congressman underscored defense cooperation while the other attributed to President Trump reflected the position that the U.S. should not get involved in Japan's defense policy.

The policy messages used in the study were mostly based on statements made either by Trump during his presidential campaign or by his administration during budget proposals. Following each policy message, respondents were asked to indicate if they had a favorable or unfavorable view of the U.S., Americans, U.S. foreign policy or Donald Trump.

The researchers analyzed the survey data to examine how the source cue, policy content and issue salience impact foreign public opinion as a whole in addition to analyzing the interactions between these factors. Regardless of the types of respondents (e.g., with high vs. low education, with high vs. low interest in politics), the effect of the policy content outweighs the source cue effect.

"If our case of Japan is any indication, Trump's damaging effect on the U.S. international image might not be as irreparable as many in and outside the U.S. believe it to be," explained the co-authors.

Credit: 
Dartmouth College

Study offers new approach to assess sustainability of reef fish

image: Red grouper, one of the five species in the study had estimated sustainability risks of greater than 95 percent.

Image: 
Jiangang Luo, UM Rosenstiel School of Marine and Atmospheric Science

MIAMI--A new study helping to improve how sustainability is measured for popular reef fish could help better assess the eco-friendly seafood options at the dinner table.

A team of researchers at the University of Miami (UM) Rosenstiel School of Marine and Atmospheric Science and NOAA Fisheries tested their newly developed fishery risk assessment method on groupers and snappers in the Florida Keys to determine if these tropical reef fish are being managed sustainability.

The new approach developed by UM Rosenstiel School Professor Jerald Ault and colleagues uses fish size-structured abundance data to evaluate fisheries sustainability status, instead of the traditional "catch-per-unit effort" method that requires long-term information collected by fishers to assess the health of a fishery.

The researchers then applied the length-based assessment to six key species in the Florida Keys region--black grouper, red grouper, coney, mutton snapper, yellowtail snapper and hog snapper--to evaluate the sustainability status of the fisheries.

They found that only one species--coney--was within the sustainable range with a less than a 35 percent risk score. The other five species had estimated sustainability risks of greater than 95 percent.

While the focus of this study was to develop a general length-based risk analysis methodology appropriate for data-limited fisheries worldwide, says the researchers, the results of the sustainability risk assessment for the species evaluated were in line with previous analyses for reef fishes in the Florida Keys and surrounding regions.

"The ecological and economic importance of tropical reef fish makes their sustainability a key conservation concern," said Ault, the lead author of the study. "The next challenge will be to evaluate the sustainability status of the over 50 principal exploited species in the Florida reef-fish community."

The new risk analysis framework can evaluate the sustainability status of tropical reef fish stocks when traditional catch data are not reliable or available and provide a frame of reference to help balance sustainability risks with management decisions.

In a separate but related study, Ault and colleagues developed a new fisheries-independent ecosystem monitoring survey to estimate biomass of deepwater snappers in the Hawaiian Islands. This new survey provides critical data for the risk analysis framework to assess fisheries sustainability in Hawaii.

"Our results help to improve the science and decision-making capacity for fisheries managers, and promotes the sustainability of coral reef fisheries subject to fishing and environmental changes," said Ault. "These combined methods will greatly improve the capacity and efficacy of fishery management for shallow and deep water coral reef fisheries in Florida, the U.S. Caribbean, and the U.S. Pacific Islands."

The new methods developed are designed to ensure the quality of commercial and recreational fishing today and into the future, said Ault.

Credit: 
University of Miami Rosenstiel School of Marine, Atmospheric, and Earth Science

The Wizard of Oz most 'influential' film of all time according to network science

The Wizard of Oz, followed by Star Wars and Psycho, is identified as the most influential film of all time in a study published in the open access journal Applied Network Science.

Researchers at the University of Turin, Italy, calculated an influence score for 47,000 films listed in IMDb (the internet movie database). The score was based on how much each film had been referenced by subsequent films. The authors found that the top 20 most influential films were all produced before 1980 and mostly in the United States.

Dr. Livio Bioglio, the lead author, said: "We propose an alternative method to box office takings - which are affected by factors beyond the quality of the film such as advertising and distribution - and reviews - which are ultimately subjective - for analysing the success of a film. We have developed an algorithm that uses references between movies as a measure for success, and which can also be used to evaluate the career of directors, actors and actresses, by considering their participation in top-scoring movies."

Applying the algorithm to directors, the five men credited for The Wizard of Oz are all in the top eight, with Alfred Hitchcock, Steven Spielberg and Stanley Kubrick ranked third, fifth and sixth respectively. When the authors used another approach to remove the bias of older movies - which, because they were produced earlier, can potentially influence a greater number of subsequent films - Alfred Hitchcock, Steven Spielberg and Brian De Palma occupied the top spots instead.

When applied to actors, the algorithm ranked Samuel L. Jackson, Clint Eastwood and Tom Cruise as the top three. The authors noticed a strong gender bias towards male actors; the only female in the top ten was Lois Maxwell, who played the recurring role of Miss Moneypenny in the James Bond franchise.

Dr. Bioglio said: "The scores of top-ranked actresses tend to be lower compared to their male colleagues. The only exceptions were musical movies, where results show moderate gender equality, and movies produced in Sweden, where actresses ranked better compared to actors."

To calculate the influence score for the 47,000 films investigated in this study, the authors treated the films as nodes in a network and measured the number of connections each film has to other films and how influential the films connected to it are. Similar network science methods have already been widely applied to measuring the impact of work in other fields, such as scientific publications.

Dr. Bioglio said: "The idea of using network analysis for ranking films is not completely new, but to our knowledge this is the first study that uses these techniques to also rank personalities involved in film production."

The authors suggest that their method could be used for research in the arts and by film historians. However, they caution that the results can only be applied to Western cinema as the data on IMDb are strongly biased towards films produced in Western countries.

Credit: 
BMC (BioMed Central)

How the devil ray got its horns

image: A new study shows that the manta ray's distinctive hornlike cephalic lobes, rather than being separate appendages, have their origins as the foremost part of the animals' fins, modified for a new purpose.

Image: 
Photo taken by Jackie Reid, courtesy of the NOAA Image Library

If you ever find yourself staring down a manta ray, you'll probably notice two things right away: the massive, flapping fins that produce the shark cousin's 20-foot wingspan and the two fleshy growths curling out of its head that give it the nickname "devil ray." A new San Francisco State University study shows that these two very different features have the same origin -- a discovery that reflects an important lesson for understanding the diversity of life.

"Small tweaks in early development can contribute to larger differences in how animals' bodies are laid out," explained San Francisco State Professor of Biology Karen Crow.

For Crow and her graduate student John Swenson, now a Ph.D. student at the University of Massachusetts Amherst, the hornlike "cephalic lobes" of manta rays represented a curious problem. All types of fish have two sets of paired appendages, like fins. But somewhere in their evolutionary past, a group of rays appeared to acquire a third set. These cephalic lobes are used for feeding, allowing some species to grapple with shellfish while helping species like manta rays more efficiently hoover up tiny plankton as they flap their way through the open ocean. What wasn't clear was just where these fleshy face funnels came from.

To investigate, the researchers studied the embryos of cownose rays, the closest relatives of the massive mantas. They took samples of genetic material at different stages of the rays' growth to see which genes were active during fin development, akin to peeking at the growing ray's assembly instructions. The team examined hundreds of genes and paid special attention to several "Hox" genes, which contain instructions for growth and development of fins and limbs. It's a group of genes crucial to development in all animals, including humans.

The team's results showed that the ray's horns aren't a third set of appendages at all -- they're simply the foremost bit of fin, modified for a new purpose. They found that the same Hox genes that guide development of the rays' cephalic lobes also play the same role in the fins of a closely related ray species, the little skate, which doesn't have cephalic lobes.

In fact, the way the horns develop is surprisingly simple. All it takes is a tiny notch that deepens and widens as the manta grows, separating each fin into two distinct parts: one for feeding and the remainder for swimming. The team published their results in the journal Frontiers in Ecology and Evolution on Nov. 13.

The researchers say that the findings support a consensus that's emerging among scientists who study evolution: Strange, novel features in nature can often arise from tiny evolutionary tweaks. "Whatever genetic changes occurred, there were far fewer than what we expected," said Crow. A devil ray isn't so different from its hornless cousins. And that lesson applies on a broader scale, too, she explains.

"We share the same genetic toolkit with all the other animals -- and we share many of our genes with all living things," Crow said.

Credit: 
San Francisco State University

What happens when materials take tiny hits

image: This scanning electron micrograph shows the crater left by the impact of a 10-micrometer particle traveling at more than 1 kilometer per second. Impacts at that speed produce some melting and erosion of the surface, as revealed by this research.

Image: 
Courtesy of the researchers

When tiny particles strike a metal surface at high speed -- for example, as coatings being sprayed or as micrometeorites pummeling a space station -- the moment of impact happens so fast that the details of process haven't been clearly understood, until now.

A team of researchers at MIT has just accomplished the first detailed high-speed imaging and analysis of the microparticle impact process, and used that data to predict when the particles will bounce away, stick, or knock material off the surface and weaken it. The new findings are described in a paper appearing today in the journal Nature Communications.

Mostafa Hassani-Gangaraj, an MIT postdoc and the paper's lead author, explains that high-speed microparticle impacts are used for many purposes in industry, for example, for applying coatings, cleaning surfaces, and cutting materials. They're applied in a kind of superpowered version of sandblasting that propels the particles at supersonic speeds. Such blasting with microparticles can also be used to strengthen metallic surfaces. But until now these processes have been controlled without a solid understanding of the underlying physics of the process.

"There are many different phenomena that can take place" at the moment of impact, Hassani-Gangaraj says, but now for the first time the researchers have found that a brief period of melting upon impact plays a crucial role in eroding the surface when the particles are moving at speeds above a certain threshold.

That's important information because the rule of thumb in industrial applications is that higher velocities will always lead to better results. The new findings show that this is not always the case, and "we should be aware that there is this region at the high end" of the range of impact velocities, where the effectiveness of the coating (or strengthening) declines instead of improving, Hassani-Gangaraj says. "To avoid that, we need to be able to predict" the speed at which the effects change.

The results may also shed light on situations where the impacts are uncontrolled, such as when wind-borne particles hit the blades of wind turbines, when microparticles strike spacecraft and satellites, or when bits of rock and grit carried along in a flow of oil or gas erode the walls of pipelines. "We want to understand the mechanisms and exact conditions when these erosion processes can happen," Hassani-Gangaraj says.

The challenge of measuring the details of these impacts was twofold. First, the impact events take place extremely quickly, with particles travelling at upward of one kilometer per second (three or four times faster than passenger jet airplanes). And second, the particles themselves are so tiny, about a tenth of the thickness of a hair, that observing them requires very high magnification as well. The team used a microparticle impact testbed developed at MIT, which can record impact videos with frame rates of up to 100 million frames per second, to perform a series of experiments that have now clearly delineated the conditions that determine whether a particle will bounce off a surface, stick to it, or erode the surface by melting.

For their experiments, the team used tin particles of about 10 micrometers (hundred thousandths of a meter) in diameter, accelerated to speeds ranging up to 1 kilometer per second and hitting a tin surface. The particles were accelerated using a laser beam that instantly evaporates a substrate surface and ejects the particles in the process. A second laser beam was used to illuminate the flying particles as they struck the surface.

Previous studies had relied on post-mortem analysis -- studying the surface after the impact has taken place. But that did not allow for an understanding of the complex dynamics of the process. It was only the high-speed imaging that revealed that melting of both the particle and the surface took place at the moment of impact, in the high-speed cases.

The team used the data from these experiments to develop a general model to predict the response of particles of a given size travelling at a given speed, says David Veysset, a staff researcher at MIT and co-author of the paper. So far, he says, they have used pure metals, but the team plans further tests using alloys and other materials. They also intend to test impacts at a variety of angles other than the straight-down impacts tested so far. "We can extend this to every situation where erosion is important," he says. The aim is to develop "one function that can tell us whether erosion will happen or not."

That could help engineers "to design materials for erosion protection, whether it's in space or on the ground, wherever they want to resist erosion," Veysset says.

The team included senior author Christopher Schuh, professor and head of the Department of Materials Science and Engineering, and Keith Nelson, professor of chemistry. The work was supported by the U.S. Department of Energy, the U.S. Army Research Office, and the Office of Naval research.

Credit: 
Massachusetts Institute of Technology

Ending the HIV epidemic: Where does Europe stand?

image: This graph shows the substantive target, overall and by WHO subregion, 2018, and comparison between 2018 and 2016.

Image: 
<i>Eurosurveillance</i>

From diagnosis of HIV to successful viral suppression: in a rapid communication published in Eurosurveillance today, ECDC and co-authors from Public Health England and The National AIDS Trust summarise the progress towards HIV elimination in 52 countries in Europe and Central Asia. The main issues: diagnosing those who are unaware of their HIV infection and treating them.

The global targets set out by UNAIDS for 2020 are to diagnose 90% of all HIV-positive people, provide antiretroviral therapy for 90% of those diagnosed, and achieve viral suppression for 90% of those treated (known as 90-90-90 targets). In 2018, 52 of 55 countries completed the survey indicating the progress towards these targets in Europe and Central Asia.

Between "substantial progress" and "concerning"

Is Europe on track to end AIDS by 2020? Following analysis of the data provided by the 52 countries in 2018, the progress towards the 90-90-90 targets stands at 86%-91%-92% in the EU/EEA. This means that overall, countries in the EU/EEA are on track to reach the targets by 2020. Looking at the whole Region, however, a striking drop in the second stage of the continuum is apparent: across Europe and Central Asia, the figures show a significant gap in the amount of people who are diagnosed with HIV but not receiving treatment: 80%-64%-86%. The article provides results on the targets for each of the reporting countries.

The authors acknowledge "substantial progress" towards the 90-90-90 targets across Europe and Central Asia. However, among the estimated 2.1 million people living with HIV in Europe and Central Asia "only two out of five are estimated to be virally suppressed in 2018". Furthermore, "the substantial drop-off between the percentages diagnosed and treated in the East sub-region is concerning since it enables preventable deaths, serious illness and onward transmission."

The results give new insights into necessary steps in the regional or national HIV responses. As almost two-thirds of the 1.2 million people across the region with transmissible virus are diagnosed but only half of those are on treatment, "the biggest public health impact could be achieved through rapid and sustained scale up of treatment", according to the authors. This is particularly true for countries in the east of the region where the outcome was 76%-46%-78%.

Policies that diversify and enhance the offer of HIV tests could help address the problem of late diagnosis across Europe, as outlined in the new ECDC guidance on HIV, hepatitis B and C testing. This would include testing for indicator conditions, during screenings for other sexually transmitted infections, in community-based settings, as self/home-testing and for partner notification.

The authors highlight that the 90-90-90 targets remain a "powerful tool to assess progress towards HIV elimination and drive standards in care" for people living with HIV. But these targets do not provide a comprehensive picture of the public health response to HIV. "Each 'last 10 percent' includes people especially marginalised from healthcare services."

What are the 90-90-90 targets?

The so-called continuum of HIV care is a framework which allows countries to monitor the effectiveness of key areas in the response to the HIV epidemic along several stages, from diagnosis towards viral suppression. The overall aim is to that people living with HIV are diagnosed (early) and receive antiretroviral treatment (ART) which leads to viral suppression, i.e. the virus is no longer detectable in the blood. Such an undetectable viral load also means that HIV positive people on effective treatment do not transmit the virus.

Based on the findings of the ECDC Dublin Declaration report on the continuum in 2015, ECDC now monitors a four-stage continuum that is directly relevant in the European region. Stage 1 looks at the estimated number of all people living with HIV (PLHIV); stage 2 at the number of all PLHIV who have been diagnosed; stage 3 at the number on PLHIV who have been diagnosed and who are on ART; and stage 4 comprises the number of PLHIV on ART who are virally suppressed. In 2018, 34 of 55 countries provided data on all four continuum stages.

Credit: 
European Centre for Disease Prevention and Control (ECDC)