Culture

Model based on liquid biopsies may predict time to progression in colorectal cancer

Bottom Line: An evolutionary model utilizing serial blood samples from patients with advanced colorectal cancer treated with anti-EGFR therapies in a phase II trial could predict personalized waiting time for progression.

Journal in Which the Study was Published: Cancer Discovery, a journal of the American Association for Cancer Research

Authors: Co-senior authors Andrea Sottoriva, PhD, MSc, Chris Rokos Fellow in Evolution and Cancer and team leader at The Institute of Cancer Research, London and Nicola Valeri, MD, PhD, team leader in Gastrointestinal Cancer Biology and Genomics at The Institute of Cancer Research, London, and consultant medical oncologist at The Royal Marsden NHS Foundation Trust

Background: "By combining frequent longitudinal sampling of cell-free DNA with mathematical modeling of tumor evolution, we were able to make statistical predictions of patients who were at risk of progression," said Sottoriva. "We could also determine when a cancer was going to come back, on a patient-by-patient basis. This is the first time that quantitative forecasting of this sort has been successfully used in cancer."

While clinicians often use tumor biopsies for cancer genotyping, many tumors have intratumor heterogeneity which can drive treatment resistance; therefore, multiple biopsies in time and space are needed to better understand how tumors evolve to resist therapy, explained Valeri.

Liquid biopsies are non-invasive, allowing the collection of circulating tumor DNA at many time points without additional risk to the patient. Furthermore, the analysis of circulating tumor DNA may capture the intratumoral heterogeneity better than a small piece of the tumor, said Valeri.

While much research has focused on the clinical utility of cell-free DNA (cfDNA) for disease monitoring, the use of liquid biopsies as a predictive tool in estimating time to disease progression has not been thoroughly investigated, Sottoriva noted.

How the Study Was Conducted and Results: The researchers analyzed the results of the PROSPECT-C trial which evaluated biomarkers of both response and resistance to anti-EGFR therapies in patients with wild-type RAS metastatic colorectal cancer. Tumor biopsies were taken from patients at predefined time points of pre-treatment (baseline) and post-treatment (disease progression), and at partial response in some. Additionally, patients provided plasma samples every four weeks until disease progression.

Even though standard tumor genotyping categorized patients as having metastatic colorectal cancer with wild-type RAS, analysis of baseline cfDNA revealed that many of these patients' tumors had aberrations in RAS proteins, which may explain why they were resistant to cetuximab, an EGFR inhibitor, noted Valeri. Furthermore, ultra-deep sequencing of baseline tumor biopsy cores revealed RAS mutations, further highlighting the limitations of standard methods for tumor genotyping, he added.

Valeri and colleagues generated mathematical models which utilized cfDNA and carcinoembryonic antigen (CEA) levels from individual patients' plasma to predict time to progression. The results were validated using RECIST measurements from radiological imaging data.

The mathematical model utilizing CEA measurements was applied to six patients to predict time to clinical progression. Of these predictions, three were within 10 percent of progression time as measured by RECIST.

Notably, predictions generated with high sensitivity cfDNA profiling allowed for the prediction of progression time several weeks in advance, compared with models utilizing CEA measurements.

With the information garnered from the cfDNA, the researchers could generate multiple models based on the predicted growth of individual subclones driven by different mutations. The accuracy of the models utilizing cfDNA relies on the identification of the dominant subclones in patients with polyclonal resistance mechanisms, Valeri said.

Author's Comments: "Integration of novel monitoring technologies like cfDNA, in combination with mathematical modeling of tumor forecasting, may offer the opportunity to act early, stop therapy, or change treatment to stay one step ahead of the disease," said Valeri. "Our method allows for a more accurate prediction as well as improved monitoring of response to therapy."

Study Limitations: Limitations of the study include a small sample size, in addition to focusing on RAS pathway aberrations in the mathematical models, as other genetic and non-genetic determinants will likely cause resistance and disease progression, noted Sottoriva. This model will need to be prospectively validated in future trials, he added.

Credit: 
American Association for Cancer Research

The Lancet Public Health: Number of very elderly needing round-the-clock care set to double by 2035 in England

The number of adults aged 85 years and older needing round-the-clock care will almost double to 446,000 in England over the next 20 years, whilst the overall numbers of over-65s requiring 24-hour care will rise by more than third to over 1 million in 2035, according a new modelling study published in The Lancet Public Health.

The new estimates also sit in the context of growing independency, with the number of adults aged 65 years and older living independently (without care needs) set to rise to 8.9 million by 2035 in England--an increase of over 60% from 5.5 million in 2015, with the increase in independence seen mainly in men.

Nevertheless, the estimates predict an increase in the number of people living into old age with multiple long-term conditions, with the majority (80%) of older adults with dementia and in need of substantial care in 2035 likely to have two or more other diseases.

The study highlights the importance of ensuring that health and social care services adapt to the unprecedented needs of an increasing older population with complex care needs. The authors warn that relying on the informal carers who provide around £57 billion worth of care in the UK is not a sustainable solution. [1]

"The challenge is considerable", says Professor Carol Jagger from the Newcastle University Institute for Ageing, Newcastle, UK. "Our study suggests that older spouse carers are increasingly likely to be living with disabilities themselves, resulting in mutual care relationships, that are not yet well recognised by existing care policy and practices. On top of that, extending the retirement age of the UK population is likely to further reduce the informal and unpaid carer pool, who have traditionally provided for older family members. These constraints will exacerbate pressures on already stretched social care budgets."[2]

Little research has been done on how levels of dependency might change for different generations of older people, and forecasts of future care needs remain poorly defined due to limitations of previous models including: limited information about key sociodemographic and lifestyle factors and chronic conditions that impact disability and dependency; and failing to account for the joint effect of diseases and complex multi-morbidity--numbers of older people with four or more diseases are projected to more than double in the next 20 years.

To improve the precision of social care need forecasts, researchers from Newcastle University and the London School of Economics and Political Science developed the Population Ageing and Care Simulation (PACSim) model that accounts for multiple risk factors for dependence and disability including a wide range of sociodemographic factors (eg, level of education) and health behaviours (eg, smoking status, physical activity), as well as 12 chronic diseases and geriatric conditions including coronary heart disease, stroke, hypertension, diabetes, arthritis, cancer, respiratory disease, and cognitive impairment, and depression. Using longitudinal data from three large nationally representative studies of adults (aged 35 and older), the study modelled future trends in social care needs for the population aged 65 years and older in England between 2015 and 2035, according to varying levels of dependency. [3]

Adults were categorised as high dependency if they required 24-hour care; medium dependency if they needed help at regular times daily; low dependency if they required care less than daily and were generally looked after in the community; or independent (without care needs).

Estimates suggest that the number of people aged over 65 will increase by just under 50% from 9.7 million in 2015 to 14.5 million in 2035, and highlight the very differing future care needs of men and women.

Between 2015 and 2035, life expectancy for men aged 65 is projected to rise by 3.5 years to 22.2 years, and the average number of years spent independent is expected to increase by 4.2 years (from 11.1 years to 15.2), whilst time spent living with substantial care needs (medium or high dependency) is likely to decline.

In contrast, for women average life expectancy at 65 will increase by just 3 years (from 21.1 to 24.1). Over this time, the average number of years spent independent is expected to rise by less than a year (from 10.7 years to 11.6), and women will spend almost half of their remaining life with low dependency needs such as help with activities like washing and shopping, alongside a small increase in years requiring intensive 24-hour care (from 2 years in 2015 to 2.7 years in 2035).

"Over the next 20 years, although young-old cohorts (aged 65-74) are more likely to enter old age independent, the proportion with multi-morbidities is projected to rise with each successive cohort, and this will result in a greater likelihood of higher dependency with further ageing", explains Professor Jagger. "However, trends for men and women are likely to be very different, with women experiencing more low level dependency than men, highlighting the importance of focusing on disabling long-term conditions such as arthritis that are more common in women than men."[2]

The researchers also analysed how the burden of dementia with and without other chronic diseases will change demands for social care over the next 20 years. They found that older people with substantial care needs (moderate or high dependency) are likely to change markedly. For instance, whilst numbers of over 65s with dementia will fall by around a third (equivalent to 16,000 less people) by 2035, those with dementia and two or more conditions will more than double (equivalent to an additional 493,000 people).

Professor Jagger warns, "This expanding group will have more complex care needs that are unlikely to be met adequately without improved co-ordination between different specialties and better understanding of the way in which dementia affects the management of other conditions."

The authors note several limitations including that the models assume that risk factor profiles of cohorts remain constant over time, and that they did not include other risk factors that might impact dependency and disability such as alcohol use.

Writing in a linked Comment, Professor Eric Brunner and Sara Ahmadi-Abhari from University College Medical School, London, UK say, "Care provision at this intense level for more than 1 million people in 2035 will require careful thought and planning at both local and national level."

They add, "Public health modelling is moving towards more realistic representation of the influences on future health-state occupancy....The PACSim model takes account of likely future dynamics in the epidemiology of old age by conditioning health transitions according to multiple risk factors for dependence, including disease. Evidence is accumulating that this perspective is relevant to prevention policy as well as to predicting future need for social care in older people."

Credit: 
The Lancet

UK MP Twitter abuse increased between 2015 and 2017 general elections

Abuse of politicians online increased substantially in the snap 2017 general election compared to the 2015 general election, according to new research by the University of Sheffield.

The study shows for the first time that both the proportion and volume of Twitter abuse increased between 2015 and 2017. In most cases, this was regardless of the party or gender of the candidate for Member of Parliament.

Scientists analysed over one million tweeted replies to MPs and candidates and found that different topics triggered abusive replies in 2015 to 2017. In 2015, users who tweeted abusive replies were more concerned with the economy. In 2017, these were concerned with national security in the wake of terror attacks on UK soil.

The study, led by Professor Kalina Bontcheva from the Department of Computer Science, also found that although prominent politicians receive a lot of tweets and thus a fair number of abusive tweets too, there is tendency for the more well-known politicians to receive proportionally less abuse. This might suggest that to certain types of senders of abuse a large target is a less attractive one.

Other findings include:

Male MPs and Conservatives are more likely to receive more abuse.

In 2015, the majority of abusive replies were received by leaders of the two main parties, David Cameron and Ed Miliband.

In 2017, the majority of abusive replies were again the leaders of the two biggest parties, Jeremy Corbyn and Theresa May, but also Boris Johnson, lead figure in the Vote Leave campaign.

Concerns about online intimidation have drawn increasing attention but little research has been done into this trend that has worrying implications for democracy.

The UK government published 'Intimidation in public life: A Review by the Committee on Standards in Public Life' in December 2017, looking at how abuse and intimidations affected parliamentary candidates during elections.

It found that the Parliamentary candidates who provided evidence to the Committee overwhelmingly believed that online intimidation is already discouraging individuals from standing for public offices.

In order to understand trends in online abuse towards UK Politicians, Professor Bontcheva and her team collected data focused on Twitter accounts of MPs, candidates and official party accounts. They collected every tweet by each of these users, and every retweet and reply to these accounts, for the month leading up to the 2015 and 2017 general election.

The team identified abusive language and detected topics that attracted abuse or were of interest to the users who sent abuse.

The team analysed the 2017 dataset to discover more about users who sent online abuse. They found that those who tweeted abusively have more recent Twitter accounts by a few months, fewer followers, follow fewer accounts and have fewer posts. One explanation could be that accounts are being created primarily to send anonymous abuse.

A similar analysis of the 2015 data found some changes - these accounts posted more and favourited more.

Reviewing the data found that more abuse was sent by a smaller number of individuals compared to 2017. This could be because Twitter became more engaged in blocking users that sent abuse, so more accounts were created.

In both 2015 and 2017 datasets, the team found significantly more accounts had been closed amongst those that sent abusive tweets as compared to accounts which did not; 16 per cent rather than 6 per cent in 2015 and 8 per cent rather than 2 per cent in 2017.

Professor Bontcheva said: "Whilst there was a clear increase in abuse on Twitter sent to politicians in the 2017 general election compared to 2015, it was interesting to see the differences in the topics they responded to. This clearly shows the different issues that rose to prominence in the two separate elections.

"The increase in abuse towards public figures is a shocking development and one that the UK government is right to take seriously. If people are dissuaded from standing for election, then our representation on a democratic level is under threat."

Credit: 
University of Sheffield

New genetics findings unravel key components of fracture risk in osteoporosis

The largest study ever to investigate the genetics of osteoporosis and fracture risk determined that only two examined factors - bone mineral density (BMD) and muscle strength - play a potentially causal role in the risk of suffering osteoporotic fracture, a major health problem affecting more than 9 million people worldwide very year. Other clinical risk factors like vitamin D levels and calcium intake, historically considered to be crucial mediators of fracture, were not found to directly predispose people in the general population to fracture. This research was published in the BMJ.

"These findings suggest that interventions aimed at increasing bone strength are more likely to prevent fractures than widespread supplementation with vitamin D," said Dr. Brent Richards, a genetic epidemiologist at the Lady Davis Institute at the Jewish General Hospital and Professor of Medicine at McGill University, and one of the senior investigators on the paper. "Our study, the first genome-wide association study for fracture risk, has provided important insight on the biologic mechanisms leading to fracture and how to prevent it."

An international team of researchers collaborated to examine data from 185,057 cases and 377,201 controls part of the Genetic Factors of Osteoporosis (GEFOS) Consortium, the UKBiobank Study and the 23andMe biotech company. The study was co-led by researchers from McGill University and the Erasmus University Medical Center in Rotterdam, The Netherlands.

"Our research confirms that BMD is the most important determinant of fracture risk and that prevention strategies aimed at increasing or maintaining bone density are the most likely to be successful," Dr. Richards pointed out. "One of the most important aspects of this research is the robust evidence showing that vitamin D supplementation in the general population is unlikely to be effective for the prevention of fracture. This will encourage clinicians to focus patients on building bone density as a more effective preventive measure against fracture."

The researchers came to these conclusions by demonstrating that the genetic factors that lead to lowered vitamin D levels in the general population do not increase risk of fracture.

Approximately 30% of people over the age of sixty-five take Vitamin D supplements partly because clinical guidelines for osteoporosis management and fracture prevention suggest such supplements. However, recent large randomized controlled clinical trials have failed to confirm any benefit of vitamin D supplementation in patients without pronounced deficiency of these factors. Thus, these findings and those derived from this study highlight the need to re-assess its wide-spread use in clinical practice.

The authors do caution that patients using osteoporosis medication should not discontinue their supplements before consulting with their treating physicians. Maintaining a healthy diet, remaining physically active, and fifteen minutes of sun exposure everyday are the main pillars of a sustainable bone health. These results do also not apply to individuals with low vitamin D levels.

Credit: 
McGill University

Drought, groundwater loss sinks California land at alarming rate

ITHACA, N.Y. - The San Joaquin Valley in central California, like many other regions in the western United States, faces drought and ongoing groundwater extraction, happening faster than it can be replenished. And the land is sinking as a result -- by up to a half-meter annually according to a new Cornell University study in Science Advances.

Despite much higher-than-normal amounts of rain in early 2017, the large agricultural and metropolitan communities that rely on groundwater in central California experienced only a short respite from an ongoing drought. When the rain stopped, drought conditions returned and the ground has continued to sink, according to researchers.

"With the heavy storms in early 2017, Californians were hopeful that the drought was over," said Kyle Murray, a Cornell doctoral candidate in the field of geophysics. "There was a pause in land subsidence over a large area, and even uplift of the land in some areas. But by early summer the subsidence continued at a similar rate we observed during the drought."

Murray and Rowena Lohman, Cornell associate professor in earth and atmospheric sciences, examined satellite imagery of the San Joaquin Valley. In the farming region of the Tulare Basin in central California, growers have been extracting groundwater for more than a century, said the researchers. Winter rains in the valley and snowmelt from the surrounding mountains replenish the groundwater annually to some extent, but drought has parched the valley since 2011.

Between 1962 and 2011, previous studies had found that the average volume of groundwater depletion each year was at least a half cubic-mile. Using satellite-based measurements between 2012 and 2016, depletion of the Tulare Basin groundwater volume was estimated at 10 miles cubed.

Fresno and Visalia border the Tulare Basin to the north, with Bakersfield to the south. About 250 agricultural products grow there with an estimated value of $17 billion annually, according to the U.S. Geological Survey. The valley holds about 75 percent of the California's irrigated agricultural land and supplies 8 percent of the United States' agricultural output.

As an engineering problem, subsidence damages infrastructure, causes roads to crack and give rise to sinkholes - expensive problems to fix, said Lohman. "One of the places where it really matters in California is the aqueduct system that brings water to the region. They're engineered very carefully to have the correct slope to carry a given amount of water," she said. "Now, one of the major aqueducts in that area is bowed and can't deliver as much water. It's been a huge engineering nightmare."

Groundwater - as an agricultural and municipal resource - is incredibly important to communities in central California and elsewhere. Said Lohman: "The subsidence we see is a sign of how much the groundwater is being depleted. Eventually, the water quality and cost of extracting it could get to the point where it is effectively no longer available."

Credit: 
Cornell University

A climate 'wake-up call'

If we proactively implement effective fisheries management and limit global temperature rise, the world's oceans still have the potential to be significantly more plentiful in the future than today, despite climate change. This finding is among several that appear in a first-of-its kind study, "Improved fisheries management could offset many negative effects of climate change," that appears today in the American Association for the Advancement of Sciences' journal Science Advances.

"The expected global effects of climate change on our oceans are broadly negative," said Steve Gaines, the study's lead author and dean of UC Santa Barbara's Bren School of Environmental Science & Management, "but we still have the fortunate opportunity to turn the tide and create a more bountiful future."

The study finds that with concerted and adaptive responses to climate change, the world's oceans could actually create more abundant fish populations, more food for human consumption and more profit for fishermen despite the negative impacts of climate change. Conversely, the study cautions, inaction on fisheries management and climate change will mean even more dramatic losses of fish and the benefits they provide to people.

A dozen leading scientists from institutions including UCSB's National Center for Ecological Analysis and Synthesis, Hokkaido University and Environmental Defense Fund (EDF) conducted the research. It is the first study to examine future fishery outcomes under both climate change projections and alternative human responses. It demonstrates that our oceans can be highly productive for decades to come if action is taken now to put effective and forward-looking management practices in place.

"The results from this study are surprisingly positive -- if we can adopt sustainable fishing policies and keep global warming at no more than 2 degrees Celsius, we can still realize significant benefits to fisheries across the globe," said Merrick Burden, senior economist with the EDF Oceans program and an author of the paper. "But these benefits require action and this study serves as a wake-up call to governments that they must change the way that fishing takes place or risk losing a crucial opportunity to secure our food supply for generations to come."

This study examines potential future outcomes for 915 fish stocks across the world under alternative management and climate scenarios. The authors modeled the impact of climate change on fishery productivity and geographical range distribution, which affects how many fish are available and where they can be caught, under four climate projections. These range from a global temperature increase of 1 degree Celsius (strong climate mitigation) to an increase of 4 degrees Celsius (business-as-usual) by 2100. For each of these climate scenarios, the authors examined future biomass, harvest and profit under alternative management approaches using bioeconomic modeling.

The new research shows that roughly 50 percent of species examined will shift across national boundaries and nearly all species are expected to experience changes in productivity in response to rising ocean temperatures. These changes will present new challenges for fishing nations. The study found that the implementation of management practices that account for changes in productivity and geographic range distribution can lead to global gains in profits, harvest and biomass compared to today. These practices range from flexible management strategies, including responsible harvest policies that account for changing stock productivity, to the creation and improvement of existing governance institutions to deal with shifting stocks, such as multilateral fishery agreements.

"Cooperation among nations will be increasingly important for ensuring future fisheries benefits as stocks shift across management boundaries," said Tracey Mangin, an author of the paper and researcher at UCSB's Sustainable Fisheries Group, explaining that rising ocean temperatures can send fish stocks beyond their traditional geographical ranges as they track their preferred thermal habitats. "These shifts can undermine previously effective and well-designed management approaches, as they can incentivize overfishing and change which nations have access to the fish stocks, which can weaken existing fishing agreements."

While improved management may lead to improved global outcomes, those outcomes will vary regionally. The results indicate that future fishery profits are expected to decline in tropical latitudes even with management that fully adapts to climate challenges. This means that equatorial nations, many of which have developing economies and are highly dependent on seafood as a source of food and income, will be hardest hit. And how much planetary warming occurs will make a significant difference on the abundance, harvest and profit from fisheries.

"Even with the right management changes, there will be winners and losers, and we have to tackle this head-on," Gaines said. "Success will require not only emissions reductions but also multilateral cooperation and real changes in fisheries management. With our growing global population and the increasing needs for healthy sources of protein, these changes will be critical for meeting United Nations Sustainable Development Goals."

The impacts of inaction are also clear. Billions of people rely on fish as their primary source of protein. Most fishing nations are not responding fast enough to create change, and successful transboundary management programs are relatively rare. But action doesn't take long to have an impact on some species. Studies have demonstrated that many fisheries can bounce back from overfishing in as little as 10 years' time under the right policies.

"Climate change is expected to hit hardest in many of the places where fisheries are already poorly managed -- things are likely to get a lot worse if we don't act," said Christopher Costello, an author of the paper and a professor of environmental and resource economics at UCSB. "We can expect inaction to bring increased conflict as fish move into new waters, along with threats to food security in some of the world's most vulnerable places."

"Fishermen will be among the most affected by climate change, and this research confirms what they are already seeing on the water," said Katie McGinty, senior vice president of EDF Oceans. "The window is narrow, but we have the tools and a clear roadmap to build a future with more fish, more food and more prosperity -- if we act now."

The study did not examine other potential threats from climate change such as ocean acidification, or new ways that species might interact. These threats require further study beyond the scope of this paper.

Credit: 
University of California - Santa Barbara

Genetic susceptibility to lower vitamin D levels and calcium intake not linked to fracture

Having a genetic predisposition to lower vitamin D levels and calcium intake is not associated with an increased risk of osteoporotic fracture, conclude researchers in The BMJ today.

Their findings add to the ongoing debate over the benefits for the general population of vitamin D supplementation, which is recommended by clinical guidelines to prevent fractures.

The findings also back recent clinical trials that have failed to consistently demonstrate a beneficial effect of supplementation for people living in the community.

The international team of researchers set out to assess the role of 15 clinical risk factors considered to be associated with risk of osteoporotic fractures, including vitamin D levels, calcium intake, fasting glucose levels, age of puberty, age at menopause, diabetes and rheumatoid arthritis, using genetics.

First, they analysed the results of genome-wide association studies (GWAS) to evaluate the influence of genetic variation on fracture risk.

Using data from 37,857 fracture cases and 227,116 controls, they identified 15 genetic loci (areas on the chromosomes) associated with fracture risk. They then replicated their findings in 147,200 fracture cases and 150,085 controls.

All 15 of these loci were linked not only to fracture risk but also to bone mineral density.

Then using a technique called Mendelian randomisation, the researchers examined the association of 15 genetic variants (each representing an individual clinical risk factor for osteoporotic fracture) against fracture risk.

Analysing genetic information in this way avoids some of the problems that afflict traditional observational studies, making the results less prone to unmeasured (confounding) factors, and therefore more likely to be reliable.

The Mendelian randomisation showed that only bone mineral density had a clear effect on fracture risk.

None of the other well-accepted risk factors tested, for example rheumatoid arthritis, vitamin D levels, calcium intake from dairy sources, fasting glucose, type 2 diabetes and coronary heart disease, had a major causal effect on fracture risk.

Older individuals at high risk of fractures often have low vitamin D levels (due to low dietary intake and sun exposure). Therefore, fracture prevention guidelines have suggested the use of vitamin D supplementation in the general population, explain the researchers. "Our analyses showed that vitamin D levels had no protective linear effect on fracture in community dwelling individuals."

Likewise, they found no evidence for a protective effect of sustained intake of dairy derived calcium on fracture risk.

The researchers point to some study limitations but say, to their knowledge, this is the largest and most comprehensive assessment of the genetic determinants of fracture risk so far.

"Our findings are a reminder that clinically relevant changes in most of these risk factors are unlikely to result in large differences in fracture risk," they write. They also "provide guidance for the design of future clinical trials on interventions that are more likely to be successful in reducing fracture risk."

Credit: 
BMJ Group

Study finds multiple sclerosis drug slows brain shrinkage

image: An NIH-funded clinical trial suggested that the anti-inflammatory drug ibudilast may slow brain shrinkage caused by progressive MS.

Image: 
Courtesy of Robert J. Fox, M.D., Cleveland Clinic, OH.

Results from a clinical trial of more than 250 participants with progressive multiple sclerosis (MS) revealed that ibudilast was better than a placebo in slowing down brain shrinkage. The study also showed that the main side effects of ibudilast were gastrointestinal and headaches. The study was supported by the National Institute of Neurological Disorders and Stroke (NINDS), part of the National Institutes of Health, and published in the New England Journal of Medicine.

"These findings provide a glimmer of hope for people with a form of multiple sclerosis that causes long-term disability but does not have many treatment options," said Walter J. Koroshetz, M.D., director of the NINDS.

Robert J. Fox, M.D., a neurologist at Cleveland Clinic in Ohio, led a team of researchers across 28 clinical sites in a brain imaging study to investigate whether ibudilast was better than placebo in reducing the progression of brain atrophy, or shrinkage, in patients with progressive multiple sclerosis.

In the study, 255 patients were randomized to take up to 10 capsules of ibudilast or placebo per day for 96 weeks. Every six months, the participants underwent MRI brain scans. Dr. Fox's team applied a variety of analysis techniques on the MRI images to assess differences in brain changes between the two groups.

The study showed that ibudilast slowed down the rate of brain atrophy compared to placebo. Dr. Fox and his colleagues discovered that there was a difference in brain shrinkage of 0.0009 units of atrophy per year between the two groups, which translates to approximately 2.5 milliliters of brain tissue. In other words, although both groups experienced atrophy, the brains of the patients in the placebo group shrank on average 2.5 milliliters more over two years compared to the ibudilast group. The whole adult human brain has a volume of approximately 1,350 milliliters. However, it is unknown whether that difference had an effect on symptoms or loss of function.

There was no significant difference between the groups in the number of patients who reported adverse effects. The most common side effects associated with ibudilast were gastrointestinal, including nausea and diarrhea, as well as headaches and depression.

"The trial's results are very encouraging and point towards a potential new therapy to help people with progressive MS," said Dr. Fox. "It also increased our understanding of advanced imaging techniques, so that future studies may require a smaller number of patients followed over a shorter period of time. This leads to increased efficiency of clinical research. These imaging methods may also be relevant to a host of other neurological disorders."

MS occurs when there is a breakdown of myelin, a fatty white substance wrapped around axons, which are long strands that carry messages from and between brain cells. When myelin starts to break down, communication between brain cells slows down, leading to muscle weakness and problems with movement, balance, sensation and vision. MS can be relapsing-remitting, in which symptoms occur then disappear for weeks or months and then may reappear, or progressive, which is marked by a gradual decline in function.

The current study was supported by the NeuroNEXT program, an innovative approach to neurological clinical trials that attempts to streamline Phase 2 studies and make them more efficient.

MediciNova donated the active drug and placebo and provided under 10 percent of the funding for the trial in a cooperative agreement with NINDS. MediciNova also had a representative on the protocol steering committee. There was no confidentiality agreement between the manuscript authors.

Future research will test whether reducing brain shrinkage affects thinking, walking, and other problems in people with MS. In addition, future studies will examine whether ibudilast slows the progression of disability in MS patients.

Credit: 
NIH/National Institute of Neurological Disorders and Stroke

Research study sheds new light on relationship between genes and bone fracture risk

Boston (August 29, 2018)--A paper titled "Assessment of the genetic and clinical determinants of fracture risk: genome wide association and mendelian randomization study" appeared today in the British Medical Journal. The paper reports findings from a large international collaboration that identified 15 variations in the genome that are related to the risk of suffering bone fractures, which are a major healthcare problem affecting more than 9 million persons worldwide every year. The study provides evidence against a causal effect of several proposed clinical risk factors for fractures, including diabetes, rheumatoid arthritis, vitamin D, as well as others. These findings strongly suggest that treatments aimed at increasing bone strength are more likely to be successful in preventing fractures than widespread supplementation of calcium and vitamin D or targeting other risk factors that were not found to mediate the disease.

This study was made possible through a team of researchers from the U.S., Europe, Canada, Asia and Australia who formed the largest effort to date investigating the genetics of osteoporosis and fracture risk. The study team included researchers from the Institute for Aging Research at Hebrew SeniorLife--among them, co-senior author Douglas P. Kiel, M.D., M.P.H., Director of the Musculoskeletal Research Center. The study sample was comprised of 185,057 cases of bone fractures and 377,201 controls who were part of the Genetic Factors for Osteoporosis ("GEFOS") Consortium, the UKBiobank Study and the 23andMe biotech company.

This first genome-wide association study (GWAS) of fracture risk provides insight into the biologic mechanisms leading to fractures. Most importantly all of the identified genomic regions found to be associated with fracture have also been previously found to be associated with variation in bone mineral density (BMD), one of the most important risk factors for fracture. Based on this finding, the study team performed an additional analysis called "Mendelian Randomization," that uses genetic information to determine causal relations between risk factors and disease outcomes. The Mendelian Randomization analysis determined that only two examined factors - bone mineral density (BMD) and muscle strength - play a potentially causal role in the risk of suffering osteoporotic fracture. One of the most important findings was that the genetic factors that lead to lowered vitamin D levels do not increase risk of fracture, meaning that vitamin D supplementation is not likely to prevent fractures in the general population. Although vitamin D supplementation is part of clinical guidelines, recent randomized control trials have failed to consistently demonstrate a beneficial effect.

According to Dr. Kiel, "Among the clinical risk factors for fracture assessed in the study, only BMD showed a major causal effect on fracture. The genetic factors contributing to fractures are also the same ones that affect BMD. Knowing one's genetic risk for fracture at an early age could be a useful piece of information to persons wanting to maintain their bone health as they age. Also the study identified novel genetic variants that could be used to target future drug therapies to prevent fracture."

Osteoporotic fractures represent a major health risk to older adults:

34 million Americans have low bone density, putting them at increased risk for osteoporosis and broken bones.

The condition leads to bone fragility and an increased risk of fractures, especially of the hip, spine and wrist. About one-quarter of those over age 50 who suffer a hip fracture die within a year of the injury.

Osteoporosis-related fractures were responsible for an estimated $19 billion in health care costs in 2005, with that figure expected to increase to $25 billion by 2025.

Credit: 
Hebrew SeniorLife Hinda and Arthur Marcus Institute for Aging Research

Changes in breakfast and dinner timings can reduce body fat

Modest changes to breakfast and dinner times can reduce body fat, a new pilot study in the Journal of Nutritional Sciences reports.

During a 10-week study on 'time-restricted feeding' (a form of intermittent fasting), researchers led by Dr Jonathan Johnston from the University of Surrey investigated the impact changing meal times has on dietary intake, body composition and blood risk markers for diabetes and heart disease.

Participants were split into two groups - those who were required to delay their breakfast by 90 minutes and have their dinner 90 minutes earlier, and those who ate meals as they would normally (the controls). Participants were required to provide blood samples and complete diet diaries before and during the 10-week intervention and complete a feedback questionnaire immediately after the study.

Unlike previous studies in this area, participants were not asked to stick to a strict diet and could eat freely, provided it was within a certain eating window. This helped researchers assess whether this type of diet was easy to follow in everyday life.

Researchers found that those who changed their mealtimes lost on average more than twice as much body fat as those in the control group, who ate their meals as normal. If these pilot data can be repeated in larger studies, there is potential for time-restricted feeding to have broad health benefits.

Although there were no restrictions on what participants could eat, researchers found that those who changed their mealtimes ate less food overall than the control group. This result was supported by questionnaire responses which found that 57 percent of participants noted a reduction in food intake either due to reduced appetite, decreased eating opportunities or a cutback in snacking (particularly in the evenings). It is currently uncertain whether the longer fasting period undertaken by this group was also a contributing factor to this reduction in body fat.

As part of the study, researchers also examined if fasting diets are compatible with everyday life and long term commitment. When questioned, 57 percent of participants felt they could not have maintained the new meal times beyond the prescribed 10 weeks because of their incompatibility with family and social life. However, 43 per cent of participants would consider continuing if eating times were more flexible.

Dr Jonathan Johnston, Reader in Chronobiology and Integrative Physiology at the University of Surrey, said:

"Although this study is small, it has provided us with invaluable insight into how slight alterations to our meal times can have benefits to our bodies. Reduction in body fat lessens our chances of developing obesity and related diseases, so is vital in improving our overall health.

"However, as we have seen with these participants, fasting diets are difficult to follow and may not always be compatible with family and social life. We therefore need to make sure they are flexible and conducive to real life, as the potential benefits of such diets are clear to see.

"We are now going to use these preliminary findings to design larger, more comprehensive studies of time-restricted feeding".

Credit: 
University of Surrey

Hospital rating tools should allow patients to customize rankings

Publicly available hospital ratings and rankings should be modified to allow quality measures to be prioritized according to the needs and preferences of individual patients, according to a new RAND Corporation analysis.

Writing in the Aug. 30 edition of the New England Journal of Medicine, researchers propose a new way of rating hospitals by creating tools that allow patients to decide which performance measures to prioritize. For example, researchers demonstrate how the different priorities of a pregnant woman and a middle-aged man needing knee surgery might change which of their local hospitals has the highest overall rating.

The research team created a web tool that demonstrates a way to create customized rankings. The tool, which allows users to create custom rankings of most hospitals in the nation, is based upon the 2016 version of the federal government's Hospital Compare star ratings.

"If the intent of hospital quality ratings is to inform patient choice, why not ask patients for their input?" said Dr. Mark Friedberg, senior author of the paper and a senior physician scientist at RAND, a nonprofit research organization. "We built a tool to show that it's possible to move beyond one-size-fits-all hospital ratings. In the internet era, there's no reason why these report cards can't be customized to each individual patient's needs and preferences."

Researchers demonstrate that a hospital quality report tailored to the "average" patient is likely not be a good fit for most patients with individual needs.

In one scenario modeled by the team, customizing hospital report cards to the needs of a pregnant woman who lives in the suburbs of Boston drives down the ranking of a large downtown medical center and boosts the ratings of two community hospitals that are closer to her home. In another scenario, the ranking of two hospitals in the Los Angeles suburbs reverses when a man needing elective knee surgery customizes rankings to reflect his own needs.

Credit: 
RAND Corporation

Trump supporters on campuses more likely to show prejudice toward international students

BUFFALO, N.Y. - International students at American colleges and universities do not always find a welcoming environment. Research has shown that, as a group, internationals face prejudice from segments of the domestic student population, and a new study by a University at Buffalo psychologist suggests that stereotypes alone do not lead to that prejudice.

"Prejudice against international students is multifaceted," says Wendy Quinton, PhD, a clinical associate professor in UB's psychology department and author of the paper published in the Journal of Diversity in Higher Education.

"But there are factors leading to prejudice that universities can influence."

Her study showed that, aside from stereotypes, other factors, including support for President Donald Trump, predicted prejudice against international students from the domestic student population.

"Some of President Trump's policies, such as promoting 'America First,' the travel ban and his talk of building a wall are in line with unwelcoming attitudes toward immigrants -- but this is the first evidence I've seen linking support for Trump with attitudes toward international students," says Quinton, an expert in prejudice, stigma and social identity who surveyed 389 college students, all of whom self-identified as being born and raised in the United States.

"This finding tells us that if you statistically account for stereotypes, those domestic students who were higher in Trump support still had significantly higher prejudice. Liking Trump goes beyond stereotypes in predicting prejudice against international students."

Stereotypes are ideas or beliefs about members of a certain group. A stereotype can be positive or negative. Prejudice is a negative attitude toward members of a particular group. Stereotypes can lead to prejudice and Quinton says there are stereotypes that predict prejudice against international students.

"I found other factors, not just stereotypes -- and administrators may be able to influence those factors and make a difference," she says.

In addition to Trump support, university identity emerged as another independent predictor of prejudice.

University identity is reflected in a sense of belonging to the school. Do I feel like I'm a member of the campus community? Does the university value me? Do I belong here? Students who didn't strongly identify with the university showed higher levels of prejudice against international students.

"Among the predictors of prejudice against international students that administrators can influence, university identity came out as an important independent predictor," says Quinton. "If you increase university identity and make everyone feel like they belong to one group then the division between domestic and international students should become smaller."

Standardized test scores marginally predicted prejudice. Those domestic students with lower SAT scores had higher levels of prejudice. Domestic students who had fewer positive stereotypes and more negative stereotypes also tended to be prejudiced against internationals.

But socialization is also critical, and it can positively impact those who would otherwise be highly prejudiced against international students, according to Quinton.

She says the process of getting acquainted predicted levels of prejudice.

"Particular kinds of socialization matter the most: Do we study together? Do we share activities together? Do we have the kinds of close contact that can build friendships? That mattered," says Quinton. "For those with lower SAT scores, for those low in positive stereotypes, for those high in Trump support, the more they socialized with international students the lower their prejudice."

"University identity and socialization are both positive factors that universities can foster and that everyone can benefit from."

The results from Quinton's research come on the heels of the first national decrease in international enrollment at American universities after years of increases. Statistics from the U.S. State Department show a 17 percent decline in 2017 in student visas.

That drop had a significant economic impact on the country.

Between 2015 and 2016, international students contributed nearly $33 billion to the U.S. economy and supported over 400,000 jobs, according to NAFSA: Association for International Educators.

Quinton sees the globalization experience as enriching for domestic and international students alike.

"International students are part of the fiber and fabric of American universities. They bring different experiences, cultures, backgrounds, languages and ideas. Domestic students gain a great deal from the presence of internationals on campus and develop a deeper understanding of the world because of that diversity.

"Universities should make the effort to address these variables because the potential benefits are so great, for all involved," she says.

Credit: 
University at Buffalo

Stem cells show promise as drug delivery tool for childhood brain cancer

CHAPEL HILL -- The latest in a series of laboratory breakthroughs could lead to a more effective way to treat the most common brain cancer in children. Scientists from the University of North Carolina Lineberger Comprehensive Cancer Center and UNC Eshelman School of Pharmacy reported results from early studies that demonstrate how cancer-hunting stem cells, developed from skin cells, can track down and deliver a drug to destroy medulloblastoma cells hiding after surgery.

Previously, UNC Lineberger's Shawn Hingtgen, PhD, and his collaborators showed in preclinical studies they could flip skin cells into stem cells that hunt and deliver cancer-killing drugs to glioblastoma, the deadliest malignant brain tumor in adults. In their new study, published in PLOS ONE, the researchers reported they could shrink tumors in laboratory models of medulloblastoma, and extend life. The study is a necessary step toward developing clinical trials that would see if the approach works for children.

Hingtgen said this approach holds promise for reducing side effects an­­­d helping more children with medulloblastoma. More than 70 percent of patients with average-risk disease live five years on standard treatment, but not all patients respond, and treatment can cause lasting neurologic and developmental side effects.

"Children with medulloblastoma receive chemotherapy and radiation, which can be very toxic to the developing brain," said Hingtgen, who is an associate professor in the UNC Eshelman School of Pharmacy, an assistant professor in the UNC School of Medicine Department of Neurosurgery, and a member of UNC LIneberger. "If we could use this strategy to eliminate or reduce the amount of chemotherapy or radiation that patients receive, there could be quality-of-life benefits."

Hingtgen and his team showed the natural ability of the stem cells to home to tumors, and began studying them as a way to deliver drugs to tumors and limit toxicity to the rest of the body. Their technology is an extension of a discovery that won researchers a Nobel Prize in 2012, and showed they could transform skin cells into stem cells.

"The cells are like a FedEx truck that will get you to a particular location, and (deliver) potent cytotoxic agents directly into the tumor," Hingtgen said. "We essentially turn your skin into something that will crawl to find invasive and infiltrative tumors."

For the study, researchers reprogrammed skin cells into stem cells, and then genetically engineered them to manufacture a substance that becomes toxic to other cells when exposed to another drug, called a "pro-drug." Inserting the drug-carrying stem cells into the brain of laboratory models after surgery decreased the size of tumors by 15 times, and extended median survival in mice by 133 percent. Using human stem cells, they prolonged life of the mice by 123 percent.

They also developed a laboratory model of medulloblastoma to allow them to simulate the way standard care is currently delivered - surgery followed by drug therapy. Using this model, they discovered that after surgically removing a tumor, the cancer cells that remained grew faster.

"After you resect the tumor, we found it becomes really aggressive," Hingtgen said. "The cancer that remained grew faster after the tumor was resected."

Scott Elton, MD, FAANS, FAAP, chief of the UNC School of Medicine Division of Pediatric Neurosurgery and co-author on the study, said there is a need for new treatments for medulloblastomas that have come back, or recurred, as well as for treatments that are less toxic overall. The ability to use a patient's own cells to directly target the tumor would be "the holy grail" of therapy, according to Elton, and he believes it could hold promise for other rare, and sometimes fatal, brain cancer types that occur in children as well.

"Medulloblastoma is cancer that happens mostly in kids, and while current therapy has changed survival pretty dramatically, it can still be pretty toxic," Elton said. "This is a great avenue of exploration, particularly for the 30 percent of children who struggle or don't make it with standard therapy. We want to nudge the needle even further."

Credit: 
UNC Lineberger Comprehensive Cancer Center

What's that smell? Scientists find a new way to understand odors

image: A mathematical model reveals a map for odors from the natural environment. From left: Yuansheng Zhou and Tatyana Sharpee.

Image: 
Salk Institute

LA JOLLA--(August 29, 2018) Every smell, from a rose to a smoky fire to a pungent fish, is composed of a mixture of odorant molecules that bind to protein receptors inside your nose. But scientists have struggled to understand exactly what makes each combination of odorant molecules smell the way it does or predict from its structure whether a molecule is pleasant, noxious or has no smell at all.

Now, scientists from the Salk Institute and Arizona State University have discovered a new way to organize odor molecules based on how often they occur together in nature, which is the setting in which our sense of smell evolved. They were then able to map this data to discover regions of odor combinations humans find most pleasurable. The findings, published on August 29 in the journal Science Advances, open new avenues for engineering smells and tastes.

"We can arrange sound by high frequency and low frequency; vision by a spectrum of wavelengths and colors," says Tatyana Sharpee, an associate professor in Salk's Computational Neurobiology Laboratory and lead author of the new work. "But when it comes to olfaction, it's been an unsolved problem whether there is a way to organize odors."

Previously, scientists had tried to classify odorant molecules strictly based on their chemical structures. "But it turns out molecules with structures that look very similar can smell very different," says Sharpee.

Using existing data on the odorant molecules found in different samples of strawberries, tomatoes, blueberries and mouse urine, they used statistical methods to put odorant molecules on a map based on how frequently they occurred together in the four sets of samples.

Those molecules that occurred together more frequently were placed closer to each other.

"It's akin to me telling you that Chicago is x number of miles from New York, Los Angeles and Melbourne. And if you mapped the cities, you'd find out is that the Earth's surface is curved, not flat--otherwise the distance from, say, Melbourne to Los Angeles wouldn't add up. We did the same thing for odor," explains Sharpee.

Using this strategy, the scientists found that odor molecules could similarly be mapped onto a curved surface in three dimensions. But instead of a sphere, like Earth, it turned out to be the shape of a Pringles potato chip--a shape mathematicians call a hyperboloid.

When the team looked at how the molecules clustered on this surface, they found there were pleasant and unpleasant directions, as well as directions that correlated with acidity or how easily odors evaporate from surfaces. These observations now make it easier to construct pleasant odor mixtures to use, for example, in artificial environments (such as a space station).

"By revealing more about how odorant molecules and the brain interact, this work may also have implications for understanding why people with some diseases--like Parkinson's--lose their sense of smell," adds behavioral neuroscientist Brian Smith of Arizona State University, a coauthor of the paper.

Credit: 
Salk Institute

Study Demonstrates a New Recurrence-Based Method that Mimics Kolmogorov-Smirnov Test

WASHINGTON, D.C., August 29, 2018 -- The recurrence plot is a vital tool for analyzing nonlinear dynamic systems, especially systems involving empirically observed time series data. RPs show patterns in a phase space system and indicate where data visit the same coordinates. RPs can also mimic some types of inferential statistics and linear analyses, such as spectral analysis. A new paper in the journal Chaos, from AIP Publishing, provides a proof of concept for using RPs to mimic the Kolmogorov-Smirnov test, which scientists use to determine if two data sets significantly differ.

The authors, however, caution that not all types of data can be used with this new method. "Continuous data at an interval or ratio-scale level would be best suited for this technique," said Giuseppe Leonardi, one of the study's authors. "However, discretely distributed data at the same level of measurement such as dice throws would also be suitable."

The researchers analyzed recurrence points in the RPs by dividing the RP into four quadrants and counting the number of recurrence points in each cell. Then, they calculated the within-sample and between-sample recurrence rates and used those values, along with expected frequencies, to determine a p-value related to the difference between the samples. This p-value indicated whether the two groups were from the same sample or from different samples.

To verify their proof of concept, the researchers conducted a series of simulations to see how their recurrence-based test performed compared to the Kolmogorov-Smirnov test. These simulations involved two groups of normal, skewed normal, or log-normal distributions with various combinations of means and standard deviations. The researchers found that the recurrence-based method performed roughly the same as the Kolmogorov-Smirnov test with a few differences in sensitivity with different distribution types.

The recurrence-based test appeared to be more sensitive at the tails of the distribution than the Kolmogorov-Smirnov test. This could be because the test considers deviations along the whole range of values, unlike the Kolmogorov-Smirnov test which only accounts for the largest deviation between two distributions. Leonardi explained that this enhanced sensitivity would make the recurrence-based test especially useful for nonlinear data like human reaction times.

He also cautioned that their method might suggest statistically reliable differences that are too small to be meaningful. "This might be a downside of the test for practical users," Leonardi said. "However, we have not investigated such effects in depth."

This proof of concept demonstrates that the RP can be useful for statistical analysis tools. Going forward, the team plans to investigate the effects of sample size on their method. Leonardi said they would also like to further develop the test to model other types of inferential statistics including analysis of variance.

Credit: 
American Institute of Physics