Culture

People with inadequate access to food 10% to 37% more likely to die prematurely

Adults with food insecurity (i.e., inadequate access to food because of financial constraints) are 10% to 37% more likely to die prematurely from any cause other than cancer compared to food-secure people, found new research in CMAJ (Canadian Medical Association Journal).

"Among adults who died prematurely, those experiencing severe food insecurity died at an age 9 years earlier than their food-secure counterparts," writes lead author Dr. Fei Men, a postdoctoral fellow in the lab of Professor Valerie Tarasuk at the University of Toronto.

Researchers looked at data from the Canadian Community Health Survey 2005-2017 on more than half a million (510 010) adults in Canada. They categorized people as food secure, or marginally, moderately or severely food insecure. By the end of the study period, 25 460 people had died prematurely, with people who were severely food insecure dying 9 years younger than their food-secure counterparts (59.5 years old versus 68.9 years).

The average life expectancy in Canada in 2008-2014 was 82 years; deaths at or before that age were considered premature in this study.

Severely food-insecure adults were more likely to die prematurely than their food-secure counterparts for all causes except cancers. Premature death by infectious-parasitic diseases, unintentional injuries and suicides was more than twice as likely for those experiencing severe versus no food insecurity.

Previous studies have examined the relation between inadequate food and death, although none looked at causes of death.

"The significant correlations of all levels of food insecurity with potentially avoidable deaths imply that food-insecure adults benefit less from public health efforts to prevent and treat diseases and injuries than their food-secure counterparts," write the authors.

Policies to address food insecurity have the potential to reduce premature death.

"The markedly higher mortality hazard of severe food insecurity highlights the importance of policy interventions that protect households from extreme deprivation. In Canada, policies that improve the material resources of low-income households have been shown to strengthen food security and health," says Dr. Men.

Credit: 
Canadian Medical Association Journal

Prolonged breath-holding could help radiotherapy treatment of cardiac arrhythmias

A technique that enables patients suffering from heart conditions to hold their breath safely for over 5 minutes could have potential as part of a new treatment for cardiac arrhythmias, say researchers at the University of Birmingham.

In a new study, published in Frontiers in Physiology, researchers initially proposed the technique as a new means for earlier diagnosis of ischaemic heart disease. The technique involves hyperventilating conscious, unmedicated patients using a mechanical ventilator which delivers air to the patient via a face mask.

Hyperventilation causes hypocapnia that leads to temporary constriction in the coronary arteries. The researchers were initially exploring whether this effect could be exploited as an 'early warning system' to diagnose coronary heart disease.

Although more work needs to be done on its diagnostic potential, the research was able to confirm that mechanical hyperventilation and hypocapnia were well tolerated and safe for patients with angina.

The team believe this paves the way to induce breath-holds of over five minutes to support an emerging new technique in which radiotherapy, instead of radiofrequency or freezing, is used for cardiac ablation.

In this procedure, patients with arrhythmias undergo precisely targeted radiotherapy, applied from outside the chest, to destroy tissue that is allowing incorrect electrical signals to cause an abnormal heart rhythm. Breathing is a problem because each breath causes the heart to move within the chest.

Lead author Dr Michael Parkes, of the University's School of Sport, Exercise and Rehabilitation Sciences, explained: "There is still little awareness of the simplicity, availability, and safety of non-invasive mechanical hyperventilation. We have already shown that patients with breast cancer can breath-hold safely for over 5 minutes using this technique. The fact that patients with angina were able to tolerate mechanical hyperventilation so well confirms its potential to improve the newly emerging procedure of using radiotherapy for cardiac ablation.

"Stopping breathing with a safe breath-hold of over 5 minutes, using mechanically induced hypocapnia and now with oxygen enriched air, could allow surgeons to target the radiotherapy for cardiac ablation much more precisely. The advantage of radiotherapy over radiofrequency or freezing is that radiotherapy is completely non-invasive and is applied from outside the chest. Whereas the other techniques require a catheter, passed via a vein in the groin or artery in the neck, to be placed inside the atria in the heart . Currently such radiotherapy is being considered only when all other ablation and pharmacological techniques have failed."

The next step is to test this technique in patients with cardiac arrhythmias to see if they too can hold their breath long enough to apply the radiotherapy.

Credit: 
University of Birmingham

Want to know what climate change will do in your back yard? There's a dataset for that

image: A small bean farm in Colombia's Darién region. Future climate scenarios can be modeled at the community scale thanks to a dataset created by the CGIAR research program on Climate Change, Agriculture and Food Security (CCAFS) and the International Center for Tropical Agriculture (CIAT).

Image: 
Neil Palmer / International Center for Tropical Agriculture

What the global climate emergency has in store may vary from one back yard to the next, particularly in the tropics where microclimates, geography and land-use practices shift dramatically over small areas. This has major implications for adaptation strategies at local levels and requires trustworthy, high-resolution data on plausible future climate scenarios.

A dataset created by the International Center for Tropical Agriculture (CIAT) and colleagues is filling this niche. Primarily intended to help policymakers devise adaptation strategies for smallholder farmers around the world, the open-access dataset has been used in 350 research papers. Users in at least 186 countries have downloaded almost 400,000 files from the dataset since it went online in 2013.

A full description, review and validation of the dataset, including how it was built, was published January 20 in Scientific Data, an open-access publication by Nature for the description of scientifically valuable datasets.

"Climate models are complex representations of the earth system, but they aren't perfect," said Julian Ramirez-Villegas, the principal investigator of the project and a scientist with CIAT and the CGIAR Research Platform on Climate Change, Agriculture and Food Security (CCAFS). "These errors can have an impact on our agricultural models. Because these models help us make decisions, this can have dire consequences."

While the data has primarily served agricultural research, it has also been used to map the potential global spread of Zika (a mosquito-borne disease), to plan investment strategies for international development, and to predict the ongoing decline of outdoor skating days in Canada due to warmer winters.

"The use and applicability of this data have been really extensive and topically quite broad," said Ramirez-Villegas. "Of course, a large portion of the studies has been done on crops that are key to global food security and incomes such as rice, coffee, cocoa, maize, and others."

Pinpointing climate impacts

Climate-change projections are typically available at coarse scales, ranging 70-400km. But models for the impact of climate change for many agricultural plant varieties require data at finer scales. The researchers used techniques to increase the spatial resolution (a process known as downscaling) and to correct errors (a process known as bias correction) to create high-resolution future climate data for 436 scenarios.

"This is a critical resource for modeling more realistically the future of crops and ecosystems," said Carlos Navarro, the lead author of the study who is affiliated with CIAT and CCAFS.

For a given emissions pathway and future period, each scenario includes monthly information for average and extreme temperatures, rainfall, and 19 other related variables. The data are publicly available in the World Data Center for Climate and the CCAFS-Climate data portal.

"Through these scenarios, we can understand, for instance, how agricultural productivity might evolve if the world continues on the current greenhouse emissions trajectory," said Navarro. "They also provide the data to model what types of adaptations would best counter any negative climate change effects."

Global and regional models analyze climate conditions at a rougher scales and simplify natural processes, producing results that may deviate from realistic scenarios.

The dataset is CGIAR's biggest Findable Accessible Interoperable Reusable (FAIR) database. It also underscores CGIAR's role in big data for development, through its Platform for Big Data in Agriculture. The dataset is currently included in its Global Agriculture Research Data Innovation and Acceleration Network (GARDIAN).

The high-resolution scale of this data is useful for scientists, policymakers, NGOs and investors, as it can help them understand local climate change impacts and therefore make better bets on adaptation measures, which plans can specifically target watersheds, regions, municipalities or countries.

Credit: 
The Alliance of Bioversity International and the International Center for Tropical Agriculture

Racial disparities in drug prescriptions for dementia

Disparities in drug prescribing suggest that black and Asian people with dementia are not receiving the same quality of care as their white peers, according to a new UCL-led study in the UK.

Asian people with dementia are less likely to receive anti-dementia drugs, and take them for shorter periods, according to the findings published in Clinical Epidemiology.

Dementia patients from black ethnic groups who are prescribed antipsychotic drugs, which are mainly used to treat dementia-related distress rather than the primary symptoms, take them for around four weeks longer per year compared to white people in the UK, exceeding suggested limits on how long they should be taken for.

"Our new findings are concerning as they appear to reflect inequalities in the care people receive to treat symptoms associated with dementia," said the study's lead author, Professor Claudia Cooper (UCL Psychiatry).

Researchers analysed data from 53,718 people across the UK who had a dementia diagnosis, and 1,648,889 people without dementia, drawing from The Health Improvement Network primary care database and collected between 2014 and 2016.

They found that Asian people with dementia were 14% less likely than white patients to be prescribed anti-dementia drugs when they were potentially beneficial, and received them for an average of 15 fewer days per year.

Anti-dementia drugs - cholinesterase inhibitors or memantine - are the only class of medication available for treating dementia, as they can help with memory and other cognitive abilities, while other medications such as antipsychotics are sometimes prescribed to treat some of the associated behavioural and psychological symptoms.

Previous studies in the USA and Australia have also found disparities in drug treatment for dementia for minority ethnic groups, but this is the first time the issue has been investigated in a large UK study.

The researchers say that the greater socioeconomic disadvantages experienced by minority ethnic groups may lead to barriers to accessing care, while language and cultural barriers could also contribute to disparities.

The researchers found that both black and Asian people with dementia were prescribed antipsychotic drugs for longer than white patients, by 27 and 17 days more, respectively, which could put them at greater risk of harmful side effects.

As they did not identify differences in rates of an initial prescription of antipsychotics, the researchers say the findings may reflect differences in the likelihood of medication being reviewed and stopped when no longer needed.

"Rates of antipsychotic prescribing in all ethnic groups exceeded recommendations for treating the often very distressing behavioural and psychological symptoms of dementia, such as agitation or challenging behaviours, which are the most common reasons antipsychotic drugs are prescribed to people living with dementia," explained Professor Cooper.

"While there has been a very sharp reduction in antipsychotic prescribing in the UK over the past 10 years, these figures suggest there is still work to do to ensure that people with dementia only receive potentially harmful antipsychotic drugs if there are no acceptable alternatives."

Dr Mary Elizabeth Jones (UCL Institute of Epidemiology & Health Care), first author of the study, commented: "While we have yet to find out whether taking antipsychotic drugs for a few weeks more increases the associated risks, which can include falls, cognitive decline, strokes and even death, it's a potentially significant inequality which we should take seriously. More work may need to be done to ensure that guidelines are being consistently met, and that dementia services are culturally competent."

Co-author Professor Jill Manthorpe of the NIHR Health & Social Care Workforce Research Unit, King's College London, said that health professionals should question whether antipsychotic drugs are being prescribed instead of other forms of support that could address causes of the distressing symptoms.

"Families too should ask if there are other alternatives such as social prescribing that may put people in contact with activities and sensory experiences which may help reduce feelings of distress. Culturally meaningful activities may be particularly helpful, such as hearing or playing music or enjoying the experience of tactile objects," she said.

A previous study also led by Professor Cooper found that dementia rates are higher among black people compared to the UK average, and ethnic minority groups may be less likely to be diagnosed in a timely manner. She has also led a study finding that women with dementia have fewer GP visits, receive less health monitoring and take more potentially harmful medication than men with dementia.

Credit: 
University College London

Strongly 'handed' squirrels less good at learning

video: This is a grey squirrel trying to get food with its mouth.

Image: 
University of Exeter

Squirrels that strongly favour their left or right side are less good at learning, new research suggests.

Just as humans are usually left- or right-handed, many animals favour one side of their body for certain tasks.

The strength of this preference varies, with some individuals happy to use either side, while others strongly favour one side (known as being strongly "lateralised").

The University of Exeter study found that grey squirrels which strongly favoured a side did less well on a learning task. They had to learn to use a paw, rather than their mouth, to get nuts.

"It has been suggested that being strongly lateralised makes brains more efficient, with each hemisphere focussing on different tasks," said Dr Lisa Leaver.

"This could help animals survive, which would explain the evolution of laterality across the animal kingdom.

"In fish and birds, there is evidence that being strongly lateralised is linked to better cognitive performance (brain function).

"However, limited data from studies of mammals suggest a weak or even negative relationship.

"Our study measured speed of learning among grey squirrels and, in line with these previous mammal studies, suggests that strong lateralisation is linked to poor cognitive performance."

In the study, wild grey squirrels on the University of Exeter's Streatham Campus were presented with a transparent tube containing peanuts.

Squirrels usually collect food with their mouths, but the tube was too narrow to allow this - so they had to learn to use a paw.

By measuring both how quickly squirrels learned and how strongly they favoured a particular paw, the researchers could assess both learning and laterality.

More than 30 squirrels were observed, with 12 providing enough data for inclusion in the study.

The relationship between laterality and human cognitive performance is still unclear, though some research has suggested that less lateralised (ie more ambidextrous) people may be more creative.

"More research on mammals is needed to understand the complex relationship between laterality and cognitive performance," Dr Leaver said.

Credit: 
University of Exeter

Becoming less active and gaining weight: Downsides of becoming an adult

Leaving school and getting a job both lead to a drop in the amount of physical activity, while becoming a mother is linked to increased weight gain, conclude two reviews published today and led by researchers at the University of Cambridge.

Many people tend to put on weight as they leave adolescence and move into adulthood, and this is the age when the levels of obesity increase the fastest. This weight gain is related to changes in diet and physical activity behaviour across the life events of early adulthood, including the move from school to further education and employment, starting new relationships and having children.

Writing in Obesity Reviews, researchers from the Centre for Diet and Activity Research (CEDAR) at Cambridge looked at changes in physical activity, diet and body weight as young adults move from education into employment and to becoming a parent. To do this, they carried out systematic reviews and meta-analyses of existing scientific literature - these approaches allow them to compare and consolidate results from a number of often-contradictory studies to reach more robust conclusions.

Leaving school

In the first of the two studies, the team looked at the evidence relating to the transition from high school into higher education or employment and how this affects body weight, diet and physical activity. In total, they found 19 studies covering ages 15-35 years, of which 17 assessed changes in physical activity, three body weight, and five diet or eating behaviours.

The team found that leaving high school was associated with a decrease of seven minutes per day of moderate-to-vigorous physical activity. The decrease was larger for males than it was for females (a decrease of 16.4 minutes per day for men compared to 6.7 minutes per day for women). More detailed analysis revealed that the change is largest when people go to university, with overall levels of moderate-to-vigorous physical activity falling by 11.4 minutes per day.

Three studies reported increases in body weight on leaving high school, though there were not enough studies to provide a mean weight increase. Two studies suggested that diets decrease in quality on leaving high school and one suggested the same on leaving university.

"Children have a relatively protected environment, with healthy food and exercise encouraged within schools, but this evidence suggests that the pressures of university, employment and childcare drive changes in behaviour which are likely to be bad for long-term health," said Dr Eleanor Winpenny from CEDAR and the MRC Epidemiology Unit at the University of Cambridge.

"This is a really important time when people are forming healthy or unhealthy habits that will continue through adult life. If we can pinpoint the factors in our adult lives which are driving unhealthy behaviours, we can then work to change them."

Becoming a parent

In the second study, the team looked at the impact of becoming a parent on weight, diet and physical activity.

A meta-analysis of six studies found the difference in change in body mass index (BMI) between remaining without children and becoming a parent was 17%: a woman of average height (164cm) who had no children gained around 7.5kg over five to six years, while a mother of the same height would gain an additional 1.3kg. These equate to increases in BMI of 2.8 versus 3.3.

Only one study looked at the impact of becoming a father and found no difference in change.

There was little evidence looking at physical activity and diet. Most studies including physical activity showed a greater decline in parents versus non-parents. The team found limited evidence for diet, which did not seem to differ between parents and non-parents.

"BMI increases for women over young adulthood, particularly among those becoming a mother. However, new parents could also be particularly willing to change their behaviour as it may also positively influence their children, rather than solely improve their own health," said Dr Kirsten Corder, also from CEDAR and the MRC Epidemiology Unit.

"Interventions aimed at increasing parents' activity levels and improving diet could have benefits all round. We need to take a look at the messages given to new parents by health practitioners as previous studies have suggested widespread confusion among new mothers about acceptable pregnancy-related weight gain."

Credit: 
University of Cambridge

What is an endangered species?

image: Gray wolves, like this pair on Isle Royale, are listed as endangered in the United States.

Image: 
Michigan Technological University

Lions and leopards are endangered species. Robins and raccoons clearly are not. The distinction seems simple until one ponders a question such as: How many lions would there have to be and how many of their former haunts would they have to inhabit before we'd agree they are no longer endangered?

To put a fine point on it, what is an endangered species? The quick answer: An endangered species is at risk of extinction. Fine, except questions about risk always come in shades and degrees, more risk and less risk.

Extinction risk increases as a species is driven to extinction from portions of its natural range. Most mammal species have been driven to extinction from half or more of their historic range because of human activities.

The query "What is an endangered species?" is quickly transformed into a far tougher question: How much loss should a species endure before we agree that the species deserves special protections and concerted effort for its betterment? My colleagues and I put a very similar question to nearly 1,000 (representatively sampled) Americans after giving them the information in the previous paragraph. The results, "What is an endangered species?: judgments about acceptable risk," are published today in Environmental Research Letters.

Three-quarters of those surveyed said a species deserves special protections if it had been driven to extinction from any more than 30% of its historic range. Not everyone was in perfect agreement. Some were more accepting of losses. The survey results indicate that people more accepting of loss were less knowledgeable about the environment and self-identify as advocates for the rights of gun and land owners. Still, three-quarters of people from the group of people who were more accepting of loss thought special protections were warranted if a species had been lost from more than 41% of their former range.

These attitudes of the American public are aligned with the language of the U.S. Endangered Species Act -- the law for preventing species endangerment in the U.S. That law defines an endangered species as one that is "in danger of extinction throughout all or a significant portion of its range."

But There Might Be A Problem

Government decision-makers have tended to agree with the scientists they consult in judging what counts as acceptable risk and loss. These scientists express the trigger point for endangerment in very different terms. They tend to say a species is endangered if its risk of total and complete extinction exceeds 5% over 100 years.

Before human activities began elevating extinction risk, a typical vertebrate species would have experienced an extinction risk of 1% over a 10,000-year period. The extinction risk that decision-makers and their consultant experts have tended to consider acceptable (5% over 100 years) corresponds to an extinction risk many times greater that the extinction risk we currently impose on biodiversity! Experts and decision-makers -- using a law designed to mitigate the biodiversity crisis -- tend to allow for stunningly high levels of risk. But the law and the general public seem accepting of only lower risk that would greatly mitigate the biodiversity crisis. What's going on?

One possibility is that experts and decision-makers are more accepting of the risks and losses because they believe greater protection would be impossibly expensive. If so, the American public may be getting it right, not the experts and decision-makers. Why? Because the law allows for two separate judgements. The first judgement is, is the species endangered and therefore deserving of protection? The second judgment is, can the American people afford that protection? Keeping those judgements separate is vital because making a case that more funding and effort is required to solve the biodiversity crisis is not helped by experts and decision-makers when they grossly understate the problem -- as they do when they judge endangerment to entail such extraordinarily high levels of risk and loss.

Facts and Values

Another possible explanation for the judgments of experts and decision-makers was uncovered in an earlier paper led by Jeremy Bruskotter of Ohio State University (also a collaborator on this paper). They showed that experts tended to offer judgments about grizzly bear endangerment -- based not so much their own independent expert judgement -- but on basis of what they think (rightly or wrongly) their peers' judgement would be.

Regardless of the explanation, a good answer to the question, "What an endangered species?" is an inescapable synthesis of facts and values. Experts on endangered species have a better handle on the facts than the general public. However, there is cause for concern when decision-makers do not reflect the broadly held values of their constituents. An important possible explanation for this discrepancy in values is the influence of special interests on decision-makers and experts charged with caring for biodiversity.

Getting the answer right is of grave importance. If we do not know well enough what an endangered species is, then we cannot know well enough what it means to conserve nature, because conserving nature is largely -- either directly or indirectly -- about giving special care to endangered species until they no longer deserve that label.

Credit: 
Michigan Technological University

New tumor-driving mutations discovered in the under-explored regions of the cancer genome

image: Dr. Jüri Reimand, lead author of the study.

Image: 
S.Scacco/CP Images

Toronto - (January 17, 2020) In an unprecedented pan-cancer analysis of whole genomes, researchers at the Ontario Institute for Cancer Research (OICR) have discovered new regions of non-coding DNA that, when altered, may lead to cancer growth and progression.

The study, published today in Molecular Cell, reveals novel mechanisms of disease progression that could lead to new avenues of research and ultimately to better diagnostic tests and precision therapies.

Although previous studies have focused on the two per cent of the genome that codes for proteins, known as genes, this study analyzed mutation patterns within the vast non-coding regions of human DNA that control how and when genes are activated.

"Cancer-driver mutations are relatively rare in these large non-coding regions that often lie far from genes, presenting major challenges for systematic data analysis," says Dr. Jüri Reimand, investigator at OICR and lead author of the study.

"Powered by novel statistical tools and whole genome sequencing data from more than 1,800 patients, we found evidence of new molecular mechanisms that may cause cancer and give rise to more-aggressive tumours."

The research group analyzed more than 100,000 sections of each patient's genome, focusing on the often-overlooked non-coding regions that interact with genes through the three-dimensional genome. One of the 30 key regions discovered was predicted to have a significant role in regulating a known anti-tumour gene in cancer cells, despite being more than 250,000 base pairs away from the gene in the genome. The group performed CRISPR-Cas9 genome editing and functional experiments in human cell lines to explore the cancer-driving properties of this non-coding region.

"We characterized several non-coding regions potentially involved in oncogenesis, but we've just scratched the surface," says Reimand. "With our algorithms and the rapidly growing datasets of patient cancer genomes and epigenetic profiles, we look forward to enabling future discoveries that could lead to new ways to predict how a patient's cancer will progress and ultimately new ways to target a patient's disease or diagnose it more precisely."

Reimand's research group developed the statistical methods behind this study and made them freely available for the research community to use. These methods have been rigorously tested against other algorithms from around the world.

"Looking into the non-coding genome is really important because these vast sections regulate our genes and can switch them on and off. Mutations in these regions can cause these regulatory switches to act abnormally and potentially cause - or advance - cancer," says Helen Zhu, student at OICR and co-first author of the study. "We've shown that our method, called ActiveDriverWGS, can excavate these regions and pinpoint specific areas that are important to cancer growth."

"Although these candidate driver mutations are rare, we now have the first experimental evidence that one of the mutated regions regulates cancer genes and pathways in human cell lines," says Dr. Liis Uusküla-Reimand, Research Associate at The Hospital for Sick Children (SickKids) and co-first author of the study. "As the research community collects more data, we plan to look deeper into these regions to understand how the mutations alter gene regulation and chromatin architecture in specific cancer types to enable the development of new precision therapies to patients with these diseases."

Credit: 
Ontario Institute for Cancer Research

Scurvy is still a thing in Canada

image: First author Dr. Kayla Dadgar

Image: 
Kayla Dadgar

HAMILTON, ON (Jan. 17, 2019) - Scurvy, the debilitating condition remembered as a disease of pirates, is still found in Canada.

The disease, which is caused by a vitamin C deficiency, can result in bruising, weakness, anemia, gum disease, hemorrhage, tooth loss, and even death if undiagnosed and untreated.

McMaster University researchers surveyed the data of patients of Hamilton's two hospital systems over nine years and found 52 with low Vitamin C levels. This included 13 patients who could be diagnosed as having scurvy, and an additional 39 who tested positive for scurvy but did not have documented symptoms.

Among those with scurvy, some were related to alcohol use disorder or to bariatric surgery but the majority were related to other causes of malnutrition such as persistent vomiting, purposeful dietary restrictions, mental illness, social isolation and dependence on others for food.

"Scurvy is seen as a disease irrelevant to the modern world, but it still exists, and clinicians caring for at-risk patients should be aware of it and know how to diagnose it," said John Neary, associate professor of medicine at McMaster and the senior author of the study published this month in the Journal of General Internal Medicine.

First author Kayla Dadgar, who did the research as a medical student at McMaster, said: "Scurvy should be a 'never event' in a healthy society. That it still occurs in Canada in our time indicates that we are not supporting vulnerable people as we should."

The patients with scurvy who were given Vitamin C had a rapid recovery of their symptoms.

Credit: 
McMaster University

New scheduling tool offers both better flight choices and increased airline profits

image: From left, Vikrant Vaze, assistant professor of engineering at Dartmouth; Keji Wei, an engineering Ph.D. candidate while working on the paper and now a senior operations research analyst at Sabre Corporation; and Alexandre Jacquillat, an assistant professor of operations research and statistics at the MIT Sloan School of Management.

Image: 
Thayer School of Engineering at Dartmouth

HANOVER, N.H. - January 17, 2020 - Researchers from Dartmouth and the Massachusetts Institute of Technology (MIT) have developed an original approach to flight scheduling that, if implemented, could result in a significant increase in profits for airlines and more flights that align with passengers' preferences. The approach is presented in a paper, "Airline Timetable Development and Fleet Assignment Incorporating Passenger Choice," recently published in Transportation Science, the leading journal in the field of transportation analysis.

Some of the most critical decision-making steps taken by airlines across the world rely on tools that do not fully incorporate passengers' preferences and the dynamics of flight scheduling, resulting in missed profits and unsatisfied passengers, according to the authors. The new paper uses 2016 data from Alaska Airlines to introduce an original integrated optimization approach to comprehensive flight timetabling and fleet assignment while taking into consideration passengers' preferences, such as flight departure time.

"Beyond ticket prices, perhaps the biggest thing that air passengers care about is the convenience of flight schedule. Yet, due to the associated computational complexities, nobody has really tried to completely redesign an airline's flight schedule from scratch to take passenger preference into account," said co-author Vikrant Vaze, assistant professor of engineering at Dartmouth. "This paper does just that, by proposing a comprehensive mathematical model and a new algorithm to solve it. It aligns the flight schedules to passenger preferences, in turn maximizing airline profits."

The model's flexible and comprehensive approach would enable airlines to increase the number of passengers with one-stop itineraries, and, consequently, dramatically increase the total one-stop revenue and the total operating profit compared with the most advanced approaches currently used in the industry. In addition, the paper suggests that an airline using this approach would experience a significant increase in market share.

First author Keji Wei, who was an engineering PhD candidate at Dartmouth while working on the study, received the Anna Valicek Award at the Airline Group of the International Federation of Operational Research Societies (AGIFORS) Symposium last fall for his work on this paper. Wei is now a senior operations research analyst at Sabre Corporation, a leading technology solutions provider to the travel industry.

In addition to Wei and Vaze, the paper was co-authored by Alexandre Jacquillat, an assistant professor of operations research and statistics at the MIT Sloan School of Management.

The authors note that the paper doesn't consider factors such as business strategy and aircraft orders because the data is not available, as well as airport gate and slot availability for simplicity's sake. However, the approach is designed to be versatile and usable for a variety of strategic planning decisions made by major airlines with a realistic computational budget.

Vaze is currently working on a follow-up paper that will incorporate revenue management considerations into scheduling and fleet assignment.

Credit: 
Thayer School of Engineering at Dartmouth

Study: Critical care improvements may differ depending on hospital's patient population

Boston, Mass. - Racial disparities have previously been identified across a range of health care environments, sometimes extending into the highest levels of care. A new study led by researchers at Beth Israel Deaconess Medical Center (BIDMC) reveals that while critical care outcomes in intensive care units (ICUs) steadily improved over a decade at hospitals with few minority patients, ICUs with a more diverse patient population did not progress comparably. Published today in the American Journal of Respiratory and Critical Care Medicine, the findings reveal that the gap is most apparent for critically ill African-American patients.

Lead author John Danziger, MD, MPhil, a nephrologist at BIDMC, and colleagues examined trends in ICU mortality and length of stay from 2006 to 2016 in more than 200 hospitals across the United States. To examine differences in critical care outcomes across hospitals, the team compared the data between two types of institutions. For the purpose of the study, hospitals with a greater than 25 percent African-American and/or Hispanic ICU patient census were defined as minority-serving hospitals, while those with less were identified as non-minority hospitals.

The team found a steady annual decline of two percent in ICU deaths at non-minority hospitals; however, the same improvement in mortality rate was not seen at minority-serving hospitals. Minority-serving hospitals also reported longer lengths of ICU stay and critical illness hospitalizations than non-minority hospitals.

In addition to the disparity for all ICU patients seen in minority-serving hospitals, the researchers observed a particularly stark difference in care for critically ill African-American patients. African-Americans treated at non-minority hospitals experienced a three percent decline in mortality each year, compared to no decline in mortality when treated at minority-serving hospitals.

While the study does not determine whether the outcomes at minority-serving hospitals are due to differences in hospital resources and practices or a systemic disparity of these patient populations, the findings highlight the profound obstacles minorities and minority-serving hospitals face.

"Although our analysis does not resolve the reasons for differences in outcomes, it identifies minority serving hospitals as an area of great need," said Danziger. "Focusing research efforts to further address these inequalities is critical in mitigating the disadvantages minorities face and ultimately closing the health care divide."

Credit: 
Beth Israel Deaconess Medical Center

Reward improves visual perceptual learning -- but only after people sleep

PROVIDENCE, R.I. [Brown University] -- Past studies have found that rewarding participants during a visual perceptual task leads to performance gains. However, new research suggests that these performance gains occur only if participants follow up the task with sleep.

The new findings may have particular implications for students tempted to sacrifice sleep in favor of late-night study sessions, said study corresponding author Yuka Sasaki, a professor of cognitive, linguistic and psychological sciences at Brown University.

"College students work very hard, and they sometimes shorten their sleep," Sasaki said. "But they need sleep in order to retain their learning."

In the study, published this month in the Proceedings of the National Academy of Sciences, young adults were asked to identify a letter and the orientation of a set of lines on a busy background. Some participants were told to refrain from eating or drinking in the hours leading up to the task and were then given drops of water as a reward for correct responses. In contrast to groups that were not rewarded during training, rewarded participants exhibited significant performance gains -- but only if they slept after the training session. This finding suggests that reward doesn't improve visual perceptual learning until people sleep.

The researchers believe that reward (or anticipation of reward) reinforces neural circuits between reward and visual areas of the brain, and these circuits are then more likely to reactivate during sleep to facilitate task learning. Indeed, during post-training sleep in rewarded participants, electroencephalogram (EEG) recordings found increased activation in the prefrontal, reward-processing area of the brain and decreased activation in the untrained visual areas of the brain.

That pattern of activation can likely be explained by past studies, which suggest that the prefrontal, reward-processing area of the brain sends signals to inhibit some of the neurons in the visual processing area. As a result, irrelevant connections are trimmed and the most efficient connections are preserved, and task performance improves.

The study also examined when the patterns of activation occurred. Untrained visual areas of the brain exhibited reduced activation during both REM and non-REM sleep, but prefrontal, reward-processing areas became active only during REM sleep. REM sleep appears to be particularly important for task learning -- likely because connections are reorganized and optimized during this sleep stage -- and it may be linked to the activation of reward-processing areas of the brain. Consistent with this theory, the rewarded study participants exhibited longer periods of REM sleep compared to those who did not receive a reward during training.

Sasaki added that physical-based rewards, like food and water, may have a stronger impact on neural circuits compared to rewards such as money.

"Water deprivation may be fundamental," she said. "When you're really thirsty and you get water as a reward, the impact of that reward may be more prevailing to the brain."

Future research could examine whether other types of learning, such as motor and associative learning, also benefit from the interaction between reward and sleep.

Going forward, Sasaki hopes the study will encourage collaboration between sleep researchers and scientists studying reinforcement learning.

"Reinforcement learning is a hot topic in neuroscience, but it hasn't interacted much with sleep research," she said. "So this could lead to more interdisciplinary work."

Credit: 
Brown University

UVA engineering professor Jack W. Davidson named an IEEE fellow

CHARLOTTESVILLE, Va. - UVA Engineering computer science professor Jack W. Davidson has been named an Institute of Electrical and Electronics Engineers Fellow in recognition of his contributions to compilers, computer security and computer science education.

The institute’s board of directors annually awards the designation to those who have contributed to the advancement or application of engineering, bringing significant value to society. Fellow is the highest level of membership and is recognized by the technical community as an important career achievement. The award is received by less than 1% of the total voting membership.

Davidson received his Ph.D. in computer science from the University of Arizona in 1981. The same year, he joined UVA Engineering as a professor in the Department of Computer Science. During his 38-year career at UVA, he has been the principal investigator on numerous high-profile grants to develop comprehensive methods for protecting software from malicious attacks.

His research accomplishments have made him internationally recognized in cybersecurity. Davidson leads the University’s Cyber Innovation and Society Institute, launched in 2018.

The Cyber Innovation and Society Institute brings faculty together from technical and humanities fields across the University to understand the impact of cyber systems on society, especially how they affect human values such as privacy, freedom, democracy and individual autonomy; to understand the risks and consequences of attacks on cyber systems and strategies to respond to attacks; to ensure that these systems operate securely and dependably as intended; and the data they collect and process is secure from improper use.

This year, Davidson and the Cyber Innovation and Society Institute were awarded a national grant from The Public Interest Technology University Network to establish a course aimed at teaching graduate students to deeply examine the complex ethical, legal and policy implications of new technologies. The graduate course, Innovation in the Public Interest, will be offered for the first time in the spring of 2020.

Davidson is a former recipient of the Institute of Electrical and Electronics Engineers Taylor L. Booth Education Award for outstanding achievement in computer science and engineering education. He has also been active in the Association for Computing Machinery, making many contributions leading to his current tenure on the association’s executive council. Davidson was associate editor on two separate association publications: ACM Transactions on Programming Languages and Systems and ACM Transactions on Architecture and Code Optimization. He also served as chair of the Special Interest Group on Programming Languages. In his role as part of the executive council, he serves as co-chair of the association’s Publications Board.

“It is a great honor to be selected for recognition as IEEE Fellow. It is also a reflection of the supportive environment provided by UVA Engineering that enables faculty to achieve their scientific and professional goals,” Davidson said.

Credit: 
University of Virginia School of Engineering and Applied Science

When David poses as Goliath

Stellar black holes form when massive stars end their life in a dramatic collapse. Observations have shown that stellar black holes typically have masses of about ten times that of the Sun, in accordance with the theory of stellar evolution. Recently, a Chinese team of astronomers claimed to have discovered a black hole as massive as 70 solar masses, which, if confirmed, would severely challenge the current view of stellar evolution.

The publication immediately triggered theoretical investigations as well as additional observations by other astrophysicists. Among those to take a closer look at the object was a team of astronomers from the Universities of Erlangen-Nürnberg and Potsdam. They discovered that it may not necessarily be a black hole at all, but possibly a massive neutron star or even an 'ordinary' star. Their results have now been published as a highlight-paper in the renowned journal Astronomy & Astrophysics.

The putative black hole was detected indirectly from the motion of a bright companion star, orbiting an invisible compact object over a period of about 80 days. From new observations, a Belgian team showed that the original measurements were misinterpreted and that the mass of the black hole is, in fact, very uncertain. The most important question, namely how the observed binary system was created, remains unanswered. A crucial aspect is the mass of the visible companion, the hot star LS V+22 25. The more massive this star is, the more massive the black hole has to be to induce the observed motion of the bright star. The latter was considered to be a normal star, eight times more massive than the Sun.

A team of astronomers from Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) and the University of Potsdam had a closer look at the archival spectrum of LS V+22 25, taken by the Keck telescope at Mauna Kea, Hawaii. In particular, they were interested in studying the abundances of the chemical elements on the stellar surface. Interestingly, they detected deviations in the abundances of helium, carbon, nitrogen, and oxygen compared to the standard composition of a young massive star. The observed pattern on the surface showed ashes resulting from the nuclear fusion of hydrogen, a process that only happens deep in the core of young stars and would not be expected to be detected at its surface.

'At first glance, the spectrum did indeed look like one from a young massive star. However, several properties appeared rather suspicious. This motivated us to have a fresh look at the archival data,' said Andreas Irrgang, the leading scientist of this study and a member of the Dr. Karl Remeis-Observatory in Bamberg, the Astronomical Institute of FAU.

The authors concluded that LS V+22 25 must have interacted with its compact companion in the past. During this episode of mass-transfer, the outer layers of the star were removed and now the stripped helium core is visible, enriched with the ashes from the burning of hydrogen.

However, stripped helium stars are much lighter than their normal counterparts. Combining their results with recent distance measurements from the Gaia space telescope, the authors determined a most likely stellar mass of only 1.1 (with an uncertainty of +/-0.5) times that of the Sun. This yields a minimum mass of only 2-3 solar masses for the compact companion, suggesting that it may not necessarily be a black hole at all, but possibly a massive neutron star or even an 'ordinary' star.

The star LS V+22 25 has become famous for possibly having a massive black hole companion. However, a closer look at the star itself reveals that it is a very intriguing object in its own right, as whilst stripped helium stars of intermediate mass have been predicted in theory, only very few have been discovered so far. They are key objects to understanding binary star interactions.

Credit: 
Friedrich-Alexander-Universität Erlangen-Nürnberg

Prosecutors' race, class bias may not drive criminal justice disparities

America's prison populations are disproportionately filled with people of color, but prosecutors' biases toward defendants' race and class may not be the primary cause for those disparities, new research from the University of Arizona suggests.

The finding, which comes from a unique study involving hundreds of prosecutors across the U.S., counters decades' worth of previous research. Those studies relied on pre-existing data, such as charges and punishments that played out in courtrooms. In a 1993 study, for example, researchers found that prosecutors in Los Angeles were 1.59 times more likely to fully prosecute an African American defendant for crack-related charges than a white defendant. That likelihood was 2.54 times greater for Hispanic defendants compared to white defendants.

The new study, led by Christopher Robertson, a professor of law and associate dean for research and innovation at the James E. Rogers College of Law, involved a controlled experiment with prosecutors, asking them to examine the same hypothetical case but changing the race and class of the defendant.

The study, administered online, provided prosecutors with police reports describing a hypothetical crime, which the researchers designed with assistance from experienced prosecutors. All details of the case were the same except for the suspect's race - either black or white - and occupation - fast-food worker or accountant - to indicate the suspect's socioeconomic status. Roughly half of the prosecutors received one version of the case; the other half received the other.

The study allowed researchers to "really isolate the prosecutor's decision-making in a way that mere observational research wouldn't allow," said Robertson, whose co-authors are Shima Baradaran Baughman of the University of Utah and Megan Wright of Penn State. The paper was published in the Journal of Empirical Legal Studies.

The outcomes the study looked for included whether prosecutors charged a felony, whether they chose to fine the defendant or seek a prison sentence, and the proposed cost of the fine or length of the sentence.

"When we put all those together, we see the same severity of charges, fines and sentences across all the conditions, whether the defendant was black, whether the defendant was white, whether the defendant had a high-class career or a low-class career," Robertson said. "Differences in the actual outcomes - in the actual behavior of the prosecutors - is what we would have expected if they were biased. But since we see no difference in the outcomes, we concluded that they were not substantially biased."

Given previous research that indicated rampant bias drives criminal justice disparities, Robertson's results may surprise many - just like they did the researchers.

"We were surprised at the bottom line," he said.

Robertson offered one possible explanation for the unexpected result.

"We conducted this study in 2017 and 2018 and prosecutors have been under a spotlight for some time," he said. "They've been training and are aware of and are working hard to not be biased in their own decision-making."

The results do not rule out race and class bias as factors in prosecutorial decision-making but suggest that policymakers committed to addressing systemic racism and classism in the legal system may be more successful seeking reforms in other areas.

"The disparities in outcomes are indisputable," Robertson said. "As we go through the criminal justice system and think about what the right reforms are, the sheer bias of the prosecutor doesn't seem to be the biggest one."

Robertson said policymakers may be better off focusing on disparities that occur before someone is even arrested, in areas such as economic development and education.

"Crime is associated with poverty, and race in America is associated with poverty, so I think some very front-end questions of social policy are really important," he said. "At the same time, I think, on the back end, to shift the focus, there's a growing consensus among people on the left and the right that our 40-year-long war on crime has been ineffectual in some ways and that we could make the criminal justice system much less severe and much less expensive and thereby reduce some of these same disparities."

Robertson also stresses that his study's results aren't the final word on prosecutor bias - a problem that still needs addressing, he said. Even after these findings, he remains a proponent of blinding prosecutors to defendants' race, a detail that is often not relevant to prosecutors after an arrest is made. Prosecutor blinding is the focus of Robertson's next research project.

Credit: 
University of Arizona