Culture

Your smartphone could soon be making your commute much less stressful

image: Researchers at the University of Sussex used mobile phones to collect data on different modes of transport.

Image: 
The University of Sussex

Apps that can detect what mode of transport phone users are travelling on and automatically offer relevant advice are set to become a reality after extensive data-gathering research led by the University of Sussex.

Researchers at the University of Sussex's Wearable Technologies Lab believe that the machine learning techniques developed in a global research competition they initiated could also lead to smart phones being able to predict upcoming road conditions and traffic levels, offer route or parking recommendations and even detect the food and drink consumed by a phone user while on the move.

Professor Daniel Roggen, a Reader in Sensor Technology at the University of Sussex, said: "This dataset is truly unique in its scale, the richness of the sensor data it comprises and the quality of its annotations. Previous studies generally collected only GPS and motion data. Our study is much wider in scope: we collected all sensor modalities of smartphones, and we collected the data with phones placed simultaneously at four locations where people typically carry their phones such as the hand, backpack, handbag and pocket.

"This is extremely important to design robust machine learning algorithms. The variety of transport modes, the range of conditions measured and the sheer number of sensors and hours of data recorded is unprecedented."

Prof Roggen and his team collected the equivalent of more than 117 days' worth of data monitoring aspects of commuters' journeys in the UK using a variety of transport methods to create the largest publicly available dataset of its kind.

The project, whose findings will be presented at the Ubicomp conference in Singapore on Friday [October 12], gathered data from four mobile phones carried by researchers as they went about their daily commute over seven months.

The team launched a global competition challenging teams to develop the most accurate algorithms to recognize eight modes of transport (sitting still, walking, running, cycling or taking the bus, car, train or subway) from the data collected from 15 sensors measuring everything from movement to ambient pressure.

The project, supported by Chinese telecoms giant Huawei with academics at Ritsumeikan University and Kyushu Institute of Technology in Japan and Saints Cyril and Methodius University of Skopje in Macedonia, saw 17 teams take part with two entries achieving results with more than 90% accuracy, eight with between 80% and 90%, and nine between 50% and 80%.

The winning team, JSI-Deep of the Jozef Stefan Institute in Slovenia, achieved the highest score of 93.9% through the use of a combination of deep and classical machine learning models. In general deep learning techniques tended to outperform traditional machine learning approaches, although not to any significant degree.

It is now hoped that the highly versatile University of Sussex-Huawei Locomotion-Transportation (SHL) dataset will be used for a wide range of studies into electronic logging devices exploring transportation mode recognition, mobility pattern mining, localization, tracking and sensor fusion.

Prof Roggen said: "By organising a machine learning competition with this dataset we can share experiences in the scientific community and set a baseline for future work. Automatically recognising modes of transportation is important to improve several mobile services - for example to ensure video streaming quality despite entering in tunnels or subways, or to proactively display information about connection schedules or traffic conditions.

"We believe other researchers will be able to leverage this unique dataset for many innovative studies and novel mobile applications beyond smart-transportation, for example to measure energy expenditure, detect social interaction and social isolation, or develop new low-power localisation techniques and better mobility models for mobile communication research."

Credit: 
University of Sussex

Recent National Academies report puts research participants' rights at risk, say law scholars

In a Policy Forum article appearing in the Oct. 12 issue of Science, leading bioethics and legal scholars sound the alarm about a recent report from National Academies of Science, Engineering, and Medicine. The Academies' report on "Returning Individual Research Results to Participants" makes recommendations on how to share research results and data with people who agree to participate in research studies and calls for problematic changes to federal law. This report proclaims its support for research participants' rights but, in reality, creates major new roadblocks to the return of data and results to participants and would roll back important privacy protections people have under current law, according to the analysis in the new Science article.

The article's authors, Susan M. Wolf and Barbara J. Evans, collaborated as part of the "LawSeqSM: Building a Sound Legal Foundation for Translating Genomics into Clinical Application" project funded by the National Human Genome Research Institute and National Cancer Institute of the National Institutes of Health. Wolf is the McKnight Presidential Professor of Law, Medicine & Public Policy; Faegre Baker Daniels Professor of Law; and Professor of Medicine at the University of Minnesota and is Chair of the University's Consortium on Law and Values in Health, Environment & the Life Sciences. Evans is the Mary Ann and Lawrence E. Faust Professor of Law, Professor of Electrical and Computer Engineering, and Director of the Center for Biotechnology & Law at the University of Houston.

"Researchers conducting imaging, environmental health, and genetics studies have offered participants their research findings for years," Wolf and Evans point out. Research participants value access to their results for a wide range of reasons, including protecting their health, and evaluating the privacy risks posed by circulation of their data. People value access to results even when the results are still under study and may be uncertain. Over the past 20 years, researchers have developed pathways for returning results in situations where the results raise clinical concerns, such as suggesting that the person may have a medical condition that needs clinical follow-up evaluation. These pathways are ethically sound and protect the participants' safety by ensuring compliance with necessary laws and regulations. Unfortunately, the Policy Forum article asserts, "the Academies' report rejects this widely supported, legally sound approach" and instead recommends restrictions on access to research results and data.

Wolf and Evans write that, "Efforts to turn back the clock on return of results appear rooted in confusion about the law." The Academies' report incorporates incorrect statements about the federal CLIA legal framework, which aims to ensure the quality of laboratory tests conducted for health care purposes.

The report overstates the degree to which research laboratories can be regulated under the CLIA statute.

The Academies' report also conflicts with existing federal privacy laws that protect research participants' access to their own data. For more than 50 years, Congress has treated individual access to one's own data as an essential element of personal privacy protection, as seen in the Privacy Act that protects data stored in governmental databases, the HIPAA Privacy Rule that protects Americans' medical privacy, and the Genetic Information Nondiscrimination Act that expanded HIPAA's protections to genetic information. Only by seeing the personal data collected can an individual assess the privacy risks involved. Yet the Academies' report recommends that an individual's access to their data be restricted to the subset of data that meets certain quality standards. Wolf and Evans explain how this would undermine federal privacy protections, which recognize that privacy can be put at risk even by low- quality data and data that is wrongly attributed to a person.

Finally, the Policy Forum article criticizes the Academies' recommendation to load multiple decisions about return of results on Institutional Review Boards (IRBs). This would place "substantial new burdens on IRBs, despite extensive literature on the limits of IRB decision making." The report "maximizes the burden on IRBs by mischaracterizing existing consensus guidelines and suggesting that IRBs start over."

Wolf and Evans conclude, "The Academies' report endorses the idea of participant access to results and data, but then builds daunting barriers. The report rejects established legal rights of access, two decades of consensus guidelines, and abundant data showing that participants benefit from access while incurring little risk. The report too often prefers paternalistic silence over partnership."

"True progress on return of results requires accepting participants' established rights of access and respecting the value that participants place on broad access to their data and results. The next step is not to build barriers but to promote transparency."

Credit: 
University of Minnesota

UK Alabama rot risk may be linked to certain types of dog breed and habitat

The risk of contracting renal glomerular vasculopathy (CRGV), popularly known as Alabama Rot, may be higher in certain types of dog breed and land habitat, indicate two linked studies published in this week's Vet Record.

The clinical signs of Alabama Rot typically include skin ulcers and anaemia, progressing to kidney damage and renal failure. As yet, the cause is unknown.

The first known cases in the UK were reported in 2012 in the New Forest in southern England, and cases have tended to occur more frequently at certain times of the year, and in certain geographical areas. But it's not clear what other potential risk factors there might be.

In the first study, the researchers assessed whether certain types of breed might be more at risk. They looked at 101 reported cases (out of 103) diagnosed between November 2012 and May 2017, comparing them with more than 446,000 dogs in receipt of veterinary care at practices submitting data on health issues to the VetCompass programme during 2013.

On average, the vet practice dogs were nearly 4.5 years old, and just over half were male. The most common Kennel Club breeds were gun dogs (spaniels and retrievers), terriers, and toy dogs; working dogs and hounds (salukis, whippets, and Hungarian vizslas) were the least common.

Crossbreds made up over a third (just under 38%) of all the dogs, with labrador retrievers, Staffordshire bull terriers, and Jack Russells, the most common specified breeds.

Compared with the vet practice dogs, those diagnosed with Alabama Rot were more likely to be female (58% vs 48%) and neutered (69% vs 45.5%).

Among the Kennel Club breeds, gun dogs and hounds made up nearly two thirds (60%) of the Alabama Rot cases. They were between 9 and 11 times as likely to have been diagnosed as terriers.

Among the specified breeds, Staffordshire bull terriers, Jack Russells and German shepherds were the least likely to have been diagnosed, while English springer spaniels, whippets, and flat-coated retrievers and Hungarian vizlas were the most likely.

"It is possible that these breed associations result from an inherent susceptibility among these breeds as a result of genetic or behavioural patterns, but it is also possible that the predisposition results from geographical confounding whereby these breeds may occur more commonly in areas with a high risk of CRGV occurrence," explain the researchers.

In the second study, the researchers looked only at the 101 dogs that had been diagnosed with Alabama Rot to see if there were any patterns in timing, geography, and terrain.

Most cases (90%) were reported between December and May, with a third diagnosed between January and March. Fewer than one in 10 cases were diagnosed between June and August.

Cases were reported from most of the western and southern regions of England, over the five years, with the lowest risk seemingly in eastern England, in particular, East Anglia.

Habitat emerged as an influential factor, accounting for more than 20 per cent of the difference in the geographical distribution of cases. Dry lowland heathland and woodland areas were the most likely to be associated with a diagnosis, while pasture was the least likely.

Areas with higher maximum temperatures in winter, and higher average rainfall in winter and spring (such as the West and South of England), were also associated with a heightened risk of a diagnosis.

Both these studies are observational, and as such, no definitive conclusions can be reached about causality.

But the researchers say their findings may help raise the index of suspicion among vets, as it is particularly important to treat Alabama Rot promptly, as well as giving dog owners an indication of when to be extra vigilant.

But further research is warranted to find out if the breeds seemingly at higher risk are inherently more vulnerable or whether there are higher proportions of these breeds in areas of greater risk, they conclude.

In a linked editorial, managing editor, Suzanne Jarvis, reiterates this uncertainty, emphasising the need for caution until further research can shed more light on the matter.

"The next step is to see if this suspected geographical connection holds true; what it is not, is to scare owners of these identified breeds," she insists.

Credit: 
BMJ Group

Ability to recover after 'maximum effort' is crucial to soccer stardom

Footballers' ability to recover after high-intensity effort may not depend on their age, but on their division level, a new study has suggested.

A multinational team of scientists led by the Complutense University of Madrid (UCM) carried out maximum-effort tests with Spanish division one and division two soccer players.

They then measured the players' oxygen consumption, heart rate and ventilation during recovery.

Professor Francisco Javier Calderón Montero, from UCM, is the study's lead author. He said: "Regardless of a player's technical ability, the ability to repeat sprints is essential in soccer. Players may need to sprint every 90 seconds during a game, meaning the available recovery time will be short."

"We wanted to discover if the difference in the recovery time before the next sprint were linked to the level at which a soccer player competes."

The researchers' findings, published today in Physiological Measurement, show that compared to first division players, second division players took longer to recover from maximum effort exertion.

Professor Montero explained: "Our results showed second division players had higher oxygen consumption and heart rate than first division players after 90 seconds of recovery time. These differences were still clear after 180 seconds of recovery time."

"The second division players, therefore, took much longer to recover to the point where they were able to repeat the effort. They are therefore unlikely to be able to repeat sprints as often and as intensely as first division players."

A huge sample of one-hundred-and-ninety-four male soccer players, from seven clubs in the Spanish Professional Football League, took part in the study. There were 114 first division and 80 second division players comprising: 12 goalkeepers, 57 defenders, 86 midfield players, and 39 strikers.

All underwent the same maximum effort test. They first warmed up by running on a treadmill for two minutes at 4 km/h, before increasing their speed until they reached a heart rate of 120-130 bpm. They maintained this for three minutes. After this warm-up period, they could rest to stabilize the respiratory quotient (RQ) at The maximum effort test began at a speed of 6 km/h, and a one per cent slope. The players' running speed increased by 2 km/h every two minutes until they reached maximum effort. The players then underwent a three-minute period of active recovery.

Although the team found a difference in recovery time between players from divisions one and two, the position in which they played had no bearing on the results.

Corresponding author Dr Luca Paolo Ardigò, from the University of Verona, said: "A possible explanation may be that the second division players always played in lesser category clubs with lower training demands. However, the players in our study showed no differences in self-declared duration of training or weekly recovery."

"It is possible that the intensity of training differed, and further study is needed to uncover whether this is a factor."

Credit: 
IOP Publishing

Organs-on-chip technology reveals new drug candidates for Lou Gehrig's disease

image: ALS on a chip with hiPS-derived optogenetic motor neuron from ALS patient (green) and hiPS-derived skeletal muscle cells (purple) was established to represent ALS pathology.

Image: 
Tatsuya Osaki/MIT

The investigation of amyotrophic lateral sclerosis (ALS) - also known as Lou Gehrig's disease - through muscle-on-a-chip technology has revealed a new drug combination that may serve as an effective treatment of the progressive neurodegenerative disease. These findings highlight organ-on-a-chip technologies - in which live conditions of the body are mimicked in a microfluidic cell culture - as promising platforms for testing drug candidates and investigating the pathogenesis of ALS, which remains largely unknown. The disease currently impacts around 12,000 to 15,000 people in the U.S. ALS involves the loss of motor neurons in the spinal cord and motor cortex, which leads to progressive paralysis, muscle atrophy and death. While roughly 10% of ALS patients have a familial version of the disease, which can typically be traced back to a genetic mutation, 90% of patients have "sporadic ALS," which has no known familial links or causes. As the few FDA-approved drugs currently on the market for ALS lack full effectiveness, there is an urgent need for ALS therapy investigations in the clinic, using better clinical models that can go beyond the limitations of animal models. Here, Tatsuya Osaki and colleagues created disease-on-a-chip technology-based approach. It features a microfluidic chip loaded with healthy skeletal muscle bundles and induced pluripotent stem cell-derived, light-sensitive motor neurons from a sporadic ALS patient. Light was used to activate muscle contraction and control neural activity on the chips. Compared to chips with non-ALS-patient-derived cells, the ALS-on-a-chip exhibited fewer and weaker muscle contractions, degraded motor neurons, and increased muscle cell death. Application of two neuroprotective molecules - rapamycin and bosutinib (both in clinical trials) - helped recover muscle contraction induced by motor neuron activity and improve neuronal survival in the chip-based model of disease. Importantly, each treatment on its own has a limited ability to penetrate the blood-brain barrier, but when combined, the molecular duo could efficiently cross blood-brain-barrier-like cell layers built onto the chip.

Credit: 
American Association for the Advancement of Science (AAAS)

New appropriate use criteria for lumbar puncture in Alzheimer's diagnosis

In preparation for more tools that detect and measure the biology associated with Alzheimer's and other dementias earlier and with more accuracy, an Alzheimer's Association-led Workgroup has published appropriate use criteria (AUC) for lumbar puncture (spinal tap) and spinal fluid analysis in the diagnosis of Alzheimer's disease.

The AUC is available online by Alzheimer's & Dementia: The Journal of the Alzheimer's Association as an article in press, corrected proof.

"Early and accurate diagnosis of Alzheimer's disease is critical as therapies that have the potential to stop or slow the progression of the disease become available," said Maria C. Carrillo, Ph.D., Chief Science Officer at the Alzheimer's Association. "These criteria will arm medical professionals with necessary guidance when the use of lumbar puncture is an appropriate part of the process to diagnose Alzheimer's disease and other dementias, thereby giving people with dementia and their families the possibility of a head start in preparing for the course of their disease."

Alzheimer's disease is commonly diagnosed by a thorough examination of physical health, medical history and assessment of memory, thinking and reasoning. Lumbar puncture, while not currently in routine clinical practice in the U.S., is anticipated to be a safe and cost-effective way to retrieve cerebrospinal fluid (CSF) to test for biological markers of Alzheimer's disease, potentially delivering valuable diagnostic information to clinicians and their patients earlier in the course of the disease.

The Workgroup's efforts complement the 2013 AUC for brain amyloid PET scans developed by the Society of Nuclear Medicine and Molecular Imaging (SNMMI) and the Alzheimer's Association.

The lumbar puncture AUC criteria recommend clinicians consider the following patient populations as appropriate and inappropriate:

Appropriate uses of lumbar puncture:

A patient has subjective cognitive decline (SCD) and is considered to be at an increased risk for Alzheimer's disease based on indicators that include a persistent decline in memory, younger onset age (>60), onset in the last 5 years and others. The decision to perform CSF biomarker testing in this case should be individualized and most strongly supported when the individual, family and clinician all are concerned about the patient's cognitive decline.

A patient has mild cognitive impairment (MCI) that is persistent, progressive and unexplained. MCI includes mild deficits on cognitive testing but no change in functional abilities.

A patient has symptoms that suggest possible Alzheimer's disease, meaning the dementia could be due to another cause.

A patient has MCI or dementia with onset at an early age (A patient meets core clinical criteria for probable Alzheimer's disease with typical age of onset.

A patient's dominant symptom is an unexplained change in behavior, such as delusions and delirium, and an Alzheimer's disease diagnosis is being considered.

Inappropriate uses of lumbar puncture:

A patient is cognitively unimpaired, is within the normal range of functioning for their age and lacks significant risk factors for Alzheimer's disease.

A patient is cognitively unimpaired but is considered to be at risk for Alzheimer's disease based on their family history.

A patient has SCD and has been evaluated and found by a clinician not to be at high risk for Alzheimer’s disease based on indications such as no family history or limited concern from an informant like a partner or family member.

A patient has symptoms of rapid eye movement (REM) sleep behavior disorder, which is a strong predictor of disorders such as Parkinson's disease and Lewy body dementia.

A patient already has been diagnosed with Alzheimer's and the test's use would be to determine the stage of their disease or its severity.

A patient is an apolipoprotein E-e4 (ApoE-e4) carrier who has no cognitive impairment. ApoE-e4 is a genetic mutation strongly associated with risk for late-onset Alzheimer's.

The test is being used in lieu of genotyping for individuals who are suspected to carry a rare genetic mutation that causes an early-onset form of Alzheimer's disease.

The AUC includes suggestions from the workgroup on implementing the criteria in clinical practice. They recommend that CSF biomarker testing be done by dementia experts who can determine the appropriateness of the test, educate the patient and family about the benefits and risks, ensure the procedure follows established guidelines, and integrate the results into the patient's treatment plan.

Credit: 
Alzheimer's Association

Sit-stand office desks cut daily sitting time, may boost job performance

Sit-stand workstations that allow employees to stand, as well as sit, while working on a computer reduce daily sitting time and appear to have a positive impact on job performance and psychological health, finds a trial published by The BMJ today.

The results show that employees who used the workstations for 12 months, on average, reduced their sitting time by more than an hour a day, with potentially meaningful benefits.

High levels of sedentary behaviour (sitting) have been associated with an increased risk of chronic diseases (type 2 diabetes, heart disease, and some cancers) as well as death and have been shown to be detrimental for work related outcomes such as feelings of engagement and presenteeism (going to work despite illness).

Office workers are one of the most sedentary populations, spending 70-85% of time at work sitting, but studies looking at ways to reduce sitting in the workplace have been deemed low quality.

So a team of researchers based in the UK, with collaborators in Australia, set out to evaluate the impact of (Stand More AT (SMArT) Work) an intervention designed to reduce sitting time at work.

The trial involved 146 office workers based at the University Hospitals of Leicester NHS Trust of whom 77 were randomly assigned to the intervention group and 69 to the control group over a 12 month period.

The average age of participants was 41 years, 78% reported being of white European ethnicity, and the majority (80%) were women.

The intervention group were given a height adjustable workstation, a brief seminar with supporting leaflet, and workstation instructions with sitting and standing targets. They also received feedback on sitting and physical activity, an action planning and goal setting booklet, a self monitoring and prompt tool, and coaching sessions. The control group carried on working as usual.

Workers' sitting time was measured using a device worn on the thigh at the start of the study (baseline) and at 3, 6, and 12 months. Daily physical activity levels and questions about work (eg. job performance, engagement) and health (eg. mood, quality of life) were also recorded.

At the start of the study, overall sitting time was 9.7 hours per day. The results show that sitting time was lower by 50.62 minutes per day at 3 months, 64.40 minutes per day at 6 months, and 82.39 minutes per day at 12 months in the intervention group compared with the control group. Prolonged sitting time was also reduced in the intervention group.

The reduction in sitting was largely replaced by time spent standing rather than moving, as stepping time and physical activity remained unchanged.

The results also suggest improvements in job performance, work engagement, occupational fatigue, presenteeism, daily anxiety and quality of life, but no notable changes were found for job satisfaction, cognitive function, and sickness absence.

The authors say this was a well-designed trial and their results remained largely unchanged after further analyses. But they acknowledge that their findings may not apply to other organisations, and that self-reporting of work related outcomes may have affected the results.

Nevertheless, they say the SMArT Work successfully reduced sitting time over the short, medium, and longer term, and positive changes were observed in work related and psychological health.

And they suggest future research should assess the longer term health benefits of displacing sitting with standing and how best to promote movement rather than just standing while at work.

In a linked editorial, Dr Cindy Gray at the University of Glasgow says this is an important study that demonstrates lasting reductions in sedentary behaviour and other work-related benefits. But she questions the potential health gains of simply replacing sitting with standing. The intervention did not increase potentially more beneficial physical activity.

She also questions SMArT Work's transferability and suitability for other types of employees, including shift workers, as well as its cost-effectiveness, which she says should be addressed in future research.

Credit: 
BMJ Group

Nutrients may reduce blood glucose levels

image: Mary-Elizabeth Patti, MD, is investigator in the Section on Integrative Physiology and Metabolism at Joslin Diabetes Center and associate professor of medicine at Harvard Medical School.

Image: 
John Soares

BOSTON - (October 10, 2018) - Type 2 diabetes is driven by many metabolic pathways, with some pathways driven by amino acids, the molecular building blocks for proteins. Scientists at Joslin Diabetes Center now have shown that one amino acid, alanine, may produce a short-term lowering of glucose levels by altering energy metabolism in the cell.

"Our study shows that it's possible we can use specific nutrients, in this case amino acids, to change metabolism in a cell, and these changes in metabolism can change how cells take up and release glucose in a beneficial way," says Mary-Elizabeth Patti, MD, an investigator in Joslin's Section on Integrative Physiology and Metabolism and senior author on a paper about the work recently published in Molecular Metabolism.

Performed in cells and in mice, her group's research began with an attempt to see what nutrients might activate a key protein called AMP kinase (AMPK), says Patti, who is also an associate professor of medicine at Harvard Medical School.

"AMPK is an enzyme in cells throughout the body that is activated when nutrient supplies are low, or in response to exercise," she explains. "AMPK then causes a lot of beneficial changes in the cell, turning on genes that serve to increase energy production. AMPK is a good thing, and it also can be activated by a variety of treatments for type 2 diabetes, such as metformin."

That raised a question for Patti and her colleagues: Could an amino acid switch on this beneficial enzyme?

The investigators began their study by testing many amino acids in rat liver cells (the liver is a crucial organ in glucose metabolism). "Alanine was the one amino acid that was consistently able to activate AMPK," Patti says.

The researchers then confirmed that AMPK was producing some of its usual metabolic effects after alanine activation. Additionally, the activation could be seen in human and mouse liver cells as well as rat liver cells, and was present with either high or low levels of glucose in the cells.

Next, scientists gave alanine by mouth to mice and found that levels of AMPK rose in the animals. Moreover, if mice ate alanine before they received a dose of glucose, their resulting blood glucose levels were significantly lower. And while glucose metabolism often behaves quite differently in lean mice than in obese mice, this mechanism was seen in both groups of mice.

Following up, the Joslin team found that the glucose lowering didn't seem to be driven by increases in insulin secretion or decreases in secretion of glucagon, a hormone that increases glucose. Instead, AMPK was boosting glucose uptake in the liver and decreasing glucose release. Further experiments in cells demonstrated that the activated enzyme was altering the Krebs cycle, a central component of cell metabolism.

"All these data together suggest that amino acids, and specifically alanine, may be a unique potential way to modify glucose metabolism," Patti sums up. "If it eventually turns out that you can do that by taking an oral drug as a pre-treatment before a meal, that would be of interest. However, this is early-stage research, and we need to test the concept both in mice and ultimately in humans."

Credit: 
Joslin Diabetes Center

New type of stellar collision discovered

image: This object is possibly the oldest of its kind ever catalogued: the hourglass-shaped remnant named CK Vulpeculae. Originally thought to be a nova, classifying this unusually shaped object correctly has proven challenging over the years. A number of possible explanations for its origins have been considered and discarded. It is now thought to be the result of two stars colliding. These new observations are the first to bring this system into focus, suggesting a solution to a 348-year-old mystery.

Image: 
ALMA (ESO/NAOJ/NRAO)/S. P. S. Eyres

For three and a half centuries, astronomers have pondered a mystery: What did the French monk and astronomer Père Dom Anthelme see when he described a star that burst into view in June 1670, just below the head of the constellation Cygnus, the swan?

It was long thought to be a nova--a star that periodically brightens as it blows off mass. But now, an international team of astrophysicists, including two professors at the University of Minnesota, have cracked the 348-year-old conundrum. The monk witnessed the explosive merger of white and brown dwarf stars--the first ever identified.

The work, led by astrophysicists at Keele University (England), is published in the Monthly Notices of the Royal Astronomical Society.

White dwarfs are the remnants of stars like the sun at the end of its life, while brown dwarfs are "failed stars" that have 15-75 times the mass of Jupiter, but not enough to ignite the thermonuclear fusion reactions that power the sun and other stars. The two stars orbited each other until they got too close and merged, spewing out debris whose chemical composition gave away the secret of the mystery object's origin.

The brown dwarf got the raw end of the deal.

"It was as if you put salsa fixings into a blender and forgot to put the lid on," said Charles Woodward, a physics and astronomy professor in the College of Science and Engineering at the University of Minnesota. "The white dwarf was like the blades at the bottom and the brown dwarf was the edibles. It was shredded, and its remains spun out in two jets--like a jet of goop shooting from the top of your blender as you searched frantically for the lid."

Woodward and fellow University of Minnesota physics and astronomy professor Robert Gehrz were members of the team that proposed studying the object and assisted in designing the program of observations, which were done at the Atacama Large Millimeter/submillimeter Array (ALMA) of telescopes in Chile.

Beneath the swan, an odd duck

The unusual star has been dubbed CK Vulpeculae, as it resides in the constellation Vulpecula (the little fox). It is just below the star Albireo, the head of Cygnus, the swan. That location is inside the Summer Triangle of bright stars, which is now high in the south after nightfall. The star is approximately 2,200 light-years from Earth.

The white dwarf and brown dwarf started out fairly ordinary--orbiting each other in a binary system, as astrophysicists believe most stars are born. The white dwarf had an estimated 10 times the brown dwarf's mass. As they merged, the brown dwarf was torn apart and its remains dumped on the surface of the white dwarf. That star's crushing gravity heated the brown dwarf material and caused thermonuclear "burning" that spilled out a cocktail of molecules and unusual forms (isotopes) of chemical elements. It also caused the brightening that caught the eye of the monk Anthelme.

Rounding up the unusual suspects

CK Vulpecula isn't visible to the naked eye, but through the telescope, the debris ejected during the merger appears as two bright rings of dust and gas that form a glowing hourglass structure around a compact central object. Studying the light from two background stars that had passed through the system, the researchers noted the presence of lithium, a light element that can't exist in the interiors of stars, where nuclear fusion occurs. They also found organic molecules like formaldehyde and methyl alcohol, which also would perish in stellar interiors. Thus, these molecules must have been produced in the debris from the collision.

The amount of dust in the debris was about one percent the mass of the sun.

"That's too high for a classical nova outburst and too low for mergers of more massive stars, as had been proposed earlier," said Sumner Starrfield, a professor at Arizona State University who was involved in the study.

That evidence, plus isotope data, led to the conclusion that the collision was between a white dwarf and brown dwarf. And the remnant star is still blowing off material.

"Collisions like this could contribute to the chemical evolution of our galaxy and universe," noted Minnesota's Gehrz. "The ejected material travels out into space, where it gets incorporated into new generations of stars."

Credit: 
University of Minnesota

More young people are choosing not to drink alcohol

Young people in England aren't just drinking less alcohol - a new study published in BMC Public Health shows that more of them are never taking up alcohol at all, and that the increase is widespread among young people.

Researchers at University College London analysed data from the annual Health Survey for England and found that the proportion of 16-24 year olds who don't drink alcohol has increased from 18% in 2005 to 29% in 2015.

The authors found this trend to be largely due to an increasing number of people who had never been drinkers, from 9% in 2005 to 17% in 2015. There were also significant decreases in the number of young people who drank above recommended limits (from 43% to 28%) or who binge drank (27% to 18%). More young people were also engaging in weekly abstinence (from 35% to 50%)

Dr Linda Ng Fat, corresponding author of the study said: "Increases in non-drinking among young people were found across a broad range of groups, including those living in northern or southern regions of England, among the white population, those in full-time education, in employment and across all social classes and healthier groups. That the increase in non-drinking was found across many different groups suggests that non-drinking may becoming more mainstream among young people which could be caused by cultural factors."

Dr Ng Fat said: "These trends are to be welcomed from a public-health standpoint. Factors influencing the shift away from drinking should be capitalised on going forward to ensure that healthier drinking behaviours in young people continue to be encouraged."

Dr Linda Ng Fat added: "The increase in young people who choose not to drink alcohol suggests that this behaviour maybe becoming more acceptable, whereas risky behaviours such as binge drinking may be becoming less normalised."

Increases in non-drinking however were not found among ethnic minorities, those with poor mental health and smokers suggesting that the risky behaviours of smoking and alcohol continue to cluster.

The researchers examined data on 9,699 people aged 16-24 years collected as part of the Health Survey for England 2005-2015, an annual, cross-sectional, nationally representative survey looking at changes in the health and lifestyles of people across England. The authors analysed the proportion of non-drinkers among social demographic and health sub-groups, along with alcohol units consumed by those that did drink and levels of binge drinking.

The authors caution that the cross-sectional, observational nature of this study does not allow for conclusions about cause and effect.

Credit: 
BMC (BioMed Central)

Earlier treatment could help reverse autistic-like behavior in tuberous sclerosis

image: Research in mice indicates that there's a sensitive period for reversing social deficits in tuberous sclerosis complex, a genetic condition that commonly includes autism spectrum disorder. IN the model, the TSC1 gene was deleted only in cerebellar Purkinje cells, which have been implicated in autism.

Image: 
Peter Tsai

New research on autism has found, in a mouse model, that drug treatment at a young age can reverse social impairments. But the same intervention was not effective at an older age.

The study is the first to shed light on the crucial timing of therapy to improve social impairments in a condition associated with autism spectrum disorder. The paper, from Boston Children's Hospital, the University of Texas, Harvard Medical School and Toronto's Hospital for Sick Children, was published today in Cell Reports.

Tuberous sclerosis and autism

Many of the hundreds of genes that likely regulate complex cognitive and neuropsychiatric behaviors in people with autism still remain a mystery. However, genetic disorders such as tuberous sclerosis complex, or TSC, are providing clues. Patients often have mutations in the TSC1 or TSC2 gene, and about half develop autism spectrum disorder.

The investigators, led by Peter Tsai, MD, PhD, at UT Southwestern Medical Center, used a mouse model in which the TSC1 gene is deleted in a region of the brain called the cerebellum.

"There were several mouse models of TSC previously published, but they all had seizures and died early in life, making it difficult to study social cognition," says Mustafa Sahin, MD, PhD, who directs the Translational Neuroscience Center and the Translational Research Program at Boston Children's and was the study's senior investigator. "That is one reason why we turned to knocking out the TSC1 gene only in cerebellar Purkinje cells, which have been implicated in autism. These mice have normal lifespans and do not develop seizures."

Timing is everything

The new research fed off a previous study published in 2012. In that study, Sahin and colleagues treated the mutant mice starting in the first week of life with rapamycin, a drug approved by the FDA for brain tumors, kidney tumors and refractory epilepsy associated with TSC. They found that they could rescue both social deficits and repetitive behaviors.

But when a similar drug, everolimus, was tested in children with TSC, neurocognitive functioning and behavior didn't significantly improve. Sahin and his colleagues wondered whether there was a specific developmental period during which treatment would be effective.

The new mouse study delineates not only the timeframe for effective rapamycin treatment of certain autism-relevant behaviors, but also some of the cellular, electrophysiological and anatomic mechanisms for these sensitive periods.

"We found that treatment initiated in young adulthood, at 6 weeks, rescued social behaviors, but not repetitive behaviors or cognitive inflexibility," says Sahin.

More importantly, neither the social deficits nor the repetitive behaviors responded when the treatment was started at 10 weeks.

Using advanced imaging, the researchers went on to show that the rescue of social behaviors correlates with reversal of specific MRI-based structural changes, cellular pathology and Purkinje cell excitability. Meanwhile, motor learning rescue appeared independent of Purkinje cell survival or rescue of cellular excitability.

A new clinical trial?

Based on the mouse findings, Sahin is now seeking funds to test whether early treatment can improve a broad range of autistic-like behaviors in children with TSC. Specifically, he'll explore whether treatment as early as 12 to 24 months can help prevent both social deficits and repetitive inflexible behaviors. He hopes to see better results than in the earlier clinical trial, which involved children ages 6 to 21.

Past research indicates that different autism-related disorders may have different windows of treatment. For example, animal studies of Rett syndrome suggest that treatment can be effective relatively late in life and still improve neurological outcome.

Credit: 
Boston Children's Hospital

15 emerging technologies that could reduce global catastrophic biological risks

image: Strategic investment in 15 promising technologies could help make the world better prepared and equipped to prevent future infectious disease outbreaks from becoming catastrophic events. This subset of emerging technologies and their potential application are the focus of a new report, Technologies to Address Global Catastrophic Biological Risks, by a team of researchers at the Johns Hopkins Center for Health Security.

Image: 
Harry Campbell/Johns Hopkins Center for Health Security

Strategic investment in 15 promising technologies could help make the world better prepared and equipped to prevent future infectious disease outbreaks from becoming catastrophic events.

This subset of emerging technologies and their potential application are the focus of a new report, Technologies to Address Global Catastrophic Biological Risks, by a team of researchers at the Johns Hopkins Center for Health Security. The study is among the first to assess technologies for the purpose of reducing GCBRs--a special category of risk defined previously by the Center as threats from biological agents that could lead to sudden, extraordinary, widespread disaster beyond the collective capability of national and international organizations and the private sector to control.

"While systems to respond [to an outbreak] are in place in many areas of the world, traditional approaches can be too slow or limited in scope to prevent biological events from becoming severe, even in the best of circumstances," wrote the Center authors. "This type of response remains critically important for today's emergencies, but it can and should be augmented by novel methods and technologies to improve the speed, accuracy, scalability, and reach of the response."

Through an extensive literature review and interviews with more than 50 experts, the Center project team identified 15 example technologies and grouped them into 5 broad categories that are significantly relevant to public health preparedness and response:

Disease Detection, Surveillance, and Situational Awareness: Ubiquitous Genomic Sequencing and Sensing, Drone Networks for Environmental Detection, Remote Sensing for Agricultural Pathogens

Infectious Disease Diagnostics: Microfluidic Devices, Handheld Mass Spectrometry, Cell-Free Diagnostics

Distributed Medical Countermeasure Manufacturing: 3D Printing of Chemicals and Biologics, Synthetic Biology for Manufacturing MCMs

Medical Countermeasure Distribution, Dispensing, and Administration: Microarray Patches for Vaccine Administration, Self-Spreading Vaccines, Ingestible Bacteria for Vaccination, Self-Amplifying mRNA Vaccines, Drone Delivery to Remote Locations

Medical Care and Surge Capacity: Robotics and Telehealth, Portable Easy-to-Use Ventilator

The project team noted their list is not exhaustive or an endorsement of specific companies. The team used a modified version of DARPA's Heilmeier Catechism to standardize the process of evaluating each technology and formulating guidance for funding decisions. That process informed the team's high-level assessment of the readiness of each technology (from early development to being field-ready), the potential impact of the technology on GCBR reduction (from low to high), and the amount of financial investment that would be needed to meaningfully deploy the technology (from low to high). Details on these findings are included in the report.

Crystal Watson, DrPH, MPH, a senior scholar at the Center, Senior Analyst Matthew Watson, and Senior Scholar Tara Kirk Sell, PhD, MA, co-led the project team, which also included Caitlin Rivers, PhD, MPH; Matthew Shearer, MPH; former Analyst Christopher Hurtado, MHS; former Research Assistant Ashley Geleta, MS; and Tom Inglesby, MD, the Center's director. Their work contributes new ideas to a field in need of innovation despite important, ongoing progress in both the public and private sectors to address pandemic risk.

"The adoption and use of novel technologies for the purpose of epidemic control and public health often lag well behind the innovation curve because they do not have a lucrative market driving their development," wrote the authors. "This leaves unrealized opportunities for improved practice."

They recommend creating a consortium of technology developers, public health practitioners, and policymakers tasked with understanding pressing problems surrounding pandemics and GCBRs and jointly developing technology solutions.

Credit: 
Johns Hopkins Center for Health Security

Social media data used to predict retail failure

Researchers have used a combination of social media and transport data to predict the likelihood that a given retail business will succeed or fail.

Using information from ten different cities around the world, the researchers, led by the University of Cambridge, have developed a model that can predict with 80% accuracy whether a new business will fail within six months. The results will be presented at the ACM Conference on Pervasive and Ubiquitous Computing (Ubicomp), taking place this week in Singapore.

While the retail sector has always been risky, the past several years have seen a transformation of high streets as more and more retailers fail. The model built by the researchers could be useful for both entrepreneurs and urban planners when determining where to locate their business or which areas to invest in.

"One of the most important questions for any new business is the amount of demand it will receive. This directly relates to how likely that business is to succeed," said lead author Krittika D'Silva, a Gates Scholar and PhD student at Cambridge's Department of Computer Science and Technology. "What sort of metrics can we use to make those predictions?"

D'Silva and her colleagues used more than 74 million check-ins from the location-based social network Foursquare from Chicago, Helsinki, Jakarta, London, Los Angeles, New York, Paris, San Francisco, Singapore and Tokyo; and data from 181 million taxi trips from New York and Singapore.

Using this data, the researchers classified venues according to the properties of the neighbourhoods in which they were located, the visit patterns at different times of day, and whether a neighbourhood attracted visitors from other neighbourhoods.

"We wanted to better understand the predictive power that metrics about a place at a certain point in time have," said D'Silva.

Whether a business succeeds or fails is normally based on a number of controllable and uncontrollable factors. Controllable factors might include the quality or price of the store's product, its opening hours and its customer satisfaction. Uncontrollable factors might include unemployment rates of a city, overall economic conditions and urban policies.

"We found that even without information about any of these uncontrollable factors, we could still use venue-specific, location-related and mobility-based features in predicting the likely demise of a business," said D'Silva.

The data showed that across all ten cities, venues that are popular around the clock, rather than just at certain points of day, are more likely to succeed. Additionally, venues that are in demand outside of the typical popular hours of other venues in the neighbourhood tend to survive longer.

The data also suggested that venues in diverse neighbourhoods, with multiple types of businesses, tend to survive longer.

While the ten cities had certain similarities, the researchers also had to account for their differences.

"The metrics that were useful predictors vary from city to city, which suggests that factors affect cities in different ways," said D'Silva. "As one example, that the speed of travel to a venue is a significant metric only in New York and Tokyo. This could relate to the speed of transit in those cities or perhaps to the rates of traffic."

To test the predictive power of their model, the researchers first had to determine whether a particular venue had closed within the time window of their data set. They then 'trained' the model on a subset of venues, telling the model what the features of those venues were in the first time window and whether the venue was open or closed in a second time window. They then tested the trained model on another subset of the data to see how accurate it was.

According to the researchers, their model shows that when deciding when and where to open a business, it is important to look beyond the static features of a given neighbourhood and to consider the ways that people move to and through that neighbourhood at different times of day. They now want to consider how these features vary across different neighbourhoods in order to improve the accuracy of their model.

Credit: 
University of Cambridge

When is a nova not a nova? When a white dwarf and a brown dwarf collide

image: This object is possibly the oldest of its kind ever catalogued: the hourglass-shaped remnant named CK Vulpeculae.

Image: 
ALMA (ESO/NAOJ/NRAO)/S. P. S. Eyres

Researchers from Keele University have worked with an international team of astronomers to find for the first time that a white dwarf and a brown dwarf collided in a 'blaze of glory' that was witnessed on Earth in 1670.

Atacama Large Millimeter/submillimeter Array (ALMA) in Chile observed the debris from the explosion

This is the first time such an event has been conclusively identified

The dual rings of dust and gas - the debris from the explosion - resemble an hourglass

Studying the remains of the merger, the researchers were able to detect the tell-tale signature of lithium

The remains are also rich in organic molecules such as formaldehyde (H2CO) and methanamide (NH2CHO)

The brown dwarf star was 'shredded' and dumped on the surface of a white dwarf star, leading to the 1670 eruption and the hourglass we see today

Using the Atacama Large Millimeter/submillimeter Array (ALMA) in Chile, the international team of astronomers, including workers from the Universities of Keele, Manchester, South Wales, Arizona State, Minnesota, Ohio State, Warmia & Mazury, and the South African Astronomical Observatory, found evidence that a white dwarf (the remains of a star like the Sun at the end of its life) and a brown dwarf (a 'failed' star without sufficient mass to sustain thermonuclear fusion) collided in a short-lived blaze of glory that was witnessed on Earth in 1670 as Nova Cygni - 'a new star below the head of the Swan.'

In July of 1670, observers on Earth witnessed a 'new star', or nova, in the constellation Cygnus - the Swan. Where previously there was no obvious star, there abruptly appeared a star as bright as those in the Plough, that gradually faded, reappeared, and finally disappeared from view.

Modern astronomers studying the remains of this cosmic event initially thought it was triggered by the merging of two main-sequence stars - stars on the same evolutionary path as our Sun. This so-called 'new star' was long referred to as 'Nova Vulpeculae 1670', and later became known as CK Vulpeculae.

However, we now know that CK Vulpeculae was not what we would today describe as a 'nova', but is in fact the merger of two stars - a white dwarf and a brown dwarf.

By studying the debris from this explosion - which takes the form of dual rings of dust and gas, resembling an hourglass with a compact central object - the research team concluded that a brown dwarf, a so-called failed star without the mass to sustain nuclear fusion, had merged with a white dwarf.

Professor Nye Evans, Professor of Astrophysics at Keele University and co-author on the paper appearing in the Monthly Notices of the Royal Astronomical Society explains:

"CK Vulpeculae has in the past been regarded as the oldest 'old nova'. However, the observations of CK Vulpeculae I have made over the years, using telescopes on the ground and in space, convinced me more and more that this was no nova. Everyone knew what it wasn't - but nobody knew what it was! But a stellar merger of some sort seemed the best bet. With our ALMA observations of the exquisite dusty hourglass and the warped disc, plus the presence of lithium and peculiar isotope abundances, the jig-saw all fitted together: in 1670 a brown dwarf star was 'shredded' and dumped on the surface of a white dwarf star, leading to the 1670 eruption and the hourglass we see today."

The team of European, American and South African astronomers used the Atacama Large Millimeter/submillimeter Array to examine the remains of the merger, with some interesting findings. By studying the light from two, more distant, stars as they shine through the dusty remains of the merger, the researchers were able to detect the tell-tale signature of the element lithium, which is easily destroyed in stellar interiors.

Dr Stewart Eyres, Deputy Dean of the Faculty of Computing, Engineering and Science at the University of South Wales and lead author on the paper explains:

"The material in the hourglass contains the element lithium, normally easily destroyed in stellar interiors. The presence of lithium, together with unusual isotopic ratios of the elements C, N, O, indicate that an (astronomically!) small amount of material, in the form of a brown dwarf star, crashed onto the surface of a white dwarf in 1670, leading to thermonuclear 'burning', an eruption that led to the brightening seen by the Carthusian monk Anthelme and the astronomer Hevelius, and in the hourglass we see today."

Professor Albert Zijlstra, from The University of Manchester's School of Physics & Astronomy, co-author of the study, says:

"Stellar collisions are the most violent events in the Universe. Most attention is given to collisions between neutrons stars, between two white dwarfs - which can give a supernova - and star-planet collisions.

"But it is very rare to actually see a collision, and where we believe one occurred, it is difficult to know what kind of stars collided. The type we believe that happened here is a new one, not previously considered or ever seen before. This is an extremely exciting discovery."

Professor Sumner Starrfield, Regents' Professor of Astrophysics at Arizona State University comments:

"The white dwarf would have been about 10 times more massive than the brown dwarf, so as the brown dwarf spiralled into the white dwarf it would have been ripped apart by the intense tidal forces exerted by the white dwarf. When these two objects collided, they spilled out a cocktail of molecules and unusual element isotopes.

"These organic molecules, which we could not only detect with ALMA, but also measure how they were expanding into the surrounding environment, provide compelling evidence of the true origin of this blast. This is the first time such an event has been conclusively identified.

"Intriguingly, the hourglass is also rich in organic molecules such as formaldehyde (H2CO), methanol (CH3OH) and methanamide (NH2CHO). These molecules would not survive in an environment undergoing nuclear fusion and must have been produced in the debris from the explosion. This lends further support to the conclusion that a brown dwarf met its demise in a star-on-star collision with a white dwarf."

Since most star systems in the Milky Way are binary, stellar collisions are not that rare, the astronomers note.

Professor Starrfield adds:

"Such collisions are probably not rare and this material will eventually become part of a new planetary system, implying that they may already contain the building-blocks of organic molecules as they are forming."

Credit: 
Keele University

Half the brain encodes both arm movements

image: Patients implanted with electrocorticography arrays completed a 3D center-out reaching task. Electrode locations were based upon the clinical requirements of each patient and were localized to an atlas brain for display (A). B. Patients were seated in the semi-recumbent position and completed reaching movements from the center to the corners of a 50cm physical cube based upon cues from LED lights located at each target while hand positions and ECoG signals were simultaneously recorded. Each patient was implanted with electrodes in a single cortical hemisphere and performed the task with the arm contralateral (C) and ipsilateral (D) to the electrode array in separate recording sessions.

Image: 
Bundy et al., <i>JNeurosci</i> (2018)

Individual arm movements are represented by neural activity in both the left and right hemispheres of the brain, according to a study of epilepsy patients published in JNeurosci. This finding suggests the unaffected hemisphere in stroke could be harnessed to restore limb function on the same side of the body by controlling a brain-computer interface.

The right side of the brain is understood to control the left side of the body, and vice versa. Recent evidence, however, supports a connection between the same side of the brain and body during limb movement.

Eric Leuthardt, David Bundy, and colleagues explored brain activity during such ipsilateral movements during a reaching task in four epilepsy patients whose condition enabled invasive monitoring of their brains through implanted electrodes. Using a machine learning algorithm, the researchers demonstrate successful decoding of speed, velocity, and position information of both left and right arm movements regardless of the location of the electrodes. In addition to advancing our understanding of how the brain controls the body, these results could inform the development of more effective rehabilitation strategies following brain injury.

Credit: 
Society for Neuroscience