Culture

Popular European football games linked to traffic accidents in Asia

Days when high profile European football matches are played are associated with more traffic accidents in Asia than days when less popular matches are played, finds a study in the Christmas issue of The BMJ.

One explanation may be that Asian drivers stay awake until the early hours of the morning to watch high profile football games and lose sleep as a result.

Football is viewed by more people worldwide than any other sport, but most high profile games are played in Europe, so fans who live outside Europe must watch these games at odd local times owing to differences in time zones.

Asian fans are the most affected, as games scheduled to start at 8 pm in Europe means fans in Beijing, Hong Kong, and Singapore have to stay up until 4 30 am to finish the game, while fans in Seoul and Tokyo have to stay up until 5 30 am.

Given that sleep deprivation is associated with poor attention management, slower reaction times, and impaired decision making, one theory is that drivers are more likely to be involved in traffic accidents on days when high profile football games air early in the morning.

If true, this would have important policy implications, as traffic accidents can result in considerable economic and medical costs.

To test this idea, a team of researchers based in China, Singapore and the USA analysed close to 2 million traffic accidents among taxi drivers in Singapore and all drivers in Taiwan together with 12,788 European club football games over a seven year period (2012-2018).

The popularity of a given match was determined according to the average market value of teams (a combined measure of players' salaries). With this metric, a game between Manchester United and Tottenham Hotspur will be classified as popular whereas a game between Burnley and Crystal Palace will be considered less popular.

After taking account of potentially influential factors such as driver age, gender and experience, weather conditions, time of year, and weekend versus weekday effects, the researchers found that days when high profile football games were aired also had higher than average traffic accidents in both Singapore and Taiwan.

For an approximate €135m (£120m; $160m) increase in average market value for matches played on a given day, around one extra accident would occur among Singapore taxi drivers, and for an approximate €8m increase in average market value of matches, around one extra accident would occur among all drivers in Taiwan.

Based on these figures, the researchers estimate that football games may be responsible for at least 371 accidents a year among taxi drivers in Singapore (this figure is likely to be much larger across all drivers in Singapore) and around 41,000 accidents per year among the Taiwanese general public.

In terms of annual economic losses, they estimate these to be more than €820,000 among Singapore taxi drivers and almost €14m among Taiwanese drivers and insurance companies, although they stress that these figures should be interpreted with caution.

This is an observational study, so can't establish cause, although the researchers were able to rule out many alternative explanations such as roadside conditions and driver characteristics. The researchers also point to some limitations, such as a lack of data on the severity of the accidents reported and being unable to compare match days against non-game days.

Nevertheless, they suggest that football's governing bodies could consider scheduling high profile games on Friday or Saturday evenings local European time (Saturday or Sunday early mornings local Asian time) when fans can sleep in immediately after watching games.

Alternatively, increasing roadside safety in Asia on high profile game days (for example, more traffic patrols), as well as banning all video based devices for drivers, could potentially reduce these economic impacts and injuries related to traffic accidents, they write.

Given that Asia has the most populous time zone, encompassing 24% of the world's population, the total health and economic impact of this finding is likely to be much higher, they conclude.

Credit: 
BMJ Group

Individuals with high ADHD-traits are more vulnerable to insomnia

image: Predrag Petrovic, consultant and associate professor in psychiatry at the Department of Clinical Neuroscience at Karolinska Institutet, Sweden.

Image: 
Andreas Andersson

Individuals with high ADHD-traits that do not meet the criteria for a diagnosis are less able to perform tasks involving attentional regulation or emotional control after a sleepless night than individuals with low ADHD-traits, a new study from Karolinska Institutet published in Biological Psychiatry: Cognitive Neuroscience and Neuroimaging reports.

While it can cause multiple cognitive impairments, there is considerable individual variation in sensitivity to the effects of insomnia. The reason for this variability has been an unresolved research question for long. In the present study, KI researchers investigated how sleep deprivation affects our executive functions, which is to say the central cognitive processes that govern our thoughts and actions. They also wanted to ascertain if people with ADHD tendencies are more sensitive to insomnia, with more severe functional impairments as a result.

ADHD (attention deficit hyperactivity disorder) is characterized by inattention, impulsiveness and hyperactivity; however, the symptoms vary from person to person and often also include emotional instability.

"You could say that many people have some subclinical ADHD-like symptoms but a diagnosis is only made once the symptoms become so prominent that they interfere with our everyday lives," says Predrag Petrovic, consultant and associate professor in psychiatry at the Department of Clinical Neuroscience at Karolinska Institutet, Sweden, who led the study along with Tina Sundelin and John Axelsson, both researchers at Karolinska Institutet and the Stress Research Institute at Stockholm University.

The study included 180 healthy participants between the ages of 17 and 45 without an ADHD diagnosis. Tendencies towards inattentiveness and emotional instability were assessed on the Brown Attention Deficit Disorder (B-ADD) scale.

The participants were randomly assigned to two groups, one that was allowed to sleep normally and one that was deprived of sleep for one night. They were then instructed to perform a test that measures executive functions and emotional control the following day (a Stroop test with neutral and emotional faces).

The researchers found that the sleep-deprived group showed worse performance in the experimental tasks (including more cognitive response variability). Moreover, people with high ADHD-traits were more vulnerable to sleep deprivation and showed greater impairment than those with low ADHD-traits.

The effects were also related to the most prominent type of subclinical ADHD-like symptom, in that after being deprived of sleep, the participants who displayed more everyday problems with emotional instability had larger problems with the cognitive task involving emotional regulation, and those who had more everyday inattention symptoms had larger problems with the non-emotional cognitive task.

"One of the reasons why these results are important is that we know that young people are getting much less sleep than they did just ten years ago," explains Dr Petrovic. "If young people with high ADHD-traits regularly get too little sleep they will perform worse cognitively and, what's more, their symptoms might even end up at a clinically significant level."

Credit: 
Karolinska Institutet

Suicide mortality in Maryland during COVID-19 pandemic

What The Study Did:
Differences in suicide deaths by race/ethnicity during the COVID-19 pandemic in Maryland were analyzed in this observational study.

Authors: Paul Sasha Nestadt, M.D., of the Johns Hopkins University School of Medicine in Baltimore, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamapsychiatry.2020.3938)

Editor's Note: The article includes funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Suicide risk among patients with Parkinson disease

What The Study Did: Researchers investigated whether Parkinson disease was associated with an increased risk of suicide among a large group of patients in Taiwan.

Authors: Pei-Chen Lee, Ph.D., of the National Taipei University of Nursing and Health Sciences in Taipei, Taiwan, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamapsychiatry.2020.4001)

Editor's Note: The article includes funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

UV exposure, risk of melanoma in skin of color

What The Study Did: The association between ultraviolet (UV) light exposure and the risk of melanoma in individuals with skin of color was examined with a review of the results of 13 studies.

Authors: Adewole S. Adamson, M.D., M.P.P., of the University of Texas at Austin, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamadermatol.2020.4616)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Giving cells an appetite for viruses

image: A UT Southwestern study identified a gene used in the cellular recycling process called autophagy that rids cells of viruses. The above illustration breaks down the steps involved in this process.

Image: 
UT Southwestern Medical Center

DALLAS - Dec. 16, 2020 - A team led by UT Southwestern researchers has identified a key gene necessary for cells to consume and destroy viruses. The findings, reported online today in Nature, could lead to ways to manipulate this process to improve the immune system's ability to combat viral infections, such as those fueling the ongoing COVID-19 pandemic.

Scientists have long known that cells use a process called autophagy to rid themselves of unwanted material. Autophagy, which translates as "self-eating," involves isolating cellular garbage in double-layered vesicles called autophagosomes, which are then fused with single-layered vesicles known as lysosomes to degrade the materials inside and recycle them into building blocks for other uses.

This process helps cells discard old or defective organelles and protein complexes, bacteria, and viral invaders. For a variety of cellular refuse tasks, researchers have identified clear pathways by which cells initiate and regulate autophagy. But it's been unclear to date, notes study leader Xiaonan Dong, Ph.D., lead author of the study and assistant professor of internal medicine at UTSW, whether a unique pathway specifically targeted viruses for autophagy.

In this latest research, Dong and his colleagues manipulated human cells infected by different viruses to individually deplete more than 18,000 different genes, examining their effects on autophagy. As their initial models, the researchers infected cells with herpes simplex virus type 1 (HSV-1), which causes cold sores and sexually transmitted infections, and Sindbis virus, which causes a mosquito-borne illness most common in parts of Africa, Egypt, the Middle East, the Philippines, and Australia.

The team's investigation identified 216 genes that appear to play a role in viral autophagy. To narrow their search to the key players, the researchers used bioinformatics to analyze biological processes these genes regulate. They quickly honed in on a gene called sorting nexin 5 (SNX5), which produces a protein that helps recycle plasma membrane-anchored proteins from endosomes, sorting organelles in cells that often ferry materials isolated outside cells into their interior. Because viruses often enter cells through this pathway, the team reasoned, SNX5 could be important for their autophagy.

When the scientists shut down SNX5 in human cells, the cells' ability to perform autophagy on HSV-1 and Sindbis viruses significantly decreased - however, their ability to activate autophagy as part of normal cellular cleaning or for bacteria removal stayed intact, suggesting that this gene is specifically used for viral autophagy. This effect persisted for several other viruses, including Zika, West Nile, chikungunya, poliovirus, Coxsackievirus B3, and influenza A, suggesting that SNX5 is part of a general mechanism for autophagy across a broad range of viruses.

Further experiments showed that deleting SNX5 greatly increased susceptibility to infection in both lab-grown cells and in adult and juvenile animals. However, when these cells were infected by viruses manipulated to suppress their ability to induce autophagy, they were largely spared.

Together, explains Dong, these results suggest that cells do have a unique pathway for viral autophagy that likely has SNX5 at the helm. This finding not only solves a long-standing mystery in basic biology, he adds, but it could eventually lead to new ways of fighting viral infections. Most current methods target viruses through their individual weaknesses - an approach that necessitates developing a unique strategy for every viral type.

"By learning how cells naturally take up and degrade viruses," Dong says, "we could discover ways to augment this process, creating a more general strategy for developing broad-spectrum antiviral therapeutics that combat an array of different viral infections."

Co-leaders of this research effort were the late Beth Levine, M.D., former director of the Center for Autophagy Research and professor of internal medicine and microbiology at UT Southwestern, and Ramnik J. Xavier, M.D., the Kurt Isselbacher Professor of Medicine at Harvard Medical School and a core member of the Broad Institute of MIT and Harvard. Levine, a world-renowned leader in the study of autophagy, died in June. She also was an investigator with the Howard Hughes Medical Institute and member of the National Academy of Sciences. At UT Southwestern, she held the Charles Sprague Distinguished Chair in Biomedical Science.

"This is a beautiful study that further cements the legacy of Dr. Levine and work from her lab members," says Julie Pfeiffer, Ph.D., professor of microbiology at UTSW and a contributor to the study who holds the Kern and Marnie Wildenthal President's Research Council Professorship in Medical Science.

Credit: 
UT Southwestern Medical Center

Study highlights stark inequality in survival after cardiac surgery between paying and NHS patients

A new study has revealed paying patients are 20 per cent less likely to die or develop major complications, such as reintervention or stroke, after cardiac surgery than NHS patients - findings researchers say cannot be explained by socioeconomic factors alone.

The study, led by academics at the University of Bristol, looked at the data of over 280,000 patients who underwent adult cardiac surgery over a ten-year period from 2009 to 2018 at 31 NHS cardiac units in England. 5,967of these were private payers and 274,242 were government funded.

Private payers are treated by the same clinical teams but can request their surgeon and when to have surgery. They also have access to 'Cinderella services' such as enhanced menus and single rooms.

Umberto Benedetto, Associate Professor in Cardiac Surgery in Bristol Medical School: Translational Health Sciences (THS) from the University of Bristol is an NHS cardiac surgeon and joint lead author of the research, published in The Lancet Regional Health - Europe. He said the study posed important questions about why there is a significant disparity in health outcomes between government funded patients and those accessing NHS healthcare through a private payer-scheme, even after socio-economic factors had been considered.

"These are patients who are treated in the same hospitals, by the same clinical teams, and yet we have found stark differences in survival between those who pay and those who don't," said Prof Benedetto.

"It's tempting to assume this is due to private payers having a more affluent and therefore better quality of life with fewer comorbidities. However, after analysing the data, we found evidence supporting the hypothesis that private patients receive a better care."

Researchers used data from the National Adult Cardiac Surgery Audit (NACSA) registry. They looked at several outcomes: the primary being in-hospital mortality, but also incidence of in-hospital postoperative cerebrovascular accident (CVA), renal dialysis, sternal wound infection, and re-exploration.

To eliminate socioeconomic status as the sole or primary cause of difference in clinical outcomes between private and NHS payers, the researchers used information on socioeconomic status by linkage with the Iteration of the English Indices of Deprivation (IoD).

Dr Arnaldo Dimagli, Honorary Research fellow at the University of Bristol and joint lead author said:

"Our findings support the hypothesis that a complex interaction between socioeconomic and health system-related factors exists for patients undergoing cardiac surgery. This should stimulate further investigations in order to identify interventions which can tackle health inequalities. For example, it is possible that NHS payers have to wait longer to get their operation. This can expose them to the risk of deterioration before surgery which can affect their outcomes."

Credit: 
University of Bristol

Female language style promotes visibility and influence online

A female-typical language style promotes the popularity of talks in the digital context and turns out to be an underappreciated but highly effective tool for social influence. This was shown by UZH psychologists in an international study in which they analyzed 1,100 TED Talks.

A large part of social interaction nowadays takes place digitally. And the digital age has brought new opportunities to interact and communicate with increasingly large audiences. The huge power for social influence of digital media may come with the risk of intensifying common societal biases, such as gender stereotypes. One behavioral manifestation that plays a major role in such social evaluations is language use. In past research that focused on offline contexts, male characteristics were associated with more influence, while female characteristics tended to be associated with less competence.

Men and women show different language styles

To investigate how gender-linked language styles influence the impact of online contributions and whether they are subject to the same rules as in offline environments, an international research group led by the University of Zurich made use of the TED science platform. The topics of the TED Talks range from technology, entertainment and design (TED) to global issues, business, science and culture, with an average two million views for each talk.

The researchers collected the transcripts of nearly 1,100 TED Talks (348 of which were given by women) in order to identify typical male and female language styles. For this purpose, an index was used that placed each speaker between the extremes of very masculine to very feminine speech, which were empirically defined on the basis of large samples. According to previous research men commonly use more abstract and analytical language while female-typical language has been described as more narrative, personal, social and emotional; women tend to refer more to themselves and to other people more than men.

Female-typical language is linked to higher impact

The researchers correlated the identified speech styles with the number of views as well as the positive and negative ratings the talks received. By doing so, they wanted to find out which language style had a stronger effect in TED Talks: A more instrumental and complex male-typical language style or a simpler and more personal female-typical language style.

"Due to the well documented male advantage in social influence, we expected a general advantage of male-typical language style in terms of talk impact," says first author and UZH psychologist Tabea Meier. "We assumed that this might be the case for women in particular, namely that a male language style might help them overcome the ascribed lower status typically associated with their gender." Instead, the researchers found the opposite - that female-typical language was connected with higher impact.

TED Talks that showed a female-typical language were associated with more talk views irrespective of the speaker's gender. "Female-typical language thus conferred an advantage for male and female speakers alike in our sample," says Meier. "For the most popular TED talks this meant over 700,000 extra views online."

"Beautiful" and "courageous" versus "fascinating" and "informative"

The language style not only predicted the quantitative impact manifested in talk views but also the qualitative impact in types of positive and negative ratings received. More female-typical language style attracted positive ratings like "beautiful", "courageous" and "funny" while male-typical language evoked positive ratings like "ingenious", "fascinating", "informative" and "persuasive".

"These different qualities are in line with common gender stereotypes of women being warmer and emotionally expressive, and men more factual," says author and research group leader Andrea Horn. However, in the digital sphere, unlike offline, such qualities did not interfere with popularity. This was also reflected in the negative ratings of the TED Talks. More female-typical language style also led to fewer "unconvincing" ratings. The authors conclude that a female-typical language style may thus be a powerful tool to promote impact and visibility irrespective of whether the speaker is male or female.

Credit: 
University of Zurich

A pair of lonely planet-like objects born like stars

image: Artist's composition of the two brown dwarfs, in the foreground Oph 98B in purple, in the background Oph 98A in red. Oph 98A is the more massive and therefore more luminous and hotter of the two. The two objects are surrounded by the molecular cloud in which they were formed.

Image: 
© University of Bern, Illustration: Thibaut Roger

Star-forming processes sometimes create mysterious astronomical objects called brown dwarfs, which are smaller and colder than stars, and can have masses and temperatures down to those of exoplanets in the most extreme cases. Just like stars, brown dwarfs often wander alone through space, but can also be seen in binary systems, where two brown dwarfs orbit one another and travel together in the galaxy.

Researchers led by Clémence Fontanive from the Center for Space and Habitability (CSH) and the NCCR PlanetS discovered a curious starless binary system of brown dwarfs. The system CFHTWIR-Oph 98 (or Oph 98 for short) consists of the two very low-mass objects Oph 98 A and Oph 98 B. It is located 450 light years away from Earth in the stellar association Ophiuchus. The researchers were surprised by the fact that Oph 98 A and B are orbiting each other from a strikingly large distance, about 5 times the distance between Pluto and the Sun, which corresponds to 200 times the distance between the Earth and the Sun. The study has just been published in The Astrophysical Journal Letters.

Extremely low masses and a very large separation

The pair is a rare example of two objects similar in many aspects to extra-solar giant planets, orbiting around each other with no parent star. The more massive component, Oph 98 A, is a young brown dwarf with a mass of 15 times that of Jupiter, which is almost exactly on the boundary separating brown dwarfs from planets. Its companion, Oph 98 B, is only 8 times heavier than Jupiter.

Components of binary systems are tied by an invisible link called gravitational binding energy, and this bond gets stronger when objects are more massive or closer to one another. With extremely low masses and a very large separation, Oph 98 has the weakest binding energy of any binary system known to date.

Discovery thanks to data from Hubble

Clémence Fontanive and her colleagues discovered the companion to Oph 98 A using images from the Hubble Space Telescope. Fontanive says: "Low-mass brown dwarfs are very cold and emit very little light, only through infrared thermal radiation. This heat glow is extremely faint and red, and brown dwarfs are hence only visible in infrared light." Furthermore, the stellar association in which the binary is located, Ophiuchus, is embedded in a dense, dusty cloud which scatters visible light. "Infrared observations are the only way to see through this dust", explains the lead researcher. "Detecting a system like Oph 98 also requires a camera with a very high resolution, as the angle separating Oph 98 A and B is a thousand times smaller than the size of the moon in the sky," she adds. The Hubble Space Telescope is among the few telescopes capable of observing objects as faint as these brown dwarfs, and able to resolve such tight angles.

Because brown dwarfs are cold enough, water vapor forms in their atmospheres, creating prominent features in the infrared that are commonly used to identify brown dwarfs. However, these water signatures cannot be easily detected from the surface of the Earth. Located above the atmosphere in the vacuum of space, Hubble allows to probe the existence of water vapor in astronomical objects. Fontanive explains: "Both objects looked very red and showed clear signs of water molecules. This immediately confirmed that the faint source we saw next to Oph 98 A was very likely to also be a cold brown dwarf, rather than a random star that happened to be aligned with the brown dwarf in the sky."

The team also found images in which the binary was visible, collected 14 years ago with the Canada-France-Hawaii Telescope (CFHT) in Hawaii. "We observed the system again this summer from another Hawaiian observatory, the United Kingdom Infra-Red Telescope. Using these data, we were able to confirm that Oph 98 A and B are moving together across the sky over time, relative to other stars located behind them, which is evidence that they are bound to each other in a binary pair", explains Fontanive.

An atypical result of star formation

The Oph 98 binary system formed only 3 million years ago in the nearby Ophiuchus stellar nursery, making it a newborn on astronomical timescales. The age of the system is much shorter than the typical time needed to build planets. Brown dwarfs like Oph 98 A are formed by the same mechanisms as stars. Despite Oph 98 B being the right size for a planet, the host Oph 98 A is too small to have a sufficiently large reservoir of material to build a planet that big. "This tells us that Oph 98 B, like its host, must have formed through the same mechanisms that produce stars and shows that the processes that create binary stars operate on scale-down versions all the way down to these planetary masses", comments Clémence Fontanive.

With the discovery of two planet-like worlds - already uncommon products of star formation - bound to each other in such an extreme configuration, "we are really witnessing an incredibly rare output of stellar formation processes", as Fontanive describes.

Bernese space exploration: With the world's elite since the first moon landing

When the second man, "Buzz" Aldrin, stepped out of the lunar module on July 21, 1969, the first task he did was to set up the Bernese Solar Wind Composition experiment (SWC) also known as the "solar wind sail" by planting it in the ground of the moon, even before the American flag. This experiment, which was planned and the results analysed by Prof. Dr. Johannes Geiss and his team from the Physics Institute of the University of Bern, was the first great highlight in the history of Bernese space exploration.

Ever since Bernese space exploration has been among the world's elite. The numbers are impressive: 25 times were instruments flown into the upper atmosphere and ionosphere using rockets (1967-1993), 9 times into the stratosphere with balloon flights (1991-2008), over 30 instruments were flown on space probes, and with CHEOPS the University of Bern shares responsibility with ESA for a whole mission.

Credit: 
University of Bern

Turning sweat against itself with a metal-free antiperspirant

Body odor is an unpleasant smell, produced when bacteria living on the skin break down the proteins in sweat. To avoid stinking, some people apply antiperspirants that clog sweat ducts with foreign materials, such as metals, to slow perspiration. As a step toward a more natural solution, researchers reporting in ACS Applied Materials & Interfaces have turned sweat against itself using an evaporation-based approach in which the salts in sweat create a gel-like plug.

Most commercial antiperspirants use metallic salts, such as aluminum chlorohydrate, to clog sweat ducts. Years ago, it was suggested that using these products could increase the risk of developing breast cancer, but according to the National Cancer Institute, no scientific evidence exists that confirm those claims. Nevertheless, consumer and regulatory demand has increased for natural, non-metallic alternatives. Because propylene glycol strongly attracts water and is "generally recognized as safe" by the U.S. Food & Drug Administration, Jonathan Boreyko and colleagues wanted to see if it would draw the water out of sweat, leaving behind a natural salt blockade that would stop the flow.

To simulate a sweat duct in the laboratory, the researchers slowly pushed an artificial sweat solution through a thin glass tube. Then, they placed a polymer cube covered in propylene glycol near the opening. The sweat solution initially bubbled, then formed a gel-like plug that obstructed the tube within two minutes. However, when the cube did not contain propylene glycol or the cube was absent, sweat continually exited the outlet and ran along the outside surface of the tube. The team used glass tubes that were 3.3 times wider than real sweat ducts in human skin, but their models suggest that applying a propylene glycol film directly on skin would also result in dehydrated salts and plug formation. The next step in this research would be to conduct human trials to compare the comfort and effectiveness of a film made of propylene glycol, or other water-attracting materials, against traditional metal-based antiperspirants, the researchers say.

Credit: 
American Chemical Society

Clemson researcher identifies gene teams working in subregions of brain

image: Clemson University graduate student Yuqing "Iris" Hang said researchers must know how genes interact in a normal brain before they can look at whether they act differently in brain tumors or in people with intellectual disabilities.

Image: 
Photo courtesy of Clemson University College of Science and Yuqing

CLEMSON, South Carolina - You must first understand how something works normally before you can figure out why it's broken.

That's the idea behind brain research by Yuqing "Iris" Hang, a Clemson University graduate student pursuing a doctorate in biochemistry in the College of Science's Department of Genetics and Biochemistry.

Hang is studying how genes interact in a normal brain.

"When studying a disease trait, we always need to compare that trait to the normal condition," she said.

An article published recently in Scientific Reports titled "Exploration into biomarker potential of region-specific brain gene co-expression networks" describes Hang's research.

Hang used RNA-seq gene expression profiles from the National Institutes of Health Genotype-Tissue Expression (GTEx) project and software called the Knowledge Independent Network Construction (KINC) to sort thousands of genes into gene groups that team up in specific parts of the normal human brain. Clemson professor Alex Feltus and former graduate student Stephen Ficklin, who is now an assistant professor at Washington State University, developed KINC. Hang is a student in Feltus' Systems Genetics Lab.

Hang then constructed six mini gene co-expression networks for a normally functioning brain and conducted additional statistical tests to reveal correlations between genes and cellular pathways in the brain. In a gene co-expression network, two genes with a high likelihood of interacting with each other will be connected by a line called an edge.

During her analysis, Hang found teams of genes working in all parts of the brain -- but only in the brain -- and she also found teams of genes working in subregions of the brain.

"We have brain biomarkers -- genes that are co-expressed pair-wise in different brain tissues -- that are evidence that they are probably doing something normal in those tissues they're not doing in other tissues. They have some collective function," Feltus said. Researchers can now test each of those gene teams to see if gene pairs are changing in brain tumors or in people with intellectual disabilities.

"Now we know what genes are interacting normally in the brain, and then we can test and see if those interactions don't exist in disease," he said.

One notable finding is that genes in the six brain mini gene co-expression networks showed higher mutation rates in tumors than in matched sets of random genes.

"This means those genes could play an important role in brain development. The mutation of those genes would increase the chances of getting a brain tumor or brain-related disease," Hang said.

Feltus called Hang's research a step toward the goal of finding gene teams that are controlling phenotypes.

"One hundred years from now, I don't think there will be any cancer, and I think we'll understand how intellectual disabilities are occurring," he said.

Credit: 
Clemson University

BAME babies at highest risk of Vitamin D deficiency

A third of all babies and half of Black, Asian and Minority Ethnic (BAME) babies are vitamin D deficient, a large study of 3000 newborn's in the West Midlands has shown, highlighting potential shortfalls in the current UK antenatal supplementation programme.

Vitamin D, sometimes referred to as the 'sunshine vitamin' helps the body absorb calcium and phosphate from our diet making it vital for healthy bones, teeth and muscles. As well as causing bone softness and weakness, vitamin D deficiency in newborn infants can lead to serious life-threatening complications such as seizures, serious heart conditions and, rarely, death in the first months of life. With very few dietary sources of vitamin D, supplementation programmes are in place to ensure adequate vitamin D consumption in high risk groups which include pregnant women and children.

This latest study, led by experts at the University of Birmingham and Birmingham Women's and Children's NHS Foundation Trust, analysed vitamin D levels on 3000 dry blood samples, collected via a heel prick in the first week of life as part of the national Newborn Blood Spot screening programme. Samples were strategically collected at the end of summer and winter to capture the peak and trough in vitamin D levels. Vitamin D levels were analysed alongside ethnicity, gestational age, maternal age and also deprivation indices. Proportion of babies with deficient, insufficient and sufficient levels of vitamin D based on season of birth and ethnicity were evaluated.

The majority of newborn's tested were white British (59.1%) and born at term. Vitamin D deficiency was present in 35% of the cohort. The results also demonstrate significant seasonal differences with 52.6% of winter-born babies being vitamin D deficient compared with 18.4% of summer-born babies. Nearly a quarter of babies tested (24%) were from areas with high levels of social deprivation.

Perhaps most significant is the prevalence of vitamin D deficiency between ethnic groups. Compared to white British babies, concentrations of the vitamin were much lower in babies of Black, Asian and mixed races as well as non-British white babies. Overall, across both seasons nearly half of the babies from Asian and black ethnic backgrounds were found to be deficient in vitamin D (47.7% and 47.4%, respectively) compared with 30.3% of white British babies. Across the entire multi-ethnic cohort, nearly 70% of the babies had a low vitamin D status, meaning that two thirds of the babies tested were either deficient or had insufficient levels.

Lead author Dr Suma Uday from the University of Birmingham and Birmingham Women's and Children's hospital said: "Vitamin D deficiency is common in all babies born in the UK, especially in winter months. The high proportion of dark-skinned infants with low vitamin D status, demonstrates potential failings of the UK's national antenatal supplementation programme in protecting these ethnic groups, who are well recognised to be at a high risk of vitamin D deficiency. We need to work on improving the disconnect between provision and uptake of vitamins in high risk-groups like expectant mothers from BAME backgrounds."

Senior author Professor Wolfgang Högler from the University of Birmingham's Institute of Metabolism and Systems Research said: "Our findings suggest that vitamin D supplementation programmes could be much improved if they were delivered and monitored like immunisation programmes. Much easier and more effective would be food fortification with vitamin D - an approach that we have seen to be successful in other high latitude countries, such as Finland"

Credit: 
University of Birmingham

Whole genomes map pathways of chimpanzee and bonobo divergence

Chimpanzees and bonobos are sister species that diverged around 1.8 million years ago as the Congo River formed a geographic boundary and they evolved in separate environments. Now, a whole-genome comparison of bonobos and chimpanzees reveals the gene pathways associated with the striking differences between the two species' diets, sociality and sexual behaviors.

The journal Genes, Brain and Behavior published the comparative analysis, conducted by anthropologists at Emory University.

"Our paper is the first whole-genome positive selection scan between chimpanzees and bonobos," says John Lindo, Emory assistant professor of anthropology and senior author of the study. "We contrasted the genomes of both species to understand how natural selection has shaped differences between the two closely related primates."

Lindo is a geneticist specialized in ancient DNA and natural selection. "Chimpanzees and bonobos are fascinating because they are very, very closely linked genetically but they have huge behavioral differences," he says.

The two species also share around 99 percent of human DNA, making them our closest living relatives in the animal kingdom. "Understanding the physiological mechanisms underlying the differences in chimpanzee and bonobo behaviors -- particularly the much stronger propensity of bonobos toward conflict resolution instead of fighting -- may also give us information about the genes underlying our own behaviors," Lindo says.

Sarah Kovalaskas, an Emory graduate student of anthropology, is first author of the paper. Before joining Emory she spent nine months in the field, studying the social development of juvenile bonobos in the Democratic Republic of Congo (DRC). Wild bonobos, an endangered species, are only found in forests south of the Congo River in the DRC.

"Bonobos are well-known for being playful, even as adults," Kovalaskas says. "It was fun to observe the juveniles twirling around in the trees, chasing one another and trying to pull each other down. When the mothers tried to wean them, they would sometimes throw tantrums and scream and run around. You can't help but recognize the similarity in behaviors to humans."

Populations of chimpanzees, also an endangered species, are found in a forested belt north of the Congo River and scattered in a few other areas of west and central Africa.

Bonobos and chimpanzees closely resemble one another physically and they were not recognized as separate species until the 1930s. Their behavioral differences are much more distinct. While bonobos organize into female-led societies, chimpanzees are patriarchal. When bonobos encounter other bonobo groups they generally interact peacefully. Bonobos are also known for using sexual behaviors to defuse tension -- including same-sex behaviors among females. Chimpanzees, however, tend to act more aggressively when encountering other chimpanzee groups and may even have violent exchanges that include fatalities.

A leading hypothesis suggests that different feeding ecologies were key to the behavioral divergence between the two species. This theory posits that the abundant ground vegetation in the bonobo territory provided easy access to year-round food without competition from other individuals. Larger groups could feed together instead of foraging in isolation, allowing females to develop strong bonds to counter male domination, and to mate with less aggressive males, leading a kind of "self-domestication."

The whole genome comparison showed selection in bonobos for genes related to the production of pancreatic amylase -- an enzyme that breaks down starch. Previous research has shown that human populations that began consuming more grains with the rise of agriculture show an increase in copies of a closely related gene that codes for amylase.

"Our results add to the evidence that diet and the available resources had a definite impact on bonobo evolution," Kovalaskas says. "We can see it in the genome."

Compared to chimpanzees, bonobos also showed differences in genetic pathways well-known to be related to social behaviors of animals -- as well as humans. Bonobos had strong selection for genes in the oxytocin receptor pathway, which plays a role in promoting social bonds; serotonin, involved in modulating aggression; and gonadotropin, known to affect sexual behavior.

"The strong female bonds among bonobos, in part, may be mediated by their same-sex sexual behaviors," says co-author James Rilling, professor and chair of Emory's Department of Anthropology. "Our data suggest that something interesting is going on in the bonobo pathways for oxytocin, serotonin and gonadotropin and that future research into the physiological mechanisms underlying behavioral differences between bonobos and chimpanzees may want to target those specific systems."

Credit: 
Emory Health Sciences

Cancer: Tumor driver promoting EMT, metastasis and resistance to therapy

Cancer metastasis, which is the dissemination of tumor cells into distant organs,
is the leading cause of mortality in cancer patients. To undergo metastasis, cells must
leave the primary tumor, circulate into the blood, colonize distant organs, and form distant metastasis. It has been proposed that epithelial to mesenchymal transition (EMT), a process in which epithelial cells detach from their neighboring cells, and acquire mesenchymal migrating properties, is important to initiate the metastatic cascade allowing the cancer cells to leave the primary tumor. However, the role of genetic mutations in promoting EMT is unknown.

FAT1 is among the most frequently mutated driver genes in a broad range of human cancers. The loss of function mutations in this gene suggest that FAT1 acts as tumour suppressor, preventing cancer development. However, and despite the high frequency of FAT1 mutations, its role in cancer is poorly understood.

In a study published in Nature, researchers led by Prof. Cedric Blanpain, MD/PhD, WELBIO investigator, Director of the Laboratory of Stem Cells and Cancer and Professor at the Université libre de Bruxelles, Belgium, demonstrated, for the first time, that loss of FAT1, promote EMT, invasive features and metastasis in skin squamous cell carcinoma -the second most frequent cancer in humans-, lung cancer -the deadliest cancer - and head and neck tumors.

Ievgenia Pastushenko and colleagues used state of the art genetic models of skin and lung cancers, as well as human skin, lung and head and neck tumors to assess
the role of FAT1 in cancer.

The authors discovered that loss of function of FAT1, promotes hybrid EMT phenotype, characterized by the co-expression of epithelial and mesenchymal genes in tumor cells. The authors demonstrated that this hybrid EMT
state occurring following FAT1 loss of function, promotes metastasis and was associated with poor clinical outcome in patients with lung cancers. "It was particularly
exciting to identify that mutations in a single gene, FAT1, promote hybrid EMT state, leading to metastasis and associated with poor prognosis in cancer patients" comments Ievgenia Pastushenko, the first author of this study.

Using different molecular approaches, the authors decipher the mechanisms by which FAT1 mutations promote hybrid EMT state. "The identification of the mechanisms
that promote this highly metastatic tumor state, allowed us to identify drug resistance and vulnerabilities in FAT1 mutated cancers. We found that Fat1 mutated cancers are
highly resistant to several drugs including EGFR inhibitor that are frequently used to treat patients with lung cancers. Most interestingly, we identify that FAT1 mutated cancers are particularly sensitive to other drugs including Src inhibitor that are currently used to treat patients with blood cancer. These findings will have very important and
immediate implications for personalized therapy in patients FAT1 mutated cancers", comments Pr Cedric Blanpain, the
senior author of this study.

Credit: 
Université libre de Bruxelles

Connections determine everything

image: Stroke Popular Figure

Image: 
M.Nzarova et al.

A team of scientists, with the first author from the HSE University, were investigating which factors are the most important for the upper limb motor recovery after a stroke. The study is published in Stroke.

, the world's leading journal for cerebrovascular pathology.

A stroke occurs when there is a blockage or rupture of a blood vessel feeding the brain. About 400,000 stroke cases are registered in Russia annually. Stroke is a disease, which occurs mostly in older ages but not only, for instance in Russia every 7th stroke happens in young people. Today there are more than 1.5 million people living in Russia who are in need of motor, speech, or cognitive rehabilitation due to stroke. The impairment in the upper limb is the most common and one of the most challenging for rehabilitation.

The extent to which the brain is damaged and movements are impaired after a stroke varies greatly among patients. How effectively a patient will regain his/her functions after a stroke depends both on the severity of the damage and the adequacy of the rehabilitation interventions. Assessment can be conducted using functional and structural approaches of brain imaging.

The published article is dedicated to a study of the relationship between the degree of movement recovery in the upper limb and the structural and functional state of the motor system, assessed using methods of transcranial magnetic stimulation (TMS) and magnetic resonance imaging (MRI). The study included 35 patients of young and middle age (in average 47 years old), who had a stroke more than six months prior to the study. The patients were divided into three groups depending on the level to which they regained movement in their upper limb: good, moderate, and bad.

The study showed that the structural integrity of the corticospinal tract assessed using structural MRI is the best predictor for the upper limb motor recovery. At the same time, the assessment of the state of the unaffected cerebral hemisphere and the integrity of the connections between the hemispheres did not have an additional value for classification when data on the integrity of the corticospinal tract were available.

The corticospinal tract is a pathway that connects the cerebral cortex with the spinal cord, allowing voluntary limb movement.

In addition, the study showed that the methods of MRI and TMS are equally effective for assessing the condition of the corticospinal tract in patients in chronic phase after stroke and can be used interchangeably when it is necessary to select a group of patients with low levels of recovery.

'However, it is important to consider that when assessing the motor system using TMS, it is necessary to study the evoked motor responses in several muscles of the upper limb. This makes it possible to reduce the number of false negative results--cases in which muscle responses to stimulation were not found, although the corticospinal tract was partially preserved,' says Maria Nazarova, research fellow at the Institute for Cognitive Neurosciences of HSE University.

The findings of the study can be used both to better understand the processes of recovery after a stroke and to plan individual motor rehabilitation in stroke survivors.

Credit: 
National Research University Higher School of Economics