Culture

Social isolation linked to more severe COVID-19 outbreaks

image: Italy - Proportion of people aged>80 infected by SARS-CoV-2 on the general population adjusted for the percentage of people aged>80 on the general population, Nursing Homes bed rate and epidemic maturity - Multivariables Linear Regression

Image: 
Liotta et al, 2020 (PLOS ONE)

Regions of Italy with higher family fragmentation and a high number of residential nursing homes experienced the highest rate of COVID-19 infections in people over age 80, according to a new study published May 21, 2020 in the open-access journal PLOS ONE by Giuseppe Liotta of the University of Rome, Italy, and colleagues.

Italy has been one of the countries most affected by the COVID-19 pandemic. Researchers have speculated that this is due to Italy's age demographics as well as the connectedness of the older and younger generations and high rate of intergenerational contact. If true, this would suggest that regions with larger households would have more severe COVID-19 outbreaks in older adults.

In the new study, researchers used publicly available data published by each Italian administrative region as well as daily situation reports on COVID-19 published by the Italian Ministry of Health and spanning February 28 through March 31, 2020. All household and population data was extracted between April 1 and 7, 2020.

Across Italian regions, the COVID-19 incidence rate ranged from 0.27% to 4.09% of the population being affected. The mean number of household members ranged from 2.02 to 2.58; the percentage of one member households ranged from 28.5 to 40.9; and percentage of COVID-19 cases that occurred in people over age 80 ranged from 4.3 to 23.6. A model that reflected the percent of the population over age 80, days since 50 cases were registered, percentage of nursing home beds in the total population, and mean number of household members was best able to predict the COVID-19 incidence among older people in each region, with an adjusted R-squared value of 0.695 (p

The authors add: "Variables associated with social isolation are risk factors for
increase in the proportion of cases in Italian patients aged >80 years
among the total number of cases." Professor Liotta also notes that "nursing homes bed rate is one of the determinants of SARS-CoV-2 infection rate among the individuals aged>80 in Italy."

Credit: 
PLOS

Changes needed to prevent controversial pharmaceutical deals

New research from the University of East Anglia (UEA) recommends changes to the system which sees drug companies strike deals with competitors to stop them producing cheaper generic alternatives.

These 'pay-for-delay' deals involve a payment from a branded drug manufacturer to a generic maker in order to delay market entry. In return for withdrawing its challenge, the generic firm receives a payment and/or a license authorizing it to enter the market at a later date, but before the expiration of the patent itself.

Such deals may block entry by other generic firms and have been challenged by competition authorities in Europe and the US on grounds of being anticompetitive. They can cost consumers and health systems millions by delaying the introduction of cheaper generic drugs for several years.

Dr Farasat Bokhari, Dr Franco Mariuzzo and Dr Arnold Polanski, of UEA's School of Economics and Centre for Competition Policy, develop a model of generic entry and patent litigation to show that the branded firm can pay off the first generic challenger and then ward off entry by second or later challengers by threatening to launch an authorized generic via the first paid-off challenger. The model captures the essential features of market entry rules for drugs and the patent litigation in both Europe and the US.

Compared to the current first-filer system in the US, where generic exclusivity is awarded to the first generic applicant, the researchers endorse a switch to a system that instead rewards the first successful challenger, which they say will result in fewer pay-for-delay deals.

Publishing their findings today in the Journal of Economics & Management Strategy, they also recommend preventing a branded firm from launching a pseudo or authorized generic against an independent generic that wins patent litigation, as this will prevent pay-for-delay deals for weak patents.

They advise that competition authorities should be cautious about using payment to a generic firm as a workable surrogate to measure the strength of a patent. This is because the payment depends on other factors as well, and therefore low payment does not necessarily mean that the underlying patent is strong and no harm has been caused to the consumers by the pay-for-delay deal.

Dr Bokhari said: "While pay-for-delay deals may be beneficial to some extent, in that they might save courts and administrative bodies, such as patent offices, time and effort, they allow branded drug firms to charge monopoly prices and in a typical deal there may be several years delay in a cheaper version becoming available.

"Investigation and fines can be important in deterring such deals. However, the more important policy question is what can be done to prevent such entry limiting agreements in the first place?

"One also has to ask why such deals are stable in the first place. If a branded firm pays the generic firm to stay out of the market and they accept the deal, what stops the next generic drug maker knocking on the branded firm's door, looking for a similar payoff? And if they do, how much do they have to pay and how can the original deal be profitable?

"The late generic challengers can be credibly threatened that even if they succeed in invalidating the patent and enter, the branded firm will launch the authorized generic prior to their entry and will capture large portion of generic profits. Therefore, it is important that the branded firms' ability to launch authorized generics be legislatively limited."

Previous studies have found that a pay-for-delay deal can cost as much $3.5 billion per year to US consumers - with prices dropping by as much as 75% after generic entry - and can slow a generic entry to the market by up to five years.

Credit: 
University of East Anglia

Study quantifies China's chronic health burden for the first time

University of Melbourne researchers have quantified the toll that having multiple chronic diseases takes in China for the first time, which could have significant implications for its economic and health systems.

Researchers say is also timely as COVID-19 has placed further pressure on the public health emergency management system in China.

Published in The Lancet Global Health, the study is the first national longitudinal data set of its kind. Researchers found multimorbidity* - two or more mental or physical chronic non-communicable diseases (NCDs), such as stroke and cancer - was associated with higher health service use levels and greater financial burden.

It increased with age, female gender, higher per capita household expenditure, and higher educational level. However, it was more common in poorer regions compared with the most affluent regions.

The study used data from the three waves of the China Health and Retirement Longitudinal Study conducted in 2011, 2013 and 2015 with Chinese residents aged 45 years and older. Researchers analysed data from 11,817 respondents.

Overall, 62 per cent of participants had physical multimorbidity in 2015. The study concluded that concerted efforts were needed to reduce health inequalities due to multimorbidity and its adverse economic impact.

The Chinese Government is working towards universal health coverage by 2030, and around 1.2 billion of China's 1.4 billion citizens are already covered by one of three social health insurance programs.

However, low levels of service coverage for some beneficiaries and high levels of patient cost-sharing from out-of-pocket fees for health services have raised concerns about the lack of adequate financial protection for patients with NCDs.

First Author Dr Yang Zhao, a researcher at University of Melbourne's Nossal Institute for Global Health, said China's ongoing social health insurance reforms must reduce out-of-pocket spending for patients with multimorbidity.

"Concerted efforts are needed to reduce health inequalities that arise due to multimorbidity, and its adverse economic impact in population groups," Dr Zhao said. "Social health insurance reforms must place emphasis on reducing out-of-pocket spending for patients with multimorbidity to provide greater financial risk protection."

COVID-19 further complicates the situation. Dr Zhao said a national effort coordinated by the Chinese government has helped to contain its spread.

"Some evidence suggests those with multimorbidity are more susceptible to COVID-19 and more likely to be at risk of severe cases and poor outcomes. However, that situation is temporary," Dr Zhao said. "The Chinese government has made unprecedented efforts and invested enormous resources and these containment efforts have stemmed the spread of the disease."

University of Melbourne Nossal Institute for Global Health senior lecturer and senior author, Dr John Tayu Lee, said chronic conditions were a major contributor to China's health burden, outcome inequalities and economic burden. He said this was likely to increase rapidly with an ageing population and high levels of NCD risk factors.

"Multimorbidity is costly to individuals and health systems," Dr Lee said. "Disease-specific guidelines are inadequate for the effective management of individuals with multimorbidity and new clinical guidelines for multimorbidity are needed in China."

Credit: 
University of Melbourne

Immunity to coronaviruses: What do we know so far?

A new review discusses the findings from over 40 studies on coronavirus immunity and what they could mean for the Covid-19 pandemic.

Written by top UK virologists, the article discusses the existing knowledge about immune responses to SARS-CoV-2 and other coronaviruses, and how this could be used to inform virus control strategies. The review, which is free to read in the Journal of General Virology (JGV), collates the available scientific evidence in a number of key areas, including how long immunity to coronaviruses lasts and the prospect of antibody testing.

In the review, Professors Paul Kellam and Wendy Barclay from Imperial College London examine what is so far known about immunity to coronaviruses including SARS, MERS and the four strains of seasonal coronaviruses that circulate in humans every winter. The article goes on to suggest that SARS-CoV-2 could become the fifth seasonal coronavirus with epidemics of the virus occurring over the next several years.

"SARS-CoV-2 is a new virus in humans and because of this we are having to learn quickly many of its basic properties. In the absence of such data right now, we can try to make predictions about the immune response to SARS-CoV-2 by re-examining what we know about the two epidemic coronavirus of humans, SARS and MERS and the four seasonal human coronavirus. We need to be cautious about inferring too much, but it is a good place to start.'

"We do not really know what happens on the pathway of a new coronavirus in humans becoming an endemic seasonal infection, but it could be that when the four seasonal coronavirus first jumped from animals into humans they were much like SARS-CoV-2 in their transmission and pathogenesis. Over time as population immunity to the seasonal coronavirus became widespread, the amount of severe disease probably declines. However, seasonal coronavirus can still cause pneumonia in some people," said Professor Kellam.

A number of factors, including severity of disease influence how long antibody protection against both SARS and MERS lasts. Studies have shown that antibody protection wanes over time. For seasonal coronaviruses where disease is mild, there have even been reports of reinfection after as little as 80 days. "We need to find out many things about SARS-CoV-2 immunity, such as how good is the immune response and how long does it last. We also need to understand if people with mild or asymptomatic infections develop a strong or weak immune response and what measurable properties of immunity predict protection from infection. When we know more about these things, we will be better able to understand how SARS-CoV-2 infections will continue over time. However, vaccines are not infections, therefore it is likely that some of the vaccines candidates will be better at inducing long lasting immunity and protection from infection," said Professor Kellam.

The authors also discuss the currently available antibody tests for SARS-CoV-2 and explain the differences between these tests, their accuracy and limitations. Knowing the level of immunity to SARS-CoV-2 in the population could be key in controlling the spread of disease and understanding how many people are at risk of infection.

"Understanding immune responses to these viruses is on many people's minds- from the public hearing about vaccines, testing and antibodies, to policy makers, and scientists working on or with an interest in the current pandemic." said Alain Kohl, Deputy Editor-in-Chief of JGV). The Editor-in-Chief Paul Duprex added: "This review gives an up to date, and superbly researched overview of this field. Many people will find areas or topics of interest in this article, that should help them understand the important discussions going on. We are delighted to see it published in the journal."

Credit: 
Microbiology Society

The genome of chimpanzees and gorillas could help to better understand human tumors

A new study by researchers from the Institute of Evolutionary Biology (IBE), a joint centre of UPF and the Spanish National Research Council (CSIC), shows that, surprisingly, the distribution of mutations in human tumours is more similar to that of chimpanzees and gorillas than that of humans.

The article, which analyses cancer from the evolutionary point of view, is published today, 19 May, in Nature Communications. It was led by Arcadi Navarro and David Juan and involved the researchers Txema Heredia-Genestar and Tomàs Marquès-Bonet.

Mutations are changes that occur in DNA. They are not distributed throughout the genome evenly, but some regions accumulate more and others less. Although mutations are common in healthy human cells, cancer cells display a greater number of genetic changes. During the development of cancer, tumours rapidly accumulate a large number of mutations. In previous studies, however, it had been observed that surprisingly tumours accumulate mutations in very different regions of the genome from those normally observed in humans.

Now, thanks to the data from the project PanCancer, a research team from the IBE has compared the regions of the genome that accumulate more and less mutations in tumour processes, in the recent history of the human population, and in the history of other primates. The results of this new study reveal that the distribution of mutations in tumours is more like that in chimpanzees and gorillas than in humans.

"To date, it was thought that the genetic differences we find when we compare tumours and healthy humans could be caused by the 'abnormal' way tumours have of accumulating mutations. In fact, we know that tumours rapidly accumulate a large number of mutations and that many of their genome repair mechanisms do not work well", comments Txema Heredia-Genestar, first author of the study who recently completed his PhD at the IBE. "But now, we have discovered that many of these genetic differences have to do with our evolutionary history".

The distribution of mutations in humans, skewed by population events

When an individual's genome is sequenced, it is observed to have a small number of new mutations -- some 60-- compared to their parents, those of their parents with respect to their grandparents, and so on with each previous generation. Therefore, in a person, approximately three million mutations can be seen that represent the evolutionary history of the mutations accumulated over hundreds of thousands of years. Of these, a few are recent and most are very old.

However, when tumour mutations are analysed, what is seen are just the mutations that have taken place during the tumour process, since the analysis does not take into account the information on populational history.

"We have seen that the distribution of mutations in the human genome is skewed because of human evolutionary history", Heredia-Genestar details. The manner in which a tumour accumulates mutations is the same as a human cell has of accumulating mutations. "But, we do not see this in the human genome because we have had such a complicated history that has made our distributions of mutations change, and this has deleted the signals we should have", he adds.

Throughout history, the human population has suffered drastic declines and has even repeatedly been on the verge of extinction. This phenomenon is known as a bottleneck, and it causes humans as a species to have very little diversity and fewer mutations: they are very similar to each other. In fact, chimpanzees are four times more genetically diverse than humans.

Therefore, the global way for a cell to accumulate mutations can be observed in chimpanzees because they have not undergone these population events. The study concludes that to understand how mutations accumulate in human cells, which is important for studying tumours, it is more useful to look at how they accumulate in other primates rather than looking at it in human populations , whose signal was destroyed by population events.

"Cancers, like chimpanzees and gorillas, only show the complete mutation landscape of a normal human cell. It is we humans, with our turbulent distant past, who display a distorted distribution of mutations", adds Arcadi Navarro, ICREA research professor at the IBE, full professor at UPF and co-leader of the study.

The research suggests that the conservation and study of the great apes could be highly relevant to understanding human health. David Juan, co-leader of the study, concludes that "in the particular case of the development of tumours, other primates have proved to be a better model for understanding how tumours develop at genetic level than humans themselves. In the future, our closest relatives could shed light on many other human diseases".

Credit: 
Universitat Pompeu Fabra - Barcelona

Weight loss surgery may alter gene expression in fat tissue

Altered gene expression in fat tissue may help explain why individuals who have regained weight after weight loss surgery still experience benefits such as metabolic improvements and a reduced risk of type 2 diabetes. The findings come from a study published in the Journal of Internal Medicine.

The study included women who underwent weight loss surgery, and gene analyses were conducted before and two and five years after surgery. Analyses were also conducted in women who did not undergo surgery.

Most gene expression changes in fat tissue occurred during the first two years after surgery; however, a subset of genes encoding proteins involved in inflammation displayed a continued decrease in expression over five years (during weight regain).

Credit: 
Wiley

Deep learning: A new engine for ecological resource research

image: Ecological resource areas that are impacted and overwhelmed by deep learning (water part) and the areas need to be solved (mountain part)

Image: 
©Science China Press

Ecological resources are an important material foundation for the survival, development, and self-realization of human beings. In-depth and comprehensive research and understanding of ecological resources are beneficial for the sustainable development of human society. Advances in observation technology have improved the ability to acquire long-term, cross-scale, massive, heterogeneous, and multi-source data. Ecological resource research is entering a new era driven by big data.

Deep learning is a big data driven machine learning method that can automatically extracting complex high-dimensional nonlinear features. Although deep leanring has achieved better performance for big data mining than traditional statistical learning and machine learning algorithms, there are still huge challenges when processing ecological resource data, including multi-source/multi-meta heterogeneity, spatial-temporal coupling, geographic correlation, high dimensional complexity, and low signal-to-noise ratio. A recent study clarified the aforementioned frontier issues.

The related research paper entitled "The Application of Deep Learning in the Field of Ecological Resources Research: Theory, Methods, and Challenges" has been published in "Science in China: Earth Science" . Prof. GUO Qinghua and Ph.D. student JIN Shichao from Institute of Botany, Chinese Academy of Sciences are co-first authors. GUO Qinghua is the corresponding author. Prof. LIU Yu from Peking University and Prof. XU Qiang from Chengdu University of Technology are co-authored research teams.

Deep learning has made significant progress in many fields with the accumulation of data, the improvement of computing power, and the progress of algorithms. This study focuses on the application of deep learning in the field of ecological resources. The main contents include:

1) An overview of the history, development and basic structure of deep learning (Figure 1). The relationships between ecological resource big data research and deep learning structures represented by convolutional neural networks, recurrent neural networks, and graph neural networks were analyzed (Figure 2)

2) The main tasks of deep learning, common public data sets, and tools in ecological resources were summarized (Figure 2).

3) The application of deep learning in plant image classification, crop phenotype, and vegetation mapping were demonstrated. The application ability and potential of deep learning in structured and unstructured ecological data were analyzed.

4) The challenges and prospects of deep learning in the application of ecological resources were analyzed (Figure 3), including standardization and sharing of data, construction of crowdsourcing collection platform, interpretability of deep neural network, hybrid deep learning with domain knowledge, small sample learning, data fusion, and enrichment and intelligence of applications.

This study explored the relationship between deep learning and ecological resource research. It is of great significance for connecting the technological frontier of computer science with the classical theoretical science in the field of ecological resources. This connection will contribute to the establishment of a new paradigm of theoretical discovery and scientific research in the era of ecological resources big data.

Credit: 
Science China Press

Promoting temporary contracts fails to have the desired effect of increasing employment

image: Researcher at UPV/EHU's Department of Applied Economics V.

Image: 
Tere Ormazabal

With the aim of making the labour market more flexible, between 1990 and 2010 most of the economies, the European ones in particular, adopted various measures designed to reform their labour markets, which involved making it cheaper to dismiss permanent employees and encouraging the use of temporary contracts. "These reforms were based on the belief that the serious unemployment problems facing these economies were due to the existence of various inflexibilities in the labour market which were hampering a rapid adjustment of these markets when responding to economic shocks," explained Josu Ferreiro, lecturer in the Department of Applied Economics V of the UPV/EHU's Faculty of Economics and Business.

Ferreiro, together with his department colleague Carmen Gómez, and Philip Arestis, a University of Cambridge lecturer, tackled the study of the effect these measures were having on the labour markets of eleven European countries over a 25-year period, from 1988 to 2012. "The cut in dismissal costs was more than half with respect to that stipulated by legislation in the 1980s, both in terms of days worked per year that are paid and in terms of the cut-off figure of monthly salaries that are paid," said the researcher.

Studies had already begun to be published suggesting that these measures had failed to achieve the desired effect and that there was no clear demonstration that more jobs had been created. "We decided to analyse the repercussions that this reduced employment protection was having not only on total employment but also on permanent employment and temporary employment. We studied each of these aspects separately," said Ferreiro.

The results of the analysis were clear, "clearer than what we had expected to find", said the researcher. And the fact is that they found that "the changes introduced to reduce employment protection, in other words, by making it cheaper to dismiss people and by simultaneously promoting temporary contracts, have had no effect on the total employment rate, because what has happened is that temporary employment has increased a lot whereas indefinite employment has fallen hugely. The evolution in employment depends on economic growth alone, and only a higher rate of economic growth generates an increase in employment".

The clearest effect that they in fact found on the job market as a result of the labour reforms was "the restructuring of employment that has taken place. A segmentation has taken place which means we can talk about two categories of workers: those who are on fixed or permanent contracts, and those on temporary contracts. They almost operate as if they were different job markets where the possibility of people on temporary contracts securing an indefinite contract is very remote. What is more, the conditions of work of temporary contracts are more precarious, firstly because the contracts are shorter and secondly because they pay less," he said.

Overreaction of layoffs during the Covid-19 crisis

Ferreiro is in no doubt that companies have been the major beneficiaries of the reforms: "They find a workforce that is much more manageable, more flexible and also cheaper." So much so that right now the rate of temporary employment has reached unprecedented levels that this economist regards as "excessive". He recalls that when the economy began to revive after the economic crisis, "there was an upturn in employment once again, but the employment created was of a temporary nature," he said.

However, the high rate of temporary work means that the employment that is created is highly volatile: a lot of jobs are destroyed the moment economic activity slows down or falls. "And that is what is happening in the crisis we are going through right now: the effect on employment may be much greater than the actual effect on economic activity due to the great flexibility of the labour market. Just as the health system is not equipped to address a huge increase in the number of patients all at once, neither are our economic systems equipped to absorb the fact that all of a sudden, within a four-month period, 20-30% of people lose their jobs," he reflected.

Ferreiro regards this situation as an opportunity for States to change legislation on employment protection: "Within the short to medium term, countries will probably start to rethink this problem of employment legislation more seriously and to consider the fact that these measures implemented in the past have not had the desired effects and that we have to go for models in which an attempt is made to stabilise employment much more."

Credit: 
University of the Basque Country

NUI Galway study compares the health of Irish children to those across Europe and Canada

A new report, Spotlight on Adolescent Health and Well-being, published today by WHO Regional Office for Europe, compiles extensive data on the physical health, social relationships and mental well-being of 227 441 schoolchildren aged 11, 13 and 15, from 45 countries. Irish children rank low on substance use such as smoking and drinking alcohol and high on physical activity. Ireland also ranks high for problematic social media use.
The report presents the comparative international findings of the Health Behaviour in School-aged Children (HBSC) survey, which is co-ordinated by the WHO and undertaken every four years. The Irish arm of this study is led by Professor Saoirse Nic Gabhainn in the Health Promotion Research Centre in NUI Galway.
In this new report Irish 11, 13 and 15 year olds are compared to those in 44 other countries across Europe and North America. Key comparative findings show that:

Irish children rank highly for eating breakfast and low for sugar-sweetened soft drink consumption at all ages. There have been significant reductions in sweets and soft drink consumption since 2014

Ireland ranks low at all ages for reported tobacco and alcohol use

Ireland ranks high relative to other countries in reported vigorous physical activity

Life satisfaction has significantly reduced since 2014, and Ireland ranks low for life satisfaction among 15-year olds

Ireland ranks highly for problematic social media use at all ages, and among 13 and 15 year olds, Ireland ranks highly for reports of having been cyberbullied.

The Irish Study

The Irish survey was carried out by the Health Promotion Research Centre at NUI Galway and was the sixth round of data collection in Ireland. The overall study aims to gain new insights, and increase our understanding of young people's health and wellbeing, health behaviours and their social context, both nationally and internationally.

As well as serving as a monitoring and a knowledge-generating function, one of the key objectives of HBSC has been to inform policy and practice, with the Irish section of the study being funded by the Department of Health. The cross-national survey covers diverse aspects of adolescent health and social behaviour, including self-assessment of mental health; body image; dietary habits; physical activity; school context; relationships with families and peers; tobacco, alcohol and cannabis use; bullying and injuries; and sexual health (for those aged 15 and above only). A special focus on online communication was included in the most recent HBSC survey, to better understand the expanding role of digital technology in young people's lives.

Kate O'Flaherty, Head of Health and Wellbeing at the Department of Health said: "This latest international study enables us to compare our progress in the area of child health and wellbeing with that of 45 other countries; our European neighbours and Canada. This study provides valuable international behavioural information recorded shortly before the onset of the Covid-19 pandemic; the next study will allow us to gauge the effect of the pandemic on teenage behaviour, health and wellbeing.

There are many areas where Ireland is doing well, for example our low smoking rates, low consumption of alcohol and sugar-sweetened drinks and comparatively good levels of physical activity. The areas of mental wellbeing and life satisfaction were comparatively less positive, and while there is already a lot of good work underway between Government Departments, agencies and other partners to address this, it will be of increased priority as we support wellbeing and resilience in the response to the Covid-19 pandemic."

Professor Saoirse Nic Gabhainn from the Health Promotion Research Centre at NUI Galway, and the Principal Investigator of the Irish HBSC study, commented on the Irish findings within the report: "This study provides valuable insight into the health and wellbeing of children in Ireland, and how we compare with children in other countries. It is positive to note we retain comparatively low rates of substance use and high rates of physical activity. The improvements in dietary behaviour in relation to lower intake of sweets and soft drinks are very welcome. However, there are also some areas that require further effort. For example, compared to other countries, Irish children report higher levels of problematic social media use than most countries and rates of cyberbullying are of concern. It is clear that more work is required to address the reductions in life satisfaction across all age groups."

Key Irish findings:

Health promoting behaviours

Daily breakfast consumption: Irish children ranked within the top 5 countries for the proportion of children reporting daily breakfast consumption

Meeting physical activity recommendations of at least one hour of moderate to vigorous physical activity (MVPA) daily: When compared to other countries, Ireland ranks in the top 10 for the proportion of 11 and 13 year-old boys and girls meeting physical activity recommendations. At all ages, a larger proportion of boys report meeting the recommendations than girls.

Risk-taking behaviour

Initiation of risk behaviours: In Ireland, the proportion of 11 and 13 year-old children reporting initiation of cigarette smoking and alcohol use is low relative to other countries

Drinking behaviours and tobacco use are continuing to improve in Ireland and when compared to other countries

Risky sexual behaviour: Girls in Ireland rank among the top 10 countries for not using the contraceptive pill or condom at last intercourse.

Mental Health and Well-being

Life satisfaction: Irish 15-year olds ranked within the bottom 2 countries for life satisfaction

Life satisfaction has significantly reduced in all age groups of Irish children since 2014

Ireland ranks low on reports of symptoms (stomachache, backache, nervous and dizziness at age 11

Girls at age 13 and 15 are more likely than boys to report multiple symptoms

High family affluence is related to better self-rated health, higher life satisfaction and lower rate of multiple symptoms.

Social interaction with family and peers

Perceived family support: Ireland ranked within the bottom ten countries for the proportion of 11 and 13 year-old children reporting high family support

Perceived family and peer support: have improved among 13 year-old girls since 2014.

Problematic social media use: Across all ages, Ireland ranked within the top 10 countries for the proportion of children categorised with problematic social media use

Bullied others at school: Ireland (13 and 15 year-olds) ranked within the bottom 10 countries for the proportion of children who report bullying others at school

Cyberbullying: Compared to other countries, Ireland ranks among the top 10 countries for prevalence of cyberbullying among older children (13 and 15 year-olds).

Data collected for the study are based on surveys completed by thousands of adolescents, thereby ensuring that their voices and concerns can be taken fully into account when the WHO frames its European strategies, policies and actions for improving child and adolescent health and well-being. The study feeds into a growing body of evidence calling for more effective and targeted interventions by governments and policy makers to tackle the effects of social, health and gender inequalities among young people in Europe.

Credit: 
University of Galway

Tick-borne encephalitis spread across Eurasia with settlers and their pets and prey

Researchers from Sechenov University together with colleagues from several Russian institutes analysed data on the RNA structure of tick-borne encephalitis virus. Much larger than in previous studies, the data volume of the new study allowed them to estimate the age of the virus subtypes and track its spread in Eurasia. The results of the study were published in the journal Viruses.

Tick-borne encephalitis is common in Central and Eastern Europe and a wide band in southern Siberia and the Far East. This disease is dangerous due to the ability of the virus to penetrate the brain and spinal cord, causing motor disorders, cognitive impairments and, in severe cases, paralysis and death. Every year in Russia, 1,500-2,000 people are infected, with about 30% developing neurological complications and 20-100 people dying.

The pathogen belongs to the genus Flavivirus (it also includes Zika virus and the virus that causes dengue fever) and is transmitted mainly through tick bites. There are three subtypes of the virus: Far Eastern, Siberian and European. Each of them is predominant in the region after which it is named, although this division is quite coarse - for example, cases of infection with the Siberian subtype were observed in the Baltic States and Sakhalin, and with the European one - in South Korea and the Altai Mountains. In recent years, the spread of the virus has expanded to northern areas (Kola Peninsula, Arkhangelsk Region) and mountainous regions (in Central Europe and Italy). In 2019, the first cases of infection were recorded in the UK and the Netherlands. To respond promptly to the emergence of the virus in new territories, it is necessary to understand what affects its spread and evolution. Existing studies provide different, sometimes contradictory results, but new data collected in recent years can clarify the situation.

The authors of the article used GenBank - a database that stores more than 200 million nucleotide sequences of RNA and DNA of various species. The data are supplied by scientists from different countries, and their volume is constantly growing, in common with the number of sequences describing the RNA of the tick-borne encephalitis virus - in ten years the number has increased fivefold.

Using computer algorithms, researchers compared the RNA of viruses found in different years on the territory of several countries. Knowing the 'distance' (the proportion of divergent nucleotides) between samples and the date of their receipt, one can estimate the time of divergence of species or the division of a species into subtypes. This method is based on the assumption that the sequence of nucleotides in a single species changes at an approximately constant rate, and is called the molecular clock. Using this method, scientists estimated the age of the most recent common ancestor for each of the subtypes. The result (about 700 and 900 years for the Siberian and Far Eastern subtypes, respectively) is consistent with earlier studies, while the age of the European subtype (about 1,600 years) was estimated for the first time after the virus was spotted in the Netherlands.

The researchers also tried to find out how genetically similar viruses of the same subtype ended up in different countries at a distance of thousands of kilometres from each other. There are several possible explanations. First, viruses (or ticks infected with them) can travel long distances with animals, such as migratory birds or bats. Second, human activity can contribute to the spread of the virus: transportation of livestock or introduction of animals suitable for hunting in new territories.

'The most important result of the work was that most of the virus spreading events occurred in the last three or four centuries, and in many cases we have observed the transfer of the virus for thousands of kilometres in the last 50-100 years, and viruses in Europe have completely mixed over the last 100-200 years,' said Alexander Lukashev, one of the authors of the work, Director of the Institute of Medical Parasitology and Tropical Medicine at Sechenov University. 'This allows us to consider tick-borne encephalitis as a highly dynamic disease, even as an emerging disease in many regions, and to think about an anthropogenic factor (the spread of infected ticks as a result of transportation of domestic and wild animals) as one of the main mechanisms behind the expansion of the virus.'

In addition, scientists have proposed an algorithm that simulates the compilation of samples as used in earlier studies. With its help, the authors of the article showed that the differences in results of previously published studies are well explained by the selection of the sequences included in the study.

Credit: 
Sechenov University

Study identifies the mechanism by which eating fish reduces risk of cardiovascular disease

image: Núria Amigó and Xavier Correig, researchers of the Universitat Rovira i Virgili, carried out the study together with researchers from Harvard Medical School, headed by Samia Mora.

Image: 
URV

A study by researchers from the Universitat Rovira i Virgili (URV) and Harvard Medical School has found that consuming omega 3 primarily through fish, but also in supplements containing these fatty acids, can modulate lipoproteins, that is, the particles that transport lipids through the blood, and can reduce the risk of cardiovascular disease. The association between the consumption of omega 3 and the reduction in the risk of suffering cardiovascular events has been demonstrated through the analysis of lipoprotein samples from 26,034 women, the largest and most detailed study ever carried out. The study is particularly important because cardiovascular disease is the most prevalent cause of death, with 1 in 3 people dying from cardiovascular events.

The research has been led by Núria Amigó, CEO of the URV spin off Biosfer Teslab and member of the Metabolomics Interdisciplinary Laboratory (MIL@b) - Metabolomics Platform, which was jointly created by the URV and the CIBERDEM and which is part of the Pere Virgili Health research Institute. Xavier Correig, professor from the Department of Electronic, Electrical and Automatic Engineering and director of the MIL@b - Metabolomics Platform, has participated in the study together with researchers from the Center for Lipid Metabolomics, Division of Preventive Medicine at the Brigham and Women's Hospital (Harvard Medical School) headed by Samia Mora.

Up to now it had been shown that a high consumption of omega 3 fatty acids was associated with lower levels of triglycerides in the blood. However, it had also been related to an increase in LDL cholesterol, that is, low-density cholesterol transported by lipoproteins, also known as bad cholesterol. LDL cholesterol increases the risk of cardiovascular diseases because it can accelerate the formation of atherosclerosis, that is, the process by which the arteries harden and lose their elasticity.

However, the study has found that increased consumption of LDL cholesterol from fish is associated principally with the cholesterol transported by the largest LDL particles, which are less atherogenic, and not with an increase in the total number of LDL particles. This decrease in the number of triglycerides transported by any type of lipoprotein helps protect the individual from heart disease.

The 3 types of omega 3 fatty acids studied, namely α-linoleic acid (ALA), docosahexaenoic acid (DHA) and eicosapentaenoic acid (EPA) are present in fish and other foods and are essential to human physiology, and the study has found that they differ in their association with the risk of cardiovascular disease. It found that there was no increase in the smallest LDL lipoproteins that transport cholesterol; instead the increase was among the largest LDL lipoproteins, which are not associated with the risk of heart disease. There was a decrease in all of the triglyceride-transporting particles and, moreover, the average size of the HDL and LDL particles increased, a phenomenon that is associated with increased protection from cardiovascular illness.

These conclusions have been obtained through mathematical modelling of the consumption of fish and omega 3 (both as a whole and of the different types ALA, DHA and EPA) and the profile of lipoproteins. The results were obtained by Nuclear Magnetic Resonance, "which can go further than simply analysing triglyceride and cholesterol content and can quantify the number and size of the different subtypes of plasmatic lipoprotein", explained Núria Amigó. She described how among the LDL particles that transport cholesterol "it is the smallest that are associated with a future cardiovascular event".

Another important element of the study is that the mathematical models used to evaluate the association between the consumption of fish and the reduction in cardiovascular risk have isolated other nutritional factors that affect the result, such as the consumption of other foods, the concentration of omega 3 according to the origin of the fish (wild or farmed) and traditional risk factors such as a sedentary lifestyle, age, body mass index and smoking.

The study analysed a cohort from the Women's Health Study by the Brigham and Women's Hospital, affiliated to Harvard Medical School, and involved the use of Nuclear Magnetic Resonance to characterise the plasma of 26,034 women with an average age of 53 (most were between 48 and 59).

Having confirmed that the risk factor associated with lipids, cholesterol concentration, triglycerides and the different subtypes of particles is modulated by the consumption of omega 3 fatty acids, "we now need to find out if the consumption of fish is associated with lower mortality from both cardiovascular diseases and other causes", Amigó explained, because "although the risk is lower in terms of lipids, we need to look at other pro-inflammatory factors and questions such as exposure to heavy metals".

Credit: 
Universitat Rovira i Virgili

Stroke rates among COVID-19 patients are low, but cases are more severe

DALLAS, May 21, 2020 -- The rate of strokes in COVID-19 patients appears relatively low, but a higher proportion of those strokes are presenting in younger people and are often more severe compared to strokes in people who do not have the novel coronavirus, while globally rates for stroke hospitalizations and treatments are significantly lower than for the first part of 2019, according to four separate research papers published this week in Stroke, a journal of the American Stroke Association, a division of the American Heart Association.

In "SARS2-CoV-2 and Stroke in a New York Healthcare System," researchers reported key demographic and clinical characteristics of patients who developed ischemic stroke associated with the COVID-19 infection and received care within one hospital system serving all 5 boroughs of New York City.

During the study period of March 15 through April 19, 2020, out of 3,556 hospitalized patients with diagnosis of COVID-19 infection, 32 patients (0.9%) had imaging-proven ischemic stroke. They compared those 32 patients admitted with stroke and COVID-19 to those admitted only with stroke (46 patients) and found that the patients with COVID-19:

tended to be younger, average age of 63 years vs. 70 years for non-COVID stroke patients;

had more severe strokes, average score of 19 vs. 8 on the National Institutes of Health Stroke Scale;

had higher D-dimer levels, 10,000 vs. 525, which can indicate significant blood clotting;

were more likely to be treated with blood thinners, 75% vs. 23.9%;

were more likely to have a cryptogenic stroke in which the cause is unknown, 65.6% vs. 30.4%; and

were more likely to be dead at hospital discharge, 63.6% vs. 9.3%.

Conversely, COVID-19 stroke patients were less likely than those stroke patients without the novel coronavirus to have high blood pressure (56.3% vs. 76.1%) or to have a prior history of stroke (3.1% vs. 13%).

The researchers observed that the rate of imaging-confirmed acute ischemic stroke in hospitalized patients with COVID-19 in their New York City hospital system was lower compared to prior reports in COVID-19 studies from China. One reason for the difference might be related to variations in race/ethnicity between the two study populations. In addition, the low rate of ischemic stroke with COVID-19 infection may be an underestimate because "the diagnosis of ischemic stroke can be challenging in those critically ill with COVID-19 infection who are intubated and sedated," said lead study author Shadi Yaghi, M.D., FAHA, of the department of neurology at NYU Grossman School of Medicine in Manhattan.

Yaghi said, "It was difficult to determine the exact cause of the strokes of the COVID-19 patients, however, most patients appeared to experience abnormal blood clotting. Additional research is needed to determine if therapeutic anticoagulation for stroke is useful in patients with COVID-19." The researchers noted that at least one clinical trial is already underway to investigate the safety and efficacy of treatment for active clotting vs. preventive treatment in certain patients with COVID-19 infection presenting with possible clotting indicators.

Yaghi and his coauthors also noted the number of stroke cases with COVID-19 seems to have peaked and is now decreasing. This finding may be related to the overall reduction in COVID-19 hospital admissions, which may be due to social distancing and guidance for people to stay at home. In addition, the number of stroke patients hospitalized during the study period was significantly lower than the same time frame in 2019.

Similar trends are reported in several other studies also published this week in Stroke, reflecting a global disruption of emergency health care services including delayed care and a lower-than-usual volume of stroke emergencies during the COVID-19 pandemic crisis.

In a Hong Kong study, "Delays in Stroke Onset to Hospital Arrival Time during COVID-19," by lead author Kay Cheong Teo, M.B.B.S., researchers compared the stroke onset time to hospital arrival time for stroke and transient ischemic attack (TIA) patients from Jan. 23 to March 24, 2020 (the first 60 days from the first diagnosed COVID-19 case in Hong Kong) to the same time period in 2019. In 2020, 73 stroke patients presented to Queen Mary Hospital compared to 83 in 2019. However, the time from stroke onset-to-arrival time was about an hour longer in 2020 compared with last year (154 minutes vs. 95 minutes). In addition, the number of patients arriving within the critical 4.5-hour treatment window dropped from 72% in 2019 to 55% in 2020.

Also from China, "The impact of the COVID-19 epidemic on stroke care and potential solutions," by lead author Jing Zhao, M.D., Ph.D., detailed survey results from more than 200 stroke centers through the Big Data Observatory Platform for Stroke of China, which consists of 280 hospitals across China. They found that in February 2020, hospital admissions related to stroke dropped nearly 40%, while clot-busting treatment and mechanical clot-removal cases also decreased by 25%, compared to the same time period in 2019. The researchers cited several factors likely contributed to the reduced admissions and prehospital delays during the COVID-19 pandemic, such as lack of stroke knowledge and proper transportation. They also noted that another key factor was patients not coming to the hospital for fear of virus infection.

In a fourth study, "Mechanical Thrombectomy for Acute Ischemic Stroke Amid the COVID-19 Outbreak," by lead author Basile Kerleroux, M.D., researchers in France compared patient data from stroke centers across the country from February 15 through March 30, 2020 to data of patients treated during the same time period in 2019. They found a 21% decrease (844 in 2019 vs. 668 in 2020) in overall volume of ischemic patients receiving mechanical thrombectomy during the pandemic compared to the previous year.

Additionally, there was a significant increase in the amount of time from imaging to treatment overall -- 145 minutes in 2020 compared to 126 minutes in 2019, and that delay increased by nearly 30 minutes in patients transferred to other facilities for treatment after imaging. The researchers said delays may have been due to unprecedented stress on emergency medical system services, as well as primary care stroke centers lacking transfer resources needed to send eligible patients to thrombectomy capable stroke centers within the therapeutic window. They noted stricter applications of guidelines during the pandemic period could also have meant some patients may have not been referred or accepted for mechanical thrombectomy treatment during that time.

Credit: 
American Heart Association

MIPT biophysicists found a way to take a peek at how membrane receptors work

image: MIPT biophysicists explained ways to visualize membrane receptors in their different states. Detailed information on the structure and dynamics of these proteins will enable developing effective and safe drugs to treat many sorts of conditions.

Image: 
Daria Sokol/MIPT Press Office

In a study published in Current Opinion in Structural Biology, MIPT biophysicists explained ways to visualize membrane receptors in their different states. Detailed information on the structure and dynamics of these proteins will enable developing effective and safe drugs to treat many sorts of conditions.

Every second, living cells receive myriads of signals from their environment, which are usually transmitted through dedicated signaling molecules such as hormones. Most of these molecules are incapable of penetrating the cell membrane so for the most part, such signals are identified at the membrane. For that purpose, the cell membrane is equipped with cell-surface receptors.

That receive outside signals and "interpret" them into the language the cell can understand. Cell surface receptors are vital for proper functioning of cell and the organism as a whole. Should the receptors stop working as intended, the communication between cells becomes disrupted, which leads to the organism developing a medical condition.

GPCRs (G protein-coupled receptors) is a large family of membrane receptors that share a common structure; they all have seven protein spirals that cross the membrane and couple the receptor to a G protein situated on the inside of the cell. Interaction of the signal molecule with the receptor triggers a change in the 3D structure, or conformation, of the receptor, which activates the G protein. The activated G protein, in turn, triggers a signal cascade inside the cell, which results in a response to the signal.

GPCR family membrane proteins have been linked to a lot of neurodegenerative and cardiovascular diseases and certain types of cancer. GPCR proteins have also been proven to contribute to conditions such as obesity, diabetes, mental disorders, and others (fig. 2). As a result, GPCRs have become a popular drug target, with a large number of drugs currently on the market targeting this particular family of receptors.

One of the modern approaches to drug development involves analyzing 3D structures of GPCR molecules. But membrane receptor analysis is a slow and extremely laborious process and even when successful, it does not completely reveal the molecule's behavior inside the cell.

"Currently, scientists have two options when it comes to studying proteins. They can either 'freeze' a protein and have its precise static snapshot, or study its dynamics at the cost of losing details. The former approach uses methods such as crystallography and cryogenic electron microscopy; the latter uses spectroscopic techniques," comments Anastasia Gusach, a research fellow at the MIPT Laboratory of Structural Biology of G-protein Coupled Receptors.

The authors of the study demonstrated how combining both the structural and the spectroscopic approaches result in "the best of both worlds" in terms of obtaining precise information on functioning of GPCRs (fig. 3). For instance, the double electron-electron resonance (DEER) and the Förster resonance energy transfer (FRET) techniques act as an "atomic ruler", ensuring precise measurements of distances between separate atoms and their groups within the protein. The nuclear magnetic resonance method enables visualizing the overall shape of the receptor molecule while modified mass spectrometry methods (MRF-MS, HDX-MS) help trace the susceptibility of separate groups of atoms constituting the protein to the solvent, thus indicating which parts of the molecule face outwards.

"Studying the GPCR dynamics uses cutting-edge methods of experimental biophysical analysis such as nuclear magnetic resonance (NMR) spectroscopy, electron paramagnetic resonance (EPR) spectroscopy, and advanced fluorescence microscopy techniques including single-molecule microscopy," says Alexey Mishin, deputy head of the MIPT Laboratory for Structural Biology of G-protein Coupled Receptors.

"Biophysicists that use different methods to study GPCRs have been widely organizing collaborations that already bore some fruitful results. We hope that this review will help scientists specializing in different methods to find some new common ground and work together to obtain a better understanding of receptors' functioning." adds Anastasia Gusach.

The precise information on how the membrane receptors function and transfer between states will greatly expand the capabilities for structure-based drug design.

Credit: 
Moscow Institute of Physics and Technology

Emerging evidence on genetics of schizophrenia raises hopes for new treatment targets

May 21, 2020 - In recent years, genome-wide association studies (GWAS) have identified many different genetic variants associated with schizophrenia. These genetic discoveries raise the promise of developing urgently needed new treatments targeting the underlying biology and pathophysiology of schizophrenia, according to a special article in the Journal of Clinical Psychopharmacology. The journal is published in the Lippincott portfolio by Wolters Kluwer.

In a translational science update, Rebecca Birnbaum, MD, of Icahn School of Medicine at Mount Sinai, New York, and Daniel R. Weinberger, MD, of Lieber Institute for Brain Development, Johns Hopkins University School of Medicine, Baltimore, review efforts to bridge the gap between new genetic findings and innovative treatments for schizophrenia. "None of the medicines used in psychiatry were initially discovered based on an understanding of the causes or basic mechanisms of psychiatric illnesses," Dr. Weinberger comments.

Dr. Birnbaum adds: "The discovery of genetic risk factors for schizophrenia and other psychiatric diagnoses represent important clues to changing this history, to discovering treatments based on biological mechanisms of causation."

Genetic Associations May Lead to New Treatments for Schizophrenia

The first medications for schizophrenia were discovered serendipitously in the 1950s, with subsequent "me-too" drugs targeting the same limited number of neurotransmitters. While these antipsychotic drugs have benefits in addressing some of the symptoms of schizophrenia, they don't necessarily target all of the potential underlying causes.

With advances in GWAS, researchers are identifying increasing numbers of genetic variants - some relatively common, others rare - associated with schizophrenia. Although the effects of individual genetic variants on schizophrenia risk may be minor, further studies of the genes and pathways they affect might lead to new understanding of the neurobiology of schizophrenia.

Studies of gene expression may provide new insights into the mechanisms by which schizophrenia develops, as well as possible "druggable targets" for new medications. Genes or pathways implicated by both common and rare variants might be especially strong candidates for new drugs. Systems biology approaches - targeting brain functional networks or biological modules affected by schizophrenia risk genes - might be more promising than trying to develop drugs targeting any individual gene.

Neurodevelopmental studies indicate that processes leading to schizophrenia begin very early, suggesting that treatments may be most effective in early life or even the fetal period. Newly discovered genetic associations might also have implications for "precision psychiatry" - using genetic markers to assess individual differences in risk, or to select the most effective treatment for individual patients.

Despite the promise of advances in understanding the genetics of schizophrenia, difficult challenges lie ahead in translating these discoveries into real treatment advances for patients with schizophrenia and other psychiatric disorders. "Translating susceptibility genes into new therapeutic interventions will require extensive investigation and probably some luck," says Dr. Weinberger. "Nevertheless, we are on the threshold of a sea change opportunity in clinical pharmacology."

Credit: 
Wolters Kluwer Health

The European viper uses cloak-and-dazzle to escape predators

image: The viper's zig-zag pattern helps the snake remain undetected, it also provides a warning of the snake's dangerous defense and it can produce an illusionary effect that may hide the snake's movement as it flees.

Image: 
The University of Jyväskylä/Janne Valkonen

Research of the University of Jyväskylä demonstrate that the characteristic zig-zag pattern on a viper's back performs seemingly opposing functions during a predation event. At first, the zig-zag pattern helps the snake remain undetected. But upon exposure, it provides a conspicuous warning of the snake's dangerous defense. Most importantly the zig-zag can also produce an illusionary effect that may hide the snake's movement as it flees. The research, published in Animal Behaviour 164 (2020), reveals how a single color pattern can have multiple effects during a predation event, thereby expanding the discussion on protective coloration and anti-predator adaptations.

Protective coloration is one of the simplest but most effective tools that prey species use to evade predators. Typically, different color patterns are useful at different stages of a predation event. Some color patterns are cryptic, obscuring the prey from being detected - think chameleons. Other patterns are aposematic, which blatantly advertise a warning to predators - think wasps. Finally, some patterns can produce optical illusions to startle or confuse predators and give the prey an escape opportunity - think zebras.

But a recent series of experiments, by a team headed by Janne Valkonen and Johanna Mappes at the University of Jyväskylä (Finland), suggests that European vipers (Vipera sp.) can achieve all three tricks with a single color pattern - their characteristic zig-zag.

At first, the zig-zag pattern helps the viper to hide. The researchers hid plasticine models of snakes with different color patterns along paths and noted how often they were detected by people walking the trail. Models with the zig-zag pattern were detected less often than plainly colored models. This is the first confirmation that the viper's zig-zag pattern provides a cryptic function. But even if the viper is detected, the zig-zag can still work its magic - instead of hiding the snake, the pattern now functions to make it more obvious. Previous research has already established that the pattern warns predators about the snake's dangerous bite.

The rapid flickering from the zigs and zags of a fleeing snake can produce a a 'flicker-fusion effect' to mammalian predators

The most significant contribution from Dr. Janne Valkonen's study deals with a particular class of illusion generated by the zig-zag pattern. Just as how a rapid series of still pictures can produce a smooth animation, the rapid flickering from the zigs and zags of a fleeing snake can produce a solid shape.

Team measured the speed of fleeing snakes and calculated the flicker rate of the zig-zag. To an observer, a rapidly changing stimulus (such as a moving zig-zag, or spinning helicopter blade) is perceived as continuous if the flicker rate exceeds a threshold in the visual system.

The researchers found that the zig-zag moved quickly enough to produce such a 'flicker-fusion effect' to mammalian predators, although the quicker eyes of a raptor won't be fooled. The effect of this illusion may change the appearance of the moving snake, making it harder to catch. So, like a skilled illusionist, the viper hides by revealing.

The viper's zig-zag seems to be a simple pattern, but it is a masterful illusion that can hide, reveal, and paradoxically achieve both at the same time. Similarly, this research resolves theoretical tensions between apparently opposing functions of color patterns. That is, crypsis and aposematism seem mutually exclusive: one is meant to blend an animal into its surroundings, the other to make it stand out.

However, through the magic of movement and optics, both functions can be gained through the same pattern at different stages in the predation sequence. Furthermore, the one-to-many aspect of the zig-zag to its antipredator functions implies a far broader scope for the evolution of color patterns and antipredator adaptations than simple one pattern-to-one function relations.

Credit: 
University of Jyväskylä - Jyväskylän yliopisto