Culture

Nanoparticles enable efficient delivery of antimicrobial peptides

Announcing a new article publication for BIO Integration journal. In this review article the authors Yingxue Deng, Rui Huang, Songyin Huang and Menghua Xiong from South China University of Technology, Guangzhou, Guangdong, P. R. China and Sun Yat-sen University, Guangzhou, Guangdong, P. R. China discuss how nanoparticles enable efficient delivery of antimicrobial peptides for the treatment of deep infections.

Antimicrobial peptides (AMPs) are rarely directly used to treat deep infections due to their systemic toxicity and low bioavailability. The authors summarize recent progress that researchers employed nanoparticles based delivery systems to deliver AMPs for the treatment of deep infections.

Nanoparticles-based delivery systems offer a strategy to increase the therapeutic index of AMPs by preventing proteolysis, increasing the accumulation at infection sites, and reducing toxicity. Especially, the development of intelligent nanocarriers can achieve selective activation and active target in the infectious sites, thus improving the therapeutic efficacy against bacterial infection and reducing the toxicity against normal tissues.

Credit: 
Compuscript Ltd

Study of 630,000 patients unveils COVID-19 outcome disparities across racial/ethnic lines

Researchers at Seattle's Institute for Systems Biology and their collaborators looked at the electronic health records of nearly 630,000 patients who were tested for SARS-CoV-2, and found stark disparities in COVID-19 outcomes -- odds of infection, hospitalization, and in-hospital mortality -- between White and non-White minority racial and ethnic groups. The work was published in the journal Clinical Infectious Diseases.

The team looked at sociodemographic and clinical characteristics of patients who were part of the Providence healthcare system in Washington, Oregon and California. These patients had SARS-CoV-2 tests administered between March 1 and December 31 of 2020.

Of the more than 570,000 patients with known race/ethnicity that were tested:

27.8 percent were non-White minorities?

Nearly 55,000 patients tested positive, with minorities representing 50.1 percent?

Hispanics represented 34 percent of infections, but only 13 percent of tests?

More than 8,500 patients were hospitalized and 1,246 died, with non-White groups representing 56.1 percent and 54.4 percent, respectively.?

The study's findings of racial and ethnic distributions of outcomes across the health system tracked with state-level statistics. 

"All minority races/ethnicities faced increased odds of testing positive for SARS-CoV-2 and being hospitalized with COVID-19," said Chengzhen Dai, lead author of the study. "Hispanic patients also exhibited increased morbidity, and Hispanic race/ethnicity was associated with increased odds of in-hospital mortality."

Hispanics were generally younger than White patients and had higher rates of diabetes, but fewer other comorbidities. "The data show major outcome disparities especially among Hispanics, who tested positive at a higher rate, required excess hospitalization and mechanical ventilation, and had higher odds of in-hospital mortality despite younger age, suggesting Hispanic patients are arriving at the hospital with more advanced COVID-19," said Dr. Andrew Magis, corresponding author on the paper and Director of Data Science for ISB's Health Data Science Lab.

ISB, an affiliate of Providence, collaborated with the healthcare system to access and analyze the anonymized electronic health records of the large cohort. 

"This project demonstrates the power of analyzing hundreds of thousands of healthcare records and is a great example of the ongoing multidisciplinary collaboration between ISB and Providence," said ISB Assistant Professor Dr. Jennifer Hadlock, senior author on the paper.

Credit: 
Institute for Systems Biology

The Lancet: Seroprevalence study from Wuhan suggests 6.9% of population had COVID-19 antibodies by April 2020 -- including 40% with neutralizing antibodies that lasted for at least 9 months -- but 82% of cases were asymptomatic

New study is the first long-term seroprevalence study from Wuhan, China involving over 9,000 residents who were tested for antibodies after the Wuhan lockdown lifted in April, then again in June and in October-December 2020.

532 of 9,542 participants tested positive for antibodies against SARS-CoV-2, which - when adjusted - equated to an estimated seroprevalence of 6.9% in the population. In addition, 82% of participants who tested positive had not experienced any COVID-19 symptoms.

40% of people with antibodies developed neutralising antibodies (those that protect against future infection) in April, and these remained stable for at least nine months, regardless of whether individuals had symptomatic or asymptomatic disease.

The authors say that their findings indicate that a large proportion of the population remains uninfected and mass vaccination will be needed to reach herd immunity to prevent further resurgences of the pandemic.

The first long-term seroprevalence study of residents in Wuhan, China, has found that 6.9% of people in the city had antibodies against COVID-19 in April 2020, and 82% of these people had an asymptomatic infection. Additionally, 40% of people with antibodies developed neutralising antibodies, and these levels did not decrease between April and October-December 2020. The results are published in an observational study of 9,542 people in The Lancet.

The authors say that understanding seroprevalence and how antibody levels change over time in Wuhan will help inform their vaccination strategy, with their findings indicating that mass vaccination is needed to protect against future resurgences of the virus.

The latest seroprevalence study from Wuhan adds to previous seroprevalence studies conducted globally, including in Geneva (Switzerland), Spain, the USA, Iceland and the Netherlands, which have attempted to shed light on the true rate of infection in a population. This is particularly important as the rates of asymptomatic infection are uncertain, with estimates ranging from between 6% to 96% globally.

Lead author, Dr Chen Wang, Chinese Academy of Medical Sciences & Peking Union Medical College, China, says: "Assessing the proportion of the population that have been infected with SARS-CoV-2 and who are immune is of utmost importance for determining effective prevention and control strategies to reduce the likelihood of future resurgence of the pandemic. Given that individuals with mild infections might not seek medical care and that asymptomatic individuals are not usually screened, there may be large discrepancies between the reported COVID-19 cases, and actual infected cases, which has been proven by the experiences and data from other countries." [1]

He continues: "Even at the epicentre of the pandemic in China, with more than 50,000 confirmed cases as of April 8, 2020, the estimated seroprevalence in Wuhan remains low, and around 40% of people with antibodies developed neutralising antibodies, suggesting there is still lack of immunity in the population." [1]

Participants in the study lived across all 13 districts of Wuhan, with all members of a household invited to take part. All ages were included in the study, but people with serious diseases (such as advanced cancer or severe mental illness) were excluded.

Participants completed a questionnaire of demographic and health information, including if they had previously been diagnosed with COVID-19 or had had any COVID-19 symptoms since 1 December 2019. Blood samples were taken to test if antibodies were present in mid-April 2020, mid-June, and between October and December. Infections were classed as symptomatic if a participant reported having had fever and/or respiratory symptoms and was positive for COVID-19 antibodies. The study included 9,542 people from 3,556 families.

Of the 9,542 participants, 532 had antibodies against COVID-19. When adjusted, this equated to a seroprevalence of 6.9% in the population of Wuhan.

The authors found that women had a higher seroprevalence than men, people aged 66 years or over had the highest seroprevalence than any other age group, health care workers had a higher seroprevalence than other occupations, and people who had visited hospital in the past five months had higher seroprevalence than those who had not (see table 1).

437 (82%) of 532 participants who were positive for antibodies were asymptomatic. The study authors note that this is much higher than past estimates of 40-45% reported worldwide. They say this may be due to recall bias where participants reported their own symptoms five months later, but also say that this is unlikely to overestimate incidence to a large extent in their study because stringent measures were taken in Wuhan to identify every case, and Wuhan residents were vigilant in recording their symptoms during the outbreak.

Around 40% of participants (212/532 people) were positive for neutralising antibodies - those that protect against future infection - in April 2020. The proportion of people who had neutralising antibodies remained stable for the two follow-up periods - with 45% (162/363 people) in June 2020, and 41% (187/454 people) in October-December 2020. In addition, looking at the levels of neutralising antibodies in people's blood using data from 335 people who attended all three blood tests, the authors found that these levels did not significantly decrease over the nine months of the study. However, people who had had asymptomatic COVID-19 had lower levels than people who had confirmed or symptomatic COVID-19 disease.

Co-author, Dr Lili Ren, Institute of Pathogen Biology, Chinese Academy of Medical Sciences & Peking Union Medical College, China, says: "Little is known of the durability of immune responses against SARS-CoV-2 over a long period. In our study, we found that the proportion of participants with antibodies against SARS-CoV-2 was sustained for at least nine months. Importantly, we found that neutralising antibody titres remained stable for at least nine months." [1]

The authors note some limitations to their study, including that they cannot confirm when participants were infected and produced antibodies because most cases were asymptomatic and not confirmed by PCR testing at the outset of their infection. However, they note that there were very few cases of COVID-19 reported in Wuhan between mid-March and April 2020, so assumed infection occurred at least 4 weeks before blood samples were taken.

The authors of a linked Comment, Professor Richard Strugnell and Dr Nancy Wang (who were not involved in the study) from Doherty Institute, Australia, say the seroprevalence estimate suggests that the number of infections in Wuhan likely exceeded the number of reported COVID-19 cases in Wuhan. They write: "If the seroconversion rate is an accurate reflection of exposure to SARS-CoV-2, the apparent disparity between low case numbers and high seroconversion rate seems to suggest that most seroconverted individuals produced antibodies to SARS-CoV-2 after asymptomatic infection."

They also note that the findings have provided a much deeper understanding of natural seroconversion in a key city in the pandemic, and that the findings underscore the success in controlling the Wuhan outbreak of COVID-19 at a time when testing, tracing, and treatment resources were much less developed: "Efficient global management of COVID-19 will probably succeed or fail on the basis of the immunity induced by natural infection and, especially, vaccination. Given the relative paucity of neutralising antibodies through natural infection, the study by He and colleagues reinforces the need for effective COVID-19 vaccines in the population-level control of the disease. The extraordinary, rapid, and effective control measures implemented in Wuhan might have restricted the spread of the virus, but also reduced naturally-acquired herd immunity by truncating the development of sustained neutralising antibodies."

Although other national and local governments have used alternate and usually less effective strategies to control the spread of COVID-19, even in highly endemic communities the prevalence of disease is usually too low to drive sufficient herd immunity to protect the population. "He and colleagues' findings suggest that herd immunity will likely not develop after natural transmission in settings where infection control mechanisms are successfully introduced, underscoring the importance of effective vaccination strategies to control the spread of COVID-19. This study is an important milestone in the description of SARS-CoV-2 infection and our understanding of immunity in the pandemic."

Credit: 
The Lancet

University of Maryland co-publishes the first full reference genome for rye

image: Rye

Image: 
Markus Spiske, public domain

As one of the founding members of the International Rye Genome Sequencing Group (IRGSG), the University of Maryland (UMD) co-published the first full reference genome sequence for rye in Nature Genetics. UMD and international collaborators saw the need for a reference genome of this robust small grain to allow for the tracking of its useful genes and fulfill its potential for crop improvement across all major varieties of small grains, including wheat, barley, triticale (a cross between wheat and rye that is gaining popularity), and rye. Following the model of international collaboration used when UMD helped sequence the wheat genome, UMD co-developed the idea to produce a reference genome, organized the effort, and contributed to achieve the collective goal. The result is a valuable resource that can help improve grain yield, disease resistance, and temperature tolerance to increase climate resilience in grain crops.

"This reference genome is a wonderful resource, and it opens so many new doors for us," says Vijay Tiwari, assistant professor in Plant Science and Landscape Architecture (PSLA) at UMD and leader of the Maryland Small Grains and Genetics program. "The knowledge that rye offers us to fight physical and disease stressors is going to help us produce better crops that can tolerate disease and climatic changes much better. We can do genome-wide assays to see where useful traits are coming from, and for that, we need a reference genome to provide a framework."

Nidhi Rawat, assistant professor in PSLA and plant pathologist specializing in diseases like Fusarium Head Blight that ravage small grains, adds, "The more we screen, the more we get amazed with how much useful diversity we see in rye. It holds tremendous potential for crop improvement across wheat, rye, triticale, and barley."

Authored by more than 60 scientists from 14 countries including 4 research institutions in the U.S., this collaboration represents truly cooperative science. Based off the example of the International Wheat Genome Sequencing Consortium (IWGSC), Nils Stein of the Leibniz Institute of Plant Genetics and Crop Plant Research (IPK) in Germany took the lead on coordinating with the global collaborators to ensure that all the necessary pieces came together to produce the full rye genome. UMD is proud of the work they did to help bring this idea to fruition.

"Before this, there was significant effort to sequence the rye genome, but the fragmented assembly was not sufficient," says Tiwari. "But in this case, scientists all came together without centralized support because we all decided it was a good idea to get this knowledge out to the community. At UMD specifically, we helped develop the consortium, co-developed the idea, and provided resources to get the sequencing done and complete the mapping work. It was really absolute teamwork."

The excitement for this new rye reference genome can be especially felt across scientific and agricultural communities alike, laying the groundwork for many avenues of future research and crop improvement. According to Tiwari and Rawat, rye has a very diverse set of genes that allows it to grow in all kinds of soil and environments, making it very stress tolerant and disease resistant. It is also a cross-pollinated crop unlike self-pollinating wheat and barley, making it ideal for producing more robust hybrid grain varieties.

"Ancient wheat, barley, and rye all evolved around the same time," explains Tiwari. "But rye took a different path and has some unique advantages to the others. For example, finding ways to make wheat and barley cross-pollinating crops makes it easier to produce hybrid wheat or barley and is a huge incentive for increasing yield. Rye has that capability already."

Rawat and Tiwari also stress that rye and triticale (developed by crossing wheat and rye) are important cover crops for this region because of their efficient use of nutrients and need for little fertilizer, making them great for the Chesapeake Bay. "In addition to being good for bread and beer, rye is a popular cover crop because it has a very good portfolio for nitrogen and phosphorus use efficiency which are specifically very important for keeping excess nutrients out of the Bay," says Rawat. "Recently, we screened hundreds of triticale lines for diseases and found useful genetic diversity that seems to be coming from rye. With the availability of the reference genome of rye, it will be very easy to map the genes underlying these useful traits and transfer them to wheat and other small grains."

Rawat and Tiwari are excited at the breeding and research opportunities that this work can open up across the entire spectrum of small grains, allowing for the development of varieties that can meet the diverse needs of growers worldwide.

"It feels really great to see that in the last three years, we have two reference genomes sequenced for small grains [wheat and rye], and UMD was one of the leaders in both of them," says Tiwari. "It is a useful contribution towards the AGNR initiative to increase global food security."

"I'm particularly excited because it not only shows our research excellence at a national and international level, but the real satisfaction comes that the work we are doing in the lab is actually benefiting farmers at the ground level," stresses Rawat. "That is very fulfilling - that is a reward that is invaluable."

Credit: 
University of Maryland

Scientists study co-evolutionary relationship between rust fungi and wheat and barberry

image: Infection processes of Puccinia striiformis f. sp. tritici basidiospores and urediniospores on barberry and wheat.

Image: 
Jing Zhao

Wheat stripe rust is one of the most important wheat diseases and is caused by the plant-pathogenic fungi Puccinia striiformis f. sp. tritici (Pst). Though Pst is known to be highly host-specific, it is interestingly able to infect two unrelated host plants, wheat and barberry, at different spore stages. Pst infects wheat through its urediniospores and infects barberry with its basidiospores.

"This complex life cycle poses interesting questions on the co-evolution between the pathogen and the hosts, as well the different mechanisms of pathogenesis underlying the infection of the two different hosts," explained Jing Zhao, an associate research fellow at the College of Plant Protection at Northwest A & F University in China.

In a recent study, Zhao and colleagues studied the co-evolutionary relationship between rust fungi and its hosts using genes specifically needed for the host infection at different spore stages. They comprehensively compared the transcriptomes of Pst during the infection of wheat and barberry leaves and were able to identify the genes needed for either wheat or barberry infection and the genes needed to infect both. They found a larger proportion of evolutionarily conserved genes in barberry, implying a longer history of interaction with Pst.

"As a matter of fact, the barberry family, belonging to primitive angiosperms and originating from 146-113 million years ago, is evolutionarily older than grasses, which means it interacted with rust fungi earlier. Thus, we postulated a hypothesis that barberry might be the primary host of Pst," said Zhao.

Zhao pointed out that Pst cleverly applies distinct strategies to overcome various host defense systems. For example, the fungi are able to secrete different sets of enzymes to degrade different types of cell walls and cuticles based on perception of different chemical components.

Their work will contribute to a deeper understanding of the roles of barberry in wheat rust disease and sustainable control of stripe rust disease. It also provides a model to understand the evolutionary processes and strategies of different stages of a pathogen during the infection process on different hosts. Read more about this study in "Distinct Transcriptomic Reprogramming in the Wheat Stripe Rust Fungus During the Initial Infection of Wheat and Barberry" published in the MPMI journal.

Credit: 
American Phytopathological Society

Medical cannabis can reduce essential tremor: turns on overlooked cells in central nervous system

Medical cannabis is a subject of much debate. There is still a lot we do not know about cannabis, but researchers from the Department of Neuroscience at the Faculty of Health and Medical Sciences have made a new discovery that may prove vital to future research into and treatment with medical cannabis.

Cannabinoids are compounds found in cannabis and in the central nervous system. Using a mouse model, the researchers have demonstrated that a specific synthetic cannabinoid (cannabinoid WIN55,212-2) reduces essential tremor by activating the support cells of the spinal cord and brain, known as astrocytes. Previous research into medical cannabis has focussed on the nerve cells, the so-called neurons.  

'We have focussed on the disease essential tremor. It causes involuntary shaking, which can be extremely inhibitory and seriously reduce the patient's quality of life. However, the cannabinoid might also have a beneficial effect on sclerosis and spinal cord injuries, for example, which also cause involuntary shaking', says Associate Professor Jean-François Perrier from the Department of Neuroscience, who has headed the research project.

'We discovered that an injection with the cannabinoid WIN55,212-2 into the spinal cord turns on the astrocytes in the spinal cord and prompts them to release the substance adenosine, which subsequently reduces nerve activity and thus the undesired shaking'.

Targeted treatment with no problematic side effects

That astrocytes are part of the explanation for the effect of cannabis is a completely new approach to understanding the medical effect of cannabis, and it may help improve the treatment of patients suffering from involuntary shaking.

The spinal cord is responsible for most our movements. Both voluntary and spontaneous movements are triggered when the spinal cord's motor neurons are activated. The motor neurons connect the spinal cord with the muscles, and each time a motor neuron sends impulses to the muscles, it leads to contraction and thus movement. Involuntary shaking occurs when the motor neurons send out conflicting signals at the same time. And that is why the researchers have focussed on the spinal cord.

'One might imagine a new approach to medical cannabis for shaking, where you - during the development of cannabis-based medicinal products - target the treatment either at the spinal cord or the astrocytes - or, at best, the astrocytes of the spinal cord', says Postdoc Eva Carlsen, who did most of the tests during her PhD and postdoc projects.

'Using this approach will avoid affecting the neurons in the brain responsible for our memory and cognitive abilities, and we would be able to offer patients suffering from involuntary shaking effective treatment without exposing them to any of the most problematic side effects of medical cannabis'.

The next step is to do clinical tests on patients suffering from essential tremor to determine whether the new approach has the same effect on humans.

Credit: 
University of Copenhagen - The Faculty of Health and Medical Sciences

Using conservation criminology to understand restaurant's role in urban wild meat trade

image: Restaurant menu in Kinshasa featuring porcupine followed by a "wild meat broth"

Image: 
WCS

KINSHASA, DEMOCRATIC REPUBLIC OF CONGO (March 18, 2021) - A new study in the journal Conservation Science and Practice finds that restaurants in urban areas in Central Africa play a key role in whether protected wildlife winds up on the menu.

The study, by a team of scientists from Michigan State University, Wildlife Conservation Society (WCS), and University of Maryland, used a crime science "hot product" approach, which looks at frequently stolen items coveted by thieves. The approach offered new insights into wildlife targeted by the urban wild meat trade and can inform urban wildlife policies.

The study engaged lower, middle, and upper-level tiered restaurants to understand which species were traded. The findings revealed that procurement of wild meat by restaurants was more targeted than opportunistic, with monkeys identified as "hot products" - species most at risk of being targeted by the urban wild meat trade in both Kinshasa and Brazzaville, the respective capital cities of Democratic Republic of Congo and Republic of Congo.

Restaurants in Brazzaville were more aware of laws and revealed a wider variety of wild meat species at risk of illegal trade. This could be suggestive that awareness of laws is not an effective deterrent against illegal activity, and instead might lead to adaptive practice, such as diversifying the range of products offered, and in doing so spreading out the risk associated with any one particular species.

Though the wild meat trade can and does exist legally, it crosses into illegality when sourcing wildlife that are protected from poaching by national laws and regulations. These can include hunting in protected areas, outside of permitted seasons, beyond set quotas, hunting protected species, or by individuals without permits) or from trading across borders by international laws.

Looking at consumer demand alone, besides monkeys, restaurant customers also commonly requested antelopes. The study found that although consumer demand is an important consideration for restaurants' purchasing decisions, it did not account for the cost and effort required to source a tradeable product. When factoring this in, pangolins were also identified as being at risk from the wild meat trade. This suggests the need to work with other supply chain actors besides consumers.

Knowing which wildlife are most at risk in urban centers can be helpful in focusing law enforcement efforts on compliance with species-specific rules say the authors.

"Restaurants have the potential to help reduce risks from the illegal wildlife trade and make their livelihoods more sustainable. Working with restaurants can also help build a community of informal wildlife guardians complementing law enforcement and legislative action, in a multi-pronged approach," said Sarah Gluszek, lead author now working for FFI.

The illegal wildmeat trade is a problem for urban zones in Central Africa, but the dynamics of the trade are poorly understood. At unsustainable rates and in illegal contexts, the wild meat trade is a driver of species extinction; it can also threaten ecosystem services, local food security and contribute to the risk of zoonotic disease spread.

Credit: 
Wildlife Conservation Society

Study reveals significant concerns over growing scale of sex selective abortions in Nepal

image: The findings raises challenges for the Nepali government in relation to sex selective abortion

Image: 
Dr Mahesh Puri

Detailed, new analysis published this week in the British Medical Journal (BMJ) Open highlights significant concerns about a growing issue of sex selective abortion of girls in Nepal.

Drawing on census data from 2011 and follow-on survey data from 2016, the social scientists estimate that roughly one in 50 girl births were 'missing' from records (i.e. had been aborted) between 2006-11 (22,540 girl births in total). In the year before the census (June 2010 - June 2011) this had risen to one in 38.

For certain areas of the country, the practice was more widespread. In Arghakhanchi, the most affected district, one in every six girl births were 'missing' in census data. In the Kathmandu Valley, Nepal's main urban centre, around 115 boys are born for every 100 girls. Without sex selection we would expect only 105 boys born for every 100 girls.

It has been widely acknowledged over many years that sons have been preferred to daughters in Nepal. Whereas boys are seen economic and social assets, in some parts of the country girls are more often regarded as a financial burden, requiring a dowry and leaving their family home upon marriage.

However, it is only since abortion legislation in 2002 and the widespread availability of ultrasounds from 2004 onwards that there has been the potential to selectively abort female foetuses. There has been growing concerns about this practice over recent years, but to date little empirical research about the scale of the issue.

Importantly, abortions due to the results of sex determination tests are both illegal and carry prison sentences in Nepal. But, the researchers writing in the BMJ-Open, suggest that these laws are not effectively enforced. It is estimated that more than half of abortions carried out in 2014 were illegal, so direct legislation only has limited scope to solve this issue.

Deeper analysis by the team found that sex ratios were skewed, with women who were richer and more highly educated more likely to undertake sex selective abortion. They also found that, in districts where more sex-selection occurred, girls were more likely than boys to die by age five, indicating discrimination both before and after birth.

Lead author, Dr Melanie Channon from the University of Bath's Department of Social & Policy Sciences explained: "As fertility falls and urbanisation increases, there is more access to prenatal sex identification technology in Nepal. Our study shows some of the impact this has had over recent years, and we expect there will be a 'trickle-down' of ability to select the sex of a baby from the wealthiest and most educated as the technology becomes more widely available and more affordable. Put simple and starkly, without concerted effort, there will be an increase in sex-selective abortions in Nepal.

"It is important to stress that the solution to this growing issue is not to ban abortion or ultrasound tests during pregnancy. Many lives have been - and continue to be - saved by these policies. The only lasting solution is to dismantle the deeply rooted gender inequity found across the country in order that people no longer wish to selectively abort female foetuses. The government in Nepal needs to take a lead on this, combining media campaigns with legal and political measures which address the issue of gender equity across a range of themes in the country."

Second author of the paper Dr Mahesh Puri from the Center for Research Environment Health and Population Activities (CREHPA) in Nepal, added: "In view of the easy accessibility to prenatal sex-determination technologies, religious and socio-economic values given to sons, coupled with lack of focussed policy and programme to address gender inequality and weak enforcement of law relating to sex determination tests, the practice of sex selective abortion could further increase in the future.

"Targeted interventions to enable gender equality, increase value of girls, as well as social and economic incentives for vulnerable girls, such as conditional cash transfer schemes and effective enforcement of the law would be required."

The team behind the report urge the Nepali government to recognise this issue and adapt a multi-sectoral national strategy to combat it.

Credit: 
University of Bath

Psychologists report an error in the NICE guidelines for autism

Reporting in the Lancet Psychiatry today, psychologists at the University of Bath highlight that a widely used technique for autism screening is being misused, which may have prevented many people from receiving an autism diagnosis over the past decade.

When individuals with suspected autism are assessed by a GP, a decision to refer them to a specialist for diagnosis is informed by using the Autism Spectrum Quotient. This ten-point scale, known as the 'AQ-10', is an internationally used technique, whereby individuals agree or disagree with statements such as 'I find it difficult to work out people's intentions'. The maximum score is ten, and higher scores represent more autistic traits.

A score of six or above on this scale should signal that an individual needs to be referred to a specialist psychologist or psychiatrist. However, through this new research, psychologists have uncovered that for almost ten years the NICE guidelines have incorrectly been recommending a score of 'more than 6 out of 10'. This error may have consequently prevented people who scored '6' from receiving proper support.

The Bath team were surprised to discover this error, and closely analysed the original research about the autism screening tool in comparison with the NICE guidance. They found that the NICE Guideline Development Group had considered, but rejected, a cut-off score for diagnosis of seven or above (?7). In their Lancet article, they conclude that the NICE recommendation of a score "more than 6 out of 10" is an error.

The researchers say that the use of an inappropriately high cut-off score makes this autism screening tool less sensitive, and therefore less accurate. Because it is so widely used among GPs and other healthcare professionals, this issue will be contributing to missed autism referrals, diagnoses, and opportunities for intervention and support. Although clinicians are not solely reliant on AQ-10 scores to make referrals, it factors into their decision-making process. As the NICE AQ-10 guidelines have been in place for almost a decade, the consequences of this mistake will be considerable.

Until the erroneous guidelines are corrected by NICE, the psychologists are calling for an urgent review into this matter, so that pending diagnoses are not missed and that any errors in previous screening can be rectified. In their paper, they recommend that clinicians and researchers use the cut-off score of "6 or above" (?6) instead of NICE's "more than 6 out of 10" (?7) to inform their work.

Dr Punit Shah, Associate Professor of Psychology at the University of Bath and the GW4 Neurodevelopmental Neurodiversity Network, explained: "This is a worrying finding as cut-off scores on screening tools underpin their accuracy. Although a difference of 1-point might not seem huge, a 1-point increased cut-off score on a 10-point scale is substantial and makes the instrument less psychologically sensitive. This means that many people going to their GPs who genuinely have autism - perhaps scoring 6 on the scale - are currently less likely to be referred to specialists for full diagnostic assessment. Diagnosis is of course crucial: without a diagnosis, people have less access to appropriate interventions and support, even certain benefits.

"It is impossible to put a number on exactly how many people will have been affected by this, but it is well known that delayed referrals and late diagnoses of autism have negative consequences for the mental health and wellbeing of autistic people and their families. We urgently need to do all we can to raise awareness of this issue, among GPs and other clinicians, while the NICE guidelines are corrected. NHS waiting times for autism assessments are already far too long and these flaws in screening procedures will be compounding this issue."

Lucy Waldren, lead author of the article also of the Department of Psychology at Bath, suggests the findings have implications for autism and psychiatry research. She says: "Our examination of the literature has discovered that the erroneous NICE guidelines have caused major confusion amongst researchers on which cut-off scores to use. We have found several examples of the incorrect value being applied. Participants in studies have also been inappropriately excluded based on their scores. And, even when the correct value was used, it has been incorrectly attributed to the NICE guidance. If researchers have followed the incorrect NICE guidelines and used the AQ-10 incorrectly in their studies, they may need to reanalyse and republish, or even consider retracting their findings."

Credit: 
University of Bath

Could leak in blood-brain barrier be cause of poor memory?

Have you forgotten where you laid your keys? Ever wondered where you had parked your car? Or having trouble remembering the name of the new neighbor? Unfortunately, these things seem to get worse as one gets older. A big question for researchers is where does benign forgetfulness end and true disease begin?

One of the keys to having a healthy brain at any age is having a healthy blood-brain barrier, a complex interface of blood vessels that run through the brain. Researchers reviewed more than 150 articles to look at what happens to the blood-brain barrier as we age. Their findings were published March 15 in Nature Aging.

Whether the changes to the blood-brain barrier alters brain function, however, is still up for debate.

"It turns out very little is known how the blood-brain barrier ages," said lead author William Banks, a gerontology researcher at the University of Washington School of Medicine, and a researcher with the Geriatrics Research Education and Clinical Center at the Veterans Affairs Puget Sound Health Care System. "It's often hard to tell normal aging from early disease."

The blood-brain barrier, discovered in the late 1800s, prevents the unregulated leakage of substances from blood into the brain. The brain is an especially sensitive organ and cannot tolerate direct exposure to many of the substances in the blood. Increasingly, scientists have realized that the blood-brain barrier also allows many substances into the brain in a regulated way to serve the nutritional needs of the brain. It also transports informational molecules from the blood to the brain and pumps toxins out of the brain. A malfunctioning blood-brain barrier can contribute to diseases such as multiple sclerosis, diabetes, and even Alzheimer's disease.

Before scientists can understand how such malfunctioning can contribute to the diseases of aging, they need to understand how a healthy blood-brain barrier normally ages.

Research shows that healthy aging individuals have a very small leak in their blood-brain barrier. This leakage is associated with some measures of the benign forgetfulness of aging, considered by most scientists to be normal. But could this leak and the difficulties in recall be the early stages of Alzheimer's disease?

When a person carries the ApoE4 allele, the strongest genetic risk of Alzheimer's risk, researchers said there is an acceleration of most of the blood-brain barrier age-related changes.

People with ApoE4 have a hard time getting rid of amyloid beta peptide in their brains, which causes an accumulation of plaque. With healthy aging, the pumps in the blood-brain barrier work less efficiently in getting rid of the amyloid beta peptide. The pumps work even less well in people with Alzheimer's disease.

Another key finding in the review is that as we age, two cells begin to change in the blood-brain barrier: pericytes and astrocytes.

Recent work suggests that the leak in the blood-brain barrier that occurs with Alzheimer's may be due to an age-related loss of pericytes. Astrocytes, by contrast, seem to be overactive. Recent work suggests that preserving pericyte function by giving the factors that they secrete or even transplanting them could lead to a healthier blood-brain barrier.

Some research suggests that pericyte health can be preserved by some of the same interventions that extend lifespan, such as regular exercise, caloric restriction, and rapamycin.

Other findings raise the question of whether the brain's source of nutrition and its grip on control of the immune and endocrine systems could deteriorate with aging. Another finding raises the possibility that the rate at which many drugs are taken up by the brain may explain why older folks sometimes have different sensitivities to drugs than their children or grandchildren.

Credit: 
University of Washington School of Medicine/UW Medicine

Sugar tax in Spain has led to only tiny reduction in calories in shopping basket

The introduction of a sugar tax, increasing the price of fizzy drinks and other products high in sugar content, has had only a limited, moderate effect in shifting people's dietary habits and behaviours, according to a new study.

Fresh research from an international team of economists published in the journal Social Science & Medicine, focused on the impact of a sugar tax on people's shopping baskets comparing customer spending in Catalonia in Spain (where a tax had been introduced), with the rest of the country (where it had not been) from May 2016 - April 2018.

A sugar-sweetened beverages (SSB) tax was in introduced in Catalonia in May 2017, but not for the rest of Spain. The tax has a tiered structure whereby the rate increases dependent on the amount of sugar contained within a product. The Catalonian approach mirrors that of the UK Soft Drinks Industry Levy, which came into force on 6 April 2018.

The SSB tax meant that, on average, a one litre bottle of Fanta, Sprite or Seven Up!, which cost around €1.02 the month before the tax increased to €1.18. However, one of the additional effects of tax has been reformulation - whereby drinks producers have created and marketed new products with much lower overall sugar content (e.g. Coke Zero).

Drawing on customer data derived from loyalty cards at a Spanish supermarket chain and comparing shopping baskets for nearly a million households (844,943) both before and after the tax was introduced, the research team found that households did reduce purchases of high sugar (taxed) beverages and increased purchases of lower sugar (untaxed) options in response.

However, this reduction was modest. Overall, they calculated this led to an average reduction in sugar of 2.2%, which spread out on a per person basis equalled only a tiny, 3.7 calories per month. Their results also showed a distributional effect with regular (i.e. those doing most of their shopping at this supermarket chain) and high-income customers most affected by the tax.

This might be either due to the fact that the tax was more salient for these groups or that these groups represented the largest share of the sample. Consequently, the researchers argue that much more is needed to influence behaviours and reduce obesity in particular among poorer households.

Lead researcher, Dr Eleonora Fichera from the Department of Economics at the University of Bath (UK) explains: "In response to rising levels of obesity and the serious and significant negative effects this is having for individuals, their families and wider healthcare systems, over the past five years there has been growing interest in the potential effectiveness of sugar taxes.

"By analysing the effect of a tiered tax system for sugar-sweetened beverages in Catalonia and by comparing its impact with the rest of Spain (where a tax was not introduced) our results provide important evidence to policymakers keen to explore the potential effectiveness of this approach.

"And whilst our results demonstrate some impact in shifting behaviours towards products lower in sugar, this effect is modest at best. If these taxes are to be more effective, they need to be more visible at the checkout so that consumers become increasingly aware of the added cost of their high-sugar choices. This requires that the tax is more specific too, ensuring producers are forced to pass the tax through to consumers. Although more than 20% of the Catalan tax was passed through to consumers, not all of it was, making the tax less impactful."

A wide range of additional policies have recently been introduced to tackle growing obesity, including high sugar and high calorie labelling as well as different taxes. In August 2020, Dr Fichera published separate research suggesting that nutritional labelling is helping to improve the nation's diet.

She added: "There is no one, single, silver bullet which will resolve the obesity crisis that many countries in the West are facing. Obesity is a complex problem, exacerbated by a proliferation of high-sugar, high-fat products but also our increasingly sedentary lifestyles. Our approach to tackle it needs to be holistic and co-ordinated."

Credit: 
University of Bath

Therapy for most common cause of cystic fibrosis safe and effective in 6-11

An international, open-label Phase 3 study, co-led by Susanna McColley, MD, from Ann & Robert H. Lurie Children's Hospital of Chicago, found that a regimen of three drugs (elexacaftor/tezacaftor/ivacaftor) that targets the genetic cause of cystic fibrosis was safe and effective in 6-11-year-olds with at least one copy of F508del mutation in the CFTR gene, which is estimated to represent almost 90 percent of the cystic fibrosis population in the United States.

For children in this age group who have only one copy of F508del mutation - or about 40 percent of patients with cystic fibrosis in the United States - this would be the first treatment that addresses the underlying genetic defect in cystic fibrosis.

This three-drug cystic fibrosis treatment was approved by the FDA in October 2019 for people 12 years and older with at least one copy of F508del mutation. Based on the positive results of this study, the FDA has accepted the application to expand treatment indication to younger children, with a decision expected by June 2021.

"The most exciting aspect of our findings is that this population of children had normal lung function at the start of the study and still had a significant improvement," said Dr. McColley, Co-Global Principal Investigator on the study and senior author, who is the Scientific Director for Interdisciplinary Research Partnerships at Stanley Manne Children's Research Institute at Lurie Children's and Professor of Pediatrics at Northwestern University Feinberg School of Medicine. "Coupled with what we saw in studies and in practice with the older population, starting treatment earlier may avert serious long-term complications and really change the trajectory of health for children with cystic fibrosis."

Cystic fibrosis is a progressive genetic disease that damages multiple organs, including the lungs and pancreas. Currently, average life expectancy is 47 years. The disease is caused by mutations in the CFTR gene that lead to insufficient flow of salt and water in and out of cells. In the lungs, this creates buildup of thick, sticky mucus that can result in chronic lung infections and severe lung disease. Damage to the pancreas occurs even before birth, which interferes with nutrition absorption and growth. While there are approximately 2,000 known mutations of the CFTR gene, the most common is the F508del mutation.

In the 24-week study with 66 children, published in the American Journal of Respiratory Care and Critical Care Medicine, Dr. McColley and colleagues confirmed the appropriateness of a dose that is half of the adult daily dose of the three-drug treatment for children 6-11 years of age who weigh less than 30 kg and of the full adult dose for those weighing more. The regimen was well tolerated, and the safety profile was generally consistent with that observed in older patients, with cough, headache and fever as the most common adverse events.

The treatment resulted in significant improvements in lung function, respiratory symptoms and nutritional status. Maintaining or improving nutritional status is associated with better lung function and increased survival in patients with cystic fibrosis. In addition, substantial improvement in sweat chloride concentration, a direct measure of CFTR function, was observed.

"In this study, we saw greater improvements in sweat chloride than those previously seen in adults and adolescents," said Dr. McColley. "This strong response to treatment may lead to better long-term clinical outcomes of cystic fibrosis."

"It's important to note that people with cystic fibrosis who are demographically characterized as having a race other than white or ethnicity characterized as Hispanic are less likely to have an F508del mutation," Dr. McColley said. "This is important because as with other acute and chronic conditions, these populations have more severe disease and lower life expectancy. As drug development continues, we are focused on having a highly effective treatment or cure for everyone with cystic fibrosis."

Credit: 
Ann & Robert H. Lurie Children's Hospital of Chicago

Chemical cocktail creates new avenues for generating muscle stem cells

image: Microscope image showing muscle stem cells produced using the newly discovered chemical cocktail. Muscle cells, in red, are integrating into injured muscle, in green, of an adult mouse.

Image: 
UCLA Broad Stem Cell Research Center

A UCLA-led research team has identified a chemical cocktail that enables the production of large numbers of muscle stem cells, which can self-renew and give rise to all types of skeletal muscle cells.

The advance could lead to the development of stem cell-based therapies for muscle loss or damage due to injury, age or disease. The research was published in Nature Biomedical Engineering.

Muscle stem cells are responsible for muscle growth, repair and regeneration following injury throughout a person's life. In fully grown adults, muscle stem cells are quiescent -- they remain inactive until they are called to respond to injury by self-replicating and creating all of the cell types necessary to repair damaged tissue.

But that regenerative capacity decreases as people age; it also can be compromised by traumatic injuries and by genetic diseases such as Duchenne muscular dystrophy.

"Muscle stem cell-based therapies show a lot of promise for improving muscle regeneration, but current methods for generating patient-specific muscle stem cells can take months," said Song Li, the study's senior author and a member of the Eli and Edythe Broad Center of Regenerative Medicine and Stem Cell Research at UCLA.

Li and his colleagues identified a chemical cocktail -- a combination of the root extract forskolin and the small molecule RepSox -- that can efficiently create large numbers of muscle stem cells within 10 days. In mouse studies, the researchers demonstrated two potential avenues by which the cocktail could be used as a therapy.

The first method uses cells found in the skin called dermal myogenic cells, which have the capacity to become muscle cells. The team discovered that treating dermal myogenic cells with the chemical cocktail drove them to produce large numbers of muscle stem cells, which could then be transplanted into injured tissue.

Li's team tested that approach in three groups of mice with muscle injuries: adult (8-week-old) mice, elderly (18-month-old) mice and adult mice with a genetic mutation similar to the one that causes Duchenne in humans.

Four weeks after the cells were transplanted, the muscle stem cells had integrated into the damaged muscle and significantly improved muscle function in all three groups of mice.

For the second method, Li's team used nanoparticles to deliver the chemical cocktail into damaged muscle tissue. The nanoparticles, which are about one one-hundredth the size of a grain of sand, are made of the same material as dissolvable surgical stitches, and they are designed to release the chemicals slowly as they break down.

The second approach also produced a robust repair response in all three types of mice. When injected into injured muscle, the nanoparticles migrated throughout the injured area and released the chemicals, which activated the quiescent muscle stem cells to begin dividing.

While both techniques were successful, the key benefit of the second one is that it eliminated the need for growing cells in the lab -- all of the muscle stem cell activation and regeneration takes place inside the body.

The team was particularly surprised to find that the second method was effective even in elderly mice, in spite of the fact that as animals age, the environment that surrounds and supports muscle stem cells becomes less effective.

"Our chemical cocktail enabled muscle stem cells in elderly mice to overcome their adverse environment and launch a robust repair response," said Li, who is also chair of bioengineering at the UCLA Samueli School of Engineering and professor of medicine at the David Geffen School of Medicine at UCLA.

In future studies, the research team will attempt to replicate the results in human cells and monitor the effects of the therapy in animals for a longer period. The experiments should help determine if either approach could be used as a one-time treatment for patients with serious injuries.

Li noted that neither approach would fix the genetic defect that causes Duchenne or other genetic muscular dystrophies. However, the team envisions that muscle stem cells generated from a healthy donor's skin cells could be transplanted into a muscular dystrophy patient's muscle -- such as in the lungs -- which could extend their lifespan and improve their quality of life.

Credit: 
University of California - Los Angeles Health Sciences

Mobile stroke units improve outcomes and reduce disability among stroke patients

DALLAS, March 17, 2021 -- Stroke patients treated via a mobile stroke unit (MSU) received clot-busting medications faster and more often - and recovered significantly better than patients who receive regular emergency care by standard ambulance, according to late-breaking science presented today at the American Stroke Association's International Stroke Conference 2021. The virtual meeting is March 17-19, 2021 and is a world premier meeting for researchers and clinicians dedicated to the science of stroke and brain health.

"Our goal in this study was to treat patients on the mobile stroke unit within an hour of the onset of their stroke symptoms, and we were gratified that one-third of the patients were actually treated within that time frame," said James C. Grotta, M.D., lead study author and director of stroke research at the Clinical Institute for Research and Innovation at Memorial Hermann - Texas Medical Center in Houston. "Our study confirmed that patients who are treated early benefit from a complete reversal of stroke symptoms and avoidance of disability. This suggests that in the first hour after a stroke occurs, the brain is not yet irreversibly damaged and is very amenable to effective treatment."

Mobile stroke units are special ambulances equipped to diagnose and treat stroke quickly. When a stroke is caused by a blood clot blocking an artery in or leading to the brain (an ischemic stroke), the team on board the mobile stroke unit can treat the patient right away with a clot-dissolving medication called tissue plasminogen activator (tPA).

This research, which is part of the ongoing, national BEST-MSU study, examined data from 1,047 patients who suffered an ischemic stroke and were eligible for tPA treated at seven U.S. centers (Houston; Aurora, Colorado; New York City; Indianapolis; Los Angeles; Memphis, Tennessee; and Burlingame, California) between 2014 and 2020. Researchers compared outcomes of stroke patients brought to the emergency department by mobile stroke unit versus those who arrived by standard emergency medical services (617 patients via mobile stroke unit, and 430 patients via standard ambulance).

They found:

Overall, 97% of patients transported by a mobile stroke unit received tPA, compared to 80% of those brought to the emergency department by a regular ambulance.

One-third of the patients treated by a mobile stroke unit were treated within one hour after the onset of stroke symptoms, compared to only 3% of patients transported by a standard ambulance.

53% of the patients treated by a mobile stroke unit made a complete recovery from the stroke after three months while 43% of the patients treated by a standard ambulance achieved a full recovery.

"Our results mean that, on average, for every 100 patients treated on a mobile stroke unit rather than standard ambulance, 27 will have less final disability and 11 of the 27 will be disability-free," Grotta said. "But for this to happen, patients, caregivers and bystanders need to recognize the signs of stroke and call 9-1-1 immediately."

"More widespread deployment of mobile stroke units may have a major public health impact on reducing disability from stroke," said Grotta. "Although mobile stroke units are costly to equip and staff, they reduce the time to treatment. We also expect that more treatment via mobile stroke units can reduce the need for downstream utilization of long-term care."

More research is ongoing by Grotta and his team to assess health care utilization over the entire year following their patients' strokes which will enable a better idea of the cost effectiveness of mobile stroke unit implementation on a wider scale. The American Heart Association's 2019 Recommendations for the Establishment of Stroke Systems of Care suggests reimbursement is an issue that warrants further investigation before widespread use of mobile stroke units is likely.

Credit: 
American Heart Association

Illinois youth opioid use linked with other substance misuse, mental health issues

image: University of Illinois researchers Allen Barton (left) and Doug Smith found opioid use among Illinois high school students tied to other substance use and mental health concerns.

Image: 
Photo: L. Brian Stauffer, University of Illinois.

URBANA, Ill. - Opioid use has dramatically increased in the 21st century, especially among young adults. A new study from the University of Illinois provides insights on usage patterns among Illinois high school students to help inform prevention and treatment strategies.

"The societal and personal costs of opioid misuse are massive. There's been a lot of focus on trying to understand how to combat the current epidemic. But we also need to make sure we have good data in order to know how we should apply our efforts," says Allen Barton, assistant professor in the Department of Human Development and Family Studies at U of I and lead author on the study.

The researchers based their study on information from the 2018 Illinois Youth Survey (IYS), which measures risk behaviors among high school students.

Over 230,000 students across Illinois typically participate in the biannual survey, says Doug Smith, professor of social work and director of the Center for Prevention Research and Development at U of I. Smith is co-author on the opioid study and principal investigator for the IYS.

The study focused on 18-to 19-year-olds, the beginning of a developmental stage when opioid use vulnerability is highest, Barton says.

Among the more than 26,000 respondents in this age group, 5.6% (1,468 youth) indicated they had used prescription pain medication in the past year without a prescription or differently than intended; that is, non-medical use of prescription opioids.

"Another 2.6% (682 youth) reported they had used prescription painkillers to get high. This addresses motive of use, which is an important part of understanding the issue," Barton says.

Finally, 0.4% of the sample (105 youth) reported they had used heroin in the past year. Heroin is another form of opioid that is not in the form of prescription medication, Barton notes.

The researchers found clear differences in characteristics of opioid users versus non-users.

"The individuals engaging in opioid use are also engaging in heightened levels of other forms of substance misuse, primarily alcohol and cannabis. They have more mental health concerns and higher suicide intent. And those who are using opioids report much lower grades and much higher levels of being victims of bullying," Barton says.

As opioid use is closely linked with other forms of substance misuse, counselors and medical practitioners should treat it as part of a pattern, Smith states. "This contradicts the typical image of a non-substance-using youth who one day decides to use opioids and then gets progressively addicted. That doesn't typically happen. These kids are already using other substances, often at levels indicative of problematic use. It seems more like a progression of general substance use than specific opioid usage," he notes.

The researchers also analyzed the data to look for profiles among the subset of youth using opioids.

"Our findings indicated three main profiles of individuals reporting opioid use. You have one group, comprising slightly more than half of this subsample, that's using opioids, but not specifically to get high. You have another group of individuals reporting a clear motive of use to get high. And a third, small group that's just using heroin," Barton notes.

While there were many similarities among the three groups, individuals who reported using opioids to get high also reported much more problematic substance abuse overall, as well as higher suicide risk compared to people who are engaged in non-medical use of prescription opioids without such motive, Smith adds.

The researchers say their study shows opioid use is a complex issue which needs tailored approaches to treatment and prevention.

"In order to address opioid use at this developmental stage, which is a transition to adulthood, we need to realize it is indicative of a broader pattern of factors related to other substance use and mental health issues that require attention. A one-step approach to just address the opioid use may not be sufficient," Barton states.

"The good news in this data is that opioid use rates are very low for this demographic across the state. However, for a subset of youth who do use, it appears to be making a difficult situation all the more challenging."

Barton and Smith say the correlation with other forms of substance misuse can help identify opioid use at an early stage.

"For any youth who is getting treatment for another substance, we need to be screening for whether they're using opioids, and we need to have a prevention program within a treatment program," Smith says.

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences