Culture

Cute monkeys perceived as safer, but in reality dominant animals get closer to humans

People say they are more willing to approach cute-looking monkeys in the wild, but in reality end up getting closer to dominant monkeys they believe could pose more risk, according to new research.

Researchers at the University of Lincoln, UK, showed a sample of people photos of wild Barbary macaques - a primate which commonly mingles with tourists in Gibraltar and North Africa - and asked them to assess their faces according to a variety of traits including dominance, trustworthiness, cuteness and socialness.

The study participants were also asked how close they would be willing to get to the monkey to feed it or take a photo.

Results showed that people said they were more willing to approach, feed or take photos with macaques that they perceived to be trustworthy, subordinate, cute, social, young, or female. This suggests that they perceive these social traits as safer to approach.

Dominant primates were perceived to pose higher threat than subordinate primates and therefore were deemed to be less approachable. But despite their stated preferences, in field observations people ended up getting closer to more dominant macaques.

Dr. Laëtitia Maréchal, Senior Lecturer in Psychology at the University of Lincoln, said: "Despite forming these first impressions based on faces, in reality the interactions we observe don't follow what people say. When people feed wildlife they are more likely to end up close to dominant animals; the ones people claimed to be less willing to approach due to being perceived as less safe.

"It is important to study wildlife interactions to improve the safety and welfare of both humans and the animals involved. This is an important step towards understanding how to better communicate with other species. This has great positive implications for human safety and animal welfare."

The research, published in the academic journal Scientific Reports, featured observations of real-life human and macaque interactions taken at a popular tourist site in Morocco, as well as the image-based tests.

Credit: 
University of Lincoln

Learning empathy as a care giver takes more than experience

PHILADELPHIA - Poverty takes a toll on health in many ways. It often causes malnutrition and hunger, creates barriers to access basic resources, and can also impact well-being in more subtle ways linked to social discrimination and exclusion. Nurses, one of the most important healthcare providers, serve both as advocates for patients and as their most constant caregivers. They are trained to provide compassionate care to all. New research from Thomas Jefferson University shows that existing training may not adequately challenge nursing students' pre-existing assumptions about poverty, and that more needs to be done to help nurses reflect on their role in combating the societal stigma of poverty.

"We should be trained, as nurses, to empathize with our patients and ultimately to help close the gap in health disparities," says author Karen Alexander, PhD, RN, "In our research, we wanted to look at whether or not past experiences with poverty (either lived or as a volunteer) gave nurses a stronger sense of empathy towards populations experiencing poverty."

The results were published March 5th in the https://www.healio.com/nursing/journals/jne/2020-3-59-3/%7B3371cd38-5176-4dda-a2e5-bc6ca21259dc%7D/the-relationship-between-past-experience-empathy-and-attitudes-toward-poverty">Journal of Nursing Education.

The researchers surveyed 104 nursing students using the Jefferson Empathy Scale, an internationally-used tool to measure empathy in healthcare contexts, and a second validated survey called Attitudes Towards Poverty (short form) at one time point. They also collected demographic information on students, which included questions on exposure to poverty through lived experience or volunteer experience, as well as age, gender, ethnicity, religion and others.

"What surprised us at first was that personal experience with poverty didn't necessarily yield higher empathy scores," says Dr. Alexander. "In fact, the scores were the same as average. What was more surprising was that those students who had interacted with poverty through volunteer experiences had lower empathy scores than the remainder of the cohort."

"The volunteer experience is central to a lot of medical and nursing-school pedagogy," says Dr. Alexander. "It's this idea that exposure is enough to help challenge assumptions, and remove stigma. But it may not have the effect we think it's having. Our results suggest that service learning isn't enough, and it may be, in fact, detrimental."

Students may bring their biases to volunteer experiences, Dr. Alexander explained, and may have those biases confirmed rather than challenged.

One intervention that Dr. Alexander finds particularly useful to gently help identify and help dismantle each student's pre-existing opinions surrounding poverty is self-reflection through journaling and peer-reflection and discussion.

"It's important for students to be able to see themselves in their patients. To think 'that could be me or someone I know.' It's hard to get to that position in the absence of a meaningful relationship," says Dr. Alexander.

Credit: 
Thomas Jefferson University

Two weeks after sports-related concussion, most patients have not recovered

March 9, 2020 - Less than half of patients with sports-related mild traumatic brain injury (mTBI) achieve clinical recovery within two weeks after injury, reports a study in Clinical Journal of Sport Medicine. The journal is published in the Lippincott portfolio by Wolters Kluwer.

"This study challenges current perceptions that most people with a sports-related mTBI recover within 10 to 14 days," write Stephen Kara, MBChB, of Axis Sports Medicine, Auckland, New Zealand. The findings also question the belief that children recover more slowly after sports-related concussion, and highlight the importance of early access to care after mTBI.

New Evidence on Expected Recovery Time After Sports-Related mTBI

The researchers analyzed recovery time in 594 patients with sports-related mTBI treated at their concussion clinic over a two-year period. (Mild traumatic brain injury and concussion refer to the same injury, but mTBI is the preferred scientific term.) All patients in the study were seen on average 8 days after injury. Seventy-seven percent were male. The average age was 20 years; about 7.5 percent of patients were children under age 12.

Patients were managed in a standardized assessment and management protocol, following current international guidelines (2017 Concussion in Sport Group [CISG] consensus statement). The protocol included an initial period of "relative rest" for 48 hours, with gradually increasing cognitive and physical activity. Patients were re-evaluated at 14 days post injury and then every two weeks until clinical recovery - defined by symptom scores, resolution of any abnormalities on initial examination, and demonstration of exercise tolerance.

At 14 days, mTBI symptoms had resolved in only 45 percent of patients - meaning that 55 percent did not yet have clinical recovery. In contrast, current CISG guidelines state that 80 to 90 percent of sports-related concussions resolve within seven to ten days.

Clinical recovery rate increased to 77 percent at four weeks after injury and 96 percent at eight weeks. Recovery time was similar across age groups; that's in contrast to the CISG and other previous statements that children have longer times to clinical recovery after mTBI.

Recovery times were longer for female athletes, as well as for patients with certain "concussion modifiers" (history of migraine or mental health issues) previously linked to prolonged recovery. Patients who waited longer before their first visit to the concussion clinic were also at increased risk of prolonged recovery time: "For a seven-day increase in time to initial appointment, we could expect an approximate 15 percent increase in the number of days until clinical recovery," according to the authors.

The study provides new insights into recovery times after sports-related mTBI, in a large group of patients receiving standardized, guideline-based care. The results suggest that less than half of patients have clinical recovery within 14 days.

"This rate of recovery is slower than described in previous CISG, and other position statements," Dr. Kara and colleagues write. " We believe that our data may reflect the natural recovery timeline for those with a sports-related mTBI."

Dr. Kara and coauthors note that their standardized concussion protocol - including early active rehabilitation and equal access to medical resources - led to similar recovery times regardless of age or level of sport. They also found faster recovery time among patients who get medical attention more quickly after injury.

"Early access to care after mTBI leads to faster recovery," Dr. Kara comments. "It enables physicians and therapists to empower patients to be actively involved in their recovery from both a physical and cognitive perspective, supported by a clinical recovery protocol."

Credit: 
Wolters Kluwer Health

Mathematical model could lead to better treatment for diabetes

CAMBRIDGE, MA -- One promising new strategy to treat diabetes is to give patients insulin that circulates in their bloodstream, staying dormant until activated by rising blood sugar levels. However, no glucose-responsive insulins (GRIs) have been approved for human use, and the only candidate that entered the clinical trial stage was discontinued after it failed to show effectiveness in humans.

MIT researchers have now developed a mathematical model that can predict the behavior of different kinds of GRIs in both humans and in rodents. They believe this model could be used to design GRIs that are more likely to be effective in humans, and to avoid drug designs less likely to succeed in costly clinical trials.

"There are GRIs that will fail in humans but will show success in animals, and our models can predict this," says Michael Strano, the Carbon P. Dubbs Professor of Chemical Engineering at MIT. "In theory, for the animal system that diabetes researchers typically employ, we can immediately predict how the results will translate to humans."

Strano is the senior author of the study, which appears today in the journal Diabetes. MIT graduate student Jing Fan Yang is the lead author of the paper. Other MIT authors include postdoc Xun Gong and graduate student Naveed Bakh. Michael Weiss, a professor of biochemistry and molecular biology at Indiana University School of Medicine, and Kelley Carr, Nelson Phillips, Faramarz Ismail-Beigi of Case Western Reserve University are also authors of the paper.

Optimal design

Patients with diabetes typically have to measure their blood sugar throughout the day and inject themselves with insulin when their blood sugar gets too high. As a potential alternative, many diabetes researchers are now working to develop glucose-responsive insulin, which could be injected just once a day and would spring into action whenever blood sugar levels rise.

Scientists have used a variety of strategies to design such drugs. For instance, insulin might be carried by a polymer particle that dissolves when glucose is present, releasing the drug. Or, insulin could be modified with molecules that can bind to glucose and trigger insulin activation. In this paper, the MIT team focused on a GRI that is coated with molecules called PBA, which can bind to glucose and activate the insulin.

The new study builds on a mathematical model that Strano's lab first developed in 2017. The model is essentially a set of equations that describes how glucose and insulin behave in different compartments of the human body, such as blood vessels, muscle, and fatty tissue. This model can predict how a given GRI will affect blood sugar in different parts of the body, based on chemical features such as how tightly it binds to glucose and how rapidly the insulin is activated.

"For any glucose-responsive insulin, we can turn it into mathematical equations, and then we can insert that into our model and make very clear predictions about how it will perform in humans," Strano says.

Although this model offered helpful guidance in developing GRIs, the researchers realized that it would be much more useful if it could also work on data from tests in animals. They decided to adapt the model so that it could predict how rodents, whose endocrine and metabolic responses are very different from those of humans, would respond to GRIs.

"A lot of experimental work is done in rodents, but it's known that there are lots of imperfections with using rodents. Some are now quite wittily referring to this situation as 'lost in [clinical] translation,'" Yang says.

"This paper is pioneering in that we've taken our model of the human endocrine system and we've linked it to an animal model," adds Strano.

To achieve that, the researchers determined the most important differences between humans and rodents in how they process glucose and insulin, which allowed them to adapt the model to interpret data from rodents.

Using these two variants of the model, the researchers were able to predict the GRI features that would be needed for the PBA-modified GRI to work well in humans and rodents. They found that about 13 percent of the possible GRIs would work well in both rodents and humans, while 14 percent were predicted to work in humans but not rodents, and 12 percent would work in rodents but not humans.

"We used our model to test every point in the range of potential candidates," Gong says. "There exists an optimal design, and we found where that optimal design overlaps between humans and rodents."

Analyzing failure

This model can also be adapted to predict the behavior of other types of GRIs. To demonstrate that, the researchers created equations that represent the chemical features of a glucose-responsive insulin that Merck tested from 2014 to 2016, which ultimately did not succeed in patients. They now plan to test whether their model would have predicted the drug's failure.

"That trial was based on a lot of promising animal data, but when it got to humans it failed. The question is whether this failure could have been prevented," Strano says. "We've already turned it into a mathematical representation and now our tool can try to figure out why it failed."

Strano's lab is also collaborating with Weiss to design and test new GRIs based on the results from the model. Doing this type of modeling during the drug development stage could help to reduce the number of animal experiments needed to test many possible variants of a proposed GRI.

This kind of model, which the researchers are making available to anyone who wants to use it, could also be applied to other medicines designed to respond to conditions within a patient's body.

"You can envision new kinds of medicines, one day, that will go in the body and modulate their potency as needed based on the real-time patient response," Strano says. "If we get GRIs to work, this could be a model for the pharmaceutical industry, where a drug is delivered and its potency is constantly modulated in response to some therapeutic endpoint, such as levels of cholesterol or fibrinogen."

Credit: 
Massachusetts Institute of Technology

International study completes the largest genetic map of psychiatric disorders so far

image: The research group in Psychiatry, Mental Health and Addictions of the Vall d'Hebron Research Institute (VHIR) and the CIBERSAM.

Image: 
Vall d'Hebron Research Institute (VHIR) and the CIBERSAM.

An international study published in the journal Cell, has described 109 genetic variants associated with eight psychiatric disorders: autism, ADHD, schizophrenia, bipolar disorder, depression, obsessive-compulsive disorder and Tourette Syndrome, in a total of about 230,000 patients worldwide.

Among the participants in the new study -the most ambitious and detailed study published so far on the genetics of psychiatric disorders- are the researchers Bru Cormand and Raquel Rabionet, from the Faculty of Biology and the Institute of Biomedicine of the University of Barcelona (IBUB), the Research Institute Sant Joan de Déu (IRSJD), the Rare Diseases Networking Biomedical Research Centre (CIBERER), and Marta Ribasés, Josep Antoni Ramos-Quiroga and other members of the research group in Psychiatry, Mental Health and Addictions of the Vall d'Hebron Research Institute (VHIR) and the Mental Health Networking Biomedical Research Centre (CIBERSAM).

The international study is promoted by the Psychiatric Genomics Consortium -the most ambitious international platform on genetics of psychiatric conditions- and is led by the expert Jordan W Smoller, from Harvard University (United States). Apart from listing potential genetic predisposition (or resilience) factors to pathologies, this study determines the specific genes that the different pathologies share and completes the genetic map of psychiatric disorders.

A new genetic perspective on psychiatric disorders

About 25% of the world population is affected by some type of psychiatric disease that can alter intellectual ability, behavior, affectivity and social relations. The new study -based on 230,000 patients and 500,000 controls- analyzes the genetic base shared by eight psychiatric pathologies and defines three groups of highly genetically-related disorders: those which respond to compulsive behaviors (anorexia nervosa, obsessive-compulsive disorder); mood and psychotic disorders (bipolar disorder, major depression and schizophrenia) and early-onset neurodevelopmental disorders (autism spectrum disorder, ADHD and Tourette syndrome). In this context, the VHIR team participated with a sample of 500 adult people with ADHD and 400 healthy controls.

"Those disorders listed in the same group tend to share more risk genetic factors between them than with other groups. Moreover, we saw that these groups built on the basis of genetic criteria match with the clinical output", notes Bru Cormand, professor at the Department of Genetics, Microbiology and Statistics and head of the Neurogenetics Research Group at the UB.

"However, the new study does not put emphasis on the genes shared by members of a particular group but on the genes shared by the highest number of disorders", continues Cormand. "That is, those factors that would somehow give way to a 'sensitive' brain, more likely to suffer from any psychiatric disorder. And the fact that this could be one or another disorder would depend on specific genetic factors, not forgetting about the environmental factors".

Many psychiatric disorders show comorbidities, they tend to co-occur, sometimes in a sequential manner. Therefore, it is quite likely for a patient to show more than one disorder over his/her life.

According to the results, a gene related to the development of the nervous system -DCC- is a risk factor for all eight studied disorders. Also, the RBFOX1 gene, which regulates the splicing in many genes, is involved in seven out of the eight disorders. In addition, ADHD and depression share 44% of those genetic risk factors that are common in the general population. Regarding schizophrenia and bipolar disorder, these figures reach 70%. According to the expert Antoni Ramos-Quiroga, "these results help people with ADHD so they can understand the disorder and also why they can suffer from depression more frequently. Furthermore, this is a new scientific evidence that ADHD can persist over life, and be present in adults. We hope this helps to reduce the social stigma regarding ADHD and the other mental illnesses".

"We now know this situation regarding psychiatric disorders can be explained, in part, by genetics. Therefore, regarding the case of someone with ADHD, we can estimate the genetic risk to develop other disorders s/he does not suffer from yet -for instance, drug addiction- and take preventive measures if the risk is high. However, these predictions are just probabilistic and not fully deterministic", notes the researcher.

Expression of risk factors in psychiatric disorders

Apart from genomics, the study focuses on the analysis of functional aspects of the genetic risk variants: for instance, the impact on gene expression in space (which organs, specific regions of the brain, tissues and even cells do express the disease genes) and in time (in what developmental phase of the individual these activate). Moreover, it analyzes the genome at a tridimensional level to detect potential relations between risk genetic variants and distant genes.

One of the most relevant findings of the study reveals that those genes that are risk factors for more than one disorder -genes with pleiotropic effects- are usually active during the second trimester of pregnancy, coinciding with a crucial stage in the development of the nervous system.

Oddly enough, some genetic variations can act as genetic risk factors in a certain disorder but they have a protecting effect in other cases. According to the lecturer Raquel Rabionet, "in the study, we identified eleven areas of the genome in which the effects are opposed in different pairs of disorders; that is, protection in one case, and susceptibility in the other. This could make sense in some instances in which there would be a genetic variant with contrary effects in ADHD -a disorder usually related to obesity- and anorexia.

"However -notes Rabionet- regarding the neurodevelopmental disorders such as autism and schizophrenia, there are genetic variants with opposite effects and others that work in the same direction. This suggests that the genetics of psychiatric disorders is more complex than what we thought and we are still far from solving this puzzle".

Hereditary genetics versus environmental factors

Alterations in a single DNA nucleotide -single nucleotide polymorphism (SNP)- explain less than a third of the genetics of these pathologies. The other two thirds may correspond to other types of genetic changes -such as rare variants- which are not that common in the human genome.

"Psychiatric disorders have a multifactorial origin", note the experts. For instance, thanks to studies with twins we know ADHD has a 75% genetic load and the remaining 25% would be explained by environmental factors (traumatic experiences during childhood, exposure to toxins, etc.)".

"This panorama could be expanded to the other psychiatric disorders we studied, because the contribution of genetics is generally higher than 50% and SNPs would always explain less than a half of this percentage. That is, SNPs have an important weight but there are many factors yet to be explored", note Cormand and Rabionet, who -as part of the study- worked on the group of patients with ADHD, anorexia or obsessive-compulsive disorder in Catalan hospitals.

Exploring new frontiers of human genetics

The study published in the journal Cell broadens the horizon of knowledge of a previous study (Nature Genetics, 2013), promoted by the Psychiatric Genomics Consortium on a base of 32,000 patients and 46,000 controls and five disorders (autism, ADHD, schizophrenia, bipolar disorder and depression). The conclusions of the new article improve the ones from the previous study, which analyzed with a global perspective the shared genetics of mental disorders but did not point at specific genes.

In the future, one of the priorities of the Consortium will be to complete the genetic landscape of mental disorders through the analysis of other genetic variations -for instance, the variations of the number of copies or CNVs- that affect large DNA segments. From an epigenetic perspective -in particular the methylation of DNA- we want to analyze the interactions between genes and environment, which could be decisive in psychiatry.

"It will be important to understand how genetic alterations are translated to the phenotype -the disorder- and this involves studying the function of every single gene identified in the genomic studies (using animal or cell models). In any case, the objective is to use genetics to improve and customize the diagnosis, prognosis and therapy of these pathologies which may be highly disabling for the affected people", note Bru Cormand and Raquel Rabionet.

Credit: 
University of Barcelona

Adding smoking cessation to lung cancer screening can reduce mortality by 14%

Denver--March 9, 2020--Including smoking cessation with existing lung cancer screening efforts would reduce lung cancer mortality by 14 percent and increase life-years gained by 81 percent compared with screening alone, according to a study from Rafael Meza from the University of Michigan and colleagues and published in the Journal of Thoracic Oncology, a publication of the International Association for the Study of Lung Cancer.

Annual lung cancer screening with low-dose computed tomography (LDCT) is recommended for adults aged 55-80 with a greater than 30 pack-year smoking history who currently smoke or quit within the previous 15 years. Since about 50% of screen-eligible individuals are still current smokers, cessation interventions at the point of screening are recommended. However, information about the short- and long-term effects of joint screening and cessation interventions is limited.

Dr. Meza and colleagues from the University of Michigan and Georgetown University used an established lung cancer simulation model to project the impact of cessation interventions within the screening context on lung cancer and overall mortality for the 1950 and 1960 US birth-cohorts. Two million individual smoking and life histories were generated per cohort. Simulated individuals were screened annually according to current guidelines and different assumptions of screening uptake rates. Dr. Meza's team then simulated a cessation intervention at the time of the first screen, under a range of efficacy assumptions.

Point-of-screening cessation interventions would greatly reduce lung cancer mortality and delay overall deaths compared to screening alone. For example, under a 30% screening uptake scenario, adding a cessation intervention at the time of the first screen with a 10% success probability for the 1950 birth-cohort would further reduce lung cancer deaths by 14% and increase life-years gained by 81% compared with screening alone.

However, the actual gains are highly sensitive to the variation in screening uptake and cessation probability. Dr. Meza said that even mildly effective cessation interventions could greatly enhance the impact of LDCT screening programs. This is because cessation not only reduces the risk of lung cancer, but also would prevent other tobacco-related diseases such as chronic obstructive pulmonary disease (COPD) and cardiovascular disease. Ms. Pianpian Cao, the study's first author and a doctoral student at the University of Michigan, said that most of these great benefits won't be realized unless lung screening uptake is improved. So more work is needed to promote lung cancer screening and facilitate access, particularly for those at highest risk.

The researchers concluded that further evaluation of specific cessation interventions within lung screening, including costs and feasibility of implementation and dissemination, are needed to determine the best possible strategies and realize the full promise of lung cancer screening.

Credit: 
International Association for the Study of Lung Cancer

Fresh groundwater flow important for coastal ecosystems

image: Fresh groundwater bubbling up at the coastline in Gunung Kidul, Java, Indonesia. The results highlight that abundant groundwater discharge like this spot is only found a small proportion of the world's coastlines.

Image: 
Nils Moosdorf, ZMT Bremen

Groundwater is the largest source of freshwater, one of the world's most precious natural resources and vital for crops and drinking water. It is found under our very feet in the cracks and pores in soil, sediments and rocks. Now an international research team led by the University of Göttingen has developed the first global computer model of groundwater flow into the world's oceans. Their analysis shows that 20% of the world's sensitive coastal ecosystems - such as estuaries, salt marshes and coral reefs - are at risk of pollutants transported by groundwater flow from the land to the sea. The research was published in Nature Communications.

The researchers quantified groundwater flow in coastal regions worldwide by combining a newly designed computer model code with a global data analysis of topography, groundwater replenishment and characteristics of the layers of rock below the surface. Their results show that although the flow of fresh groundwater is very low, it is highly variable. This means that for small areas of the coastline, the flow is high enough to act as an important source of freshwater. However, when polluted or carrying an excess of nutrients due to human activity, this actually poses risks to sensitive coastal ecosystems.

The new results question earlier claims that fresh groundwater flow influences the carbon, iron and silica budget of the oceans as a whole. However, the local effects of groundwater flow along coastlines are important. Groundwater provides a fresh water resource that has been and is still essential in many places around the world. Although this is still poorly understood, the mixing of fresh groundwater and seawater may support local ecosystems that are adapted to slightly salty water. The largest negative effect on coastal ecosystems comes from nutrients such as nitrogen and pollutants, which people have introduced on the land, and which then seeps towards the coast. It may take years or even decades to flow to the sea where it will then affect coastal marine ecosystems.

As first author, Dr Elco Luijendijk, University of Göttingen Department of Structural Geology and Geodynamics, says, "We very much hope these new results and the data our model has revealed will motivate follow-up from more detailed studies. It is important to monitor and understand the effects of fresh groundwater flow on coastal ecosystems, especially in regions that have so far not been studied in detail, such as large parts of South America, Africa and southern Asia and many tropical islands."

Credit: 
University of Göttingen

Clotting problem

image: UD's Velia Fowler, professor and chair of the Department of Biological Sciences, has been researching a blood disorder called MYH9, which impedes the typical clotting process.

Image: 
Illustration by Jeffrey C. Chase

Ouch, you've cut your finger! As you fumble to grab a tissue, the paramedics in your blood are already rushing to the scene. These blood cells, called platelets, morph in shape from round to spiny, sticking to each other and to the injured blood vessel walls, to begin patching the gash. The platelets join together with other proteins to form a mesh-like plug--a clot--to stop the bleeding.

But for the 1 in 25,000 people estimated to have MYH9-related disorders, caused by mutations in the MYH9 (myosin) gene, their blood doesn't clot so well, resulting in a range of health issues--kidney failure, heavy menstrual periods, cataracts, hearing loss.

What's at fault? In research published in Blood, the journal of the American Society of Hematology, University of Delaware Professor Velia M. Fowler and her collaborators at UD and the National Institutes of Health reveal a number of wrong moves by blood cells on their creeping, crawling journey toward platelet formation.

While platelets normally are tiny, less than one-tenth the size of red blood cells, and the average person generates some 40 billion of these clotting cells a day, people with MYH9-related disorders have scant numbers of them. What's more, their platelets are jumbo-sized and can even be as big as red blood cells.

"In people with MYH9-related disorders, the platelets are few and they are just way too big," said Fowler, professor and chair of UD's Department of Biological Sciences. "They look like bluish giants in our stains under the microscope."

Studying cells from mice that mimic the human disease, Fowler and her team began looking for defects in the platelet-making process. Platelets originate from humongous cells called megakaryocytes, which, in turn, are derived from stem cells, in the soft, gelatinous marrow at the center of your bones. These megakaryocytes creep and crawl from the bone marrow to neighboring blood vessels--the sinusoids--which have leaky walls that allow other cells to squeeze through. This is where the megakaryocytes extend branch-like arms called proplatelets into the blood vessel, and the circulating blood shears them off into many small platelets.

"Megakaryocytes are really, really large cells that glide forward pulling their large cell body along behind them like a snail dragging its shell," Fowler said. "They use proteins like myosin and actin, which regulate muscle contractions, to move from here to there in a process called cell motility."

How the cells know to go in a certain direction is still somewhat a mystery, Fowler said, but involves the ability of the megakaryocyte to sense chemoattractant molecules released from the blood vessels, similar to a dog following a scent.

The team tracked and filmed these megakaryocytes during their migration from bone to blood vessel. Their focus was three mouse-cell lines, each representing a different known mutation in the myosin-9 protein molecule (for which the MYH9 blood disorder is named).

Using an inverted microscope, which allows researchers to view and film samples of live cells from below rather than from the top down, the researchers recorded the direction and distance these megakaryocytes traveled. When they plotted the data, they could see that the mutant cells were all going in the wrong direction like a pack of bloodhounds that had lost their sense of smell.

"They have lost their way somehow, but we don't really know why," Fowler said.

The mutant cells also move in erratic ways: too slowly or too randomly or much faster than they should, sometimes almost in a hyper state.

"The megakaryocyte cells can't get there--to the blood vessel--so you can't get platelets," Fowler said, "but the reason they can't get there is different for each mutation."

Specifically, cells with the R702C mutation experience a loss of myosin contractility--the ability of their microscopic muscle-like cellular structures to contract--making them too slow; cells with the D1424N mutation gain greater contractility resulting in rapid and at times hyperactive movement; and cells with the E1841K mutation produce contractility at random.

Based on these findings, Fowler said, personalized drug therapies and treatments would be needed to enhance or reduce the cells' directionality and movement issues, depending on the patient-specific mutation.

"Just as megakaryocyte migration properties are affected by improper MYH9 myosin function, it is also possible that clots formed by the platelets carrying these mutations are unstable," Fowler said. "Further hematological analysis of platelet properties from MYH9-RD patients will be required to determine if these mutations affect clot formation. Since many patients with MYH9-RD also develop cataracts, hearing loss and kidney problems, our study can also shed light on the causes for other defects associated with this disease in patients."

Fowler, who began this study at the Scripps Research Institute, where she worked prior to joining the UD faculty in 2019, has nothing but praise for her research team, including the first author on the paper, Kasturi Pal, a former postdoctoral researcher at UD.

"Dr. Pal was just fantastic--she had never worked with megakaryocytes before. But she is one of these people who takes risks, initiates collaborations and responds to constructive criticism," Fowler said. "I'm proud to have had a role in mentoring her."

Credit: 
University of Delaware

University of Surrey's 'SMART' study awarded £426k to make multilingual content accessible

Today’s modern world is characterised by live multimedia and multilingual content, such as breaking news and TV programmes and different types of live events like conferences, parliamentary debates, and interviews. Such content is not accessible to everyone and the language service industry has struggled to keep up with the current multilingual content boom.

Respeaking has so far been used to produce intralingual subtitles, i.e. in the same language, and is the most well-established method today to subtitle live TV programmes in the UK. The University of Surrey’s ‘Shaping Multilingual Access through Respeaking Technology’ (SMART) project aims to investigate how this technique can be adjusted to produce interlingual subtitles, i.e. in a different language, and how this could impact society.

Interlingual Respeaking (IRSP) is a complex and hybrid practice combining speech recognition technology, human interpreting and subtitling skills that has the potential to allow access to live multimedia and multilingual content for a wider audience, thus bridging the gap between hearing and non-hearing people and native and non-native speakers. The range of applications is extremely wide-reaching, including every-day interactions such as university lectures, court cases, live conferences, festivals, and museums tours.

The project is led by Dr Elena Davitti, Senior Lecturer at The University of Surrey’s Centre for Translation Studies (CTS), a centre of excellence dedicated to cutting-edge research, scholarship and teaching in translation and interpreting. SMART brings together an international consortium of academic and industrial partners. Academics include Dr Simon Evans, Lecturer in Neuroscience at Surrey’s School of Psychology, Professor Lucile Desblache from the University of Roehampton, Dr Pablo Romero-Fresco, University of Vigo and University of Roehampton, and Annalisa Sandrelli, University of International Studies of Rome-UNINT. Industrial partners include Ai-Media, Sky and Sub-ti Ltd.

Tony Abrahams, Co-Founder and CEO of Ai-Media, said: “As a global provider of quality live multilingual captioning, transcription and translation, Ai-Media is delighted to support this groundbreaking project.”

Federico Spoletti, Managing Director of Sub-Ti Ltd, said: “Sub-Ti is very excited to be involved in the University of Surrey’s SMART project. Live subtitles should be accessible to all audiences, beyond sensory or language barriers, and this project will certainly help interlingual communication progress in the future.”

The project will design bespoke training courses to equip language professionals with optimal interlingual respeaking skills and will develop a Best Practice Guide which will provide guidelines to educate service providers on how to make live events and broadcasts accessible.

Dr Davitti said: “SMART’s vision is to contribute barrier-free access to information, entertainment and culture by investigating IRSP, a technique that can broaden the concept of accessibility, empowering different user groups, ultimately enabling a more inclusive and integrated society."

SMART builds on the success of the recently expanded CTS, following the 2019 Expanding Excellence in England (E3) award from Research England. CTS now has a new interdisciplinary team set to pursue CTS’ vision of promoting a human-centric approach to technology use in translation, interpreting and related forms of communication by bringing together traditional human-based research practices with cutting-edge advances in artificial intelligence.

Find out more about the exciting work going on at CTS.

Credit: 
University of Surrey

Food prices after a hard Brexit could increase by £50 per week

The effects of Brexit on different food types and what this will mean for families has been measured by research from the University of Warwick. Using structured expert judgement they combine several experts estimates mathematically to give an estimate for the panel as a whole

They estimate that tea, coffee and cocoa will be least effected, and that meat, dairy and jams would be most effected

If we leave with a deal which is minimally destructive, a family of four will spend an extra £5.80 per week for a healthy diet, and up to £18.17 under a deal that is similar to the deal the UK had under membership with the EU.

If we leave without a deal under a hard Brexit a family of four will see an increase up to £20.98 per week, and possibly up to as much as £50.98 per week

A hard Brexit could result in a family of four seeing their food prices increase to up to £50.98 per week researchers at the University of Warwick have found. If we leave with a deal the increase could be as little as £5.80 per week, or £18.17.

A panel of industry experts have estimated the effects of Brexit on different food types and what this would mean for families in the best and worst case scenarios.

Researchers at the University of Warwick then used each expert's estimates, and combined them mathematically to give an estimate for the panel as a whole, their findings have been published In the paper 'Anticipated impacts of Brexit scenarios on UK food prices and implications for policies on poverty and health: a structured expert judgement approach' in the journal BMJ Open.

Importantly, these industry and academic experts have not simply estimated the most likely changes, but also how large and how small those changes can plausibly be. This is important to be able to plan for reasonable worst and best case scenarios.

The experts estimate that Tea, Coffee & cocoa will be least affected by Brexit and that meat, dairy and jams would be most affected.

The most minimally disruptive deal would be one similar to what we had under membership of the EU, this will likely cost a family of four an additional £5.80 per week for a healthy diet, and it could plausibly be as much as £18.17.

Under a hard Brexit, the costs and the uncertainty are much higher: the family of four would likely see an increase of £20.98 per week, but it could be as much as £50.98 per week.

Lead investigator Dr Martine Barons, from the Department of Statistics at the University of Warwick comments:

"We conducted the research using estimates from ten specialists with expertise in food procurement, retail, agriculture, economics, statistics and household food security. We then combined in proportions used to calculate Consumer Price Index food basket costs, median food price change for Brexit with a Deal is expected to be increased by 6%, and with No-deal it is expected to increase 22%.

"Food security in the UK is a topical issue, according to the Trussell Trust over the last five year food bank use increased 73%, and this could increase for families who are unable to absorb these increased costs. There could also be reductions in diet quality leading to long term health problems."

The study was conducted in July 2018, when a hard Brexit seemed unlikely. Since more is now known about the intentions of all parties and how industries have been preparing for Brexit, the University of Warwick has supported Dr Martine Barons, with strategic priority funds to update the estimates, and this work is currently in progress.

Credit: 
University of Warwick

Homeless Health Research Network releases evidence-based clinical guideline

image: Homeless Health Research Network

Image: 
Homeless Health Research Network

A collaborative approach is required to build healthcare pathways that will end homelessness in Canada, says the Homeless Health Research Network, a pan-Canadian team of experts including researchers from McGill University. Clinicians can play a role by tailoring their interventions using a comprehensive new clinical guideline on homelessness published in the Canadian Medical Association Journal.

The guidelines aim to inform clinicians and encourage collaboration with community organizations and policy-makers around priority steps and evidence-based interventions to treat homeless and precariously-housed people at risk of homelessness.

"It's important that clinicians get involved in ending homelessness because they are so well-placed to make a difference," says co-author Dr. Anne Andermann, an Associate Professor in the Department of Family Medicine and Director of Community-Oriented Primary Care at McGill. "As a first step, we can learn to adapt our clinical approach to better address patient needs in a more integrated way - including their physical health, mental health, and social challenges."

A network of clinicians, academics, and governmental and nongovernmental stakeholders called the Homeless Health Research Network, as well as five people with lived experience of homelessness, created the guideline. Among the co-authors of the guidelines are also two McGill medical students Sebastian Mott and Victoire Kpadé. A steering committee with representatives from across Canada helped coordinate the process.

"Housing is medicine," says Amanda DiFalco, fellow at the Institute of Global Homelessness, and who has experienced homelessness. "We need to integrate these guidelines into health policy and how we teach the next generation of clinicians."

The guideline, which will be updated every five years, recommends the following interventions to help patients who are homeless or vulnerably housed:

1. Permanent supportive housing: connect homeless or vulnerably housed people to a local housing coordinator or case manager to provide links to housing options

2. Income assistance: help people with income insecurity to find and access income-support resources

3. Case management: ensure people with mental health and substance use disorders access local mental health programs and psychiatric services

4. Opioid agonist therapy: provide access to opioid agonist therapy in primary care or referral to an addiction specialist for patients with chronic opioid use

5. Harm-reduction: identify appropriate management for people with substance use issues, or refer them to local addiction and harm reduction services

The homeless population is changing

The homeless population in Canada has changed considerably over the last 25 years, from mostly middle-aged men to increasing numbers of women, youth, Indigenous people, older adults and even families. The estimated homeless population in 2014 was 235,000, of whom 27.3% were women, 18.7% were youth, 6% were recent immigrants or migrants, and a growing number were veterans or seniors.

"Over 2 million Canadians have experienced hidden homelessness in their lifetime, meaning that they have used 'couch surfing' or other approaches to avoid staying in shelters or living on the street," says Dr. Andermann, who founded a community outreach clinic with St. Mary's Family Medicine Centre and the Multicaf food bank in Côte-des-Neiges.

Indigenous homelessness

In Canada, Indigenous peoples are eight times more likely to be homeless than non-Indigenous people. However, there are few Indigenous-led approaches to address homelessness.

To fill this gap, Indigenous historian and York University professor Jesse Thistle and Dr. Janet Smylie, a Métis family physician and research chair at Unity Health Toronto and the University of Toronto, are leading the development of a separate guideline to address Indigenous homelessness. Indigenous elders, researchers and scholars as well as people with experiences of homelessness helped develop the guideline.

While standard definitions of homelessness focus on housing precarity, the researchers based their guideline on a definition of Indigenous homelessness rooted in a breakdown of healthy relationships resulting from colonial disruptions. They propose four broad protocols for health and social service providers working with Indigenous peoples experiencing homelessness: situating one's self, keeoukaywin (visiting), hospitality, and treating people as you would treat your own relative.

Credit: 
McGill University

Advanced optical imaging technique may lead to structure-guided drug design

image: Clemson University biophysics associate professor Hugo Sanabria and an international team of researchers have demonstrated new optical imaging methods that may someday aid in structure-guided drug design.

Image: 
Ken Scar, Clemson University

CLEMSON, South Carolina -- A Clemson University College of Science researcher, together with a team of researchers primarily at Heinrich Heine University in Germany, developed and demonstrated new optical imaging methods to monitor a single molecule in action.

This fluorescence-based technique may accelerate the field of structural biology, helping scientists better understand how molecules are assembled, function, and interact, which in turn may aid in structure-guided drug design.

Hugo Sanabria, an associate professor of physics and astronomy at Clemson, with his colleagues used Förster resonance energy transfer (FRET) to study the lysozyme of the bacteriophage T4. They reported their findings in the paper titled "Resolving dynamics and function of transient states in single enzyme molecules," published on March 6, 2020, in Nature Communications.

According to co-author Claus A.M. Seidel, chair of the Institute for Molecular Physical Chemistry at Heinrich Heine University in Germany, this work underpins essential reaction steps of biomolecular machines (enzymes).

"Our FRET studies demonstrate the need of a third functional state in the famous Michaelis-Menten kinetics," said Seidel. "The Michaelis-Menten description is one of the best-known models of enzyme kinetics."

The centerpiece of this imaging tool is a FRET-based microscope, a sophisticated and powerful machine capable of visualizing biomolecules as small as a few nanometers.

To visualize biomolecules at work, Sanabria and colleagues placed two fluorescent markers on a set of molecules, which created a ruler at the molecular level. By using different locations of the markers, the team collected a set of distances that describe the shape and form of the observed molecule.

In essence, this process generated a collection of data points that were computationally processed, allowing the researchers to distinguish how the molecule looks and how it moves.

"We observe changes in the structure, and because our signal is time dependent, we can also get an idea of how the molecule is moving over time," Sanabria said.

In his study, Sanabria and his team of collaborators combined the FRET-based microscope with molecular simulations to examine lysozyme, an enzyme found in tears and mucus. Lysozyme destroys the protective carbohydrate chains surrounding bacteria's cell wall. Scientists widely use lysozyme to study protein structure and function because it's such a stable enzyme.

"We can track the lysozyme of the bacteriophage T4 as it processes its substrate at near atomistic level with unprecedented spatial and temporal resolution," Sanabria said. "We've taken the imaging field to a whole new level."

Sanabria's optical method revealed that the lysozyme structure is different than previously thought. Until now, scientists have largely determined the structures of proteins like lysozyme mainly using methods such as X-ray crystallography, nuclear magnetic resonance (NMR) spectroscopy, and cryo-electron microscopy.

"For the longest time, this molecule was considered a two-state molecule because of how it receives the substrate or cell wall of the target bacteria," he said. "However, we have identified a new functional state."

The team is helping to establish a database where their FRET-based structural models of similarly generated biomolecular models can be stored and accessed by other scientists. Together with the FRET community, the group is also working to establish recommendations for FRET microscopy.

Sanabria aims to apply his imaging methodology to other biomolecules. "This optical method can be used to study protein folding and misfolding or any structural organization of biomolecules," Sanabria said. "It can also be used for drug screening and development, which requires knowing what a biomolecule looks like in order for a drug to target it."

"This work is a milestone in structure determination using FRET to map short-lived functionally relevant enzyme states," Seidel said.

Credit: 
Clemson University

How new data can make ecological forecasts as good as weather forecasts

image: The Indian Ocean Dipole is one example of repeated and predictable climate patterns that create opposite extremes over large regions. When waters off of East Africa warm, it sets up heavy rains in some regions and drought in others. When East African waters cool, those rainfall patterns reverse. Plants and animals likely respond to these regional changes in climate.

Image: 
Illustration by E. Paul Oberlander, with permission from the Woods Hole Oceanographic Institution

MADISON, Wis. -- When El Nino approaches, driven by warm Pacific Ocean waters, we've come to expect both drenching seasonal rains in the southern U.S. and drought in the Amazon. Those opposite extremes have huge effects on society and are increasingly predictable thanks to decades of weather data.

Soon, University of Wisconsin-Madison ecologist Ben Zuckerberg thinks we'll be able to pull off the same forecasting feat for bird migrations and wildlife populations. That's because just as those recurring changes in climate have predictable consequences for humans, they also have predictable effects on plants and animals.

For instance, ecological predictions could help us prepare for diseases in crops or population crashes in endangered species. Good forecasting could tell us where conservation measures are needed most in the coming year or decade.

With a team of scientists, Zuckerberg published a paper March 5 in the journal Trends in Ecology and Evolution describing how species and ecosystems respond to the opposite extremes of climate across continents, induced by patterns such as El Nino. The team coined a name for these large-scale, opposing ecological outcomes, such as famine on one continent and a feast on another, dubbing them "ecological dipoles."

"Plant and animal populations respond to climate at continental scales," says Zuckerberg, who is leading an interdisciplinary team looking to unearth evidence of this global climate-ecology link. "Going forward, we want to know how do we observe this connection? How do we measure it? How do we track how these dynamics are changing?"

He and his team believe that a recent revolution in ecological data makes this possible. With the rise of citizen science, hundreds of thousands of global volunteers have been collecting quality data about the world around them. And the National Science Foundation has begun setting up ecological stations nationwide that mirror the ubiquitous weather stations we rely on for constant data collection.

"We are beginning that revolution right now in ecology where we are able to collect data at a scale that matches what climatologists have been able to use," says Zuckerberg. "Having data that's been collected over continental scales, in real time, and that spans decades is really what you need to analyze the regularity and changes in both climate and ecological dipoles."

The idea that climate affects ecosystems across big expanses is not entirely new. It's been clear for decades that plant and animal behavior can be synchronized across a region. One classic example is acorn production. In certain years, all the oak trees in an area will produce huge amounts of acorns, which in turn leads to population booms in squirrels and other animals. Most likely, climate helps organize this collective response. Better data will make it easier to spot these kinds of patterns across the globe.

Understanding this climate-ecology connection is more urgent than ever as Earth rapidly warms and its climate changes, says Zuckerberg. It's not clear how climate change will affect patterns like El Nino or the plants and animals that respond to those patterns. Getting a handle on how predictable climate extremes affect ecosystems will help researchers respond to changes as they arise.

Now with their theory laid out, Zuckerberg's team is beginning the first project to formally test the ecological dipole idea. They will use citizen science data to track deviations in normal bird migrations and the boom-and-bust cycle of seed production to try to identify a link back to climate across the entire continent.

For Zuckerberg, the fun comes from wrapping his head around this modern-day butterfly effect.

"Shifts in the climate system that can influence these ecological processes originate halfway across the world," he says. "And I love thinking about how these connections are going to change over time. It's really fascinating."

Credit: 
University of Wisconsin-Madison

Nationwide study shows disparities in outpatient care for common orthopaedic problems

March 9, 2020 - Racial/ethnic minorities, people with lower incomes, and other groups are less likely to receive office-based care for common musculoskeletal conditions, reports a nationwide study in Clinical Orthopaedics and Related Research® (CORR®), a publication of The Association of Bone and Joint Surgeons®. The journal is published in the Lippincott portfolio by Wolters Kluwer.

Some of the same characteristics are linked to higher use of more-expensive emergency department (ED) care for orthopaedic conditions, according to the new research by Nicholas M. Rabah and colleagues of Case Western Reserve University School of Medicine, Cleveland. "It is imperative for orthopaedic surgeons to continue to collaborate with policy makers to create targeted interventions that improve access to and use of outpatient orthopaedic care to reduce healthcare expenditures," the researchers write.

Patient Factors Linked to Lower Use of Outpatient Orthopaedic Care, Higher Use of ED Care

The study included data on more than 63,500 patients receiving office-based or ED care for common orthopaedic conditions between 2007 and 2015, drawn from the nationally representative Medical Expenditure Panel Survey. The study focused on eight categories of non-emergent musculoskeletal conditions--for example, osteoarthritis, fractures, and strains and sprains. (The study did not include spinal disorders, which can be treated by either neurosurgeons or orthopaedic surgeons.)

Several sociodemographic factors were linked to lower use of office-based care for musculoskeletal conditions. After adjustment for other characteristics, black and Hispanic patients were about 20 percent less likely to receive outpatient care, compared to white patients.

Use of outpatient orthopaedic care was also lower for Americans with household incomes below the federal poverty line, without at least a high school education, and without private insurance (either on public insurance or uninsured).

In contrast, patients with lower income, lower education, and public insurance status were more likely to receive ED care for these nonemergent musculoskeletal conditions. Hispanic patients also were more likely to receive ED care, although black patients were not. For most of the eight conditions studied, expenditures were significantly higher for ED care than for office-based care.

There are well-documented disparities in healthcare use in the United States. Musculoskeletal disorders are a major health burden, affecting more Americans than either cardiovascular or respiratory disease and accounting for more than $162 billion in healthcare spending per year (based on 2012-14 data).

Office-based care is thought to be the most appropriate site of care for common musculoskeletal conditions. The new study is one of the first to link specific sociodemographic factors to disparities in the use outpatient orthopaedic care.

Multiple factors may contribute to the observed disparities, including differences in health literacy, beliefs about health and disease, and lack of social support and resources to recognize diseases and make informed decisions. Mr. Rabah and coauthors conclude: "[O]rthopaedic surgeons should focus on improving communication with patients of all backgrounds to help them identify musculoskeletal symptoms that warrant office-based orthopaedic care versus ED care."

Credit: 
Wolters Kluwer Health

The Lancet: First study identifies risk factors associated with death in adults hospitalised with new coronavirus disease in Wuhan

Being of an older age, showing signs of sepsis, and having blood clotting issues when admitted to hospital are key risk factors associated with higher risk of death from the new coronavirus (COVID-19), according to a new observational study of 191 patients with confirmed COVID-19 from two hospitals in Wuhan, China, published in The Lancet.

Specifically, being of an older age, having a high Sequential Organ Failure Assessment (SOFA) score, and having d-dimer greater than 1 μg/mL are the factors that could help clinicians to identify patients with poor prognosis at an early stage.

The new study is the first time researchers have examined risk factors associated with severe disease and death in hospitalised adults who have either died or been discharged from hospital. In the study of 191 patients, 137 were discharged and 54 died in hospital. The authors note that interpretation of their findings might be limited by the study's sample size.

In addition, the authors present new data on viral shedding, which indicate that the median duration of viral shedding was 20 days in survivors (ranging from 8 to 37 days), and the virus was detectable until death in the 54 non-survivors.

While prolonged viral shedding suggests that patients may still be capable of spreading COVID-19, the authors caution that the duration of viral shedding is influenced by disease severity, and note that all patients in the study were hospitalised, two-thirds of whom had severe or critical illness. Moreover, the estimated duration of viral shedding was limited by the low frequency of respiratory specimen collection and the lack of measurable genetic material detection in samples.

"The extended viral shedding noted in our study has important implications for guiding decisions around isolation precautions and antiviral treatment in patients with confirmed COVID-19 infection. However, we need to be clear that viral shedding time should not be confused with other self-isolation guidance for people who may have been exposed to COVID-19 but do not have symptoms, as this guidance is based on the incubation time of the virus," explains co-lead author Professor Bin Cao from the China-Japan Friendship Hospital and Capital Medical University, China. [1]

He continues: "We recommend that negative tests for COVID-19 should be required before patients are discharged from hospital. In severe influenza, delayed viral treatment extends how long the virus is shed, and together these factors put infected patients at risk of dying. Similarly, effective antiviral treatment may improve outcomes in COVID-19, although we did not observe shortening of viral shedding duration after antiviral treatment in our study." [1]

According to co-author Dr Zhibo Liu from Jinyintan Hospital, China: "Older age, showing signs of sepsis on admission, underlying diseases like high blood pressure and diabetes, and the prolonged use of non-invasive ventilation were important factors in the deaths of these patients. Poorer outcomes in older people may be due, in part, to the age-related weakening of the immune system and increased inflammation that could promote viral replication and more prolonged responses to inflammation, causing lasting damage to the heart, brain, and other organs." [1]

For the first time, the study describes the complete picture of the progression of the COVID-19. The median duration of fever was about 12 days in survivors, which was similar in non-survivors. But the cough may last for a long time--45% of survivors still had cough on discharge. In survivors, dyspnoea (shortness of breath) would cease after about 13 days, but would last until death in non-survivors. The study also illustrates the time of the occurrence of different complications such as sepsis, acute respiratory distress syndrome (ARDS), acute cardiac injury, acute kidney injury and the secondary infection.

The new analysis includes all adults (aged 18 or older) with laboratory-confirmed COVID-19 admitted to Jinyintan Hospital and Wuhan Pulmonary Hospital after December 29, 2019, who had been discharged or died by January 31, 2020. These were the two designated hospitals for transferring patients with severe COVID-19 from across Wuhan up until February 1, 2020.

During the study, the researchers compared clinical records, treatment data, laboratory results, and demographic data between survivors who had been discharged from hospital and non-survivors. They looked at the clinical course of symptoms, viral shedding, and changes in laboratory findings during hospitalisation (eg, blood examinations, chest x-rays, and CT scans; see table 1 for full list), and used mathematical modelling to examine risk factors associated with dying in hospital.

On average, patients were middle-aged (median age 56 years), most were men (62%, 119 patients), and around half had underlying chronic conditions (48%, 91 patients)--the most common being high blood pressure (30%, 58 patients) and diabetes (19%, 36 patients; table 1). From illness onset, the median time to discharge was 22 days, and the average time to death was 18.5 days.

Compared with survivors, patients who died were more likely to be older (average age 69 years vs 52 years), and have a higher score on the Sequential Organ Failure Assessment (SOFA) indicating sepsis, and elevated blood levels of the d-dimer protein (a marker for coagulation) on admission to hospital (table 1 and 3).

Additionally, lower lymphocyte (a type of white blood cell) count, elevated levels of Interleukin 6 (IL-6, a biomarker for inflammation and chronic disease), and increased high-sensitivity troponin I concentrations (a marker of heart attack), were more common in severe COVID-19 illness (figure 2 and table 3).

The frequency of complications such as respiratory failure (98%, 53/54 non-survivors vs 36%, 50/137 survivors), sepsis (100%, 54/54 vs 42%, 58/137), and secondary infections (50%, 27/54 vs 1%, 1/137) were also higher in those who died than survivors (table 2).

The authors note several limitations of the study, including that due to excluding patients still in hospital as of Jan 31, 2020, and thus relatively more severe disease at an earlier stage, the number of deaths does not reflect the true mortality of COVID-19. They also point out that not all laboratory tests (eg, d-dimer test) were done in all patients, so their exact role in predicting in-hospital death might be underestimated. Finally, a lack of effective antivirals, inadequate adherence to standard supportive therapy, and high doses of corticosteroids, as well as the transfer of some patients to hospital late in their illness, might have also contributed to the poor outcomes in some patients.

Credit: 
The Lancet