Culture

Breakthrough science provides hope for disease that affects 1.5 million people in US

Today the prestigious New England Journal of Medicine (NEJM) publishes research led by Monash University Professor Eric Morand that offers the first real hope for the treatment of lupus, a disease which affects 1.5 million people in the US and more than 5 million globally, 90% women and for which there is no cure.

The results are of an international, three-year, Phase 3 trial of a potential new drug that treats this autoimmune disease (also known as systemic lupus erythematosus (SLE)).

Lupus is an autoimmune disease in which the immune system attacks healthy parts of the body. It is particularly insidious disease as it has a ten-year mortality of 10%, "which if you are diagnosed in your early twenties is a terrible outcome," according to Professor Morand, who oversaw the global trial in over 360 people with SLE.

The trial, called TULIP 2, evaluated AstraZeneca's anifrolumab and achieved a statistically-significant and clinically-meaningful reduction in disease activity versus placebo, with both arms receiving standard of care.

Professor Morand has also been key in developing new lupus assessment criteria - which because the disease involves a number of organs in the body - can be difficult to both diagnose and monitor.

According to Professor Morand, there has only been one new treatment approved for the disease in the last 60 years, which is not available on the Pharmaceutical Benefits Scheme in Australia.

Between 60% and 80% of adults with SLE show increased interferon-induced genes, which reflect overproduction of the immune protein Type 1 interferon. While previous attempts to block this protein in lupus have failed, the potential new treatment, anifrolumab, works by blocking the receptor on all cells in the body, aiming to reverse the triggering of lupus symptoms.

Professor Morand said that interferon is associated with other autoimmune diseases such as Scleroderma and Sjogren's disease "so there may be potential for using anifrolumab in the treatment of other interferon related diseases as well."

In the TULIP 2 trial, eligible patients received a fixed-dose intravenous infusion of anifrolumab or placebo every four weeks. TULIP 2 assessed the effect of anifrolumab in reducing disease activity - noting a significant effect in global disease activity measures.

The trial, from 2015 to 2018, involved 362 patients receiving either 300 mg of the drug or a placebo intravenously once every four weeks for 48 weeks. Benefit was measured using a defined clinical assessment of improvement in all organs as well as the number of flare ups (which see the patient experiencing fever, painful or swollen joints, fatigue, rashes or sores or ulcers in the mouth or nose). The volunteers were aged between 18 and 70 and had moderate to severe disease despite standard treatments. Patients with SLE typically die of organ failure.

The study found that - 52 weeks after the trial started - significantly more patients on the drug than the placebo had:

A reduction in overall disease activity in all active organs

improvement in lupus skin disease

A reduction in steroid drug doses

Reduced annual rate of flares

The TULIP 2 trial followed on from the TULIP 1 trial which failed to meet its primary outcome. The second trial, published in the NEJM, used a different endpoint. "Measurement of treatment response in SLE has been very problematic and this represents a kind of second breakthrough of this trial," Professor Morand said.

AstraZeneca will now work with regulators, to bring anifrolumab, a potential new medicine, to patients. The study was done in collaboration with colleagues in Japan, the UK, the US, France and South Korea.

Credit: 
Monash University

Concussions common among college students, more prevalent off the field than on

Concussions are more than twice as prevalent among college students than previously believed and significantly more likely to occur off the playing field than on, according to a three-year study published Dec. 18 in the journal JAMA Network Open.

The research, which looked at student health data from the University of Colorado Boulder, also found that incidence is slightly higher among females, and more concussions occur in August than any other month.

"This study shows how common head injuries are among this population and that concussions are not restricted to the athletic field," said Dr. John Breck, study co-author and lead physician for CU Boulder Medical Services. "Student health centers around the country should be training their staff in concussion recognition and putting systems in place to help concussed students get the evaluation and treatment they need."

The research, among the first to assess rates among the general college-age population, tracked concussion diagnoses during the academic year at the Wardenburg Student Health Center from August 2015 to May 2018 and through the CU Sports Medicine department, which treats varsity athletes, from 2016 to 2018.

In all, among roughly 30,000 public university undergraduates, about 340 concussions are diagnosed annually - an incidence rate of about one in 75 students per year, the study found.

Notably, 41% of students diagnosed said they had already had between one and three concussions; 5% reported four or more.

Across all years, whether varsity athletes were included or not, non-sport-related concussions outnumbered sport-related concussions.

Among the general undergraduate population, excluding varsity athletes, 64% of concussions were non-sport-related, while the remainder were sustained during organized competitive sports, such as club sports. Falls, such as slips on the ice or crashes on skateboards, accounted for 38% of concussions. Hits to the head, such as those sustained in a fight or accident, constituted 8.5%. Meantime, 6.5% resulted from motor vehicle accidents.

When varsity athletes were included, sport-related concussion incidence was 51 per 10,000 students per year and non-sport-related concussion incidence was 81 per 10,000 students per year. Overall concussion incidence was 132 per 10,000 students per year.

"There is a widely held perception that most concussions are sport-related. Our study shows it can happen to anyone, male or female, engaged in a variety of activities," said co-author Matt McQueen, an integrative physiology professor.

The study also found that, across all three years, concussion incidence soared in August.

"These data do not tell us why August had such high numbers, but anecdotally we know that August is a time of lower academic demand and higher risk-taking behavior," said Breck.

Among varsity athletes, females had a higher rate of concussion, with 54 females and 26 males sustaining concussions across two academic years. This finding is consistent with another recent study which found that concussions among females increased six-fold from 2003-2013, while the increase among males was 3.6-fold.

While it's uncertain exactly why females appear to be more susceptible to concussions, hormonal differences and differences in neck strength and head mass may play a role, said Breck.

Prior research by the Centers for Disease Control and Prevention found concussion incidence rates among 9 to 22-year-olds to be around 98 per 10,000 people per year. The World Health Organization has pegged rates for the general population at around 60 concussions per 10,000. The new study, which looked at only nine months, found it to be more than twice that.

"Our findings suggest that collegiate students, including the general population and varsity athletes, may be at an increased risk of concussion," the authors concluded.

They noted that previous studies relied either on survey self-reports, emergency room visits, or focused on varsity players, possibly resulting in an under-estimate. Greater awareness could also be encouraging more students to seek care - and that's a good thing, the authors say.

"Missing class and falling behind due to a head injury can be a significant detriment to a student's academic success," said Breck. "It's critical that they get high quality, evidence-based care as soon as possible so they can return to learning in a safe way with as little disruption in their education as possible."

Credit: 
University of Colorado at Boulder

Association of household with risk of first psychiatric hospitalization in Finland

What The Study Did: National registry data for 6.2 million people in Finland from 1996 to 2014 were used to examine how household income was associated with risk for a first admission to a psychiatric hospital for treatment of a mental disorder.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

Authors: Sami Pirkola, M.D., of the University of Tampere in Tampere, Finland, is the corresponding author.

(doi:10.1001/jamapsychiatry.2019.3647)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Nearly 9 million injured worldwide by fire, heat, and hot substances in 2017

Eight countries, including the US, accounted for half of all heat-related deaths in 2017

'Prevention should be the first priority in reducing intolerable number of injuries and deaths'

Heat-related injuries disproportionately kill the very young and very old

SEATTLE - Heat-related incidents resulted in nearly 9 million injuries and more than 120,000 deaths worldwide in 2017, according to a new scientific study.

Fires, heat, and hot substances, such as cooking oil or a hot stove, disproportionately kill young children and the elderly.

"Prevention should be the first priority in reducing the intolerable number of injuries and deaths," said Dr. Spencer James, senior author on the study and Lead Research Scientist at the Institute for Health Metrics and Evaluation at the University of Washington School of Medicine. "Especially as treatment for burns and related injuries remains relatively expensive and requires robust health care services not often available in low- and middle-income countries."

James and coauthors found the risk of dying from a fire, heat, or hot substance is greatest in Seychelles, where one in 15 injuries result in death. Laos follows closely in second at 1 in 17. Conversely, Singapore has the lowest risk among all nations at one in 1,000.

The age-adjusted incidence rate in China has increased by 46% since 1990, with new cases increasing from 935,000 to 1.3 million in 2017. In contrast, the US was among the countries with the largest decreases in the age-adjusted incidence rate, falling by 40% over the study period. The authors suggest this progress could be associated with various factors, including use of smoke detectors, building standards, and safety awareness. However, the US ranked fourth highest globally in overall heat-related deaths, with more than 5,500 in 2017.

Published today in the international medical journal BMJ Injury Prevention, the study is part of the annual Global Burden of Disease (GBD). It is the first of its kind to quantify the totality of injuries resulting from fire, heat, and hot substances, not just burns. Other injuries analyzed by IHME researchers include amputations, open wounds, muscle and tendon injuries, and fractures.

The analysis provides comparable estimates of mortality and morbidity across 195 countries and territories. Examples of incidents classified as "fire, heat, and hot substances" include:

fires;

explosive or inflammable material accidents;

ignited clothing;

smoke exposure;

contact with caustic or corrosive material; steam or other hot vapors; hot drinks, food, fats, or cooking oils; stoves, ovens, and other household appliances; heaters, radiators, or other hot pipes; hot engines and metals, etc.

It should be noted that for the purposes of this analysis "fire, heat, and hot substance" does not include injuries directly due to heat waves, climate change, interpersonal violence (e.g., acid attack), or self-immolation.

In an encouraging global trend, IHME researchers saw an overall decrease in heat-related injury burden between 1990 and 2017. For example, the global death and disability rates dropped by 47% and 24%, respectively, likely due to safety improvements, fire danger awareness, and increased access to quality health care.

Yet heat-related injuries continue to present the greatest burden in low- and middle-income countries. Lower-income areas are both more susceptible to fire, heat, and hot substances as a cause of injury, and also experience higher death rates. Despite some of the largest decreases in the age-adjusted rate of new cases since 1990, Laos, Indonesia, and Malaysia remain among the top countries with the highest risk of death given a heat-related injury in 2017.

Eight countries, seven of which are low- or middle-income, accounted for half of all heat-related deaths in 2017. They include India (27,027 deaths), China (10,836), Russia (7,063), United States (5,505), Nigeria (4,085), Pakistan (2,603), Democratic Republic of the Congo (2,093), and Ethiopia (2,013).

"It is imperative that health policymakers study these patterns to help inform safety efforts, prevention programs, and resource planning," said James. "But more research is still needed on smoking, types of cooking fuel, smoke alarm efficacy, synthetic clothing, and other factors leading to these injuries."

Additional findings include:

Worldwide, there were 8,991,468 new injuries from fire, heat, and hot substances in 2017, of which 120,632 resulted in death.

Children under 5 adults 60 and older have the highest mortality rate from fire, heat, and hot substances, but the highest rate of injury is between ages 5 and 30.

The leading cause of disability for victims of fire, heat, and hot substances was - by far - burns affecting less than 20% of the body.

A relatively small proportion of all disability can be attributed to burns affecting more than 20% of the body.

PERCENTAGE CHANGE IN INCIDENCE RATES (AGE-STANDARDIZED), 1990 to 2017

Greatest increase

1. Bermuda: 49.8%
2. Kiribati: 49.5%
3. Palestine: 46.5%
4. China: 46.0%
5. Turkey: 38.0%
6. Oman: 35.5%
7. Bahrain: 33.4%
8. Puerto Rico: 33.3%
9. Vietnam: 33.1%
10. Jamaica: 31.8%

Greatest decrease

1. Mauritius: -55.3%
2. Maldives: -50.9%
3. Greenland: -48.6%
4. Laos: -46.3%
5. Indonesia: -44.8%
6. United States: -40.4%
7. Timor-Leste: -38.9%
8. Cambodia: -36.6%
9. Brazil: -34.2%
10. Estonia: -33.7%

DEATH RATES (ALL AGES), 195 COUNTRIES AND TERRITORIES, 2017

Highest death rates

1. Belarus: 6.6 deaths per 100,000 people
2. Latvia: 6.2
3. Lesotho: 5.6
4. Russian Federation: 4.8
5. Georgia: 4.8
6. Ukraine: 4.3
7. Estonia: 4.3
8. South Sudan: 4.3
9. Swaziland: 4.3
10. Greenland: 4.2

Lowest death rates

1. Singapore: 0.23 deaths per 100,000 people
2. Bermuda: 0.37
3. Oman: 0.40
4. Nicaragua: 0.43
5. Colombia: 0.43
6. Switzerland: 0.48
7. Honduras: 0.48
8. São Tomé and Príncipe: 0.49
9. New Zealand: 0.51
10. Turkey: 0.52

DEATH RATES (ALL AGES), US STATES, 2017

Highest death rates

1. Alabama: 3.7 deaths per 100,000 people
2. Mississippi: 3.5
3. South Carolina: 3.2
4. Kentucky: 3.1
5. Arkansas: 2.9
6. Tennessee: 2.9
7. West Virginia: 2.8
8. Louisiana: 2.7
9. Maine: 2.7
10. Oklahoma: 2.6

Lowest death rates

1. California: 0.7 deaths per 100,000 people
2. Hawaii: 1.0
3. Arizona: 1.0
4. Nevada: 1.1
5. Florida: 1.1
6. Colorado: 1.1
7. Utah: 1.1
8. Washington 1.2
9. Wisconsin: 1.2
10. Idaho: 1.3

RISK OF DEATH GIVEN A HEAT-RELATED INJURY, 2017

Highest risk of death given a fire, heat, and hot substance injury

1. Seychelles: 6.5% of heat-related injuries resulted in death
2. Laos: 5.8%
3. Mauritius: 5.0%
4. Sri Lanka: 5.0%
5. Cambodia: 4.8%
6. Timor-Leste: 4.4%
7. Philippines: 4.1%
8. Indonesia: 4.0%
9. Malaysia: 3.6%
10. Thailand: 3.5%

Lowest risk of death given a fire, heat, and hot substance injury

1. Singapore: 0.1% of heat-related injuries resulted in death
2. New Zealand: 0.2%
3. Oman: 0.2%
4. Palestine: 0.3%
5. Slovenia: 0.3%
6. Turkey: 0.3%
7. Bermuda: 0.3%
8. Albania: 0.3%
9. Macedonia: 0.4%
10. Australia: 0.4%

Credit: 
Institute for Health Metrics and Evaluation

Together you're less alone

image: This young coppery titi monkey (Plecturocebus cupreus) benefits from paternal care. The males of the pair living titi monkeys are primarily taking care of the offspring.

Image: 
Photo: Sofya Dolotovskaya

Alone, as a pair or in groups - the diversity in social systems of primates is interesting because it may also provide insights into human social life. An evolutionary biologist from the German Primate Center - Leibniz Institute for Primate Research, together with a colleague from the University of Texas at San Antonio, investigated how different primate societies evolved and which factors may be responsible for transitions among them. The reconstructions showed that the evolution from a solitary way of life to group living usually occurred via pair living. Pair living thus served as a stepping stone for group living and therefore plays a key role in the evolution of social systems (Science Advances).

In the course of evolution, species had to adapt to changing environmental conditions. A crucial adaptation in this process is the modification of social behavior. About half of all primate species live in groups, around one third in pairs, the rest live solitarily. Why these different forms of social complexity evolved, how many transitions among them occurred and which factors led to the transitions was analyzed on the basis of genetic data and behavioral observations of 362 primate species.

Pair living, the association of a male and a female, plays a key role in the evolution of mammalian social systems, as males could achieve higher reproductive success if they did not bond to a single female. "Evolutionary biologists have been struggling for a long time to identify selective advantages of pair living for males," says Peter Kappeler, first author of the study and head of the Behavioral Ecology and Sociobiology Unit at the German Primate Centre. At first glance, the two current hypotheses on the development of pair living, the female distribution and the paternal care hypothesis, seem to be mutually exclusive. "In fact, our results indicate that the two factors may be complementary," says Kappeler. "Initially, a presumed ecological change in the habitat led to female spatial separation and solitary males, which previously had several females living in their territory, were subsequently only able to gain access to one female. Paternal care resulting from the pair formation in turn increased the survival probability of the offspring and thus reinforced pair living".

The further transition to group living was possible through an improvement of the ecological situation, which allowed related females to live in close proximity. These could then be joined by one or more males. "However, the pair bond typical for humans within larger social units cannot be explained with our results, since none of our recent ancestors lived solitarily. Nevertheless, the advantages of paternal care also may have led to a consolidation of pair living in humans," says Peter Kappeler.

Credit: 
Deutsches Primatenzentrum (DPZ)/German Primate Center

Email users should have 'more control' over post-mortem message transmission

image: Aston Business School research calls for email users to have more of a say over the way their messages are handled when they die.

Image: 
geralt

Email users should have far more control over the transmission of their messages upon death, a new study suggests.

Currently, Google and Microsoft, the main email providers, have contractual provisions in place which allows them to regulate the post-mortem transmission of emails regardless of copyright and succession laws.

But the study, led by Dr Edina Harbinja, Senior Lecturer in Media Privacy and Law at Aston Law School in Birmingham, suggests these provisions don't offer meaningful control over the assets of their users and the law fails to recognise this.

The paper, published in Death Studies, argues the owner of an email account should have the right to privacy after death, with the default communication of messages to third parties prohibited without the deceased's consent. This would mean amending legislation and policy to give users the choice of what happens to their emails after they have died.

In practice, for example, the law would be reformed to allow a personal representative to be given the authority to determine whether the deceased's emails should be deleted, or valuable unpublished works which only exist in an email account should pass on as copyright to heirs.

Dr Edina Harbinja said: "I have argued that the UK law surrounding digital assets post-mortem should be overhauled for many years.

"Not only is the legislation unclear, but it fails to take into account new developments such as encrypted communications.

"A reform of the law would mean that such confidential data would only be made available in line with the deceased's wishes."

Credit: 
Aston University

A self-healing sweat sensor (video)

image: A newly developed headband can measure electrolyte levels in sweat and can heal itself when cut or scratched during exercise.

Image: 
American Chemical Society

Wearable sensors that track heart rate or steps are popular fitness products. But in the future, working up a good sweat could provide useful information about a person's health. Now, researchers reporting in ACS Applied Materials & Interfaces have developed a headband that measures electrolyte levels in sweat. And unlike many previous sweat sensors, the device can heal itself when cut or scratched during exercise. Watch a video of the sensor in action here.

Human sweat contains biochemical markers, such as metabolites, electrolytes and heavy metals, that can indicate a person's health and even help diagnose some diseases. In recent years, scientists have developed sweat sensors in the form of patches, bandages and tattoos, but their performance can be impaired by natural movements such as walking, running, jumping or throwing. Also, if the sensors become scratched or broken, which can easily happen during exercise, they often cannot be repaired. Sung Yeon Hwang, Jeyoung Park, Bong Gill Choi and colleagues wanted to develop a wearable sweat sensor that could withstand vigorous exercise and quickly repair itself if damaged.

To make their self-healing sensor, the researchers coated carbon fiber thread electrodes with a citric acid-based polymer. When cut, the threads quickly rejoined through hydrogen bonding of the polymer. They sewed the threads, which could detect potassium and sodium ions, into a headband and added a wireless electronic circuit board that could transfer data to a smart phone. A human volunteer wore the headband while exercising on a stationary bike, and the sensor accurately tracked the electrolyte concentrations in their sweat over 50 minutes of exercise. During cycling, the researchers cut the sensor threads with scissors, and the threads healed and returned to normal operation in only 20 seconds.

Credit: 
American Chemical Society

Study suggests early-life exposure to dogs may lessen risk of developing schizophrenia

Ever since humans domesticated the dog, the faithful, obedient and protective animal has provided its owner with companionship and emotional well-being. Now, a study from Johns Hopkins Medicine suggests that being around "man's best friend" from an early age may have a health benefit as well -- lessening the chance of developing schizophrenia as an adult.

And while Fido may help prevent that condition, the jury is still out on whether or not there's any link, positive or negative, between being raised with Fluffy the cat and later developing either schizophrenia or bipolar disorder.

"Serious psychiatric disorders have been associated with alterations in the immune system linked to environmental exposures in early life, and since household pets are often among the first things with which children have close contact, it was logical for us to explore the possibilities of a connection between the two," says Robert Yolken, M.D., chair of the Stanley Division of Pediatric Neurovirology and professor of neurovirology in pediatrics at the Johns Hopkins Children's Center, and lead author of a research paper recently posted online in the journal PLOS One.

In the study, Yolken and colleagues at Sheppard Pratt Health System in Baltimore investigated the relationship between exposure to a household pet cat or dog during the first 12 years of life and a later diagnosis of schizophrenia or bipolar disorder. For schizophrenia, the researchers were surprised to see a statistically significant decrease in the risk of a person developing the disorder if exposed to a dog early in life. Across the entire age range studied, there was no significant link between dogs and bipolar disorder, or between cats and either psychiatric disorder.

The researchers caution that more studies are needed to confirm these findings, to search for the factors behind any strongly supported links, and to more precisely define the actual risks of developing psychiatric disorders from exposing infants and children under age 13 to pet cats and dogs.

According to the American Pet Products Association's most recent National Pet Owners Survey, there are 94 million pet cats and 90 million pet dogs in the United States. Previous studies have identified early life exposures to pet cats and dogs as environmental factors that may alter the immune system through various means, including allergic responses, contact with zoonotic (animal) bacteria and viruses, changes in a home's microbiome, and pet-induced stress reduction effects on human brain chemistry.

Some investigators, Yolken notes, suspect that this "immune modulation" may alter the risk of developing psychiatric disorders to which a person is genetically or otherwise predisposed.

In their current study, Yolken and colleagues looked at a population of 1,371 men and women between the ages of 18 and 65 that consisted of 396 people with schizophrenia, 381 with bipolar disorder and 594 controls. Information documented about each person included age, gender, race/ethnicity, place of birth and highest level of parental education (as a measure of socioeconomic status). Patients with schizophrenia and bipolar disorder were recruited from inpatient, day hospital and rehabilitation programs of Sheppard Pratt Health System. Control group members were recruited from the Baltimore area and were screened to rule out any current or past psychiatric disorders.

All study participants were asked if they had a household pet cat or dog or both during their first 12 years of life. Those who reported that a pet cat or dog was in their house when they were born were considered to be exposed to that animal since birth.

The relationship between the age of first household pet exposure and psychiatric diagnosis was defined using a statistical model that produces a hazard ratio -- a measure over time of how often specific events (in this case, exposure to a household pet and development of a psychiatric disorder) happen in a study group compared to their frequency in a control group. A hazard ratio of 1 suggests no difference between groups, while a ratio greater than 1 indicates an increased likelihood of developing schizophrenia or bipolar disorder. Likewise, a ratio less than 1 shows a decreased chance.

Analyses were conducted for four age ranges: birth to 3, 4 to 5, 6 to 8 and 9 to 12.

Surprisingly, Yolken says, the findings suggests that people who are exposed to a pet dog before their 13th birthday are significantly less likely -- as much as 24% -- to be diagnosed later with schizophrenia.

"The largest apparent protective effect was found for children who had a household pet dog at birth or were first exposed after birth but before age 3," he says.

Yolken adds that if it is assumed that the hazard ratio is an accurate reflection of relative risk, then some 840,000 cases of schizophrenia (24% of the 3.5 million people diagnosed with the disorder in the United States) might be prevented by pet dog exposure or other factors associated with pet dog exposure.

"There are several plausible explanations for this possible 'protective' effect from contact with dogs -- perhaps something in the canine microbiome that gets passed to humans and bolsters the immune system against or subdues a genetic predisposition to schizophrenia," Yolken says.

For bipolar disorder, the study results suggest there is no risk association, either positive or negative, with being around dogs as an infant or young child.

Overall for all ages examined, early exposure to pet cats was neutral as the study could not link felines with either an increased or decreased risk of developing schizophrenia or bipolar disorder.

"However, we did find a slightly increased risk of developing both disorders for those who were first in contact with cats between the ages of 9 and 12," Yolken says. "This indicates that the time of exposure may be critical to whether or not it alters the risk."

One example of a suspected pet-borne trigger for schizophrenia is the disease toxoplasmosis, a condition in which cats are the primary hosts of a parasite transmitted to humans via the animals' feces. Pregnant women have been advised for years not to change cat litter boxes to eliminate the risk of the illness passing through the placenta to their fetuses and causing a miscarriage, stillbirth, or potentially, psychiatric disorders in a child born with the infection.

In a 2003 review paper, Yolken and colleague E. Fuller Torrey, M.D., associate director of research at the Stanley Medical Research Institute in Bethesda, Maryland, provided evidence from multiple epidemiological studies conducted since 1953 that showed there also is a statistical connection between a person exposed to the parasite that causes toxoplasmosis and an increased risk of developing schizophrenia. The researchers found that a large number of people in those studies who were diagnosed with serious psychiatric disorders, including schizophrenia, also had high levels of antibodies to the toxoplasmosis parasite.

Because of this finding and others like it, most research has focused on investigating a potential link between early exposure to cats and psychiatric disorder development. Yolken says the most recent study is among the first to consider contact with dogs as well.

"A better understanding of the mechanisms underlying the associations between pet exposure and psychiatric disorders would allow us to develop appropriate prevention and treatment strategies," Yolken says.

Credit: 
Johns Hopkins Medicine

Tel Aviv University study finds widespread misinterpretation of gene expression data

Reproducibility of research data is a major challenge in experimental biology. As data generated by genomic-scale techniques increases in complexity, this concern is becoming more and more worrisome.

RNA-seq, one of the most widely used methods in modern molecular biology, allows in a single test the simultaneous measurement of the expression level of all genes in a given sample. New research by a Tel Aviv University group identifies a frequent technical bias in data generated by RNA-seq technology, which often leads to false results.

The study was conducted by Dr. Shir Mandelbaum, Dr. Zohar Manber, Dr. Orna Elroy-Stein and Dr. Ran Elkon at TAU's Sackler Faculty of Medicine and George S. Wise Faculty of Life Sciences and was published on November 12 in PLOS Biology.

"Recent years have witnessed a growing alarm about false results in biological research, sometimes referred to as the reproducibility crisis," Dr. Elkon, lead author of the study, says. "This study emphasizes the importance of proper statistical handling of data to lessen the number of misleading findings."

A main goal of RNA-seq experiments is to characterize biological processes that are activated or repressed in response to different conditions. The researchers analyzed dozens of publicly available RNA-seq datasets to profile the cellular responses to numerous stresses.

During the research, the scientists noticed that sets of particularly short or long genes repeatedly showed changes in the expression level measured by the apparent number of RNA transcripts from a given gene. Puzzled by this recurring pattern, the team wondered whether it reflected some universal biological response common to different triggers or whether it stemmed from some experimental condition.

To tackle this question, they compared replicated samples from the same biological condition. Differences in gene expression between replicates can reflect technical effects that are not related to the experiment's biological factor of interest. Unexpectedly, the same pattern of particularly short or long genes showing changes in expression level was observed in these comparisons between replicates. This pattern is the result of a technical bias that seemed to be coupled with gene length, the researchers say.

Importantly, the TAU researchers were able to show how the length bias they detected in many RNA-seq datasets led to the false identification of specific biological functions as cellular responses to the conditions tested.

"Such misinterpretation of the data could lead to completely misleading conclusions," Dr. Elkon concludes. "In practical terms, the study also shows how this bias can be removed from the data, thus filtering out false results while preserving the biologically relevant ones."

Credit: 
American Friends of Tel Aviv University

Watered down biodiversity: sample type is critical in environmental DNA studies for biomonitoring

image: Freshwater sampling in Wood Buffalo National Park, Alberta

Image: 
Photo by Daryl Halliwell

DNA-based biomonitoring relies on species-specific segments of organisms DNA for their taxonomic identification and is rapidly advancing for monitoring invertebrate communities across a variety of ecosystems. The analytical approaches taken vary from single-species detection to bulk environmental sample analysis, depending largely on the focus of data generation. However, for freshwater systems, there is often a lack of consideration as to the optimal sample type for maximising detection of macroinvertebrates.

Ecology, life stage and habitat preference (i.e. benthic or water column) of macroinvertebrates ultimately influences the rate of DNA detection depending on the sampling approach taken. DNA-based biomonitoring data collected for freshwater macroinvertebrates often focuses on detection of bioindicator groups Ephemeroptera (mayflies), Plecoptera (stoneflies), Trichoptera (caddisflies) and Odonata (dragonflies & damselflies), to infer the health status of freshwater systems.

In their larval stage, these macroinvertebrates - commonly referred to as EPTO - occupy the benthos in rivers, lakes, ponds, and wetlands. Considering this, water samples have been proposed as a surrogate source of macroinvertebrate DNA, despite lack of understanding as to whether water provides sufficient taxonomic recovery of macroinvertebrates, particularly of EPTO groups.

To address this, a recent collaboration between the Hajibabaei Lab (Centre for Biodiversity Genomics, University of Guelph) and scientists at Environment and Climate Change Canada resulted in a paper in PLOS One investigating the recovery of macroinvertebrates, in particular EPTO, in shallow open-water wetlands by comparing matched water and bulk-tissue DNA samples.

Overall, they found that very few taxa were shared between bulk-benthos and water samples, with a much greater richness of macroinvertebrate taxa recovered from benthos. EPTO groups in particular were associated strongly with bulk-benthos samples, with limited EPTO families detected in all matched water samples.

"This pristine, wetland study system is excellent for comparing the relative detection of these taxa without the influence of water flow," said lead author Prof. Mehrdad Hajibabaei. The study illustrates how sample choice is a critical factor for a comprehensive assessment of total macroinvertebrate biodiversity. "This research is vitally important for informing large-scale projects such as STREAM, where a high volume of benthic macroinvertebrate data is now being generated using a standardised DNA-based methodology."

"Species detectability is an important consideration when designing biomonitoring programs," said Dr. Donald Baird, co-author and Research Scientist with Environment and Climate Change Canada's Water Science and Technology Directorate. "Our study shows clearly that to access macroinvertebrates DNA water samples are no substitute for bulk organism collection, as the majority of critical indicator taxa are simply not detected when we know they are present."

Eliminating false negatives and positives is crucial for creating high quality baseline data for determining the health status of Canadian watersheds. "There is a need for consistency of biomonitoring data when assessing total biodiversity, and bulk-benthos samples provide sufficient taxonomic coverage that is both cost-effective and efficient," said Dr. Chloe Robinson, co-author and project manager for STREAM.

This study emphasises the critical nature of choosing representative sampling methods to maximise DNA capture from target organisms whilst avoiding diluting the diversity, to enable informed decisions regarding freshwater health.

Credit: 
Centre for Biodiversity Genomics, University of Guelph

Switching cereals in India for improved nutrition, sustainability

image: UD assistant professor Kyle Davis sits near a millet field in the Himalayan foothills. Davis led a study that shows how India can improve nutrition, climate resilience, and the environment by diversifying its crop production.

Image: 
Photo courtesy of Kyle Davis

When the Green Revolution came to India, it brought with it an emphasis on high-yielding varieties of rice and wheat, which allowed India to triple its cereal production over the past 50 years. As a result, rice contributes almost half of the country's cereal production, and cereals continue to make up much of the calorie consumption in India's urban and rural households.

But that success has led to two new problems: rice does not offer the nutritional benefits of some other cereals, such as sorghum and millets, and at the same time, it is grown in areas that are not necessarily suited to rice production, which can have adverse environmental impacts.

A new study from the University of Delaware published in the Proceedings of the National Academy of Sciences shows that India can sustainably enhance its food supply and improve its environmental footprint by reducing its reliance on rice and planting more nutritious and less environmentally damaging crops such as sorghum, finger millet and pearl millet.

The study was led by Kyle Davis, assistant professor in the College of Earth, Ocean and Environment's Department of Geography and Spatial Sciences and the College of Agriculture and Natural Resources' Department of Plant and Soil Sciences. Davis explained that while the reliance on rice during the Green Revolution succeeded in feeding a large population, it also pushed out a lot of traditional cereals that are still consumed in India but to a lesser extent.

"We've found that those traditional cereals have a higher nutritional quality and also tend to use less water, require less energy to be grown, and emit fewer greenhouse gases on a per kilogram basis," said Davis.

Because rice is flood irrigated, it requires a lot of water, which is a burden in a country like India that is experiencing widespread depletion of groundwater resources.

In addition, the standing water in rice fields contributes to anaerobic respiration, which causes methane, a potent greenhouse gas, to be emitted to the atmosphere. Since the other cereals are not flood irrigated, their production does not produce any methane emissions.

"These traditional cereals also tend to be less sensitive to variability in temperature and precipitation so they're more resilient to climate variability," said Davis. "There are also many places where the yields of these cereals are comparable to or higher than rice. For all of those reasons, we wanted to look at whether there were opportunities to replace some rice production with some of these traditional cereals without reducing food supply in the country."

Sorghum and millets were consumed more widely in India a generation ago, and the government in India is interested in promoting the production and consumption of these different crops, even going so far as to declare 2018 the national year of millets. Davis said that this study can help the government in deciding which regions would benefit the most from promotion of these cereals.

"Our study provides a lot of value because we're able to pinpoint which districts in which states could see the largest improvements," said Davis. "If the government had to prioritize a few states, they could point to our results and say for example 'Ok, these are the places where our largest water savings are going to happen so we should focus here.' "

The next steps in implementing a more widespread planting of sorghum and millets would be to quantify the willingness of local populations to increase the amount of these different cereals in their diets.

The government would also have to make economic considerations to protect the livelihoods of farmers, as asking a farmer to switch to a different crop might mean that they have different fertilizer requirements or would have to buy more seed. Davis said that there are multiple government subsidy programs in India that help support farmers, but those would have to be modified to make sure that they accommodate the changes.

Finally, Davis said that while they only looked at decreasing some of the rice area and increasing some of the area allocated to these other cereals, it's also possible that India might look at areas that are currently used to produce cotton or sugarcane, water-intensive crops that don't contribute to nutrition, and replace them with sorghum and millets.

All of this could have positive environmental and nutritional benefits and Davis said that he was happy to lead a study that shows the positive impact agriculture can have on the planet.

"This was an India-focused study, but it makes a broader statement about sustainable agriculture and framing agriculture as a solution to multiple global challenges like malnutrition, water scarcity, and greenhouse gas," said Davis. "You often see agriculture presented as causing environmental problems, when in fact agriculture is the solutions to many challenges. Our study shows there are opportunities to realize a number of different benefits through more thoughtful agricultural practices, and it shows that a single intervention can change multiple outcomes for the better."

Credit: 
University of Delaware

Solving the puzzle of IgG4-related disease, the elusive autoimmune disorder

image: Scientists piece together the inflammation mechanism in IgG4-related disease revealing possible therapeutic targets.

Image: 
Tokyo University of Science

Autoimmune diseases are a medical conundrum. In people with these conditions, the immune system of the body, the designated defense system, starts attacking the cells or organs of its own body, mistaking the self-cells for invading disease-causing cells. Often, the cause for this spontaneous dysfunction is not clear, and hence, treatment of these diseases presents a major and ongoing challenge.

One recently discovered autoimmune disease is the IgG4-related disease (or IgG4-RD), which involves the infiltration of plasma cells that are specific to the immunoglobulin (antibody) IgG4 into the body tissue, resulting in irreversible tissue damage in multiple organs. In most patients with IgG4-RD, the blood levels of IgG4 also tend to be higher than those in healthy individuals. Previous studies show that T cells--which are white blood cells charged with duties of the immune response--play a key role in the disease mechanism. In particular, special T cells called cytotoxic T lymphocytes, or CTLs, were found in abundance from the inflamed or affected pancreas of patients, along with IgG4. But what was the exact role of CTLs?

In a new study published in International Immunology, a team of scientists from Tokyo University of Science decided to find the answer to this question. Prof. Masato Kubo, a member of this team, states that their aim was twofold. "We planned to explore how IgG4 Abs contributes to the CTL-mediated pancreas tissue damage in IgG4-RD, and also to evaluate the pathogenic function of human IgG4 Abs using the mouse model that we have established." The latter is especially important, as IgG4 is not naturally present in mice, meaning that there is a severe lack of adequate animal models to explore this disease.

With these aims, they selected mice that have been genetically programmed to express a protein called ovalbumin (the major protein in egg white) in their pancreas. Then, they injected IgG4 that specifically targets ovalbumin into the mice. Their assumption was that IgG4 would target the pancreas and bring about IgG4-RD-like symptoms. However, what they found was surprising. No inflammation or any other symptom typical of IgG4-RD appeared. This convinced the researchers that IgG4 alone was not the causative factor of IgG4-RD.

Next, to check if it was the CTLs that were perhaps the villain of the story, the scientists injected both IgG4 specific against ovalbumin as well as CTLs. Now, the pancreas of the mice showed tissue damage and inflammation. Thus, it was established that the presence of CTLs and IgG4 was necessary for pancreatic inflammation.

When they probed further, they found that another variation of T cells, known as T follicular helper or "TFH cells," which develop from the natural T cells of the mice, produce self-reactive antibodies like IgG4, which induce inflammation in combination with CTLs.

Once the puzzle was pieced together, the scientists now had the opportunity to zero in on the target step for intervention; after all, if one of these steps is disrupted, the inflammation can be prevented. After much deliberation, they propose that Janus kinase, or JAK, can be a suitable target. JAK is a key component of the JAK-STAT cellular signaling pathway, and this pathway is an integral step in the conversion of natural T cells of the mice to TFH cells. If this JAK is inhibited, this conversion will not take place, meaning that even the presence of CTLs will not be able to induce inflammation.

Prof. Kubo also suggests a broader outlook, not limited to the therapeutic option explored in the study. He states, "based on our findings, the therapeutic targets for IgG4-related diseases can be the reduction of TFH cell responses and the auto-antigen specific CTL responses. These can also provide the fundamental basis for developing new therapeutic applications."

These proposed therapeutic targets need further exploration, but once developed, they have the potential to improve the lives of millions of patients with IgG4-RD worldwide.

Credit: 
Tokyo University of Science

In global south, urban sanitation crisis harms health, economy

ITHACA, N.Y. - Cities in the "global south" - densely populated urban areas that are part of low-income countries in Asia, Africa and Latin America - should phase out pit latrines, septic tanks and other on-site methods of human waste management.

Instead, cities should invest in sewage systems, according to a report released Dec. 18 from the World Resources Institute/Ross Center for Sustainable Cities. Building such systems - based on public sector capital investment, where water and energy are sufficient - will improve health, well-being and the economy for those cities.

Victoria Beard, Cornell professor of city and regional planning, associate dean of research for the College of Architecture, Art and Planning and a World Resources Institute fellow, was one of four authors of the new study. The researchers spent a year examining 15 cities in the global south, and found that 62% of sewage and fecal sludge is unsafely managed.

"Every large, densely populated city would benefit from a well-funded, well-regulated, off-site sanitation system, and in many cases this means a working sewer system," said Beard, a fellow at the Cornell Atkinson Center for Sustainability.

"Even where on-site sanitation systems (pit latrines, septic tanks) are used, you need systems of public regulation and treatment plants that function," Beard said, explaining that on-site sanitation systems are more complicated, in part because they have to be constructed and maintained by a household.

Environmental special interest groups argue against building urban sewers in the global south because of water and energy these systems require. But this new report argues that by reusing water and creating energy from waste, when possible, the off-site sanitation systems are preferred to on-site sanitation systems in densely populated urban areas, where the costs are largely born by households.

The report found 10 out of 15 cities had fecal sludge management regulations, and nine reported that these were enforced. Five cities did not regulate fecal sludge.

"Untreated and Unsafe: Solving the Urban Sanitation Crisis in the Global South," explains how previous widely used global indicators underestimated the urban sanitation crisis.

Around the world, the number of urban residents who lack well-managed sanitation increased more than 20%, from 1.9 billion in 2000 to 2.3 billion in 2015, according to the World Health Organization. Add to that a burgeoning world urban population, expected to reach nearly 6 billion by 2030.

Inadequate urban sanitation impedes health, economic growth and productivity, and imposes costs on poor households, according to the authors. It is estimated that unsafe sanitation costs are $223 billion annually, as a result of disease and lost productivity. In fact, the WHO has said that for every dollar invested in improving sanitation in the global south, an economy can return $6 to $9.

The researchers encountered unsafe sanitary conditions in most cities that relied on on-site means and manual laborers who emptied pit latrines.

"We know from our fieldwork," Beard said, "that human waste collected on-site often ends up where it should not be - dumped into streams or farms or dumped on the side of the road, and this creates a huge public health risk."

Beard said on-site sanitation systems are needed for the short or medium term, but "densely populated cities need a well-financed, publicly regulated sanitation system - this is a public good."

From the household perspective, the cost of off-site sewage systems is less expensive than grappling to maintain construction and maintenance of pit latrines or private septic systems - two of the most common forms of on-site sanitation management in contexts where sewer connections are not available or affordable.

As an example, Cochabamba, Bolivia, is considered a struggling municipality with 27% of its households living in low-income neighborhoods. To install an on-site septic tank there costs 120% of average monthly household income; one-time emptying costs 120% of average monthly household income, as well.

In the Makoko settlement in Lagos, Nigeria, no sewage system exists, but the construction of a pit latrine - in an urban setting - costs 611% of average monthly household income, and latrine emptying is 23% of average monthly household income.

"Upfront, sewers are an expensive investment, but these systems last upward of 50 years," she said. "In many cities, connection to an off-site sanitation system is much more affordable from the perspective of low-income households."

Credit: 
Cornell University

Perpetual predator-prey population cycles

image: This rotifer species is common in freshwater lakes and ponds around the world and was used in the 10 year-long experiment on predator-prey relationships conducted at the University of Potsdam in Germany.

Image: 
Guntram Weithoff, University of Potsdam

How can predators coexist with their prey over long periods without the predators completely depleting the resource that keeps them alive? Experiments performed over a period of 10 years by researchers from McGill University and the Universities of Oldenburg and Potsdam have now confirmed that regular oscillations in predator-prey populations can persist over very long periods

"Because predators eat their prey there is always a danger that they perish after killing off the resource that kept them alive," says Gregor Fussmann, professor in McGill's Department of Biology and co-senior author of the new study published in Nature. "Yet, if predators are a bit less efficient, prey populations may be able to recover while predator numbers dwindle. This process can lead to potentially endless predator-prey cycles."

The researchers used a microbial predator and prey system to try and understand if these predator-prey population cycles occur naturally through the interaction of the two species or if they are the result of external drivers.

Balanced cycles of predator-prey populations

Predator-prey cycles are based on a feeding relationship between two species: if the prey species rapidly multiplies, the number of predators increases - until the predators eventually eat so many prey that the prey population dwindles again. Soon afterwards, predator numbers likewise decrease due to starvation. This in turn leads to a rapid increase in the prey population - and a new cycle begins.

To confirm these dynamics, the researchers kept plankton in glass vessels under highly controlled conditions - constant temperature - and used rotifers, which feed on algal cells, as predators. The oscillations in rotifer and unicellular algae populations were measured across 50 cycles and more than 300 predator generations - a record for a study of this kind.

"Our experiments confirm the theoretical concept of self-generated predator-prey cycles," says Bernd Blasius, lead author of the new study and head of the Mathematical Modelling group at the University of Oldenburg's Institute for Chemistry and Biology of the Marine Environment. "We mainly observed regular oscillations in the predator and prey populations recurring at almost constant intervals. Unexpectedly, these regular oscillations were repeatedly interrupted by short, irregular periods without any discernible external influences and then independently returned to the original state."

Fussmann explained that their work helps understand more complex systems.

"Our ability to understand the dynamic behaviour of natural ecosystems largely relies on the correctness of very simple theoretical assumptions - including those about how predator-prey cycles arise," Fussmann said. "Our work gives assurance that we are using the right building blocks when we attempt to predict what is happening in complex ecosystems."

The experimental work was performed at the University of Potsdam, where the research team had already measured how external factors can influence population cycles by periodically changing the nutrient supply of the algae. The researchers would now like to explore how other factors occurring in real ecosystems - such as changing temperature - affect predator-prey cycles.

Credit: 
McGill University

Different mutations in a single gene can wreak many types of havoc in brain cells

New York, NY (December 19, 2019) -- Mount Sinai researchers have found that different mutations in a single gene can have myriad effects on a person's health, suggesting that gene therapies may need to do more than just replenish the missing or dysfunctional protein the gene is supposed to encode, according to a study published in Nature Genetics in November.

"You have to fully understand the mutation to understand how to fix it," said Kristen Brennand, PhD, Associate Professor of Genetics and Genomic Sciences, Neuroscience, and Psychiatry at the Icahn School of Medicine at Mount Sinai, and together with Gang Fang, PhD, Associate Professor of Genetics and Genomic Sciences, one of the lead authors of the study. The two researchers "have been collaborating for seven years on multiple projects that combine our complementary expertise in biology and informatics," said Dr. Fang.

The collaboration originated from Dr. Brennand's interest in the function of the gene neurexin-1, or NRXN1, in psychiatric disorders and Dr. Fang's technology expertise in the use of sophisticated techniques for analyzing different forms of individual genes. Much of the work was led by Shijia Zhu, PhD, formerly a postdoctoral fellow in Dr. Fang's lab, and Erin Flaherty, PhD, a former graduate student in Dr. Brennand's lab.

Patients with schizophrenia, autism, and bipolar disorder sometimes carry mutations in NRXN1. Until now, NRXN1 "had largely been studied only in mice. And, from the mouse studies, we know there are over 300 splice isoforms," said Dr. Brennand. "That means that this one gene makes 300 different proteins in the mouse."

The team set out to understand how NRXN1 functions in typical human neurons, and how different mutations might impact cellular function.

Dr. Brennand and her team started with skin samples from several patients at The Mount Sinai Hospital who had mental health diagnoses and carried mutated forms of the gene. They used these samples, as well as samples from participants without these diagnoses, to culture human induced pluripotent stem cells (hiPSCs)--cells with the ability to grow into any cell in the body.

The cells were then induced to grow into neurons. In the cells that came from patients with mutations in NRXN1, the scientists noted differences in the shape and electrical activity of the neurons as well as the rates at which they matured.

But that wasn't all. All people have two copies of the gene. If there is a mutation, it is usually only in one of those copies. The normal, unmutated gene still produces the healthy protein, but the mutated copy is unable to produce any protein, meaning the individual produces less of the protein than is necessary for normal function. The researchers figured that introducing more of the healthy protein would rescue the neurons, but this wasn't always the case.

Some of the mutations cause the second copy of the gene to produce a separate, mutated version of the protein. The researchers found that these mutated proteins may interfere with the action of the healthy protein. The team found that even cells that could produce enough of the healthy protein that they should have functioned normally would suffer if they were also exposed to a mutant form of the protein--and different mutations led to different problems.

"Functionally, these mutant proteins seem to have a dominant negative effect," said Dr. Brennand. "Overexpression of a single mutant protein in healthy neurons is enough to cause them to fire irregularly."

The study was small, and the gene variants the team studied are rare. In the future it will be important to tease out exactly how the variants impact function: do developmental perturbations lead to later differences in activity or vice versa? But both Dr. Brennand and Dr. Fang emphasized that the overall message is crucial for anyone hoping to use genetics to personalize medicine.

"I went into this really naively, thinking that all patients with deletions in this gene would probably show the same effect," she said. "What we learned is that if you want to move towards precision medicine, it matters not just what genes are impacted, but how they're mutated as well."

Credit: 
The Mount Sinai Hospital / Mount Sinai School of Medicine