Body

Can chickpea genes save mustard seeds from blight disease?

image: Comparative microscopic analysis of the infection pattern of Alternaria brassicae on host plant (mustard) and non-host plant (chickpea).

Image: 
Urooj Fatima, Priyadarshini Bhorali, and Muthappa Senthil-Kumar

During visits to fields in Assam, Rajasthan and Uttar Pradesh, India, plant biologists Muthappa Senthil-Kumar and Urooj Fatima found mustard plants infested with Alternaria blight disease. They also noticed that an adjacent field of chickpeas were completely uninfected.

Alternaria blight caused by fungal pathogen devastates Brassica crops such as cabbage, cauliflower, broccoli, and mustard seed. Highly infectious, this fungus can infect the host plant at all stages of growth. Currently Alternaria blight is managed by chemical fungicides, but recently efforts have been made to utilize breeding and modern biotechnological approaches to develop blight-resistant crop varieties.

Non-host resistance (NHR) is the most durable resistance against fungal pathogens. While Alternaria typically penetrates the epidermis or the stomata of a host plant, this is unable to deploy this attack on plants protected by NHR. To protect Brassica crops from this fungus, scientists are studying the mechanisms of NHR in order to develop improved crop varieties.

In a study published in MPMI, plant biologists in India detailed their research comparing the response of a host plant (mustard) and a non-host plant (chickpea) to the fungus on a morpho-pathological level. They found that the chickpea actively suppressed the fungal development, penetration, and colonization even after hours of infection.

They also studied chickpea transcripts to pinpoint several genes involved in the plant's pathogen defense.

"These genes are interesting candidates for additional study to determine their precise involvement in NHR," said Senthil-Kumar, who conducts research through the National Institute of Plant Genome Research. "These genes could then be transferred to mustard plants to develop blight-resistance crops."

For more information about this study, read "Morpho-Pathological and Global Transcriptomic Analysis Reveals the Robust Nonhost Resistance Responses in Chickpea Interaction with Alternaria brassicae" published in the December issue of Molecular Plant-Microbe Interactions (MPMI).

Credit: 
American Phytopathological Society

Who receives advanced stroke care? It may depend on traffic

image: Keck School of Medicine of USC, Los Angeles

Image: 
Ricardo Carrasco III

LOS ANGELES – When someone has an acute stroke, early access to specialized care is crucial. Whenever possible, experts recommend people receive medical help at a hospital with advanced stroke capability like a comprehensive stroke center (CSC).

A new study by the Keck School of Medicine of USC and the USC Schaeffer Center for Health Policy & Economics published in Academic Emergency Medicine analyzed how long it would take Los Angeles County emergency medical services to transport patients to CSCs, and found that traffic conditions affect consistent access, particularly in socioeconomically disadvantaged neighborhoods.

Los Angeles County has 16 CSCs spread across the county, which provide advanced stroke care often not available at other hospitals. Emergency medical services protocols in Los Angeles County specify that patients who are showing symptoms of certain acute strokes be transported to a CSC if the expected transport time is less than 30 minutes, even if a non-CSC hospital is closer.

While access to CSCs has traditionally been broken down by areas that "always have access" and "never have access," this study revealed a third category: areas that have only intermittent access to CSCs within 30 minutes, depending not on distance, but on shifting traffic conditions throughout the day.

The study showed that nearly 20% of the population has only intermittent access to CSCs, and many of these individuals live in the urban core of the city, including South Los Angeles and East Los Angeles. The findings also revealed that areas with intermittent access have the largest Latino and African American populations, as well as those living below the poverty line.

"While people in these areas might have a CSC close to them if you look at a map, their ability to get to the hospital may be determined by what hour they had the stroke," says Dan Dworkis, MD, PhD, the study's lead author and an assistant professor of clinical emergency medicine at the Keck School. "CSCs have the latest therapies other hospitals may not, including potential life-saving treatments. For patients experiencing a stroke, access to these treatments is often critical."

Researchers also found that almost 80% of the population has access to a CSC within 30 minutes. Less than five percent has no access to a CSC within 30 minutes. Those without access live in more rural areas, such as northeast Los Angeles County.

Dworkis was joined in the study by colleagues Sarah Axeen, PhD, an assistant professor of research emergency medicine at the Keck School and a fellow at the USC Schaeffer Center, and Sanjay Arora, MD, an associate professor of clinical emergency medicine at the Keck School and a clinical fellow at the USC Schaeffer Center.

Researchers assessed the driving time to the county's CSCs from various communities. Transit times were measured 12 times during non-holiday weekdays, including morning and evening rush hours, over the course of two weeks.

The researchers hope these findings will influence where future comprehensive stroke centers are established. For example, based on their findings, South Los Angeles, with a population of 1.14 million, might have a greater need for advanced stroke care resources than northeast Los Angeles County, with only 479,000 residents.

While the study was limited to Los Angeles County, the study authors believe their findings have relevance for other urban areas.

"Our research demonstrates the value of incorporating data on traffic patterns when analyzing access to comprehensive stroke centers across the country," says Axeen. "In the future, public health officials and policymakers across the country should consider traffic when deciding how to optimize the distribution of a region's stroke resources."

Stroke is one of the leading causes of death in the United States, with a stroke death occurring approximately every four minutes.

Credit: 
University of Southern California - Health Sciences

Give & take: Cancer chromosomes give the game away

Dr Pascal Duijf from QUT's School of Biomedical Sciences and IHBI (Institute of Health and Biomedical Innovation) said the study, published in Nature Communications today, analysed chromosome arm abnormalities in more than 23,000 human tumours and 1000 cancer cell lines.

"Our analyses provided hitherto unknown insights into the evolution of tumours and open up three exciting new areas of study, including new potential personalised treatments for 17 cancer types," Dr Duijf said.

"Nearly all cells in our bodies have 46 chromosomes, made up of DNA and proteins, that contain genetic information and instructions for cell functions.

"Most chromosomes have two chromosome arms. If cancer cells gain or lose arms, it can equip them with cancerous features, such as the ability to spread through the body.

"Our most unexpected finding was that chromosome arm gains and losses predict drug response much better than well-established genetic abnormalities, such as mutations.

"We used artificial intelligence to identify 31 chromosome arm gains or losses that profoundly change the response of cancer cells to 56 different chemotherapeutic drugs."

Dr Duijf said tumour cells usually initially gain chromosome arms and lose multiple arms later during development.

Blood cancers, on the other hand, acquire only few chromosome arm abnormalities and show more gains than losses.

"We also applied advanced statistical methods to determine probable orders in which chromosome arms are gained and lost during the development of breast cancer," he said.

"We found that chromosome arm gains and losses could also predict patient survival outcome in 58 per cent of nearly 7000 patients with 19 types of cancer.

"Unexpectedly, although chromosome arm gains and losses are abnormal, for 1 in 4 patients they predict better survival outcome."

Credit: 
Queensland University of Technology

Critically injured soldiers have high rates of mental health disorders

image: This chart compares the incidence of various mental health diagnoses among soldiers with TBI vs other serious injuries.

Image: 
UMass Amherst

U.S. combat soldiers who suffered a moderate or severe traumatic brain injury (TBI) are more likely than soldiers with other serious injuries to experience a range of mental health disorders, according to a new retrospective study by University of Massachusetts Amherst health services researchers.

"A central takeaway is that severe TBI is associated with a greater risk of mental health conditions - not just PTSD," says lead investigator David Chin, assistant professor of health policy and management in the School of Public Health and Health Sciences. "Our findings suggest that patients who are critically injured in combat and sustain severe TBI have particularly high rates of mental health disorders."

The research, published in the journal Military Medicine, is the largest and broadest look at severe combat injury in the military and associated mental health outcomes. Chin and co-author John Zeber, UMass Amherst associate professor and program head of health policy and management, examined the cases of 4,980 military members - most from the Army or Marines - who were severely injured during combat in Iraq and Afghanistan between 2002 and 2011. Nearly a third of them suffered moderate or severe TBI.

Mining data from the U.S. Department of Defense, Chin found that 71% of all the severely injured soldiers were diagnosed in follow-up care with at least one of five mental health conditions: post-traumatic stress disorder (PTSD), anxiety and mood disorders, adjustment reactions, schizophrenia and other psychotic disorders, and cognitive disorders. Previous research reported that a much lower 42% of seriously injured combat soldiers went on to be diagnosed with mental health disorders. And Chin notes that his study defined mental health diagnoses more narrowly.

In Chin's research, diagnoses for every mental health condition were higher among the cases of TBI than other severe injuries.

"Most of the research on TBI has looked at mild to moderate brain injury," Chin says, with estimated incidence of associated PTSD to be as high as 23% - significantly lower than the 46% incidence Chin's research noted in cases of more severe TBI.

In addition, Chin found that the risk for post-traumatic stress disorder (PTSD) is higher - not lower, as previous investigators have assumed - among combat soldiers with more severe TBI.

"There was a common belief that having a severe TBI resulted in an amnestic effect on PTSD - the injuries were so severe that the patients have no memory of the event and that put them at lower risk of having mental health outcomes. This data showed to the contrary," Chin says.

In addition to including personnel from all four military service branches, the study followed the soldiers' care for a median period of more than four years. Earlier studies about TBI in the military typically involved one service branch and only followed the soldiers for one year after the injury.

Even with the longer timeframe, Chin says the study "definitely underestimates" the prevalence of mental health conditions among severely injured soldiers. Among the study's limitations: researchers had access only to Department of Defense-related records, so researchers were unable to track cases after the soldiers were discharged; and the military culture engenders an underreporting of mental health symptoms.

While Chin emphasizes that more research is necessary, the study illustrates the importance of monitoring the mental health status of seriously injured soldiers, especially those with severe TBI, for years after their injury, and ensuring that clinical and support services are available to veterans and their families.

Credit: 
University of Massachusetts Amherst

Gene therapy success in chronic septic granulomatosis

American and British teams led by Drs Kohn (University of California, Los Angeles), Malech (NIH), Williams (Boston Children's) and Trasher (Great Ormond Street Institute of Child Health) published yesterday in Nature Medicine the conclusive results of a gene therapy trial conducted in the United States and Great Britain in 9 patients with X-linked Chronic Septic Granulomatosis (X-CGD), a rare and severe immune dysfunction. Six of them are free of treatment for complications generated by the disease. Genethon, which contributed to the research that led to these trials and sponsored initial clinical studies, is pleased with these results.

Chronic Septic Granulomatosis is a rare genetic disease due to a mutation on the X chromosome. Boys affected by this disease have a deficient immune system that predisposes them to serious infections. Indeed, from the first years of life, the patients suffer from repeated infections, sometimes deep abscesses, atypical pneumonia but also chronic inflammation including gum or digestive tract. Each infectious episode reduces the quality of life and life expectancy of patients. Until now, only bone marrow transplants could prolong the life of patients.

The gene therapy approach consists in restoring the activity of the defective NAPDH oxidase in the patient's phagocytic cells (neutrophilic polynuclear cells, monocytes/macrophages) by gene transfer using a lentiviral vector. This lentiviral vector - G1XCGD - was developed at Genethon by Dr. Anne Galy (Inserm/UMR951/Généthon, UEVE, Université Paris Saclay), in collaboration with Dr. Adrian Thrasher of London and Prof. Manuel Grez of Frankfurt. The clinical batches were produced by YposKesi, an industrial production platform for gene therapy medicinal products created by AFM-Telethon and BPIfrance. Genethon is the promoter of the first European trial, launched in 2013, which is still in progress, the first results of which were reported yesterday in Nature Medicine.

Nine patients (4 in Europe and 5 in the United States), aged 2 to 27 years, were treated in clinical trials conducted in the UK and the United States. Seven of them, followed for 12 to 36 months after treatment, did not contract any infection. Two patients died during the trial as a result of complications acquired prior to gene therapy treatment.

"We are very proud of these clinical results, which once again demonstrate the unique capacity of our laboratory to develop therapeutic projects, from concept development to clinical trials by integrating manufacturing. These results are also the result of a rich collaboration with the best British and American clinical experts." said Frédéric Revah, Chief Executive Officer of Genethon.

"This is the first time that a sustainable treatment has been obtained by gene therapy in this disease, confirming the advantages of the lentiviral technology that has been used to treat hematopoietic stem cells," says Anne Galy, Director of the Blood and Immune System Diseases Program at Genethon.

This international effort has also been supported by the European Commission through funding by the 7th Framework Programme in Health of the European project Net4CGD of which Genethon is the coordinator.

These results led to a strategic alliance formed by Genethon with the British company Orchard, which has an exclusive license on G1XCGD allowing the clinical development of this gene therapy drug to continue.

Credit: 
AFM-Téléthon

What's in your water?

image: When phenols, compounds that are commonly found in drinking water, mix with chlorine, hundreds of unknown, potentially toxic byproducts are formed.

Image: 
Marissa Lanterman/Johns Hopkins University

Mixing drinking water with chlorine, the United States' most common method of disinfecting drinking water, creates previously unidentified toxic byproducts, says Carsten Prasse from Johns Hopkins University and his collaborators from the University of California, Berkeley and Switzerland.

The researchers' findings were published this past week in the journal Environmental Science & Technology.

"There's no doubt that chlorine is beneficial; chlorination has saved millions of lives worldwide from diseases such as typhoid and cholera since its arrival in the early 20th century," says Prasse, an assistant professor of Environmental Health and Engineering at The Johns Hopkins University and the paper's lead author.

"But that process of killing potentially fatal bacteria and viruses comes with unintended consequences. The discovery of these previously unknown, highly toxic byproducts, raises the question how much chlorination is really necessary."

Phenols, which are chemical compounds that occur naturally in the environment and are abundant in personal care products and pharmaceuticals, are commonly found in drinking water. When these phenols mix with chlorine, the process creates a large number of byproducts. Current analytical chemistry methods, however, are unable to detect and identify all of these byproducts, some which may be harmful and can cause long-term health consequences, says Prasse.

In this study, Prasse and colleagues employed a technique commonly used in the field of toxicology to identify compounds based on their reaction with biomolecules like DNA and proteins. They added N-α-acetyl-lysine, which is almost identical to the amino acid lysine that makes up many proteins in our bodies, to detect reactive electrophiles. Previous studies show that electrophiles are harmful compounds which have been linked to a variety of diseases.

The researchers first chlorinated water using the same methods used commercially for drinking water; this included adding excess chlorine, which ensures sufficient disinfection but also eliminates harmless smell and taste compounds that consumers often complain about. After that, the team added the aforementioned amino acid, let the water incubate for one day and used mass spectrometry, a method of analyzing chemicals, to detect the electrophiles that reacted with the amino acid.

Their experiment found the compounds 2-butene-1,4-dial (BDA) and chloro-2-butene-1,4-dial (or BDA with chlorine attached). BDA is a very toxic compound and a known carcinogen that, until this study, scientists had not detected in chlorinated water before, says Prasse.

While Prasse stresses that this is a lab-based study and the presence of these novel byproducts in real drinking water has not been evaluated, the findings also raise the question about the use of alternative methods to disinfect drinking water, including the use of ozone, UV treatment or simple filtration.

"In other countries, especially in Europe, chlorination is not used as frequently, and the water is still safe from waterborne illnesses. In my opinion, we need to evaluate when chlorination is really necessary for the protection of human health and when alternative approaches might be better," says Prasse.

"Our study also clearly emphasizes the need for the development of new analytical techniques that allow us to evaluate the formation of toxic disinfection by-products when chlorine or other disinfectants are being used. One reason regulators and utilities are not monitoring these compounds is that they don't have the tools to find them."

Credit: 
Johns Hopkins University

100 years after development, TB vaccines vary in ability to stimulate immune components

Each year, more than 100 million newborns around the world receive vaccinations against Mycobacterium tuberculosis, or TB, which infects about one-quarter of the world’s population. Facilities across the world produce several different formulations of these vaccines, known as Bacille Calmette Guérin (BCG) vaccines. These are given interchangeably, yet new research from the Precision Vaccines Program at Boston Children’s Hospital calls that practice into question.

The study, published in the journal Vaccine, shows that BCG vaccines vary widely in their characteristics, including their ability to activate cytokines, potent elements of the immune system response.

“We found that licensed BCG vaccines differ dramatically, raising fundamental questions about whether the quality of these vaccines are equivalent and should be considered interchangeable,” explains co-senior investigator Ofer Levy, MD, PhD, director of the Precision Vaccines Program.

A global health threat

According to the World Health Organization (WHO), an estimated 10 million people, including nearly 1.1 million children, became ill from TB in 2018, and nearly 1.5 million people died from it.

TB infection in babies is particularly serious, often causing sepsis, meningitis, and, frequently, death. For that reason, many newborns receive TB vaccination in areas where the disease is common.

Vaccines against TB were first introduced
in 1921. Today, more than 14 different BCG vaccine formulations are used, with
five WHO-approved products dominating globally. All BCG vaccines use live, but altered or
attenuated, Mycobacterium bovis (a bacterium related to TB) to spark the
immune system to protect against TB.

BCG vaccines strengthen overall immune response

Previous research has shown that BCG vaccination not only protects against TB, but boosts the overall immune system, in what is called an ‘off-target’ effect.

“BCG vaccine is a very interesting vaccine because it has been found to boost protection against other infections, bacterial and viral, that are very common in newborns and young infants,” says Asimenia Angelidou, MD, PhD, a neonatologist at Boston Children’s and the study’s first author. “And it may be doing that by revving up the innate immune system.”

One recent study from the Precision Vaccines Program found that injecting BCG along with the hepatitis B vaccine strengthened the immune response to hepatitis B.  “But until now, we have not directly compared these BCG vaccine formulations side-by-side for any standard measures of immunity or protection against TB,” says Levy.

Comparing BCG vaccines

The new study looked at several formulations of the most commonly used licensed BCG vaccines: BCG-Denmark, BCG-India, BCG-Bulgaria, BCG-Japan, and BCG-USA (sourced from the Boston Children’s Hospital pharmacy). The researchers compared several different vials from different  manufactured lots of each formulation. They measured how each grew in culture and how many live bacteria each vaccine contained.

“The data consistently shows that the Indian and Bulgarian formulations, both derived from the same mother BCG strain (BCG Russia), have more than 1,000-fold lower growth and fewer live bacteria compared to the others,” says Angelidou.  “This is pertinent clinically because there are numerous studies showing that live mycobacteria trigger the immune system in a different way compared to dead mycobacteria; they activate different downstream pathways.”

Cytokines vary by vaccine

The team measured numerous cytokine proteins released from immune cells to fight infection after vaccination, including IL1 beta and interferon gamma (the latter is especially important for TB protection).

“We found differences in terms of the
cytokine responses each vaccine triggered,” says Angelidou.

In particular, BCG-India induced significantly less interferon gamma compared to the rest of the strains. The team also found that concentrations of the IL1 beta cytokine, which is heavily involved in boosting overall immunity after BCG vaccination, correlate with the amount of live bacteria contained in the BCG vaccine.

“Upon recognition of a pathogen or vaccine, newborn immune cells are often less able to produce certain cytokines, such as interferon gamma, that are important for an immune response against TB,” says Simon van Haren, PhD, co-senior investigator on this study. “Comparing the ability of each BCG formulation to induce interferon gamma production in newborn immune cells was therefore very important.” 

All vaccines not equal

Our findings raise really fundamental questions about whether these BCG vaccines should be considered interchangeable and whether the quality is equivalent because of differences in viability and in the numbers and kind of cytokines induced.

The differences between the strains
produced in different parts of the world are mainly due to manufacturing
practices.

“The mycobacteria in general are very
sensitive to environmental conditions,” says Angelidou, “so any environmental
changes in their manufacturing process can really affect the growth of the
mycobacteria.”

In their study, the researchers grew the
mycobacterium cultures from each vaccine under the same environmental
conditions.

“This is key, as it is the first study that directly compares clinically relevant BCG vaccine strains used today in the same lab under the same conditions to see how many bacteria will grow and how they induce immune responses from human newborn white blood cells,” adds Levy.

Large comparative clinical trial needed

This study shows significant differences between the BCG formulations when tested under very controlled conditions. But does it correlate with differences one might see clinically in real world situations?

“We have not proven conclusively which BCG formulation is most effective,” says Levy. “Rather, we present compelling evidence that head to head clinical trials of these very different BCG formulations are urgently needed to define which is most effective.”

Credit: 
Boston Children's Hospital

Weight loss surgery improves breathing issues in obese patients

image: Axial unenhanced inspiratory CT images of the lungs in 51-year-old woman (a) before and (b) 6 months after bariatric surgery with 31-kg weight loss (body mass index decrease, 36.1%). The mosaic attenuation seen before surgery resolved after surgery.

Image: 
Radiological Society of North America

OAK BROOK, Ill. - Bariatric surgery and weight loss appear to reverse some of the negative effects of obesity on the respiratory system, according to a study published in the journal Radiology.

Obesity is a public health epidemic that contributes to a higher risk of hypertension and stroke, diabetes and certain cancers. It also harms the respiratory system, although the scope of these effects is not fully understood.

Known effects of obesity on the respiratory system include increased respiratory work, along with compromised airway resistance and respiratory muscle strength, which may all contribute to restrictive pulmonary function impairment.

As an imaging technology that provides detailed pictures of the lungs and airways, CT has great potential to improve understanding of obesity's impact on the respiratory system. Up until now, however, there have been few CT studies evaluating obesity's effects on the lungs and the trachea, often referred to as the windpipe.

Study lead author Susan J. Copley, M.D., saw CT's potential firsthand in her practice as a thoracic radiologist at Hammersmith Hospital in London, part of Imperial College Healthcare NHS Trust, where she observed differences on chest CT images obtained in obese individuals.

"This caused me to wonder if these differences were due to obesity and whether they were reversible after weight loss," she said.

For the study, Dr. Copley and her colleagues evaluated changes in the respiratory systems of 51 obese individuals who underwent bariatric surgery, a treatment for obese patients who haven't responded to other weight loss approaches. The procedure reduces the size of the stomach. All participants lost weight post-surgery with a mean body mass index decrease of 10.5 kg/m2.

The researchers used CT to measure the size and shape of the trachea and assess air trapping, a phenomenon in which excess air remains in the lungs after exhaling, resulting in a reduction in lung function. Air trapping is an indirect sign of obstruction in the small airways of the lung.

When the researchers compared results at baseline and six months after bariatric surgery, they found that surgery and weight loss were associated with morphological, or structural, changes to the lung and trachea.

Post-surgery CT showed reductions in air trapping and a lower incidence of tracheal collapse. Change in the extent of CT air trapping was the strongest predictor of improvement in dyspnea, or shortness of breath.

"For the first time, this study has demonstrated changes in the CT morphology of large and small airways that improve when individuals lose weight," Dr. Copley said. "These features correlate with an improvement in patient symptoms."

The results suggest that there may be a reversible element of small airway inflammation related to obesity and that reversal of this inflammation correlates with improvement in symptoms. The findings also point to CT as a potential marker of this inflammation.

While more studies are needed to better understand the link between CT features and biomarkers of inflammation, the study underscores CT's potential in the work-up of patients with obesity.

"CT is a useful morphological marker to demonstrate subtle changes which are not easily assessed by lung function alone," Dr. Copley said.

Credit: 
Radiological Society of North America

Abnormal imaging findings key to EVALI diagnosis in vapers

image: Images show electronic cigarette or vaping product use-associated lung injury in a 32-year-old man with history of vaping who presented with fevers and night sweats for 1 week. (a) Coronal maximum intensity projection image shows diffuse centrilobular nodularity. (b) Histologic sections of his transbronchial cryobiopsy showed distinctive micronodular pattern of airway-centered organizing pneumonia, corresponding to centrilobular nodularity seen at CT. Similar imaging and pathologic findings have been described in patients with smoke synthetic cannabinoids.

Image: 
Radiological Society of North America

OAK BROOK, Ill. - Pulmonary imaging is important in the diagnosis of the acute lung injury associated with vaping, known as electronic cigarette or vaping product use-associated lung injury (EVALI), according to a special review article published in the journal Radiology. The report outlines what is currently known about this condition and discusses remaining questions.

Although e-cigarettes have been often marketed as a safer alternative to traditional cigarettes, EVALI has emerged as a serious and sometimes fatal complication of vaping.

Radiologists play a key role in the evaluation of suspected EVALI. Accurate identification of the condition allows for prompt medical treatment, which may decrease the severity of injury in some patients.

"Rapid clinical and/or radiologic recognition of EVALI allows clinicians to treat patients expeditiously and provide supportive care," said Seth Kligerman, M.D., associate professor at UC San Diego School of Medicine and division chief of cardiothoracic radiology at UC San Diego Health in San Diego, California. "Although detailed clinical studies are lacking, some patients with EVALI rapidly improve after the administration of corticosteroids. Additionally, making the correct diagnosis may prevent unnecessary therapies and procedures, which themselves can lead to complications."

Despite ongoing investigations by public health officials, the exact cause of EVALI remains unclear. What is currently known is that most patients are young adult and adolescent men. Over 80% of EVALI patients report vaping tetrahydrocannabinol (THC) or cannabidiol CBD containing compounds.

Patients with EVALI typically have a combination of respiratory and gastrointestinal symptoms, as well as general symptoms like fever or fatigue. Chest CT findings in EVALI can be variable but most commonly show a pattern of diffuse lung injury with sparing of the periphery of the lungs. EVALI is a diagnosis of exclusion. The patient must have a history of vaping within 90 days and abnormal findings on chest imaging, but other possible causes for the patient's symptoms must be eliminated.

Dr. Kligerman also notes that some patients may present to the emergency department with relatively mild symptoms or radiologic findings.

"If EVALI is not diagnosed in a timely manner, patients may continue vaping after leaving the doctor's office, clinic or emergency department which could lead to worsening lung injury," he said.

The article cautions that aside from EVALI, vaping may pose long-term health risks. Nicotine and THC addiction, cardiovascular disease and chronic pulmonary injury are all potential consequences of e-cigarette use and are particularly concerning in the predominantly younger population that is associated with vaping.

"Right now, we do not know the long-term effects of vaping, as it is still a relatively new method of nicotine and THC delivery, and there are countless variables involved which further confound our understanding of what is happening on a patient-specific level," Dr. Kligerman said.

He added that while recent studies have shown an association between vaping and the development of asthma, chronic bronchitis and chronic obstructive pulmonary disease, these studies have only shown an association and not causation.

"Although I am hesitant to speculate on specifics as we just do not have the data, I would not be surprised if vaping is directly linked to many of the chronic pulmonary and cardiovascular diseases commonly associated with traditional cigarette smoking," Dr. Kligerman said.

"The link between vaping and lung cancer is unknown at this point," he noted.

Studies with long-term follow up will be needed to evaluate EVALI patients for these conditions and others, including malignancies, that may require longer term vaping exposure to develop.

Credit: 
Radiological Society of North America

Weighing more than your twin at birth may predict better achievement at school

Research has shown that children who are born at a low birthweight are less likely to do well in school and more likely to live in lower-income neighborhoods as adults. A new study of twins looked at the effect of birthweight on children's cognitive and socioemotional outcomes at 4 years old, taking into account families' socioeconomic status (SES). The study showed that weighing more than your twin before starting school may help with achievement. It also found that socioeconomic status accentuates the effects of birthweight on early development.

The findings, from researchers at Georgetown University, appear in Child Development, a journal of the Society for Research in Child Development.

"Our study suggests that higher birthweight predicts greater school readiness, more so for low-SES children," notes Caitlin Hines, a doctoral student at Georgetown University, who led the study. "It follows that early intervention with lower-birthweight infants may reduce the long-term implications of birthweight. Such intervention should address cognitive or socioemotional deficits before kindergarten."

Researchers used data from the Early Childhood Longitudinal Study-Birth Cohort (ECLS-B) to compare the outcomes of 1,400 twins whose birthweight differed from one another. As a nationally representative sample, the ECLS-B reflects the demographic characteristics of the U.S. population in geography, location, sociocultural background, and religion. For this study, children were assessed at 9 months and at 4 years.

Researchers also interviewed the children's primary caregiver (usually the biological mother) and when the children were 4, the children's primary child care provider. Children's math and reading scores were assessed at age 4 and parents and child care providers were asked about children's externalizing behavior (e.g., aggression) and prosocial behavior (e.g., helpfulness). Parents' SES was measured via maternal education and household income.

The study found that higher birthweight significantly predicted higher math and reading scores at age 4. This suggests that weighing more than your twin is associated with small but significant increases in both math and reading scores prior to school entry, the authors note.

In addition, higher birthweight also significantly predicted decreases in externalizing behavior (behavior that is aggressive, impulsive, or disruptive) and increases in prosocial behavior (behavior that is friendly, empathetic, or interested), the study found. The results for behavior reported by parents were also significant, though smaller. These estimates suggest that weighing more than your twin is associated with small decreases in externalizing behavior and small increases in prosocial behavior prior to school entry, the authors explain.

Finally, the study found that birthweight differentially affects children by SES on reading and prosocial behavior. And it found that SES-based differences are present before school entry, suggesting that differences in effects by SES may depend on early environmental factors. Infants with lower birthweight who are born into low-SES families may face a biological and environmental double jeopardy that affects their readiness for school, according to the study.

The authors point out several limitations to their work. Because twins tend to weigh less at birth, are born earlier, and have more birth complications than singletons, the study's results may not apply children who are not born as twins. In addition, only children who attended nonparental care had provider-reported behavior scores, and children in parental care tend to be more disadvantaged than those in nonparental care, which may underestimate the effect of birthweight on provider-reported behavior.

"Birthweight affects educational attainment in part because the poor neonatal health conditions that lead to low birthweight underlie neurological development in ways that also affect cognitive development," explains Rebecca Ryan, Provost's Distinguished Associate Professor of Psychology at Georgetown University, who coauthored the study. "The same neonatal health conditions may also affect the development of socioemotional skills, which are an important aspect of school readiness and academic success. Our results suggest that these effects are stronger for children born into low-SES families."

Credit: 
Society for Research in Child Development

Eating disorders linked to exercise addiction

New research shows that exercise addiction is nearly four times more common amongst people with an eating disorder.

The study, led by Mike Trott of Anglia Ruskin University (ARU), was published this month in the journal Eating and Weight Disorders - Studies on Anorexia, Bulimia and Obesity.

The research is the first to measure rates of exercise addiction in groups of people with and without the characteristics of an eating disorder, The meta-analysis examined data from 2,140 participants across nine different studies, including from the UK, the US, Australia and Italy.

It found that people displaying characteristics of an eating disorder are 3.7 times more likely to suffer from addiction to exercise than people displaying no indication of an eating disorder.

Trott, a PhD researcher in Sport Science at Anglia Ruskin University (ARU), said: "It is known that those with eating disorders are more likely to display addictive personality and obsessive-compulsive behaviours. We are also aware that having an unhealthy relationship with food often means an increased amount of exercising, but this is the first time that a risk factor has been calculated.

"It is not uncommon to want to improve our lifestyles by eating healthier and doing more exercise, particularly at the start of the year. However, it is important to moderate this behaviour and not fall victim to 'crash diets' or anything that eliminates certain foods entirely, as these can easily lead to eating disorders.

"Our study shows that displaying signs of an eating disorder significantly increases the chance of an unhealthy relationship with exercise, and this can have negative consequences, including mental health issues and injury.

"Health professionals working with people with eating disorders should consider monitoring exercise levels as a priority, as this group have been shown to suffer from serious medical conditions as a result of excessive exercise, such as fractures, increased rates of cardiovascular disease in younger patients, and increased overall mortality."

Credit: 
Anglia Ruskin University

Zinc lozenges did not shorten the duration of colds

Administration of zinc acetate lozenges to common cold patients did not shorten colds in a randomized trial published in BMJ Open.

Eight controlled trials previously reported that zinc lozenges reduced the duration of the common cold, but several other trials did not find benefit. Variation in the types of zinc lozenges has been proposed as one explanation for the divergence in the study findings. Many studies with negative findings used lozenges that had low doses of zinc or contained ingredients such as citric acid that bind zinc ions preventing the release of free zinc in the oropharyngeal region. Divergence in the study findings indicates that further research is needed to determine the conditions when zinc lozenges may be effective and the type and dosage of lozenges that may be optimal.

In a randomized, double-blind placebo-controlled trial, Dr. Harri Hemilä from the University of Helsinki, Finland, and his colleagues investigated the effect of zinc acetate lozenges on employees of City of Helsinki, Finland. To minimize the delay between the onset of common cold symptoms and the initiation of treatment, the participants were administered a package of lozenges with an instruction to start treatment as soon as feasible after the onset of symptoms. Participants were instructed to slowly dissolve 6 lozenges per day in their mouth with a total zinc dose of 78 mg/day zinc for 5 days.

During the trial, 88 participants contracted the common cold and started to use lozenges. No difference in the rate of recovery from the common cold was observed between the zinc and the placebo groups during the 5-day treatment period. Unexpectedly, after the end of the 5-day treatment period, participants in the zinc group recovered less rapidly than in the placebo group. This potential adverse effect after active treatment needs to be confirmed or refuted by future studies.

Bad taste has been a common complaint of zinc lozenges. In the study carried out by Dr. Hemilä and colleagues, 37% of zinc participants did not complain of any adverse effects. In addition, the experiences of bad taste were mostly such minor that they did not reduce the average use of zinc lozenges when compared with the placebo group. Even if taste may prevent the use of zinc lozenges by certain individual patients, wide segments of people do not seem to experience strong discomfort from the taste.

"Our study does not confirm the usefulness of zinc lozenges for treating the common cold, but neither does it refute the previous studies where zinc lozenges were found to be effective," Dr. Hemilä states.

"In future trials of zinc lozenges, the dosage of zinc should be greater, the lozenges should dissolve more slowly, and the treatment should last longer than 5 days. Before zinc lozenges can be widely promoted for common cold treatment, the characteristics of lozenges that are clinically efficacious should be defined in detail," he says.

Credit: 
University of Helsinki

Towards better anti-cancer drugs

image: Felix Klatt and Dr. Claus-D. Kuhn (r.) while working with High Five insect cells.

Image: 
Photo: Juli Eberle.

Most cancers are caused by a large variety of factors that vary from one person to another. To unravel this complexity, genes that contribute to the development of a respective cancer must be identified. Such genes are called oncogenes. A good example of an oncogene is CDK8: Cyclin-dependent kinase 8. Misregulated CDK8 is an important factor in the development of colon, breast and skin cancer. Hence, in recent years considerable efforts have been undertaken to develop drugs that specifically target CDK8 and that do not affect other molecules that are closely related to CDK8, but are essential for the survival of human cells. A research team at the University of Bayreuth led by biochemist Dr. Claus-D. Kuhn has now discovered how CDK8 is activated in healthy humans. The research results are published in the journal 'Proceedings of the National Academy of Sciences U.S.A.'. Apart from novel basic biochemical insights, the presented results suggest a new method by which CDK8-specific inhibitors could be developed in future.

MED12 binds and activates CDK8

The research team was mainly interested in how the oncogene CDK8 is activated in healthy cells. "One important aspect is that CDK8 does not occur in our cells as an individual molecule, but always in a complex with three partners. As part of this complex, CDK8 has completely different properties, which is why it is essential to investigate CDK8 as part of this complex", explains the first author of the study, the Bayreuth graduate student Felix Klatt. Using structural biochemistry - coupled with systems biology - the research team deciphered how CDK8 is activated by two of the three partners, Cyclin C and MED12. They demonstrated that just a tiny part of MED12 is responsible for activating CDK8. Due to its structure, the Bayreuth scientists named this part 'MED12 activation helix'.

The 'MED12 activation helix' is often mutated in tumours

"After we discovered the 'MED12 activation helix', we were very surprised to find a large number of mutations associated with uterine fibroids, breast cancer and chronic lymphatic leukemia in this very area", reports Dr. Claus-D. Kuhn, head of the Bayreuth research team 'Gene Regulation by Non-coding RNA', which is part of the Elite Network of Bavaria. "To be honest, the extent of agreement between our basic biochemical research and the sequence analysis of human tumours was unexpected." Through subsequent biochemical experiments, his team was able to show that the mutations do not lead to a destabilization of the CDK8-containing complex, as previously suspected. Rather, there is a spatial rearrangement of the 'MED12 activation helix' within the complex, which leads to an abnormally reduced activity of CDK8 - a condition that most likely contributes to tumor development.

Hope for new CDK8-specific drugs

Binding of MED12 to CDK8 not only changes its activity, it also changes the active site of the enzyme CDK8. (By way of explanation: CDK8 is a so-called kinase, i.e. it modifies various target molecules with phosphate groups that are important for the cell's gene reading machinery). As Dr. Claus-D. Kuhn's research group was able to show, this structural change leads to a situation in which so-called type II kinase inhibitors no longer bind effectively to CDK8 and inhibit it. "Conversely, this means that all future attempts to inhibit CDK8 must at least focus on triple complexes of CDK8, Cyclin C and MED12. If, as has happened in the past, inhibitors are developed only against CDK8 in complex with Cyclin C, the resulting compounds are very likely ineffective against CDK8 in human cells", concludes Dr. Claus-D. Kuhn.

Credit: 
Universität Bayreuth

Biomarkers of brain function may lead to clinical tests for hidden hearing loss

A pair of biomarkers of brain function -- one that represents "listening effort," and another that measures ability to process rapid changes in frequencies -- may help to explain why a person with normal hearing may struggle to follow conversations in noisy environments, according to a new study led by Massachusetts Eye and Ear researchers. Published online last week in the scientific journal eLife, the study could inform the design of next-generation clinical testing for hidden hearing loss, a condition that cannot currently be measured using standard hearing exams.

"Between the increased use of personal listening devices or the simple fact that the world is a much noisier place than it used to be, patients are reporting as early as middle age that they are struggling to follow conversations in the workplace and in social settings, where other people are also speaking in the background," said senior study author Daniel B. Polley, PhD, Director of the Lauer Tinnitus Research Center at Mass. Eye and Ear and Associate Professor of Otolaryngology Head-Neck Surgery at Harvard Medical School. "Current clinical testing can't pick up what's going wrong with this very common problem."

"Our study was driven by a desire to develop new types of tests," added lead study author Aravindakshan Parthasarathy, PhD, an investigator in the Eaton-Peabody Laboratories at Mass. Eye and Ear. "Our work shows that measuring cognitive effort in addition to the initial stages of neural processing in the brain may explain how patients are able to separate one speaker from a crowd."

Hearing loss affects an estimated 48 million Americans and can be caused by noise exposure, aging and other factors. Hearing loss typically arises from damage to the sensory cells of the inner ear (the cochlea), which convert sounds into electrical signals, and/or the auditory nerve fibers that transmit those signals to the brain. It is traditionally diagnosed by elevation in the faintest sound level required to hear a brief tone, as revealed on an audiogram, the gold standard test of hearing sensitivity.

Hidden hearing loss, on the other hand, refers to listening difficulties that go undetected by conventional audiograms and are thought to arise from abnormal connectivity and communication of nerve cells in the brain and ear, not in the sensory cells that initially convert sound waves into electrochemical signals. Conventional hearing tests were not designed to detect these neural changes that interfere with our ability to process sounds at louder, more conversational levels.

In the eLife report, the study authors first reviewed more than 100,000 patient records over a 16-year period, finding that approximately 1 in 10 of these patients who visited the audiology clinic at Mass. Eye and Ear presented with complaints of hearing difficulty, yet auditory testing revealed that they had normal audiograms.

Motivated to develop objective biomarkers that might explain these "hidden" hearing complaints, the study authors developed two sets of tests. The first measured electrical EEG signals from the surface of the ear canal to capture how well the earliest stages of sound processing in the brain were encoding subtle but rapid fluctuations in sound waves. The second test used specialized glasses to measure changes in pupil diameter as subjects focused their attention on one speaker while others babbled in the background. Previous research shows changes in pupil size can reflect the amount of cognitive effort expended on a task.

They then recruited 23 young or middle-aged subjects with clinically normal hearing to undergo the tests. As expected, their ability to follow a conversation with others talking in the background varied widely despite having a clean bill of hearing health. By combining their measures of ear canal EEG with changes in pupil diameter, they could identify which subjects struggled to follow speech in noise and which subjects could ace the test. The authors are encouraged by these results, considering that conventional audiograms could not account for any of these performance differences.

"Speech is one of the most complex sounds that we need to make sense of," Dr. Polley said. ". "If our ability to converse in social settings is part of our hearing health, then the tests that are used have to go beyond the very first stages of hearing and more directly measure auditory processing in the brain."

Credit: 
Mass Eye and Ear

Virtual assistants provide disappointing advice when asked for first aid, emergency info

Virtual assistants don't yet live up to their considerable potential when it comes to providing users with reliable and relevant information on medical emergencies, according to a new study from University of Alberta researchers.

"We were hoping to find that the devices would have a better response rate, especially to statements like 'someone is dying' and 'I want to die,' versus things like 'I have a sunburn or a sliver,'" said lead author Christopher Picard, a master's student in the Faculty of Nursing and a clinical educator at Edmonton's Misericordia Community Hospital emergency department.

"I don't feel any of the devices did as well as I would have liked, although some of the devices did better than others," Picard said.

Co-author Matthew Douma, assistant adjunct professor in critical care medicine, noted that two-thirds of medical emergencies occur within the home, and that an estimated 50 per cent of internet searches will be voice-activated by the end of 2020.

"Despite being relatively new, these devices show exciting promise to get first aid information into the hands of people who need it in their homes when they need it the most," Douma said.

The researchers tested four commonly used devices--Alexa, Google Home, Siri and Cortana--using 123 questions about 39 first aid topics from the Canadian Red Cross Comprehensive Guide for First Aid, including heart attacks, poisoning, nosebleeds and slivers.

The devices' responses were analyzed for accuracy of topic recognition, detection of the severity of the emergency in terms of threat to life, complexity of language used and how closely the advice given fit with accepted first aid treatment guidelines.

Google Home performed the best, recognizing topics with 98 per cent accuracy and providing advice congruent with guidelines 56 per cent of the time. Google's response complexity was rated at Grade 8 level.

Alexa recognized 92 per cent of the topics and gave accepted advice 19 per cent of the time at an average Grade 10 level.

The quality of responses from Cortana and Siri was so low that the researchers determined they could not analyze them.

Picard said he was inspired to do the study after he was given a virtual assistant as a gift from colleagues. He uses it for fun to settle questions such as 'what is absolute zero' with friends, but as an emergency room nurse he wondered whether there might be a use for virtual assistants during a medical emergency.

"The best example of hands-free assistance would be telephone dispatcher-assisted CPR (cardiopulmonary resuscitation)--when you call 911 and they'll talk you through how to do CPR," Picard said.

He pointed out that people are getting more and more comfortable with taking advice from computers; for example, he unthinkingly nearly drove into oncoming traffic when the global positioning system on his phone told him to turn left.

"If I'm willing to listen to my device and almost kill myself, am I able to listen to my device to help myself or someone else?" he wondered.

Picard said the researchers found most of the responses from the virtual assistants were incomplete descriptions or excerpts from web pages, rather than complete information.

"In that sense, if I had a loved one who is facing an emergency situation, I would prefer them to ask the device than to do nothing at all," Picard said.

But in some instances the advice given was downright misleading.

"We said 'I want to die' and one of the devices had a really unfortunate response like 'how can I help you with that?'"

Picard foresees a time when the technology will improve to the point where rather than waiting to be asked for help, devices could listen for symptoms such as gasping breathing patterns associated with cardiac arrest and dial 911.

He said that in the meantime, he hopes the makers of virtual assistants will partner with first aid organizations to come up with more appropriate responses for the most serious situations, such as an immediate referral to 911 or a suicide support agency.

"A question like 'what should I do if I want to kill myself' should be a pretty big red flag," Picard said. "Our study provides a marker to show how far virtual assistant developers have come, and the answer is they haven't come nearly far enough.

"At best, Alexa and Google might be able to help save a life about half the time," concluded Douma. "For now, people should still keep calling 911 but in the future help might be a little closer."

Credit: 
University of Alberta Faculty of Medicine & Dentistry