Body

A switch to turn fragrances on and off

image: Salk Institute and Purdue University scientists have discovered the switch in plants that turns off production of terpenoids -- carbon-rich compounds that play roles in plant physiology and are used by humans in everything from fragrances and flavorings to biofuels and pharmaceuticals.

Image: 
Salk Institute

LA JOLLA--(August 21, 2018) Salk Institute and Purdue University scientists have discovered the switch in plants that turns off production of terpenoids--carbon-rich compounds that play roles in plant physiology and are used by humans in everything from fragrances and flavorings to biofuels and pharmaceuticals.

Plant terpenoids are found in nutritional supplements, natural insecticides, and drugs used to treat malaria and cancer. The chemotherapy drug Taxol, which is used to treat breast, ovarian, lung, bladder and prostate cancers, is a plant terpenoid. But plants often make them in such low quantities that extracting them for such uses is costly and often impractical.

The findings were reported in the journal Nature Plants on August 20, 2018.

"Several years ago my laboratory discovered a new enzyme found in all plants called isopentenyl phosphate kinase (IPK) that regulates the ebb and flow of living, carbon-based molecules called terpenoids. As is often the case in science, we first unraveled the role of this enzyme in completely different organisms, bacteria and a very ancient group of life called Archaea," says Professor Joseph P. Noel, director of Salk's Jack H. Skirball Center for Biology and Proteomics, Howard Hughes Medical Institute investigator and the paper's co-corresponding author. "By elucidating the three-dimensional structure and chemistry of this enzyme reported in ACS Chemical Biology and eLife in 2010 and 2013, respectively, we revealed that a previously unknown gene found in all plants encoded the very same enzyme as originally discovered in microbes."

Because terpenoids use up considerable amounts of carbon and energy in plants, it had been recognized that their formation must be under tight control so that they are produced only when important for the bacterium or plant hosts.

For the paper, the Noel lab teamed up with the laboratory of co-corresponding author Natalia Dudareva, Purdue distinguished professor in the Department of Biochemistry and researcher in the Purdue Center for Plant Biology, to unravel how plants switch on and off metabolic pathways controlling the ebb and flow of terpenoid production by regulating the availability of their chemical starting materials.

The Salk-Purdue team had earlier determined how plants turn on terpenoid production, but having an understanding of both the "on" and "off"--the yin and yang switches--as well as the bottlenecks for flux are essential for understanding and ultimately tuning up terpenoid yield.

"This is important basic knowledge that opens new targets for engineering of terpenoid metabolic pathways," says Dudareva. "Plants produce these compounds already, but the amounts are small. It might have taken hundreds or thousands of plants to get enough of a compound to use it for something like a pharmaceutical. This new set of unanticipated discoveries will lead to faster, more efficient ways way to obtain sufficient amounts of these products for the benefit of humans."

IPKs convert chemical pools of inert monophosphate terpenoid building blocks into readily used diphosphate building blocks. Using a multipronged approach that includes structural biology, biochemistry, plant genetics and synthetic biology, the research team determined that two Nudix enzymes were the missing links responsible for the removal of a phosphate group to return the active terpenoid diphosphates back to the inert pool of terpenoid monophosphates.

"The Nudix hydrolase family of enzymes are conserved in all organisms, yet their biological roles are largely undefined. Here we uncover an unexpected and new function for members of this family in plants," remarks co-first author Suzanne Thomas, a postdoctoral researcher in the Noel lab.

"We have shown that IPK and Nudix are working together to regulate downstream terpenoid product formation," says co-first author Laura Henry, a recent doctoral graduate of Dudareva's lab who is now an analytical chemist for Heritage Research Group. "Some of these products may be toxic to the plants if the plants make too much of them. This is how the plants regulates their output."

Credit: 
Salk Institute

Identifying drug-resistant hotspots can provide roadmap to reduce tuberculosis in South Africa

Boston--Researchers at Boston Medical Center (BMC) have created a near real-time surveillance method to identify communities experiencing a high burden of drug-resistant tuberculosis in South Africa using routinely collected laboratory data. The team mapped where in the Western Cape Province were the highest rates of drug-resistant tuberculosis and tracked changes over five years. The results of this study, published in PLOS Medicine, will help create a method that can lead to more targeted interventions and public health approaches aimed at reducing the number of people who contract the disease.

South Africa has the highest rates of tuberculosis (TB) incidence in the world, with 4 percent of those cases being multidrug resistant cases, meaning resistant to first line treatment. The country also has a centrally collected laboratory database that includes TB tests, making it an ideal location to implement a surveillance system to track drug- resistant TB cases by clinic location. Researchers developed an algorithm to identify unique patients and episodes of disease from the data, and created heat maps of the region to see which areas were most afflicted between 2008 and 2013.

The group identified 799,779 individuals who had specimens submitted for TB tests from clinics during the study period; 28% were diagnosed with TB, of which 4.6 percent were resistant to first line tuberculosis treatment. The spread of these cases was geographically heterogeneous, ranging from zero to 25 % of TB cases having drug resistance in different parts of the region. There were also significant annual fluctuations in drug-resistant TB percentages at several locations. The communities that saw the highest rates of drug-resistant TB were Cape Town townships and informal settlements, the rural region of the west coast, and areas bordering the Eastern Cape Province.

"It is critically important that we understand how drug-resistant TB impacts people in specific areas over time," says Karen Jacobson, MD, MPH, infectious disease physician at BMC. "By locating emerging and chronic hotspots of the disease in real-time, public health providers can evaluate the most effective interventions and monitor progress towards TB reduction goals."

Researchers also emphasize the role routinely collected laboratory data plays in identifying both ongoing and short-term TB outbreaks. Their findings show this data is a powerful tool for researchers and providers, allowing for more accurate allocation of resources to treat TB.

Recent evidence shows that the TB epidemic is dynamic, and that medication-resistant TB infections are spread through transmission from another individual with drug-resistant TB disease. This indicates that constant monitoring could lead to more effective public health interventions, resulting in fewer cases of medication-resistant TB.

"Our model of mapping high-burden communities can serve as a roadmap for regions working to reduce TB incidence by initiating treatment as soon as possible," says Jacobson, who also is an assistant professor of infectious diseases and epidemiology at Boston University School of Medicine and Boston University School of Public Health, respectively.

Credit: 
Boston Medical Center

Newer HIV therapies, Pregabalin warnings, and more

1. Newer HIV therapies have led to dramatic gains in viral suppression rates over the past 2 decades

Younger persons and blacks saw lower rates of improvement

Abstract: http://annals.org/aim/article/doi/10.7326/M17-2242

Editorial: http://annals.org/aim/article/doi/10.7326/M18-1944

URLs go live when the embargo lifts

Viral suppression rates have nearly tripled in the U.S. over the past 2 decades, but disparities still exist for younger persons and blacks living with HIV. Researchers say that newer, better-tolerated treatment regimens, such as fixed-drug combinations that include integrase strand transfer inhibitors (ISTIs), have likely contributed to these dramatic gains. Findings from an observational cohort study are published in Annals of Internal Medicine.

Approximately 1.2 million adults in the U.S. are living with HIV, and men who have sex with men and African Americans are disproportionately affected. Achieving and maintaining HIV viral suppression is essential for optimal outcomes and prevention efforts. As such, understanding trends and predictors of viral suppression is imperative to inform public health policy.

Researchers supported by the National Institutes of Health (NIH) analyzed data for nearly 32,000 adults living with HIV who were enrolled in care at eight Centers for AIDS Research Network of Integrated Clinical Systems sites from 1997 to 2015 to evaluate trends in viral suppression. The researchers evaluated associated factors, such as demographic characteristics and ISTI use. They found that overall rates of viral suppression increased significantly during the timeframe, from 32 percent in 1997 to 86 percent in 2015. They also found that the average interval from enrollment to suppression was shortened substantially from 9 months for those initiating antiretroviral therapy, or ART, between 1997 and 2000 to 2 months for those initiating ART between 2010 and 2015. However, the gains in viral suppression were not equally distributed across populations. Younger persons and blacks were more likely to have detectable viral load. According to the researchers, these disparities warrant further research.

Media contact: For an embargoed PDF, please contact Lauren Evans at laevans@acponline.org. To interview the lead author, Heidi M. Crane, MD, MPH, please contact Susan Gregg at sghanson@uw.edu. To reach the author of the editorial, please contact Hillary Hoffman at hillary.hoffman@nih.gov

2. Pregabalin may increase risk for death when coprescribed with opioids

Product monograph should be revised to include a warning about the risk for serious adverse events when combined with opioids

Abstract: http://annals.org/aim/article/doi/10.7326/M18-1136

Editorial: http://annals.org/aim/article/doi/10.7326/M18-2175

URLs go live when the embargo lifts

Pregabalin, an anticonvulsant increasingly prescribed as an adjunct for chronic pain, is associated with an increased risk for opioid-related death when coprescribed with opioids. A brief research report is published in Annals of Internal Medicine.

Pregabalin can be sedating and may augment central nervous system (CNS) depression in patients also receiving opioids. Because more than one half of Ontario residents who initiate pregabalin therapy are concurrently prescribed an opioid, determining the risk for opioid-related death among persons coprescribed these medications has important clinical implications.

Researchers from St. Michael's Hospital and the University of Toronto identified a cohort of persons aged 15 to 105 who received publicly funded opioid prescriptions between August 1997 and December 2016. Case patients who died of an opioid-related cause were matched by age, sex, and other characteristics to up to four control participants. After adjusting for multiple variables, the researchers concluded that concomitant exposure with opioids and pregabalin in the preceding 120 days was associated with significantly increased odds of opioid-related death compared with exposure to opioids alone.

According to the researchers, these findings warrant a revision to the pregabalin product monograph, which does not currently warn about the risk for serious adverse events when combined with opioids.

Media contact: For an embargoed PDF, please contact Lauren Evans at laevans@acponline.org. To interview the lead author, Tara Gomes, MHSc, PhD, please contact Debora Creatura at deborah.creatura@ices.on.ca.

3. Studies needed to determine the effects of marijuana use in pregnancy

Abstract: http://annals.org/aim/article/doi/10.7326/M18-1141

URLs go live when the embargo lifts

Marijuana use among pregnant women has increased over the past decade and research is needed to determine its effect on fetal and neonatal outcomes. Authors from Kaiser Permanente Northern

California health system describe the limitations of previous studies and outline a strategy for future research in Annals of Internal Medicine.

With the growing acceptance, accessibility, and legalization of marijuana, its use is likely to rise among pregnant women. As such, it is crucial to understand the effects of prenatal exposure. The weakness of previous studies is that they did not fully account for confounding factors. The authors present new data on the high rate of substance co-use of pregnant patients in the Kaiser Permanente Northern California health system who were universally screened for any prenatal substance use between 2009 and 2016. More than a third of women who had a screening result positive for marijuana use also had a positive result for at least one other substance. This result highlights the importance of controlling for co-occurring prenatal substance use to accurately detect marijuana-specific health risks.

According to the authors, well-designed retrospective and prospective cohort studies should include current, large, representative populations, and use validated measures of marijuana exposure while adjusting for other types of substance use. Large health care systems like Kaiser Permanente are well-positioned for this type of research, the results of which can help women make informed decisions about marijuana use during pregnancy.

Media contact: For an embargoed PDF, please contact Lauren Evans at laevans@acponline.org. To interview the lead author, Kelly C. Young-Wolff, PhD, please contact Brett Israel at Brett.T.Israel@kp.org.

Also new in this issue:

Clinical-Community Partnerships to Reduce Food Insecurity Among High-Need, High-Cost Medicaid Patients

Katherine Rediger, MSN, CRNP; D.R. Bailey Miles, MD

Ideas and Opinions

Abstract: http://annals.org/aim/article/doi/10.7326/M18-1104

Association Between Publication Characteristics and Treatment Effect Estimates

Agnes Dechartres, MD, PhD; Ignacio Atal, MSc; Carolina Riveros, MSc; Joerg Meerpohl, MD; Philippe Ravaud, MD, PhD

Research and Reporting Methods

Abstract: http://annals.org/aim/article/doi/10.7326/M18-1517

Credit: 
American College of Physicians

Young, healthy people still vulnerable to CVD if their LDL cholesterol is high

DALLAS, August 20, 2018 -- Young, healthy people may still face a lifetime risk of premature death from cardiovascular disease if they cannot keep their cholesterol levels in check, according to new observational research in the American Heart Association's journal Circulation.

Researchers in this latest study looked at associations between low-density lipoprotein-cholesterol (LDL-C) and non-high-density lipoprotein-cholesterol (HDL-C) thresholds and cardiovascular disease (CVD) and coronary heart disease (CHD) mortality to evaluate whether people believed to be at low 10-year risk for heart health problems should begin pursuing efforts to lower elevated cholesterol earlier through lifestyle changes, and in some cases, cholesterol-lowering medication.

Coronary heart disease remains the leading cause of death in the United States, affecting half of all men and one-third of all women. An estimated 28.5 million Americans have total cholesterol levels of 240 mg/dL or higher. LDL is a type of cholesterol that contributes to clogged arteries which increases the risk of heart attack and stroke.

"High cholesterol at younger ages means there will be a greater burden of cardiovascular disease as these individuals age. This research highlights the need to educate Americans of any age on the risks of elevated cholesterol, and ways to keep cholesterol at a healthy level throughout life," said Robert Eckel, M.D., past president of the American Heart Association and Director of the Lipid Clinic at University of Colorado Hospital in Aurora. Eckel has been active in developing the AHA's Check.Change.Control.Cholesterol™ initiative to help providers and patients work together to identify cardiovascular health risks.

Clinical trials typically have focused on individuals at moderate or high risk for cardiovascular disease. This observational study included 36,375 young, relatively healthy participants of the Cooper Center Longitudinal Study who were free of diabetes or cardiovascular disease and were followed for 27 years. For a low-risk person, researchers discovered that LDL levels were independently associated with increased chances of dying from cardiovascular disease. Without taking into account other risk factors, researchers' other findings included:

Compared with participants who had LDL readings of under 100 mg/dL, those with LDL levels in the range of 100-159 mg/dL had a 30 to 40 percent higher risk of cardiovascular disease death.

Those with LDL levels of 160 mg/dL or higher had a 70 to 90 percent increased risk of cardiovascular death, compared with participants who had LDL readings of under 100 mg/dL.

Among the group (72 percent men, average age 42), there were 1,086 deaths from cardiovascular disease, such as stroke, and 598 coronary heart disease deaths.

"Our study demonstrates that having a low 10-year estimated cardiovascular disease risk does not eliminate the risk posed by elevated LDL over the course of a lifetime," said lead study author Shuaib Abdullah, M.D., at University of Texas Southwestern Medical Center and Veteran's Affairs North Texas Healthcare System in Dallas, Texas. The study was done in collaboration with investigators from the Cooper Institute. "Those with low risk should pursue lifestyle interventions, such as diet and exercise, to achieve LDLs levels as low as possible, preferably under 100 mg/dL. Limiting saturated fat intake, maintaining a healthy weight, discontinuing tobacco use, and increasing aerobic exercise should apply to everyone."

Credit: 
American Heart Association

Chagas disease, caused by a parasite, is spreading

DALLAS, Aug. 20, 2018 -- Chagas disease, caused by infection with a parasite called Trypanosoma cruzi (T cruzi), causes chronic heart disease in about one third of those infected. Over the past 40 years, Chagas disease has spread to areas where it had not traditionally been seen, including the United States, according to a new American Heart Association scientific statement published in the American Heart Association journal Circulation.

The statement. summarizes the most up-to-date information on diagnosis, screening and treatment of T cruzi infection. Infection occurs when feces from the infected blood sucking insect triatomine enters the skin through the bite site or in the eye. Triatomine insects are found in Central and South America, where they infest adobe houses and in the Southern United States. The disease can also be passed through contaminated food or drink, from pregnant mothers to their babies, and through blood transfusions and organ transplants.

The health risks of Chagas disease are well-known in Latin America where most cases are found in countries that include Brazil, Argentina, Bolivia, Paraguay, Mexico and El Salvador. However, doctors outside of Latin America are largely unaware of the infection and its connection to heart disease. Countries where infected individuals have been diagnosed include the United States with an estimated 300,000 cases, Spain with at least 42,000 cases, Italy, France, Switzerland, the United Kingdom, Australia and Japan.

"This statement aims to increase global awareness among physicians who manage patients with Chagas disease outside of traditionally endemic environments," said Maria Carmo Pereira Nunes, M.D., Ph.D, co-chair of the committee that produced the statement. "This document will help healthcare providers and health systems outside of Latin America recognize, diagnose and treat Chagas disease and prevent further disease transmission," said Pereira Nunes, who is a cardiologist at the Federal University of Minas Gerais in Belo Horizonte, Brazil.

Although 60-70 percent of people infected with T cruzi never develop any symptoms, those that do can develop heart disease, including heart failure, stroke, life threatening ventricular arrhythmias (heart rhythm abnormalities) and cardiac arrest. In the Americas, Chagas disease is responsible for more than seven times as many disability-adjusted life-years lost as malaria. However, if caught early, an infection can be cured with medications that have a 60 to 90 percent success rate, depending on when in the course of infection the patient is treated.

"Early detection of Chagas disease is critical, allowing prompt initiation of therapy when the evidence for cure is strong," said statement co-author Caryn Bern, M.D., M.P.H., professor of epidemiology and biostatistics at the University of California in San Francisco.

The risk of infection is extremely low for most travelers and residents of endemic countries. To minimize risk, people should avoid sleeping in houses with un-plastered adobe walls and/or thatch roofs, and avoid unpasteurized sugar cane juice, açai fruit juice and other juices when visiting affected countries.

Credit: 
American Heart Association

Energy-efficient spin current can be controlled by magnetic field and temperature

The transition from light bulbs to LEDs has drastically cut the amount of electricity we use for lighting. Most of the electricity consumed by incandescent bulbs was, after all, dissipated as heat. We may now be on the verge of a comparable breakthrough in electronic computer components. Up to now, these have been run on electricity, generating unwanted heat. If spin current were employed instead, computers and similar devices could be operated in a much more energy-efficient manner. Dr. Olena Gomonay from Johannes Gutenberg University Mainz (JGU) in Germany and her team together with Professor Eiji Saitoh from the Advanced Institute for Materials Research (AIMR) at Tohoku University in Japan and his work group have now discovered an effect that could make such a transition to spin current a reality. This effect significantly simplifies the design of fundamental spintronic components.

Touching a computer that has been running for some time, you will feel heat. This heat is an - undesirable - side effect of the electric current. Undesirable because the heat generated, naturally, also consumes energy. We are all familiar with this effect from light bulbs, which became so hot after being on for hours that they could burn your fingers. This is because light bulbs converted only a fraction of the energy required to do their job of creating light. The energy used by LEDs, on the other hand, is almost completely used for lighting, which is why they don't become hot. This makes LEDs significantly more energy-efficient than traditional incandescent bulbs.

Instead of using an electric current composed of charged particles, a computer using a stream of particles with a spin other than zero could manipulate the material of its components in the same way to perform calculations. The primary difference is that no heat is generated, the processes are much more energy-efficient. Dr. Olena Gomonay from Mainz University and Professor Eiji Saitoh from Tohoku University have now laid the foundations for using these spin currents. More precisely, they have used the concept of spin currents and applied it to a specific material. Gomonay compares the spin currents involved with how our brains work: "Our brains process immeasurable amounts of information, but they don't heat up in the process. Nature is, therefore, way ahead of us." The team from Mainz is hoping to emulate this model.

Drastic change in current flow

How well spin currents flow depends on the material - just like in the case of electric current. While spin currents can always flow in ferromagnetic materials, in antiferromagnetic materials states with low resistance alternate with those with high resistance. "We have now found a way to control spin currents by means of a magnetic field and temperature, in other words, to control the resistance of an antiferromagnetic system based on spin," explained Gomonay, summarizing her results.

At a temperature close to the phase transition temperature, Gomonay and her team applied a small magnetic field to the material. While the applied magnetic field alters the orientation of the spin currents to allow them to be easily transported through the material, the temperature has precisely two effects. On the one hand, a higher temperature causes more particles of the material to be in excited states, meaning there are more spin carriers that can be transported, which makes spin transport easier. On the other hand, the high temperature makes it possible to operate at a low magnetic field.

Thus the resistance and the current flow change drastically by several orders of magnitude. "This effect, which we call spin colossal magnetoresistance or SCMR for short, has the potential to simplify the design of fundamental spintronic components significantly," explained the scientist from Mainz. This is particularly interesting for storage devices such as hard disks. This effect might be employed, for example, to create spin current switches as well as spin current based storage media.

Credit: 
Johannes Gutenberg Universitaet Mainz

Blood test could detect kidney cancer up to five years prior to clinical diagnosis

image: This is Rupal Bhatt, M.D., Ph.D., corresponding author and medical oncologist at Beth Israel Deaconess Medical Center.

Image: 
Beth Israel Deaconess Medical Center

BOSTON - Every year, more than 330,000 people are diagnosed with kidney cancer worldwide. More than 80 percent of those new cases are renal cell carcinomas (RCC). When caught early, the five-year survival rate is more than 90 percent. Patients diagnosed with more invasive tumors, however, have dramatically poorer prognoses, with five-year survival rates of 50 percent and 10 percent for patients diagnosed at stages III and IV respectively. Early detection could improve the overall survival rate in patients at high risk for death from RCC.

Now, a team of investigators led by Beth Israel Deaconess Medical Center (BIDMC) medical oncologist Rupal Bhatt, MD, PhD, has demonstrated that a molecule called KIM-1, a protein present in the blood of some patients with renal cell carcinoma is present at elevated levels at the time of diagnosis, can also serve as a tool to predict the disease's onset up to five years prior to diagnosis. The team's findings were published in the journal Clinical Cancer Research.

"Our study found a significant association between plasma KIM-1 concentrations and the risk of renal cell carcinomas," said Bhatt, corresponding author of the study and an Associate Professor of Medicine at Harvard Medical School. "The team also found that KIM-1 concentrations were associated with poorer survival. Further studies are needed, but a sensitive and specific tumor marker that can detect early stage RCC would have strong potential to improve overall survival."

Bhatt and colleagues, including co-first author David Muller, PhD, a research fellow in Epidemiology and Biostatistics at Imperial College London, analyzed data from the European Prospective Investigation into Cancer and nutrition (EPIC), one of the world's largest cohort studies investigating the link between diet, lifestyle and environment and chronic diseases including cancer.

When the team compared KIM-1 concentrations in samples from EPIC participants who developed RCC within five years with participants who remained healthy, they found the average concentration of KIM-1 was double in those eventually diagnosed with RCC. What's more, including KIM-1 concentrations into a model for predicting kidney cancer risk approximately doubled the model's accuracy.

"This work is a big step forward, because KIM-1 is the only blood biomarker shown prospectively to distinguish between people at high and low risk of kidney cancer," said Muller. "But more work will be necessary before we could see this in the clinic."

"It will be important to understand more about the settings in which KIM-1 might be incorporated into patient care," added Bhatt. "We don't expect that KIM-1 will be useful as a screening test, as risk of RCC in the general population is low. KIM-1 is more likely to be relevant in high-risk populations or as an adjunct to other diagnostic procedures."

Credit: 
Beth Israel Deaconess Medical Center

Blood test may identify gestational diabetes risk in first trimester, NIH study indicates

A blood test conducted as early as the 10th week of pregnancy may help identify women at risk for gestational diabetes, a pregnancy-related condition that poses potentially serious health risks for mothers and infants, according to researchers at the National Institutes of Health and other institutions. The study appears in Scientific Reports.

Gestational diabetes occurs only in pregnancy and results when the level of blood sugar, or glucose, rises too high. Gestational diabetes increases the mother's chances for high blood pressure disorders of pregnancy and the need for cesarean delivery, and the risk for cardiovascular disease and type 2 diabetes later in life. For infants, gestational diabetes increases the risk for large birth size. Unless they have a known risk factor, such as obesity, women typically are screened for gestational diabetes between 24 and 28 weeks of pregnancy.

In the current study, researchers evaluated whether the HbA1c test (also called the A1C test), commonly used to diagnose type 2 diabetes, could identify signs of gestational diabetes in the first trimester of pregnancy. The test approximates the average blood glucose levels over the previous 2 or 3 months, based on the amount of glucose that has accumulated on the surface of red blood cells. According to the authors, comparatively few studies have examined whether the HbA1c test could help identify the risk for gestational diabetes, and these studies have been limited to women already at high risk for the condition. The test is not currently recommended to diagnose gestational diabetes at any point in pregnancy.

The researchers analyzed records from the NICHD Fetal Growth Study, a large observational study that recruited more than 2,000 low-risk pregnant women from 12 U.S. clinical sites between 2009 and 2013. The researchers compared HbA1c test results from 107 women who later developed gestational diabetes to test results from 214 women who did not develop the condition. Most of the women had tests at four intervals during pregnancy: early (weeks 8-13), middle (weeks 16-22 and 24-29) and late (weeks 34-37).

Women who went on to develop gestational diabetes had higher HbA1c levels (an average of 5.3 percent), compared to those without gestational diabetes (an average HbA1c level of 5.1 percent). Each .1 percent increase in HbA1c above 5.1 percent in early pregnancy was associated with a 22-percent higher risk for gestational diabetes.

In middle pregnancy, HbA1c levels declined for both groups. However, HbA1c levels increased in the final third of pregnancy, which is consistent with the decrease in sensitivity to insulin that often occurs during this time period.

"Our results suggest that the HbA1C test potentially could help identify women at risk for gestational diabetes early in pregnancy, when lifestyle changes may be more effective in reducing their risk," said the study's senior author, Cuilin Zhang, Ph.D., of the Epidemiology Branch at NIH's Eunice Kennedy Shriver National Institute of Child Health and Human Development.

Exercise and a healthy diet may lower blood glucose levels during pregnancy. If these measures are not successful, physicians may prescribe insulin to bring blood glucose under control.

The authors noted that further studies are needed to confirm whether measuring HbA1c levels in early pregnancy could determine a woman's later risk for gestational diabetes. Similarly, research is needed to determine whether lowering HbA1c with lifestyle changes, either in early pregnancy or before pregnancy, could reduce the risk for the condition.

Credit: 
NIH/Eunice Kennedy Shriver National Institute of Child Health and Human Development

First-of-its-kind Parkinson's biomarker guidelines invigorates drive for treatments

PHILADELPHIA--Parkinson's disease affects more than 4 million people worldwide, with numbers projected to double in the next few decades. With no known cure, there is a race for treatments to slow or stop the progression of the disease. Key to the research and discovery of treatments for Parkinson's is the identification of biomarkers--a measureable biological indicator, such as proteins found in blood, which can help diagnose disease.

Today, a slate of guidelines to shape the future of Parkinson's biomarker research have been published in Science Translational Medicine. While previous recommendations have been created to support the research of Parkinson's biomarkers, this is the first developed in collaboration with institutions outside of academic medicine, including The Michael J. Fox Foundation for Parkinson's Research.

Biomarkers can not only help predict, diagnose, or monitor disease, but they can also be used to see how well the body responds to a treatment for a disease or condition. For example, within Alzheimer's disease, measures of the protein beta-amyloid help diagnose the disease, and also serve as a drug target in clinical trials. Similarly, measuring cholesterol levels can help with the diagnosis and treatment of cardiovascular disease.

Lead author Alice Chen-Plotkin, MD, the Parker Family Associate Professor of Neurology in the Perelman School of Medicine at the University of Pennsylvania, led the project in partnership with experts from 36 organizations, including government groups, academic institutions, and non-profit funding agencies, to foster collaboration and discovery of these critical biomarkers.

"These players at times have acted in separate worlds, but with a disease affecting so many and lacking in disease-modifying therapies, we're coming together for essential collaboration and innovation," Chen-Plotkin said. "Biomarkers to bolster our efforts to develop new therapies are urgently needed. These guidelines can help make the discovery of biomarkers for Parkinson's a reality."

The guidelines focus on three areas--recommendations for types of biomarkers researchers should identify in order to aid the development of new treatments, resources for collaboration, and research principles to follow.

Previous research efforts have largely focused on biomarkers that distinguish Parkinson's disease from healthy individuals or those with other neurodegenerative diseases such as Alzheimer's disease. However, these guidelines argue for a shift to focus on biomarkers that look within Parkinson's disease itself, as there are many ways the disease manifests in patients. This is an important element for planning clinical trials and developing new treatments.

Researchers have already built an ecosystem of biobanks at different centers across the world, which hold thousands of biological samples including blood and tissue. These biobanks hold many clues that could help propel the next breakthroughs in the treatment of neurodegenerative diseases, and the guidelines list recommended biobanks as a resource for researcher collaborations.

"Before the advent of these shared biobanks, investigators depended on their own ability to collect hundreds or thousands of samples for testing, preventing potential researchers lacking access to large clinical populations from entering the biomarker discovery arena," said Chen-Plotkin. "However, within the last five years, multiple public-private efforts have laid the groundwork for investigators from both academic and industrial sectors to access well-documented clinical samples. These repositories are all open for collaboration to improve the pipeline to take Parkinson's biomarkers from concept to clinic."

The guidelines also include recommendations for biomarker research standards, such as larger sample sizes and replication across multiple patient groups. These principles will harmonize findings, streamlining and advancing the biomarker discovery process.

Ideas in the new paper originated from a Biomarkers Discovery Workshop convened by The Michael J. Fox Foundation in New York in March of 2016. They were further developed in discussions at the National Institute of Neurological Disorders and Stroke Parkinson's Disease Biomarkers Program annual meeting in Washington, DC, in August, 2016.

Credit: 
University of Pennsylvania School of Medicine

Eating breakfast burns more carbs during exercise and accelerates metabolism for next meal

image: New research published in the American Journal of Physiology suggests that eating breakfast could 'prime' the body to burn carbohydrates during exercise and more rapidly metabolise foods after working out.

Image: 
Javier Gonzalez

Eating breakfast before exercise may "prime" the body to burn carbohydrates during exercise and more rapidly digest food after working out, University of Bath researchers have found.

Scientists from the University's Department for Health, working with colleagues at the universities of Birmingham, Newcastle and Stirling, were studying the effect of eating breakfast versus fasting overnight before an hour's cycling. In a control test breakfast was followed by three hours' rest. The volunteers ate a breakfast of porridge made with milk two hours before exercise.

Post exercise or rest, the researchers tested the blood glucose levels and muscle glycogen levels of the 12 healthy male volunteers who took part.

They discovered that eating breakfast increased the rate at which the body burned carbohydrates during exercise, as well as increasing the rate the body digested and metabolised food eaten after exercise too.

Dr Javier Gonzalez, senior lecturer in the Department of Health who co-led the study, said: "This is the first study to examine the ways in which breakfast before exercise influences our responses to meals after exercise. We found that, compared to skipping breakfast, eating breakfast before exercise increases the speed at which we digest, absorb and metabolise carbohydrate that we may eat after exercise."

Rob Edinburgh, PhD student in the Department for Health who co-led the study, said: "We also found that breakfast before exercise increases carbohydrate burning during exercise, and that this carbohydrate wasn't just coming from the breakfast that was just eaten, but also from carbohydrate stored in our muscles as glycogen. This increase in the use of muscle glycogen may explain why there was more rapid clearance of blood sugar after 'lunch' when breakfast had been consumed before exercise.

"This study suggests that, at least after a single bout of exercise, eating breakfast before exercise may 'prime' our body, ready for rapid storage of nutrition when we eat meals after exercise."

The study is published in American Journal of Physiology: Endocrinology and Metabolism.

An interesting aspect of this research is that it shows that extrapolating from other studies conducted on people who are fasted, which is common in metabolism experiments, may not be reliable, as being fed alters metabolism.

Dr Gonzalez added: "Whilst fasting prior to laboratory trials is common in order to control for baseline metabolic status, these conditions may preclude the application of findings to situations most representative of daily living, because most people are not fasted during the day."

Rob Edinburgh said: "As this study only assessed the short-term responses to breakfast and exercise, the longer-term implications of this work are unclear, and we have ongoing studies looking at whether eating breakfast before or after exercise on a regular basis influences health.

"In particular there is a clear need for more research looking at the effect of what we eat before exercise on health outcomes, but with overweight participants who might be at an increased risk of type 2 diabetes and cardiovascular disease. These are some of the questions we will now try to answer."

Credit: 
University of Bath

Remifentanil during labor could halve the number of women needing an epidural

Half as many women in labour who were given a drug called remifentanil to help manage their pain needed a subsequent epidural, compared to the women given pethidine - the current standard of care, according to an open-label randomised controlled trial of 400 women from 14 maternity units in the UK published in The Lancet.

Epidurals - injections of pain relief drugs around the spinal cord - provide effective pain relief but increase the risk of needing instrumental delivery (forceps or vacuum) during birth, which in turn can increase the risk of trauma and long-lasting problems for the mother, such as incontinence and sexual dysfunction.

Pethidine is given to more than a quarter of a million women in labour each year in the UK, and many more worldwide. In the UK and Europe, remifentanil is rarely offered routinely in labour and its use restricted to women who cannot receive an epidural for medical reasons (such as blood clotting disorders). The authors suggest that using remifentanil instead of pethidine could reduce the need for epidurals, instrumental deliveries, and consequent morbidity for large numbers of women worldwide, but recommend more research to understand the effects of low maternal oxygen levels it can cause.

"Previous studies have shown that at least one in three women given pethidine to manage pain during labour require a subsequent epidural as the drug is not always effective. It also has unwanted side effects such as sedation and nausea for the mother, and it may pass into the baby's bloodstream through the placenta," says lead author Dr Matthew Wilson, University of Sheffield, UK. "Our findings challenge the routine use of pethidine for pain relief during labour. Remifentanil reduced the need for an epidural by half and there were no lasting problems for the mothers and babies in our trial, although the effect of remifentanil on maternal oxygen levels needs to be clarified in further studies." [1]

The study included 400 women aged over 16 years old who were giving birth after 37 weeks and had requested opioid pain relief. The participants were told about the trial in antenatal visits or when admitted to hospital to have labour induced, and could sign up to take part when they were in active labour.

Half of the women were allocated to receive remifentanil and half were allocated to pethidine. Remifentanil was given as a patient-controlled drip and women could receive 40μg of the drug every two minutes by pressing a hand held device, whereas pethidine was given as an injection of 100mg of the drug into a muscle up to every four hours with a maximum of 400mg in 24 hours. Because of the difference in how the drugs were given, participants and healthcare professionals knew which drug was being used.

All women in the trial received one-to-one care from a midwife, with checks on the mother's breathing rate, sedation, pain ratings and oxygen levels every 30 minutes. The women could request an epidural at any time, and other pain relief was stopped if an epidural was given.

In the remifentanil group 93% (186/201) of women received the drug, and in the pethidine group 77% (154/200) of women received the drug. The main reasons for not receiving the allocated drug were women giving birth before it could be administered (12 women in the remifentanil group and 17 women in the pethidine group) or the mother deciding to immediately request an epidural after randomisation, without receiving the allocated opioid, which only occurred in the pethidine group (in 22 women).

Half as many women in the remifentanil group went on to have an epidural (19% [39/201]) than in the pethidine group (41% [81/199]) and this remained the same even when the women who did not receive the drug they were meant to were excluded.

On average, women in the remifentanil group rated their pain as less severe than women in the pethidine group.

Women given remifentanil were also less likely to need forceps and vacuum during labour than women given pethidine (15% [31/201] vs 26% [52/199]).

However, remifentanil was associated with twice as many mothers having low oxygen levels than pethidine (14% [26/189] vs 5% [8/154]), and more women in the remifentanil group were given supplementary oxygen. Despite this increase it did not cause any negative effects for the mother or baby, but more research in larger groups will be needed to confirm this.

Breathing problems and sedation in the mother were rare in both groups.

The authors note some limitations, including that the higher number of women in the pethidine group opting for an epidural immediately could be because they had preconceptions about pethidine's effectiveness. Once a women requested an epidural, it would have been unethical to withhold it. However, excluding these episodes from analysis made little difference to the findings so they had no effect on the result.

Women whose labour was induced were more common in the study than in the general population as this provides extra time before labour and meant they were more easily recruited into the study. However, being induced is very common so the authors believe their findings are still relevant.

Writing in a linked Comment, Professor Peter Kranke, University Hospitals of Wuerzburg, Germany, says: "In returning to the initial question of whether remifentanil is better than pethidine, we can carefully say yes in view of the efficacy data by reducing the epidural progression rate by 50%, with the proviso that the reported effect for this conversion to epidural might have been overestimated by an unknown magnitude, and providing that safety issues are thoroughly addressed. For all those who would want a more precise answer, this finding gives impetus to further studies, particularly with regards to the selection of women with contraindications or those who express the explicit desire not to have an epidural analgesia. In this scenario, at least the prospect of the gold standard cannot bias the treatment effect."

Credit: 
The Lancet

Rare cancer could be caught early using simple blood tests

image: This is a Myeloma HE stain.

Image: 
Wikipedia Commons

A pioneering study into myeloma, a rare cancer, could lead to GPs using simple blood tests to improve early diagnosis.

The study investigated the best combination of blood tests that could be used to diagnose myeloma in GP practices.

The research was a collaboration between the University of Oxford, the University of Exeter and Chiddenbrook Surgery, Crediton, funded by the National Institute for Health Research (NIHR) and is published in the British Journal of General Practice.

Researchers investigated how useful a number of different measures were for indicating the presence of the disease, and suggested what combinations of these tests were sufficient to rule out the disease, and to diagnose it, saving the patient from the worry of specialist referral.

Blood tests of 2703 cases taken up to five years prior to diagnosis were analysed and compared with those of 12,157 patients without the cancer, matching cases with control patients of similar age amongst other relevant parameters.

They demonstrated that a simple combination of two blood parameters could be enough to diagnose patients. Such blood tests are routinely conducted in GP surgeries.

Constantinos Koshiaris, lead author of the study, from Oxford University, said: "The combination of levels of haemoglobin, the oxygen carrier in the blood, and one of two inflammatory markers (erythrocyte sedimentation rate or plasma viscosity) are a sufficient test rule out myeloma. If abnormalities are detected in this test, it should lead to urgent urine protein tests which can help speed up diagnosis".

Each year approximately 5,700 people are diagnosed with myeloma in the UK alone. It can lead to symptoms such as bone pain, fatigue and kidney failure. It has the longest diagnosis process of all common cancers, and a large number of patients are diagnosed after emergency care, over a third of which having had at least three primary care consultations.

Professor Willie Hamilton, of the University of Exeter Medical School, is principal investigator on the study. He said "Ordinarily a GP will see a patient with myeloma every five years - and early diagnosis matters. More timely treatment could significantly improve survival rates for this disease. We report a simple way a GP can check patients presenting symptoms such as back, rib and chest pain, or recurrent chest infections, and determine whether they have myeloma or not".

The authors also suggest the possibility of integrating a system in the electronic health record to alert clinicians to relevant symptoms or changes in blood parameters related to myeloma.

Credit: 
University of Exeter

Tdap vaccination for pregnant women does not increase risk of autism

A Kaiser Permanente study of more than 80,000 children born over a 4-year period showed that the prenatal Tdap vaccination (tetanus, diphtheria, acellular pertussis) was not associated with increased risk of autism spectrum disorder in children. The study was published today in Pediatrics.

"Infants are at the highest risk of hospitalization and death among any population subgroup after contracting a pertussis infection, a highly contagious respiratory disease also known as the whooping cough," said Tracy A. Becerra-Culqui, PhD, a post-doctoral research fellow with Kaiser Permanente Southern California's Department of Research & Evaluation and lead author of the study. "With waning immunity against pertussis in the United States, it has become very important for pregnant women to be immunized against pertussis. It is an immunity they pass on to their unborn baby."

"Pregnant women can be reassured by this study that there is no indication of an increased risk of autism spectrum disorder in children after being exposed prenatally to the Tdap vaccine," Becerra-Culqui added.

The Advisory Committee on Immunization Practices, which provides guidance on the use of vaccines for the United States, recommends pregnant women receive the Tdap vaccine to prevent pertussis infection, but some women still hesitate.

Kaiser Permanente researchers were able to comprehensively study the hypothesized link between Tdap and autism because of the organization's large and diverse patient population. In Southern California, Kaiser Permanente provides health care in 15 hospitals and about 220 medical offices to approximately 4.4 million members who are broadly representative of the area's population. Recommended vaccinations are free to all members.

This retrospective cohort study looked at the autism diagnosis for children born at Kaiser Permanente hospitals in Southern California between Jan. 1, 2011 and Dec. 31, 2014.

The study included 81,993 children and found that:

Prenatal Tdap vaccination coverage ranged from 26 percent for the 2012 birth cohort to 79 percent for the 2014 birth cohort.

The autism spectrum disorder incidence rate in children was 1.5 percent in the maternal Tdap vaccinated group and 1.8 percent in the maternal unvaccinated group, comparable to autism rates in the United States (1.7 percent).

Analyses of the data extracted from electronic health records showed that Tdap vaccination during pregnancy was not associated with increased autism spectrum disorder risk in children.

Results were consistent across study birth years and among first-born children.

"The link between vaccination and development of autism has been refuted by many rigorous scientific investigations. Unfortunately, the misconceptions still generate concerns," said the paper's senior author, Hung Fu Tseng, PhD, of the Department of Research & Evaluation.

"Given the increasing practice to vaccinate pregnant women with Tdap vaccine, it was important to address the concern of a link between maternal vaccination and subsequent development of autism spectrum disorder in children," he added. "We hope that our findings reassure parents that Tdap vaccination during pregnancy was not associated with autism in children."

Credit: 
Kaiser Permanente

WPSI says screen all women annually for urinary incontinence

Women's Preventive Services Initiative says screen all women annually for urinary incontinence

Review: http://annals.org/aim/article/doi/10.7326/M18-0225
Guideline (Free): http://annals.org/aim/article/doi/10.7326/M18-0595
Editorial: http://annals.org/aim/article/doi/10.7326/M18-1768
URLs go live when the embargo lifts

All women should be screened annually for urinary incontinence, according to new guidelines from the Women's Preventive Services Initiative (WPSI). Screening should assess whether women experience urinary incontinence and whether it affects their activities and quality of life. If treatment is indicated, women should be referred for further evaluation. The clinical guideline and evidence review are published in Annals of Internal Medicine.

Urinary incontinence, or the involuntary loss of urine, affects an estimated 51 percent of women overall and can adversely affect a woman's physical, functional, and social well-being. However, many women are reluctant to discuss urinary incontinence with their health care providers, so they may endure symptoms for a long time before the issue is addressed. Urinary incontinence is often never recognized by health care providers.

Researchers from Oregon Health and Science University conducted a systematic review of published studies to evaluate whether screening for urinary incontinence in women not previously diagnosed improved physical and functional outcomes. They also assessed studies on the accuracy of screening methods and potential harms. The researchers found that no studies evaluated the overall effectiveness or harms of screening. Limited evidence suggested that some screening methods (brief questionnaires) had fairly high accuracy for identifying symptoms of urinary incontinence in primary care settings.

Despite the lack of direct evidence, the WPSI asserts that screening has the potential to identify urinary incontinence in many women who silently experience its adverse effects. Because early intervention may reduce symptom progression, improve quality of life, and limit the need for more complex and costly treatment, the WPSI recommends annual screening for women of all ages.

The authors of an accompanying editorial from the Women's Health Research Program at Monash University in Melbourne, Victoria, Australia argue that applying a screening test to a large population is a very serious responsibility and should be implemented with caution. The authors suggest advocating for a randomized trial to directly assess the benefits and harms of urinary incontinence screening in women before recommending it for all.

Media contact: For an embargoed PDF, please contact Lauren Evans at laevans@acponline.org. To interview the lead author of the evidence review, Heidi Nelson, MD, MPH, please contact Tracy Brawley at brawley@ohsu.edu. To interview a spokesperson WPSI, please contact Angela Collom at acollom@acponline.org. To interview the editorialists, please email Robin J. Bell, MBBS, PhD, MPH at robin.bell@monash.edu.

Computer-aided colonoscopy reliably diagnoses small polyps that do not need to be removed
Leaving diminutive, nonneoplastic rectosigmoid polyps can save up to $33 million annually

Abstract: http://annals.org/aim/article/doi/10.7326/M18-0249
Editorial: http://annals.org/aim/article/doi/10.7326/M18-1901
URLs go live when the embargo lifts

Real-time computer-assisted diagnosis (CAD) for colonoscopies can help endoscopists reliably distinguish diminutive (less than 5 mm) polyps in the distal colon that do not have to be removed to reduce cancer risk. Diagnosing and leaving these polyps, rather than removing them, can save time and substantial costs annually. Findings from a prospective study are published in Annals of Internal Medicine.

Most cases of colorectal cancer develop from adenomas or sessile serrated polyps, and removal of these lesions is recommended. However, hyperplastic polyps, especially if small and located in the distal part of the large bowel, are not associated with subsequent development of adenomas or colorectal cancer. Therefore, they do not have to be removed to reduce cancer risk. Diagnosing and leaving these polyps would save time and expense, as the annual cost of unnecessary polypectomy is estimated to be $33 million in the United States.

Researchers from the Digestive Disease Center, Showa University Northern Yokohama Hospital, Yokohama, Japan compared diagnoses derived from real-time CAD with endocytoscopies for 791 consecutive patients undergoing colonocospy with pathologic reading of the resected specimen. They found that the CAD system provided 93.7 percent negative predictive value for identification of diminutive neoplastic polyps. According to the authors, these results suggest that computer-aided colonoscopy has the potential for replacing histological assessment of diminutive colorectal polyps in near future.

Media contact: For an embargoed PDF, please contact Lauren Evans at laevans@acponline.org. To interview the lead author, Yuichi Mori, MD, PhD, please contact him directly at ibusiginjp@gmail.com.

TNF inhibitors not associated with increased cancer recurrence in patients with rheumatoid arthritis and a history of cancer

Abstract: http://annals.org/aim/article/doi/10.7326/M17-2812
URLs go live when the embargo lifts

Use of tumor necrosis factor inhibitors (TNFi) to treat rheumatoid arthritis (RA) is not associated with increased risk for cancer recurrence in patients with a history of cancer. However, meaningful risk increases could not be ruled out completely. Findings from a nationwide population-based cohort study are published in Annals of Internal Medicine.

Clinical guidelines have issued cautions about use of TNFi in patients with a history of cancer because these drugs may have tumor-promoting effects. TNFi are widely used to treat RA, which makes treating patients with RA and a history of cancer a clinical dilemma.

Researchers from the Karolinska Institutet, Stockholm, Sweden studied data from national registries in Sweden to compare cancer recurrence rates in 467 patients who had started TNFi treatment for RA after their cancer diagnosis between 2001 and 2015 versus an individually matched cohort of 2,164 patients with RA and a similar cancer history who had never been treated with TNFi. They also compared cancer recurrence rates between the TNFi patients and an umatched cohort of 3,826 patients who were diagnosed with RA during the same timeframe but had no history of biologic treatment before inclusion. Cancer recurrence rates were similar in all of the groups, suggesting that TNFi treatment did not increase risk for recurrent cancer. However, the authors caution that because several estimates had confidence intervals with upper limits around 2 or above, the clinically relevant risk for cancer recurrence could not be ruled out.

Media contact: For an embargoed PDF, please contact Lauren Evans at laevans@acponline.org. To interview the lead author, please contact Pauline Raaschou, MD, PhD directly at pauline.raaschou@sll.se.

Also new in this issue:

Treatment of Acute Intoxication From Inhaled 1,2-Difluoroethane
Juan Pablo Arroyo, MD, PhD; Daniel C. Johnson, PharmD; Julia B. Lewis, MD; Ahmed Al Sheyyab, MB BS; Adam King, MD; Matthew R. Danter, MD; Stuart McGrane, MB, ChB; Joshua P. Fessel, MD, PhD
Observation: Case Report
Abstract: http://annals.org/aim/article/doi/10.7326/L18-0186

Credit: 
American College of Physicians

Genetic tools uncover cause of childhood seizure disorder missed by other methods

Early childhood seizures result from a rare disease that begin in the first months of life. Researchers at University of Utah Health have developed high-tech tools to uncover the genetic cause of the most difficult to diagnose cases. The results are available online on August 13 in the journal Nature Genomic Medicine.

"These tools let us peek in the dark corners and under the rug of the genome that other methods do not," said Aaron Quinlan, Ph.D., associate professor of Human Genetics and Biomedical Informatics at U of U Health and senior author on the paper. "With this approach rather than undergoing multiple tests, families can receive results faster, limiting their medical odyssey, at ultimately a lower cost."

According to Betsy Ostrander, M.D., early infantile epileptic encephalopathy (EIEE) begins with intractable seizures in the first months of life.

"Most patients are on four to five medications and still suffer from frequent, debilitating seizures, from once a week to 50-times a day," said Ostrander, assistant professor of Pediatrics at U of U Health and the Division of Pediatric Neurology at Primary Children's Hospital and first author on the paper.

If the condition is not diagnosed early and treated with the available medications, the seizures hinder normal development, leading to intellectual impairment and often an early death. Although more than 50 genes are associated with the disease, routine genetic tests fail half the time to pinpoint the cause of the illness, limiting the medical practitioner's ability to alleviate the child's symptoms.

Ostrander and her colleagues turned to experts in bioinformatics at the university to help them sift through the volume of genetic information obtained from 14 patients and their parents.

All of the patients in the study had previously undergone several rounds of genome testing, such as clinical gene panels and chromosomal microarrays, but these methods failed to find the genetic cause of their illness. According to Quinlan, these tests were too low resolution or only focused on already identified causative genes to identify the mutation.

Quinlan and his team have created an array of computational tools that contain powerful algorithms to scrutinize the genetic data and identify errors that lead to disease. They applied their suite of computational tools to all of the genetic information from the patients and their parents to pinpoint changes in the genome responsible for disease development.

In the study, they found a spontaneous mutation was responsible for EIEE for 12 of the 14 patients. In one of these patients, the mutation was found on a gene not previously associated with the disease. The researchers also identified large structural changes (a translocation and a duplication) in the genome of the remaining two patients. These structural changes affected genes previously linked to EIEE but were undetectable through standard genome testing techniques.

"These families have been drifting through expensive prolonged testing with little hope of finding an answer," Ostrander said. "We can now identify the genetic cause of EIEE and select medications best suited to each patient to decrease the frequency of seizures earlier and hopefully prevent developmental delays."

Quinlan admits that cost is still a limiting factor in deploying this approach more widely. In addition, not every rare disease is associated with a clear genetic change.

The computational tools in this study, developed by the University of Utah computational teams (RUFUS, GEMINI, GENE.IOBIO and LUMPY) are available to researchers on the USTAR Center for Genetic Discovery website.

"Our unique team of computational biologists are building just the right kind of software tools to cast a wide net, making sure that no or few disease-causing genetic variations or mutations are missed," said Gabor Marth, DSc, professor of Human Genetics at U of U Health and co-director of the USTAR Center for Genetic Discovery. "Ultimately, it is really the combination of the high-quality data that was collected and the comprehensive and accurate methods we developed [that] was the key to achieving the high diagnostic success rate for [these] children."

Credit: 
University of Utah Health