Culture

Determining the atomic structure of natural products more rapidly and accurately

image: Measuring the residual chemical shift anisotropy in a liquid crystalline medium. The method was used to determine the stereochemistry of spiroepicoccin A, a recently discovered natural product. The novel natural product was isolated from the deep-sea fungus Epicoccum nigrum, which can be found at depths of more than 4,500 meters.

Image: 
Songhwan Hwang, FMP

Many drugs are derived from natural products. But before natural products can be exploited, chemists must first determine their structure and stereochemistry. This can be a major challenge, particularly when the molecules cannot be crystallized and contain only few hydrogen atoms. A new NMR-based method, developed at the Leibniz-Forschungsinstitut für Molekulare Pharmakologie (FMP), now simplifies the analysis and produces more accurate results. The work has been published in the Journal of the American Chemical Society.

Natural products are present in antibiotics, painkillers and cancer drugs, playing a key role in around 60 percent of all FDA approved drugs. Plants, fungi and sessile marine organisms are particularly promising sources, because many of them possess chemical defenses to deter predators. However, identification of potential drug candidates is a challenge. First, researchers must accurately determine the structure and stereochemistry (the spatial arrangement of atoms) of the molecules. Without this information, chemists will be unable to synthesize the molecules and develop them into drugs. Moreover, the structure is needed to establish whether the molecule has previously been discovered.

Besides the X-ray diffraction method, which can only be applied to the crystallizable molecules, chemists usually use nuclear magnetic resonance (NMR) spectroscopy for structure determination. Most recently, the NMR-based parameter "residual chemical shift anisotropy" has taken on particular importance in this context. Studies from the past two to three years have shown that this parameter facilitates the very accurate determination of the structure and stereochemistry of organic molecules. However, this requires the use of special instruments that are not available in all laboratories. And then there is the matter of the time-consuming methods of analysis involved in data analysis.

Simplified method produces more accurate results

Researchers from the Leibniz-Forschungsinstitut für Molekulare Pharmakologie (FMP) have now developed a method that enables the residual chemical shift anisotropy to be measured much more easily and effectively. Partners from China (Institute of Oceanology, Chinese Academy of Sciences and South Central University for Nationalities) and Brazil (Universidade Federal de Pernambuco) were also involved in the work, which has now been published in the Journal of the American Chemical Society.

"The NMR-based method we have developed enables chemists to determine the stereochemistry of novel natural products with greater accuracy and efficiency," explained Dr. Han Sun from FMP, who led the study. " Furthermore, the method is very easy to use, making it accessible to all chemists."

The experiment involves bringing together natural products with a commercially available peptide with a sequence of AAKLVFF. Dissolved in methanol, the peptides are transformed into liquid crystals, giving the natural products a weak orientation in the magnetic field. "This particular orientation enables us to measure the residual chemical shift anisotropy of the molecules as a parameter, which in turn provides accurate information about their structure and stereochemistry," stated the chemist Sun, describing the new method.

The example of thalidomide shows how important it is to determine the stereochemistry of compounds correctly. Besides having a sedative effect (hypnotic), the compound thalidomide also has an adverse developmental effect, which is attributable to its two mirror-image forms (R)-thalidomide and (S)-thalidomide.

Analysis of exotic natural products from the ocean

For their current study, the researchers used a previously unexplored natural product: spiroepicoccin A was isolated from marine microorganisms by the Chinese partners. The substance, obtained from a depth of more than 4,500 meters, only has a few hydrogen atoms attached to its stereocenters, posing a challenge to established NMR methods. Thanks to the new measurement method, however, the structure and stereochemistry of the natural product were unambiguously elucidated. "Even though our method enables us to measure only the relative and not the absolute stereochemistry as yet, our work makes an important contribution to simplifying the determination of challenging natural products," remarked Sun. It appears that pharmaceutical companies have already expressed an interest "because the method accelerates the development of new drugs, which is also our aim."

Credit: 
Forschungsverbund Berlin

Vitamin D supplementation linked to potential improvements in blood pressure in children

PITTSBURGH, Jan. 21, 2020 - Overweight and obese vitamin D-deficient children who took a relatively high dose of vitamin D every day for six months had lower blood pressure and improved insulin sensitivity than their peers who took a lower dose, according to the results of a UPMC Children's Hospital of Pittsburgh clinical trial reported in The American Journal of Clinical Nutrition.

However, the study did not show improvements in other markers of cardiovascular and metabolic health, a finding that indicates vitamin D supplementation alone may not be the cure-all for improving the heart health of children at highest risk for diabetes and heart disease.

"Current recommendations for taking vitamin D are pegged to optimal bone health," said lead author Kumaravel Rajakumar, M.D., M.S., professor of pediatrics at the University of Pittsburgh School of Medicine. "But we know vitamin D is involved in more than building healthy bones. It can turn on and off genes that direct our cells to regulate blood glucose levels, and immune and vascular function."

Rajakumar and his colleagues enrolled 225 healthy, but vitamin D-deficient, 10- to 18-year-old children in Pittsburgh who were overweight or obese in the clinical trial, and 211 of them were black. People with darker skin have higher amounts of melanin pigment in their skin and are more likely than their lighter-skinned counterparts to be vitamin D deficient. This is because vitamin D is made in the body when the skin is directly exposed to sunlight, and melanin in the skin acts as a natural sunscreen and inhibits vitamin D production. Overweight and obese children also have a higher risk of vitamin D deficiency, as well as developing diabetes and heart disease.

The children were split into three groups and given pills that appeared identical, but contained different quantities of vitamin D, which is measured in international units, or IUs. One group received a 600 IU tablet daily, which is the current recommended daily dietary allowance. The other two groups received either a 1,000 IU or 2,000 IU tablet daily, still well below the 4,000 IU daily maximum considered safe for children in this age range. During the trial, neither the participants, nor their doctors, knew which dose each child was receiving.

Blood tests showed that the higher the daily dose of vitamin D, the greater the improvement in the participants' blood concentration of vitamin D. By the conclusion of the trial, none of the groups was considered vitamin D deficient.

After six months, the children receiving the daily 2,000 IU vitamin D supplement had a reduced fasting blood glucose level and improved insulin sensitivity -- both of which reduce susceptibility to diabetes and improve cardiovascular health. After six months, the children receiving 1,000 IUs of vitamin D daily had lower blood pressure. High blood pressure is bad because it increases risk of heart attack, stroke and kidney disease.

The study did not reveal any significant changes in measures of the health of the membrane that lines the blood vessels or arterial stiffness -- both of which are strong indicators of heart health and were the primary measures that the researchers were seeking to influence with vitamin D supplementation.

"There are many reasons we might not have seen changes in endothelial function or arterial stiffness," said Rajakumar, who also is a pediatrician at UPMC Children's Hospital. "Maybe vitamin D simply doesn't influence these, or perhaps we didn't reach and maintain a level of vitamin D to cause an effect. It could also be that our trial didn't run long enough. However, treatment of vitamin D deficiency with these higher daily doses can have a positive impact on cardiometabolic health of children, without negative side effects."

Credit: 
University of Pittsburgh

Study: Pharmaceutical companies marketing stimulants to physicians

Boston - Results of a new study show that a large number of physicians in the US may have received marketing payments from pharmaceutical companies that produce stimulant medications. Led by researchers at Boston Medical Center's (BMC) Grayken Center for Addiction, the first of its kind study found that one in 18 physicians received some form of pharmaceutical marketing about stimulants, most often in the form of food or beverage. Published in JAMA Pediatrics, the results indicate the potential role that the subtle, low-cost marketing can have on increasing physician prescribing of stimulants, which could be associated with the recent increase in prescribing and misuse of these medications.

In the US, prescription stimulant use doubled from 2006 to 2016, resulting in the highest pharmaceutical expenditure for children compared to any other medication class. Additionally, the number of attention deficit hyperactivity disorder (ADHD) diagnoses in children are increasing, resulting in more prescriptions being written for medications such as Vyvanse, Quillivant and Mydayis.

"Our study results indicate that the marketing of stimulants by pharmaceutical companies could be contributing to the increased prescribing of both generic and brand name stimulant medications for children with ADHD," said Scott Hadland, MD, MS, MPH, the study's lead author who is a pediatrician and addiction specialist at the Grayken Center for Addiction at BMC. "Given the potential for the misuse of these medications - and the fact that misuse often starts during adolescence and young adulthood - we need to more closely examine whether there should be standards in place limiting the marketing of stimulant medications directly to providers."

The researchers examined data on industry and physician marketing interactions from the Open Payments database between Jan. 1, 2014 and Dec. 31, 2018. They included entries that were non-research payments for both generic and brand name stimulants and looked at the type, value and number of payments made overall, to individual providers, and to providers by specialty.

During the study's five-year period, there were 591,907 payments made to physicians, totaling more than $20 million dollars. The median value of a payment was $14; more than 97 percent of payments were in the form of food and beverage; and nearly 50 percent of the total amount of money spent was on food and beverage. Overall, of the 989,789 physicians studied, 55,105 received marketing payments.

"As previous studies have shown, marketing - even its most subtle form - can influence prescribing, so doctors should be aware that even something as seemingly benign as a meal from a drug company could be affecting the clinical decisions they make," added Hadland, who also is an assistant professor of pediatrics at Boston University School of Medicine.

Pediatricians as a group received the most marketing payments (40.4 percent of all payments), and over 19 percent of pediatricians received marketing related to stimulants; psychiatrists received the most marketing in terms dollar value (56.7 percent of the money spent) and were the second highest specialty receiving payments (31.8 percent); and family physicians received 18 percent of the marketing payments. The researchers estimate that as many as one in five pediatricians in practice during the study period may have received marketing.

Previous research has shown that drug misuse often starts during adolescence, and young adults who have ADHD are also at a higher risk of developing a substance use disorder. Hadland notes, however, that many young people who are prescribed stimulants experience benefit from these medications and should continue to receive them. "It's important to point out that young people with untreated ADHD are at risk for developing problematic substance use, and appropriate treatment of ADHD with stimulants may reduce this risk," said Hadland, who suggests that his study results warrant further investigation into how these marketing tactics may be connected to broader excessive use and misuse of stimulant medications.

Credit: 
Boston Medical Center

Vitamin C-B1-steroid combo linked to lower septic shock mortality in kids

Treating septic shock in children with a combination of intravenous vitamin C, vitamin B1 and hydrocortisone (a commonly used steroid) is associated with lower mortality, according to a study from Ann & Robert H. Lurie Children's Hospital of Chicago. This is the first pediatric study of the safe and relatively inexpensive treatment for septic shock, and the preliminary data supports the promising outcomes seen in adults. Findings were published in the American Journal of Respiratory and Critical Care Medicine.

Septic shock is the result of a severe systemic response to infection causing organ failure and dangerously low blood pressure. It is one of the leading causes of death in critically ill children.

"We were surprised and excited to see a substantial reduction in mortality after treating septic shock in children with a high dose of vitamin C combined with vitamin B1 and hydrocortisone," says lead author Eric Wald, MD, MSCI, critical care physician at Lurie Children's and Associate Professor of Pediatrics at Northwestern University Feinberg School of Medicine. "While based on a retrospective analysis, our results are especially compelling in that they are very similar to the positive outcomes found in a recent randomized controlled trial of vitamin C treatment for septic shock in adults."

In their retrospective study, Dr. Wald and colleagues matched 43 patients with septic shock who received the vitamin C-B1-hydrocortisone treatment with 43 patients of similar clinical profile who did not receive it (control group), and with 43 patients who received hydrocortisone only as adjunctive therapy. They found that while controls had mortality of 28 percent at 30 days, mortality in patients treated with the vitamin C combination protocol dropped to 9 percent in the same period. Treatment with hydrocortisone alone did not improve mortality (30 percent at 30 days). Similar reductions in mortality were seen at 90 days (14 percent with vitamin C protocol vs. 35 percent in controls and 37 percent in the hydrocortisone only group).

"While it is still unclear why vitamin C appears to reduce mortality from septic shock and we need to dig deeper to understand the mechanism, our results are incredibly promising," says Dr. Wald. "We hope to encourage larger, multi-center studies in children with septic shock to confirm our data."

Credit: 
Ann & Robert H. Lurie Children's Hospital of Chicago

Influenza vaccination of children cuts hospitalization in half: Ben-Gurion U. researchers

BEER-SHEVA...January 21, 2020 - Fully vaccinating children reduced the risk of hospitalization for complications associated with influenza by 54%, according to a new study by Ben-Gurion University of the Negev (BGU) and University of Michigan researchers.

The research, published in the December, 2019 print journal Clinical Infectious Disease, is one of the few studies worldwide that has tested the effectiveness of childhood vaccination against influenza and risk of hospitalization due to the influenza complications.

The study was led by Dr. Hannah Segaloff, an epidemiologist at the School of Public Health, University of Michigan and Prof. Mark Katz, M.D., of BGU's Department of Health Management, Faculty of Health Sciences and a senior researcher at the Clalit Institute of General Research. Prof. Katz also teaches in BGU's Medical School for International Health.

The retrospective study reviewed the vaccination data of 3,746 hospitalizations of children ages six months to 8 years old at six hospitals in Israel. They were tested for influenza over three winter seasons 2015-16, 2016-17 and 2017-18.

The findings reveal that the flu vaccine reduced hospitalizations associated with the flu by more than half. They also validate guidelines in the United States and Israel that recommend two vaccine doses for children up to age 8 who have never been vaccinated or who previously received one dose.

"Children vaccinated according to government guidelines are much better protected from influenza than those who only receive one vaccine," says Dr. Segaloff. "Over half of our study population had underlying conditions that may put them at high risk for severe influenza-related complications, so preventing influenza in this group is critically important.

"Our results also showed that the vaccine was effective in three different seasons with different circulating viruses, reinforcing the importance of getting an influenza vaccine every year no matter what virus strain is circulating."

Co-author Prof. Katz adds, "Young children are at high risk of hospitalization due to influenza complications. Children with underlying illnesses such as asthma and heart disease have an even greater risk of getting the complications. It is important to prevent influenza infections in these populations."

The findings also support health organizations' recommendations, including the Israel Ministry of Health, to vaccinate children against influenza every year, preferably before the onset of winter and especially during early childhood. Children under age five are defined as having a high risk of influenza complications.

"This study mirrors a previous study conducted at Clalit Institute where we found that flu vaccine reduces 40% risk of hospitalizations in pregnant women," says Prof. Ran Balicer, a member of the BGU School of Public Health and director of the Clalit Research Institute. "It found that vaccination is the most effective way to prevent both the flu and hospitalization."

The researchers hope parents make an informed decision about the importance of vaccinating their children and that their research will increase vaccination rates among the general public.

Credit: 
American Associates, Ben-Gurion University of the Negev

New roles found for Huntington's disease protein

image: A cross-section of the striatum in a mouse brain. Loss of huntingtin protein in striatal neurons (red) causes neuron loss and an inflammatory response, shown by the infiltration of glial astrocytes (cyan).

Image: 
Caley Burrus, Duke University

DURHAM, N.C. -- A Duke University research team has identified a new function of a gene called huntingtin, a mutation of which underlies the progressive neurodegenerative disorder known as Huntington's Disease.

Using genetic mouse models, they have discovered that neurons in the striatum, a brain area involved in controlling movement, require the huntingtin gene for regulating the body's movements, maintaining cell health during aging, and developing functioning connections between cells.

Addressing the gene's role in maintaining those neural connections may provide a new avenue against Huntington's, the researchers said.

Huntington's Disease is an inherited neurodegenerative disorder that usually emerges in mid-life and leads to impaired motor control, dementia, and psychiatric symptoms. While the genetic basis of this lethal disease was identified more than two decades ago, there are no approved treatments yet to slow its progression or cure it.

The disease is caused by a mutation in one of a person's two copies of the huntingtin gene. The mutation results in the production of an aberrant version of the huntingtin protein, which is toxic to neurons. Although the mutant protein is expressed throughout the body, neurons of the striatum are specifically vulnerable to its effects and degenerate as the disease progresses.

While the mutant huntingtin protein is damaging to neurons, it may also interfere with the remaining, non-mutated huntingtin's ability to perform its normal functions.

Drugs currently being tested in clinical trials are designed to block the defective huntingtin protein, but they also end up decreasing the amount of normal huntingtin in neurons. Huntingtin is known to play several important functions in cells, but its specific role in striatal neuron health and function was not known.

"We hypothesized that the normal huntingtin gene plays a critical role in neuronal health and connectivity, and we wanted to determine what happens to striatal neurons that have had huntingtin eliminated," said lead author Cagla Eroglu, an associate professor of cell biology and neurobiology and the co-director of Regeneration Next Initiative at Duke.

In a study appearing Tuesday in Cell Reports, the team found that deleting the huntingtin gene specifically from the striatal neurons of very young mice caused these neurons to die as the mice aged, similar to the pattern of neuron death seen in Huntington's Disease. They also found that mice lacking huntingtin in their striatal neurons were impaired in their ability to control their movement. Importantly, this loss of movement regulation happened even before the neurons themselves started to die.

"These findings suggest that cell death itself might not be the only trigger of Huntington's Disease symptoms," Eroglu said.

In a healthy brain, striatal neurons control movement by communicating with other neurons through connections called synapses. The researchers found that striatal neurons lacking huntingtin formed abnormal synaptic connections, which could potentially explain the problematic motor function of the mice.

"We believe changes at the neuronal and synapse level happening before cell death are contributing to the progression of the disease," said Caley Burrus, a PhD candidate in Eroglu's lab and first author of the study.

It's possible, the authors said, that therapies to address the faults at the level of the synapse may be beneficial for patients. The team is continuing the research by investigating the precise role huntingtin plays in synapse development, which might lead to specific drugs designed to target these deficits.

"Understanding the critical roles huntingtin plays in neurons is essential to designing therapies that will help individuals with Huntington's Disease in the safest and most effective way possible," Eroglu said.

Credit: 
Duke University

Persistent environmental contaminant changes the gut microbiome of mice

UNIVERSITY PARK, Pa. -- An industrial chemical -- phased out since 2002, but previously used in stain and water-repellent products and firefighting foam -- alters the gut microbiome of mice and could have implications for human health, according to an international team of researchers.

Perfluorooctane sulfonate, or PFOS, persists in the environment and in the bodies of living organisms. While the U.S. Environmental Protection Agency designated PFOS a "contaminant of emerging concern" and its production was voluntarily ceased in the United States by producers, it is still detected in the blood of up to 99% of the U.S. population.

"We know that chronic exposure to some environmental chemicals, including persistent organic pollutants, can impact the gut microbiome, and we are actively assessing whether these interactions can impact health," said Andrew Patterson, Tombros Early Career Professor and professor of molecular toxicology, Penn State. "Our study shows that PFOS alters the composition and function of the microbiome, which suggests that this chemical and perhaps related chemicals, have mechanisms of action outside our own cells. Exploring how chemicals impact the microbiome is an important and emerging area of study."

The team's results appeared on Jan. 8 in Toxicology.

The research team studied the effects of PFOS on the mouse microbiome by comparing mice fed a normal diet, to mice fed a diet containing PFOS at concentrations that were somewhat higher than those to which the average human being would likely be exposed. Afterward, they examined the livers and gut microbiota of the mice using DNA sequencing, metabolomics and molecular analyses.

The DNA sequences revealed a significant difference in the gut microbiota community between mice that were fed even the lowest dose of PFOS and the control group. Further, the group found that incubation of PFOS with the gut microbiota in vitro resulted in physiologic and metabolic changes to microbes.

"These results support emerging ideas that gut microbes can be sensitive to chemical exposures and that perhaps we need to consider ways to assess how such exposure affects them," said Patterson.

The team also found evidence that PFOS activated at least two or three nuclear receptors that regulate the expression of genes related to the metabolic fate of various chemical entities in the body.

"It's been known for years that PFOS can activate different nuclear receptors, but the mechanism in this case appears to be unique in that PFOS alters the gut microbiome, which in turn causes these changes in receptor activities," said Jeffrey Peters, Distinguished Professor of Molecular Toxicology and Carcinogenesis, Penn State, and deputy director, Penn State Cancer Institute.

Overall, Peters said, the team found that PFOS altered the composition of the mouse microbiome and metabolism of the bacterial communities in their guts.

"In future studies, we plan to follow these mice to see if PFOS and the consequent perturbed gut microbiome modifies metabolic disease," he said.

Credit: 
Penn State

Possible Alzheimer's breakthrough suggested

image: An image of the 'Aggregatin' protein from the paper published by Wang, Zhu and collaborators.

Image: 
Case Western Reserve University

CLEVELAND--Researchers at the Case Western University School of Medicine say they have identified a previously unknown gene and associated protein which could potentially be suppressed to slow the advance of Alzheimer's disease.

"Based on the data we have, this protein can be an unrecognized new risk factor for Alzheimer's disease (AD)," said Xinglong Wang, an associate professor of pathology at the School of Medicine. "We also see this as a potential novel therapeutic target for this devastating disease."

Wang said proving the latter assertion, which has not yet been tested in humans, would require additional research to corroborate the function of the protein they have dubbed "aggregatin." Eventually, that would someday mean clinical trials with Alzheimer's patients, he said.

"This protein characteristically accumulates, or aggregates, within the center of plaque in AD patients, like the yolk of an egg--which is part of the reason we named it "aggregatin," Wang said.

A research team led by Wang and Xiaofeng Zhu, a professor of Population and Quantitative Health Sciences at the School of Medicine, has filed for a patent through the university's Office of Research and Technology Management for "novel Alzheimer's disease treatments and diagnosis based on this and related study," Wang said.

"We're very excited about this because our study is likely the first systematic work combining the identification from a genome-wide association study of high dimensional brain-imaging data and experimental validation so perfectly in Alzheimer's disease," Zhu said.

Their research was published this month by the scientific journal Nature Communications and supported by grants from the National Institutes of Health (NIH) and the Alzheimer's Association. Genomic and brain imaging data was obtained from the Alzheimer's Disease Neuroimaging Initiative, which is supported by the NIH.

Alzheimer's Disease affects millions

More than 5.7 million Americans have Alzheimer's disease, which is the primary cause of dementia and sixth-leading cause of death in the United States. That population is predicted to reach 14 million by the year 2050, according to the Alzheimer's Association.

The relationship between Alzheimer's (and subsequent brain atrophy) and amyloid plaques--the hard accumulations of beta amyloid proteins that clump together between the nerve cells (neurons) in the brains of Alzheimer's patients--has been well-established among researchers.

Less understood is precisely how that amyloid-beta actually leads to plaque formation--and where this new work appears to have broken new ground, Wang said.

Further, while there has been much research into what genes might influence whether or not someone gets Alzheimer's, there is less understanding of genes that might be linked to the progression of the disease, meaning the formation of plaque and subsequent atrophy in the brain.

The role of 'aggregatin' protein

In the new work, the researchers began by correlating roughly a million genetic markers (called single-nucleotide polymorphisms, or SNPs) with brain images. They were able to identify a specific SNP in the FAM222, a gene linked to different patterns of regional brain atrophy.

Further experiments then suggested that the protein encoded by gene FAM222A is not only associated with AD patient-related beta-amyloid plaques and regional brain atrophy, but that "aggregatin" attaches to amyloid beta peptide--the major component of plaque and facilitates the plaque formation.

So when researchers injected mouse models with the "aggregatin" protein (made from the FAM222A gene), plaque (amyloid deposits) formation accelerated in the brain, resulting in more neuroinflammation and cognitive dysfunction. This happened, they report, because the protein was found to bind directly the amyloid beta peptide, thus facilitating the aggregation and placque formation, Wang said.

Conversely, when they suppressed the protein, the plaques were reduced and neuroinflammation and cognitive impairment alleviated.

Their findings indicate that reducing levels of this protein and inhibition of its interaction with amyloid beta peptide could potentially be therapeutic--not necessarily to prevent Alzheimer's but to slow its progression.

Credit: 
Case Western Reserve University

Improving cardiovascular health of the most vulnerable

image: Rick Stouffer, MD

Image: 
UNC School of Medicine

CHAPEL HILL, NC - Starting in 2016, a two-year partnership between the North Carolina Chapter of the American College of Cardiology (NCACC) and the North Carolina Association of Free and Charitable Clinics (NCAFCC) provided free lipid lowering therapy and clopidogrel to patients at seven free clinics in North Carolina. The results of this pilot study were recently published in the Journal of the American College of Cardiology.

"Through this pilot, we were able to increase the number of patients treated with statin medications, the number of statin medication tablets dispensed, and the use of high-intensity statins, resulting in significant decreases in total cholesterol and LDL levels," said Stouffer, who is the Ernest and Hazel Craige Distinguished Professor of Cardiovascular Medicine and chief of the division of cardiology.

"This shows the feasibility of a partnership between a medical specialty society and an association of free and charitable clinics, demonstrating the impact that a public health partnership can have on treating cardiovascular disease."

During year one, the seven clinics provided 1,296 patients with statin medications and dispensed a total of 178,384 tablets. During year two, the seven clinics provided 1,550 patients with statin medications, and dispensed a total of 279,474 tablets.

Altogether, the collaboration enabled more patients to receive statins (24% increase in year one and 45% increase in year two) and resulted in more statin tablets being dispensed (61% increase in year one and 83% increase in year two). In addition, there was an increase of 349% in the use of high intensity statin treatment in year one and a 38% increase in year two. In a random sample of 815 patients, who had lipid levels measured before and after initiation of the grant, total cholesterol decreased from 208 [173, 236] mg/dl to 175 [147, 209] mg/dl (p

During the first year, the seven clinics provided 70 patients with clopidogrel (9854 tablets) and during the second year, the seven clinics provided 81 patients with clopidogrel (13,205 tablets).

"In this project, the North Carolina chapter of the American College of Cardiology was able to provide important cardiovascular medications to a group of vulnerable patients. Most of the patients who attend free clinics rarely, if ever, see a cardiologist. They usually only come to our attention when they get acutely ill and seek care in an emergency department. This project provides a model for specialty societies to improve the health of individuals who lack the resources to visit the specialist in their office."

Credit: 
University of North Carolina Health Care

Bend and snap: New interventions for rib fractures

image: Physicians at MUSC Health offer surgical stabilization of rib fractures for less severe fractures and breaks. It helps patients experience less pain during their recovery.

Image: 
Lauren Hooker, MUSC

When an arm snaps, a leg cracks or a wrist twists, physicians set the bone to ensure it heals properly and with as little discomfort to the patient as possible. But the same cannot be said for most rib fractures.

Past practice and teaching call for little to no treatment, even if it takes months for the patient to breathe normally or just get back to work. The prevailing wisdom has been succinct. "Offer medicine for the pain and a ventilator if breathing is an issue," said Evert Eriksson, M.D., a trauma surgeon at the Medical University of South Carolina and coauthor of the paper. "But otherwise, the bones will form a callus over time that allows it to function as it needs to." And while that idea has morphed over the last decade to make multiple-fracture repair more common, patients with less severe fractures often still go untreated despite pain.

This extended discomfort is what led Denver Health Medical Center surgeon Fredric Pieracci, M.D., along with several other trauma surgeons to conduct a study with members of the Chest Wall Injury Society. Twelve centers from across the United States came together to evaluate the success of surgical stabilization of rib fractures (SSRF), which involves installing a plate to line up the two ends of the fracture and hold them in place throughout the healing process. They theorized that by stabilizing partially displaced and fractured ribs, patients' pain and quality of life would improve.

As recently published in the Journal of Trauma and Acute Care Surgery, patients who underwent SSRF for three or more rib fractures with partial dislocation reported less pain on the numeric pain scale and a better quality of life after their stabilization surgery.

"This research shows that patients who have partially displaced fractures as well as some pulmonary compromise also benefit from a procedure that is usually reserved for a more severely injured cohort," said Eriksson.

Technological limitations have played a role in keeping surgeons from performing this procedure in the past. It wasn't until recently that surgeons acquired the right equipment to keep surgical incisions small and the risk of complications in the pleural space low. By pulling the muscles aside, instead of cutting through them, surgeons are able to access the chest wall and ribs less invasively. Even the material of the stabilization plates has improved, becoming less rigid and moving more naturally with the patient as the chest expands and contracts with each breath, according to Eriksson.

While the level of narcotic use did not change significantly in patients who received SSRF, these patients consistently reported more comfort and less pain at each interview interval than those who had not undergone the operation. The fractured ribs took just as long to completely heal, but the patients' experiences during this process were far superior, and they reported feeling less pain and easier breathing throughout.

Patients also experienced fewer complications from their rib fractures. By opening the chest, addressing any additional injuries, guiding the bones back into position and removing any excess blood from the area, surgeons decreased the chances that study participants would have any additional bleeding or fluid accumulation in that space.

And the difference was statistically significant. Surgeons reported that in the group that underwent SSRF, there was a zero percent pleural space complication rate from their patients' injuries, while the group that did not undergo the procedure experienced a 10% complication rate.

Next, Eriksson wants to look at other bones that are not treated surgically. "I had a patient from this study come to me and say, 'My chest no longer hurts. You fixed that. But now it's my shoulder that's the problem.' By stabilizing the chest wall, physicians may improve outcomes from clavicle or scapula fractures as well.

This collaborative multicenter effort presents an opportunity for surgeons to address a different population than is traditionally not considered for operational treatment. "It gives us an opportunity to help a new set of patients," said Eriksson. "And that is important."

Credit: 
Medical University of South Carolina

Big gains in bone marrow transplant survival since mid-2000s

A bone marrow transplant can be a lifesaving treatment, but it can come with life-threatening risks.

The encouraging news for patients: Those risks have been plummeting for years.

The overall risk of death after transplant dropped 34% between 2003-2007 and 2013-2017, according to an analysis published in the Jan. 21 issue of the Annals of Internal Medicine.

Those gains stem from a sharp decline in transplant-related complications, said corresponding author Dr. George McDonald, an emeritus member at Fred Hutchinson Cancer Research Center. The risk of dying from those complications -- mostly due to infections and diseases involving the liver, kidneys and lungs -- has fallen from 30% to 11% over the past 25 years.

Other findings weren't all as dramatic, McDonald said. Risk of death from relapse of cancer declined -- but nowhere near as steeply as that from complications. Recurrence of cancer remains a major challenge for the transplant field, he said.

Still, the results should reassure researchers and clinicians at Fred Hutch and elsewhere who have toiled to improve the practice for decades, said McDonald, who also led an earlier analysis in 2010 showing similar striking improvements for bone marrow transplant recipients from the 1990s through the early 2000s.

The team's latest analysis shows that trend has continued. McDonald credits the improved outcomes to small, steady advances made at transplant centers by a diverse cast of doctors, nurses and specialists across every major medical discipline.

"Each of us has been working hard, trying to make our little corner of the problem less severe," said McDonald, who saw his first transplant patient in 1972. "Cumulatively, those little improvements sum up to big improvements in outcomes. This paper reflects 25 years' worth of clinical research."

Bone marrow transplants are lifesaving treatments for patients with blood cancers and other diseases. During these procedures, patients first undergo chemotherapy and/or radiation to destroy their diseased bone marrow and to prevent rejection of donor cells. A donor's healthy, blood-forming stem cells are then given directly into the patient's bloodstream.

For the current study, McDonald and colleagues reviewed the outcomes of 1,148 patients who underwent a transplant at the Hutch's clinical care partner, Seattle Cancer Care Alliance, between 2003-2007. They then compared them to 1,131 patients who had the procedures between 2013-2017. The most recent cohort was older and sicker when they had their transplant, McDonald said. Yet they still fared better than the previous group.

In absolute terms, the frequency of overall mortality during 2013-2017 was 40%, and this proportion will obviously increase with further follow-up, McDonald said.

McDonald noted that the study was a retrospective analysis of previously collected data and therefore cannot say with certainty what caused the improved outcomes. "But we can make really educated guesses as to why we're getting better," he said.

Those educated guesses involve changes to clinical practice driven by ongoing research at Fred Hutch and elsewhere, including:

Improved methods to prevent, detect and treat the viral, fungal, and bacterial infections that threaten immune-compromised transplant patients

Before transplant, identification of patients at high-risk for fatal complications.

The use of less-toxic chemotherapy and radiation regimens to prepare patients for transplant, especially for those at high-risk.

Advances in the prevention of graft-vs-host disease, where donor immune cells attack a patient's organs

Lower doses of prednisone to treat GVHD, which has led to fewer infections

Looking ahead, McDonald thinks further clinical research by infectious disease experts and medical specialists in liver, kidney and pulmonary disease will help drive down the risk of death from transplant-related complications into the single digits.

The tougher challenge will be preventing relapse. But there is cause for optimism, said Fred Hutch's Dr. Brenda Sandmaier, an oncologist and study author.

"Now that we have significantly reduced non-relapse mortality, we have a platform to implement different treatments to prevent relapse or treat early evidence of recurring disease," she said. "In another 10 years, (the relapse rate) should and will go down."

Sandmaier pointed to several areas of clinical research underway to reduce the risk of relapse:

New targeted therapies that inhibit molecules involved in disease are being tested in clinical trials to prevent relapse after transplant

Novel pre-transplant preparative regimens such as radioimmunotherapy -- which delivers a radioactive punch to cancer cells with little to no damage to surrounding healthy tissue -- can reduce disease before transplant

New cellular therapies, such as CAR T-cell therapy, could also eliminate disease before transplant

Credit: 
Fred Hutchinson Cancer Center

Setting fires to avoid fires

image: US Forest Service prescribed burn in California's Sierra National Forest.

Image: 
US Forest Service (public domain)

Australians desperate for solutions to raging wildfires might find them 8,000 miles away, where a new Stanford-led study proposes ways of overcoming barriers to prescribed burns - fires purposefully set under controlled conditions to clear ground fuels. The paper, published Jan. 20 in Nature Sustainability, outlines a range of approaches to significantly increase the deployment of prescribed burns in California and potentially in other regions, including Australia, that share similar climate, landscape and policy challenges.

"We need a colossal expansion of fuel treatments," said study lead author Rebecca Miller, a PhD student in the Emmett Interdisciplinary Program in Environment and Resources within the Stanford School of Earth, Energy & Environmental Sciences.

"Prescribed burns are effective and safe," said study co-author Chris Field, the Perry L. McCarty Director of the Stanford Woods Institute for the Environment and Melvin and Joan Lane Professor for Interdisciplinary Environmental Studies. "California needs to remove obstacles to their use so we can avoid more devastating wildfires."

Years of fire suppression in California have led to massive accumulations of wood and plant fuels in forests. Hotter, drier conditions have exacerbated the situation. Prescribed burns, in combination with thinning of vegetation that allows fire to climb into the tree canopy, have proven effective at reducing wildfire risks. They rarely escape their set boundaries and have ecological benefits that mimic the effects of naturally occurring fires, such as reducing the spread of disease and insects and increasing species diversity.

To put a meaningful dent in wildfire numbers, California needs fuel treatments - whether prescribed burns or vegetation thinning - on about 20 million acres or nearly 20 percent of the state's land area, according to the researchers. While ambitions for prescribed burns in California have been rising - private, state and federal acres planned for the approach more than doubled between 2013 and 2018 - up to half of that acreage has gone unburned due to concerns about risks like the resulting smoky air, outdated regulations and limited resources.

To better understand these barriers, the researchers interviewed federal and state government employees, state legislative staff and nonprofit representatives involved with wildfire management, as well as academics who study the field. They also analyzed legislative policies and combed through prescribed burn data to identify barriers and ultimately propose solutions.

Barriers to burning

Just about everyone the researchers interviewed described a risk-averse culture in the shadow of liability laws that place financial and legal responsibility for any prescribed burn that escapes on the burners. Private landowners explained how fears of bankruptcy swayed them to avoid burning on their property. Federal agency employees pointed to an absence of praise or rewards for doing prescribed burns, but punishment for any fires that escape. Federal and state employees claimed that negative public opinion - fear of fires escaping into developed areas and smoke damaging health - remains a challenge.

Limited finances, complex regulations and a lack of qualified burners also get in the way. For example, wildfire suppression has historically diverted funding from wildfire prevention, many state fire crews are seasonal employees hired during the worst wildfire months rather than the months when conditions are best for prescribed burn and burners who receive federal or state funds must undergo potentially expensive and time-consuming environmental reviews.

Toward solutions

California has taken some meaningful steps to make prescribed burning easier. Recent legislation makes private landowners who enroll in a certification and training program or take appropriate precautions before burning exempt from financial liability for any prescribed burns that escape. And new public education programs are improving public opinion of the practice.

To go further, stakeholders interviewed for the study suggested a range of improvements. They pointed to the need for consistent funding for wildfire prevention (rather than a primary focus on suppression), federal workforce rebuilding and training programs to bolster prescribed burn crews and standardization of regional air boards' burn evaluation and approval processes. Changing certain emissions calculations - prescribed burn smoke is currently considered human-caused, whereas wildfires count as natural emissions - may also incentivize treatments.

Making these changes will require a multi-year commitment by the executive and legislative branches, according to the researchers. The magnitude of the 2017 and 2018 wildfires prompted new wildfire-related policy proposals, but maintaining that focus during lighter fire seasons will be critical to protecting California's communities and managing its ecosystems.

"As catastrophic climate impacts intensify, societies increasingly need to innovate to keep people safe," said study co-author Katharine Mach, an associate professor at the University of Miami who was director of the Stanford Environment Assessment Facility and senior research scientist in the Stanford School of Earth, Energy & Environmental Sciences at the time of the research. "Much of this innovation is conceptually simple: making sure the full portfolio of responses, prescribed burns and beyond, can be deployed."

Credit: 
Stanford University

While promoting diseases like cancer, these enzymes also cannibalize each other

image: Cathepsins eat away at collagen and elastin in Manu Platt's Georgia Tech lab.

Image: 
Georgia Tech / Allison Carter

Like motley bandits, certain enzymes implicated in cancer and other diseases also annihilate each other. A new study reveals details of their mutual foils in the hopes that these behaviors can be leveraged to fight the enzymes' disease potential.

The bandits are cathepsins, enzymes that normally dispose of unneeded protein in our cells. But in unhealthy scenarios, cathepsins can promote illnesses like cancer, atherosclerosis, and sickle cell disease. Many experimental drugs that inhibit them, while effective, have failed due to side effects that could not be well explained, so researchers at the Georgia Institute of Technology abandoned the common focus on single cathepsins to model three key cathepsins as a system.

The researchers found that the cathepsins, denoted by the letters K, L, and S, not only degrade extracellular structures - proteins outside of cells that support cells - but also cannibalize, distract, and deactivate each other. Cathepsins are proteases, enzymes that degrade proteins, and since the cathepsins are themselves proteins, they can degrade each other, too.

Cathepsin Three Stooges

"Auto-digestion is my personal favorite. Think about it: You take a group of cathepsin Ks, and they eat each other. Why? Because they're just closer to each other than to what they would otherwise eat," said the study's principal investigator Manu Platt, an associate professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University.

In disease, cathepsins appear to be like The Three Stooges in a porcelain shop, tearing the shop down while they torment each other. As a result, early on, when the Georgia Tech researchers tried to influence a single cathepsin in the group, outcomes were puzzling, and the researchers felt they might be onto something relevant to past mysterious drug failures.

Through lab experiments and mathematical calculations, they arrived at a computational model that showed how single influences ripple through the system. They published the model as a tool online that other researchers can use to jigger the three cathepsins in group settings, their levels of available targets, and inhibitor chemicals. The tool contrasts cathepsin bungling with cathepsin effectiveness.

The researchers publish their research results in the journal the Proceedings of the National Academy of Sciences in the week of January 20, 2020. The research, which took a systems biology approach, was funded by the National Science Foundation and the National Institutes of Health.

Q&A

How do cathepsins go wrong?

The three cathepsins in this study are best known for their activity in cell organelles called lysosomes under healthy conditions, where they work like molecular woodchippers to cut protein down to amino acids.

"They also serve functions in specific cell types, such as cathepsin S helping the immune system to recognize what to attack and what not to," Platt said.

"Problems happen when cathepsins get overexpressed and end up in the wrong places. They're crazy powerful and degrade the structural proteins elastin and collagen that make up arteries, tendons, the endometrium, and many tissue structures."

"In healthy settings, cathepsin K breaks down old bone to recycle calcium. But when breast cancer comes, those cancerous cells make cathepsin K to destroy collagen around the tumor. And that allows the cells to escape and metastasize to the bone," Platt said.

How is this research relevant to drug development?

"I study cathepsins in illnesses like tendinopathy, endometriosis, atherosclerosis, cancer, and sickle cell disease," Platt said. "So, having a drug on the market to handle cathepsins would be a big deal."

"Many cathepsin inhibitor drugs that have failed clinical trials were very finely targeted but caused big side effects, and some of those cathepsin inhibitor drugs did not even cross-react with other cathepsins they were not targeting - which is usually a good thing - so the cause of the side effects was a mystery," Platt said. "By modeling a system of cathepsins, we think we have a good start toward uncovering that mystery."

"If we don't know how these cathepsins are working with and against each other in complex systems, similar to how they exist in our bodies, then we are going to have a hard time getting anything into the medicine cabinet to inhibit them."

The study floats ideas on new approaches to drug research. For example, cathepsin S could be strategically boosted in situations where it is not the culprit to break down cathepsins K and L.

What can other researchers expect from the online model?

"They can set up their own experiments and make predictions, including what inhibitors will do, so they can test inhibitors at varying strengths in this system," Platt said. "They can ask questions that they can't answer yet experimentally then test the model's predictions in the lab."

The model processes varying inputs into resulting changes in cathepsin levels and outcomes of degradation and indicates whether they have been deactivated or demolished. Scenarios can be exported as a report and a data spreadsheet.

Credit: 
Georgia Institute of Technology

Physics shows that imperfections make perfect

image: Research shows why fireflies blink in unison even though each individual insect is different.

Image: 
Toan Phan

EVANSTON, Ill. --- Northwestern University researchers have added a new dimension to the importance of diversity.

For the first time, physicists have experimentally demonstrated that certain systems with interacting entities can synchronize only if the entities within the system are different from one another.

This finding offers a new twist to the previous understanding of how collective behavior found in nature -- such as fireflies flashing in unison or pacemaker cells working together to generate a heartbeat -- can arise even when the individual insects or cells are different.

Northwestern's Adilson Motter, who led the research, explained that identical entities naturally behave identically -- until they start interacting.

"When identical entities interact, they often behave differently from each other," said Motter, who is a professor of physics in Northwestern's Weinberg College of Arts and Sciences. "But we identified scenarios in which the entities behave identically again if you make them suitably different from each other."

This discovery could help researchers optimize human-made systems, such as the power grid, in which many parts have to remain synchronized while interacting with one another. It also could potentially inform how groups of humans, such as juries, can coordinate to reach a consensus.

The research will publish on Monday, Jan. 20 in the journal Nature Physics. Motter coauthored the paper with Northwestern's Takashi Nishikawa and Ferenc Molnar, a former postdoctoral researcher at Northwestern who is now at Notre Dame University.

This work expands upon Nishikawa's and Motter's 2016 paper, which theoretically predicted the phenomenon.

"It is interesting that systems need to be asymmetric to exhibit behavioral symmetry," said Nishikawa, a research professor of physics in Weinberg. "This is remarkable mathematically, let alone physically. So, many colleagues thought that experimentally demonstrating this effect was impossible."

Motter and his collaborators made the seemingly impossible possible by using three identical electric generators. Each generator oscillated at a frequency of exactly 100 cycles per second. When separated, the identical generators behaved identically.

When connected to form a triangle, their frequencies diverged -- but only until the generators were properly mismatched to have different energy dissipations. At that point, they synchronized again.

"This can be visualized by putting a small lamp between each pair of generators," Molnar explained. "When the generators are identical, the lamp flickers, meaning that the generators are not synchronized. But when the generators' dissipation is tweaked to different levels, the flickering stop, indicating that the generator voltages are oscillating in sync."

The researchers dubbed this phenomenon "converse symmetry breaking" because it represents the opposite of the previously known phenomenon of symmetry breaking, which underlies superconductivity, the Higgs mechanism and even the appearance of zebra stripes.

In symmetry breaking, the dynamical equations have a symmetry that is not observed in the behavior of the system, while converse symmetry breaking concerns situations in which the behavior of the system has a given symmetry only when that symmetry is avoided in the dynamical equations.

"It might seem counterintuitive," Motter said. "But our theory predicts that this is true across many systems, not just electromechanical ones."

Motter's team plans to explore the implications of their findings across social, technological and biological systems. In particular, the team is actively working on the design of a power grid that is more stable while allowing incorporation of an increasing share of energy from renewable sources.

Credit: 
Northwestern University

A cautionary tale about measuring racial bias in policing

Racial bias and policing made headlines last year after a study examining records of fatal police shootings claimed white officers were no more likely to shoot racial minorities than nonwhite officers. There was one problem: The study was based on a logical fallacy.

The original research counted the numbers of fatal shootings, but never considered how often civilians encounter police officers, an essential ingredient to justifying its central claim.

The findings sparked a fiery debate among other academics, including two professors from Princeton University, who raised mathematical concerns about the study's approach. Today, they published their critique as a letter in the Proceedings of the National Academy of Sciences (PNAS).

The pair -- Dean Knox, assistant professor of politics, and Jonathan Mummolo, assistant professor of politics and public affairs-- outline a number of serious flaws in the original study, which was featured in PNAS on Aug. 6, 2019.

For the original study, researchers from Michigan State University and the University of Maryland compiled data on 900 fatal U.S. police shootings from crowdsourced databases. They then contacted each police department, gathering information about the race of the police officers responsible for each fatality.

The researchers then used the shootings data to predict the race of victims. Specifically, they showed that when the shooting officer was black, the civilian who was shot was more likely to be black than white. And controlling for attributes of the county in which shootings occurred, "the relationship between officer and civilian race was attenuated or eliminated." The authors interpreted these results as evidence that white officers are not biased against black civilians.

Yet, Knox and Mummolo show that the authors' conclusions hinge on the assumption that black and white officers encounter black and white civilians in equal numbers. Knox and Mummolo show this formally, but a simple thought experiment also illustrates the conceptual problem.

Imagine a white officer encounters 90 white civilians and 10 black, while a black officer encounters 90 black civilians and 10 white, both under identical circumstances. If both officers shot five black and nine white civilians, the results would -- according to the reasoning of the original study -- appear to show no racial bias.

However, once encounter rates are taken into account, one would see the white officer shot 50% of the black civilians he or she saw while the black officer shot 5.6%. Therefore, failing to incorporate information on encounter rates masks racial bias.

The data from the original study also only includes records of shootings, ignoring all other police-civilian encounters. And it doesn't take into account that all police officers -- white and nonwhite -- could, in theory, be biased in shooting black men.

These critiques have a number of implications on the way data is collected for research and the benchmarks used for analysis.

"New data on police behavior are coming online all the time, and that is great from a research standpoint," Mummolo said. "But all the data in the world do not negate the need to adhere to basic tenets of statistical theory and causal inference. Studies of racial bias demand the utmost rigor, and when blatant mistakes are made, they need to be quickly corrected. To allow provably false results to stand unchallenged risks confusing the public and lawmakers on one of the most pressing policy issues of our time."

After their critique was initially rejected by PNAS, Mummolo published a Twitter thread highlighting the mathematical problems associated with the original study in August 2019. The team also posted their analysis on the preprint server SSRN.

Responding to the critique, the authors of the original paper released a formal response, stating their claim about the relative probability of white and black officers shooting racial minorities was not supported, but adding that the original findings, "as described in that manuscript, largely stand unchanged."

Knox and Mummolo then appealed the rejection at PNAS, and their critique was accepted.

Credit: 
Princeton School of Public and International Affairs