Culture

Citizen science birding data passes scientific muster

image: A Yellow-headed blackbird in the Bear River Migratory Bird Refuge, Utah.

Image: 
Photo by JJ Horns.

As long as there have been birdwatchers, there have been lists. Birders keep detailed records of the species they’ve seen and compare these lists with each other as evidence of their accomplishments. Now those lists, submitted and aggregated to birding site eBird, can help scientists track bird populations and identify conservation issues before it’s too late.

Joshua Horns is an eBird user himself and a doctoral candidate in biology at the University of Utah. In a paper published today in Biological Conservation, Horns and colleagues report that eBird observations match trends in bird species populations measured by U.S. government surveys to within 0.4 percent.

Many nations don’t conduct official bird surveys, Horns says. “In a lot of tropical nations that's especially worrisome because that's where most birds live.” But he’s now shown that eBird data may be able to fill that gap.

Find this release and images here.

The full study can be found here.

How eBird works

For birders, eBird is a way to add their observations to a worldwide community and to contribute data to a vast and growing database of which birds have been seen where, and when.

Birders at the University of Utah (notably Kenny Frisch, an assistant horticulturalist who has logged 116 of the 120 known species on campus: see sidebar below) have made the university a local hotspot. And eBird has a system in place to ensure that the data submitted reflects reality. Fact-checkers, including Frisch, are contacted by eBird to follow up on unusual sightings. Ornithologist Çagan Sekercioglu (currently the fifth-ranked eBirder in the world with 7,273 species observed) says he has been flagged for fact-checking when he identifies species never before seen in an area, and uses his photographs to verify his sightings.

Check out eBird’s page about the University of Utah here.

How many lists?

Horns’ question was whether eBird data could serve as a reliable measure of bird populations. In the United States, he had the luxury of being able to compare birders’ lists to the Breeding Bird Survey, conducted annually by the U.S. Geological Survey throughout the United States and Canada. But in South America, the Caribbean and tropical Africa, along with other bird hotspots, government data is absent. eBird users, however, are present all around the world.

Horns compared more than 11 million eBird lists to government data between 1997 and 2016. To account for the range in birder skill represented in the eBird lists, Horns used the length of the birders’ lists as a proxy for their expertise and experience. “Some studies have shown that as you bird for a longer stretch of time you do record more species, but as you bird for more and more years, the number of species you see on any outing increases as well,” Horns says.

With additional statistical controls to ensure a good comparison between the eBird and official data, Horns set out to see how many lists were required to accurately track a species’ population. The cutoff, he found, was just about 10,000 lists. So, if you have above that number of lists for a country or region, the results suggest, you can be confident that population species trends observed in the lists are a reflection of reality.

But what about areas that don’t have that many lists? Horns says that lists from bird atlases and ecotourism groups can also be used, again with list length as a proxy for birder skill. Sekercioglu is doing his part, having submitted eBird lists following recent trips to Bolivia, New Guinea and Madagascar.

The eBird data is more accurate for common birds, Horns says, simply because they’re observed so often. “White-crowned pigeons live only in the Florida Keys,” Horns says, “so unless you live in the Florida Keys, you're not going to be seeing them.” Also, more lists are submitted for areas closer to cities. “You're not going to have many people out in Utah’s West Desert looking for birds but there will be a lot in Farmington Bay, near the Great Salt Lake,” Horns says.

But even common birds can be vulnerable. Horns’ analysis of eBird data shows significant declines for 48 percent of 574 North American bird species over the past 20 years. Large numbers of a common bird species could be lost before the general public notices, Horns says. “It's those declines in common species that could really drive down functioning of an ecosystem versus declines in rarer species.”

Horns’ results show the value of citizen science observations by amateurs, although the practice of birdwatching long predates the term “citizen science.” Each time birders head out, tripods and binoculars in hand, they are serving as another set of scientific eyes to help bird conservation efforts.

“We hope this analysis can be taken a step further,” Horns says, “We can use it to start monitoring these birds and pick up on birds that may be declining before they decline so much that it's hard to bring them back.”

Credit: 
University of Utah

Genes play a role in empathy

A new study led by scientists from the University of Cambridge, the Institut Pasteur, Paris Diderot University, the CNRS and the genetics company 23andMe suggests that our empathy is not just a result of our education and experience but is also partly influenced by genetic variations. These results will be published in the journal Translational Psychiatry on March 12, 2018.

Empathy plays a key role in human relationships. It has two parts: the ability to recognize another person's thoughts and feelings, and the ability to respond with an appropriate emotion. The first part is called "cognitive empathy" and the second part is called "affective empathy".

Fifteen years ago, a team of scientists at the University of Cambridge developed the Empathy Quotient or EQ, a brief self-report measure of empathy. Using this test, which measures both types of empathy, the scientists demonstrated that some of us are more empathetic than others, and that women, on average, are slightly more empathetic than men. They also showed that, on average, autistic people have more difficulties with cognitive empathy, even though their affective empathy may be intact.

The Cambridge team, the Institut Pasteur, Paris Diderot University, the CNRS and the genetics company 23andMe can now report the results of the largest genetics study of empathy using information from more than 46,000 23andMe customers. These people all completed the EQ online and provided a saliva sample for genetic analysis.

The results of this study, led by Varun Warrier(1) (University of Cambridge), Professors Simon Baron-Cohen(2) (University of Cambridge) and Thomas Bourgeron(3) (Paris Diderot University, Institut Pasteur, CNRS), and David Hinds (23andMe), first revealed that our empathy is partly down to genetics. Indeed, at least a tenth of this variation is associated with genetic factors.

The findings also confirm that women are, on average, more empathetic than men. However, this variation is not a result of DNA as no differences were observed in the genes that contribute to empathy in men and women. This implies that the difference in empathy between the sexes is caused by other factors, such as socialization, or non-genetic biological factors, such as prenatal hormone influences, which also differ between the sexes.

Finally, the scientists observed that genetic variants associated with lower empathy are also associated with higher risk for autism.

Varun Warrier explained: "This is an important step towards understanding the role that genetics plays in empathy. But since only a tenth of the variation in the degree of empathy between individuals is down to genetics, it is equally important to understand the non-genetic factors."

Professor Thomas Bourgeron said: "These results offer a fascinating new perspective on the genetic influences that underpin empathy. Each specific gene plays a small role and this makes it difficult to identify them. The next step is to study an even larger number of people, to replicate these findings and to pinpoint the biological pathways associated with individual differences in empathy."

Finally, Professor Simon Baron-Cohen added: "Finding that even a fraction of why we differ in empathy is due to genetic factors helps us understand people, such as those with autism, who struggle to imagine another person's thoughts and feelings. This empathy difficulty can give rise to a disability that is no less challenging than other kinds of disability. We as a society need to support those with disabilities, with novel teaching methods, work-arounds or reasonable adjustments, to promote inclusion."

Credit: 
Institut Pasteur

Proteins associated with diabetic complications and increased heart disease identified

image: Protein pathways that are closely linked to changes in both triglyceride and hemoglobin A1c levels in diabetic patients have been identified in new research by the Intermountain Medical Center Heart Institute in Salt Lake City.

Image: 
Intermountain Medical Center Heart Institute

Protein pathways that are closely linked to changes in both triglyceride and hemoglobin A1c levels in diabetic patients have been identified in new research by the Intermountain Medical Center Heart Institute in Salt Lake City.

The findings of two related studies bring new interest in additional research that will help healthcare providers understand the links and identify ways to intervene earlier and prevent the onset of heart disease or diabetic complications.

"Understanding the biology of how proteins interact with other cells in the body can improve patient care and help physicians prevent catastrophic events like heart attack, stroke, or death," said Stacey Knight, PhD, a researcher with the Intermountain Medical Center Heart Institute and lead author of the study. "The findings of these studies may help explain the often-increased triglyceride levels that lead to cardiovascular events for diabetic patients."

Results of the two studies on protein pathways will be presented at the American College of Cardiology Scientific Session in Orlando on March 11, at 9:45 a.m., ET.

In diabetic patients, high triglyceride levels are associated with heart disease and stroke. High levels of hemoglobin A1C are also associated with increased complications like diabetic retinopathy.

For one of the studies, researchers looked at 264 patients who were enrolled in the FACTOR-64 study, which was a clinical trial designed to reduce the risks of diabetic patients for cardiovascular disease. A SOMAscan assay was used to determine the plasma levels for more than 4,000 proteins.

Researchers found a significant association between the pathways of semaphorin and plexin, both of which have been found to the linked with diabetic retinopathy -- a diabetic complication in which high blood glucose levels damage the blood vessels of the retina.

"We found that an increase of the proteins in this pathway may result in increased hemoglobin A1C -- or an increased A1C increase proteins in the pathway," said Dr. Knight. "We'll need to further explore this association to identify how those two elements influence each other."

The second study looked at the same population of patients from the FACTOR-64 study and identified three protein pathways that were significantly associated with triglyceride levels:

Insulin-like growth factor-binding protein

Immunoglobulin

Fibronectin

Additional research is needed to help clinicians better understand the relationships between triglyceride levels and these three protein pathways.

"These initial findings made us pause for a moment and start asking additional questions about these relationships," said Dr. Knight. "We hope to further explore these pathways to better identify where interventions may occur to help reduce risk for cardiovascular events in diabetic patients."

Credit: 
Intermountain Medical Center

Transcatheter aortic valve replacement dramatically improves heart patients' quality of life

image: Patients who undergo a transcatheter aortic valve replacement, or TAVR -- a minimally-invasive surgical procedure that repairs a damaged heart valve -- experienced a significant increase in their quality of life, according to a new study by researchers at the Intermountain Medical Center Heart Institute in Salt Lake City.

Image: 
Intermountain Medical Center Heart Institute

Patients who undergo a transcatheter aortic valve replacement, or TAVR -- a minimally-invasive surgical procedure that repairs a damaged heart valve -- experienced a significant increase in their quality of life, according to a new study by researchers at the Intermountain Medical Center Heart Institute in Salt Lake City.

Intermountain Medical Center Heart Institute researchers found that patients who underwent a TAVR procedure ranked their quality of life significantly higher just 30 days after the procedure. Patients ranked their quality of life 72.9 on a 100-point scale (healthiest possible score), compared to a score of 42 prior to the procedure. One year after the procedure, these patients continued to see significantly improvement, ranking their quality of life at 75.4/100.

Researchers say results suggest that patients experienced an easier time completing daily tasks and a better overall quality of life after the procedure, confirming results observed in clinical trials.

Results of the study will be presented at the 2018 American College of Cardiology Scientific Session in Orlando on March 11, at 9:30 a.m., ET. More than 13,000 cardiologists and cardiovascular clinicians from around the world are attending the scientific meeting.

"It's remarkable that patients were able to improve their quality of life from 42 to 72.9. What's most meaningful is that the improvement was sustained, which means the quality of life of our patients continued to be dramatically improved," said Jose Benuzillo, MA, MS, lead author of the study and outcomes analyst for the Cardiovascular Clinical Program at Intermountain Healthcare.

Researchers assessed patient's self-reported health status at baseline, 30 days after the TAVR procedure, and one year after the TAVR, using the Kansas City Cardiomyopathy Questionnaire, a widely accepted survey that measures physical function, severity of symptoms, social limitations, self-care abilities, and quality of life of heart patients.

For the study, researchers examined Intermountain Healthcare's Enterprise Data Warehouse, one of the nation's largest repository of clinical data, to identify patients who underwent TAVR at Intermountain Medical Center between October 2013 to July 2017.

A total of 471 patients who underwent TAVR at Intermountain Medical Center were analyzed, of whom 460 completed the baseline survey.

Patients differ in many ways from those participating in randomized clinical trials. So, being able to study these patients is important; it provides a more a comprehensive view of the care and outcomes associated with TAVR.

During a TAVR procedure, doctors repair the valve without removing the old, damaged valve. Somewhat similar to a stent placed in an artery, the TAVR approach delivers a fully collapsible replacement valve to the valve site through a catheter.

Once the new valve is expanded, it pushes the old valve leaflets out of the way and the tissue in the replacement valve takes over the job of regulating blood flow.

Credit: 
Intermountain Medical Center

New cardiac pump device improves long-term outcomes for heart failure patients

Boston, MA -- New findings, presented today at the American College of Cardiology, provide long-term information about survival, stroke rates and durability of a novel centrifugal-flow pump compared with a commercial axial flow pump for heart-failure patients. BWH investigators report that patients who received the centrifugal-flow pump had significantly lower rates of pump-related blood clots and stroke. Results from the MOMENTUM 3 trial's analysis at 24 months were presented in a Late Breaking Clinical Trial at ACC by Mandeep R. Mehra, MD, executive director of the Center for Advanced Heart Disease and medical director of the Heart & Vascular Center at Brigham and Women's Hospital, and published simultaneously online in the New England Journal of Medicine.

"This is a pivotal study in the field of advanced heart failure," said Mehra. "Left ventricular assist devices have been in development for 40 years and there have been improvements in their technology but several challenges exist, including problems of blood clots forming in these devices, requiring device replacement. The field has been trying to engineer devices that could obviate some or all of these problems, and we report today on some important advances."

MOMENTUM 3, sponsored by Abbott Inc., evaluated Abbott's HeartMate 3™ left ventricular assist system, a magnetically-levitated continuous centrifugal-flow circulatory pump, compared to the HeartMate II™. The trial evaluated how many participants, two years after receiving their device, had not suffered a disabling stroke or had an operation to replace or remove a malfunctioning device. A total of 366 patients were randomized to receive either the centrifugal flow pump or the axial flow pump. Researchers report that 151 of 190 patients on the centrifugal flow pump did not experience a disabling stroke or need a re-operation (79.5 percent) compared to 106 of 176 of the patients (60.2 percent) on the axial pump. Only three people who received the centrifugal-flow pump needed a re-operation compared to 30 who received the axial pump. No re-operations occurred due to blood clots in the centrifugal-flow pump. Deaths or disabling strokes were similar between the two groups, but overall, stroke rates were less frequent in the centrifugal-flow pump group. Bleeding and infection rates were no different between the two groups.

MOMENTUM 3 launched in 2014 and was designed to dramatically reduce the overall timeline for clinical trials. All patients with refractory heart failure who needed a cardiac pump were eligible for the trial, regardless of whether the pump was intended as bridge to transplantation or destination therapy.

"Traditional trials must undergo safety testing, followed by testing in healthier populations, and it can be over a decade before the broader population has access to such therapies," said Mehra. "Removing restrictions based on transplant status resulted in a unique study that has been extremely successful in its enrollment and highly expeditious in delivering results."

The HeartMate 3 includes several technological adaptations intended to reduce risk of complications. The fully magnetically levitated device runs like a bullet train - its rotor has no mechanical bearings in it and pushes the blood using only magnetism. It is designed to reduce shear stress, which is thought to cause blood clots to form in pumps.

In its next phase, MOMENTUM 3 will evaluate 1,028 patients at the two-year mark to further validate the current findings. Results of the full cohort are expected by the end of 2019.

Credit: 
Brigham and Women's Hospital

Eliminating cost barriers helps heart patients comply with drug regimens

image: This is Tracy Wang, M.D., Duke Clinical Research Institute

Image: 
Duke Health

DURHAM, N.C. -- Doctors often cite the high price of a prescription drug as a reason they don't prescribe it, while patients similarly say that cost is a main reason they quit taking a drug.

Removing this financial barrier might increase the use of evidence-based therapies, improve patient adherence to those medications, and potentially save lives. That theory was tested in a study of heart attack survivors led by the Duke Clinical Research Institute; findings were presented March 11 at the American College of Cardiology annual scientific sessions meeting in Orlando.

"This study provides some good insights into medication-taking behavior and tackling the adherence problem, a big problem in the U.S," said study chair, Eric D. Peterson, executive director of the DCRI. "While financial issues are certainly part of the problem, a more complete answer will be needed to further improve adherence and patient outcomes."

The researchers enrolled 11,001 heart attack patients between June 2015 and June 2016 at hundreds of sites across the country in a study known as ARTEMIS. Doctors at participating hospitals provided usual care, but at roughly half the sites selected randomly, the cost of anti-platelet medications were offset by vouchers over the course of the study's one-year span.

Payment vouchers eliminated price differences between an older generic therapy called clopidogrel and a newer, more effective version of the therapy, ticagrelor. Doctors had full discretion on which of the two drugs to prescribe.

The study found that clinicians were indeed sensitive to their patient cost concerns. When patient co-pays were covered, doctors were more than 30 percent more likely to prescribe the more effective drug.

When patients were asked about their medication use, 80 to 85 percent reported that they filled all their prescriptions continuously, but the study's analysis of pharmacy fill data indicated that only 55 percent had been fully compliant.

Regardless of the measure of medication use, the study confirmed that more of the patients who got the pay vouchers stuck to their recommended drug regimens.

But those improvements did not appear to result in a reduced rate of death, heart attacks or strokes compared with patients who got usual care.

"Our study confirms some of our thoughts on how drug prices affect doctors and patients behaviors," said lead author Tracy Wang, M.D., associate professor of medicine at Duke University School of Medicine and member of DRCI.

"But we still have a lot of work to do to understand how we can both measure and improve treatment adherence," Wang said. "We should consider copayment reductions as part of broader initiatives to improve medication use and clinical outcomes."

Credit: 
Duke University Medical Center

Non-invasive technology is a money-saver for heart patients needing PCI

image: Diagnostic called iFR helps identify blockages at a lower cost than an older technology.

Image: 
Shawn Rocco/Duke Health

DURHAM, N.C. -- Doctors evaluating patients for blockages in the heart are aided by having a good roadmap of the vascular terrain before they can insert stents to clear the impasse.

Two technologies have been used with equal success, but now a study presented March 10 at the American College of Cardiology annual meeting by Duke cardiologists shows that the newer method carries a much lower cost, potentially saving each patient at least $800.

In a study of nearly 2,500 heart patients, researchers found that a new technology, non-invasive instantaneous wave-free ratio (iFR), was less expensive than an older technology known as fractional flow reserve (FFR), which requires injection of a drug that dilates the blood vessels, adding complexity, expense and potential risk.

"There are clear clinical advantages to using these technologies to map coronary physiology prior to coronary revascularization procedures, because they provide an accurate evaluation of the blockage, as well how best to treat it," said senior researcher Manesh Patel, M.D., chief of the Division of Cardiology at Duke and member of the Duke Clinical Research Institute. "Unfortunately, there has been resistance to performing FFR in part due to the use of the vasodilator drug, so finding a good alternative is an important clinical step."

Patel said iFR has in recent years emerged as an alternative in recent years. Unlike FFR -- which requires administration of the drug adenosine to maximally vasodilate the heart muscle and then measure the differences in a pressure along a blocked or narrowed artery -- iFR relies on measuring the pressure during a specific point in the cardiac cycle.

A study called DEFINE-FLAIR compared iFR to FFR and reported last year that the different technologies performed similarly for clinical outcomes, with iFR actually resulting in fewer symptoms for patients before, during and after the procedure.

In the current study, the question was whether iFR might be less expensive than FFR, which would eliminate another potential barrier to its use. Using the DEFINE-FLAIR data for their cost analysis, Patel and colleagues found that the average cost of the catheterization procedure was lower in the iFR group than in the FFR group, at $2,489 vs. $2,564.

The iFR procedure was less costly because it took less time, didn't require the vasodilation drug, and resulted in lower percutaneous coronary intervention rates.

Patients in the iFR group also had significantly fewer coronary artery bypass graft procedures and fewer subsequent revascularisations than those in the FFR group.

Overall, health care costs were estimated at $7,442 with iFR and $8,243 with FFR, for an unadjusted saving of $801 per patient.

"Either of these two technologies improve outcomes for patients with coronary disease, but our study shows that iFR has cost savings with similar outcomes," Patel said. "This should help remove barriers to the more widespread clinical adoption of a technology that can provide physicians with a better conception of patients' unique coronary physiology."

Credit: 
Duke University Medical Center

Osteochondral allograft transplantation effective for certain knee cartilage repairs

NEW ORLEANS, LA - Isolated femoral condyle lesions account for 75% of the cartilage repair procedures performed in the knee joint, and physicians have a variety of techniques to consider as part of surgical treatment. Osteochondral allograft transplantation (OCA) is a valuable and successful approach for this condition, as described by research presented today at the American Orthopaedic Society for Sports Medicine's Specialty Day in New Orleans.

"Our study demonstrated that the modern OCA transplantation technique, which utilizes thin, dowel type grafts, was very effective in treating patients with femoral condyle cartilage lesions," noted Luís E. Tírico, MD, who is currently with the University of Sao Paulo in Sao Paulo, Brazil and served as the research fellow and lead author on the presentation under Dr. William Bugbee, Director of Clinical Research and Head of the Scripps Cartilage Restoration and Transplant Program at Scripps Clinic in La Jolla, CA. "In 200 cases, we noted an 89% satisfaction rate with those treated by this method, along with significant improvements in clinical scores and a low graft failure rate."

The study, which represents the largest reported cohort of isolated femoral condyle lesions treated with the modern, dowel technique for OCA transplantation included 187 patients (200 knees) who underwent OCA transplantation between June 1999 and August 2014. At a minimum follow-up of two and average of 6.7 years, International Knee Documentation Committee (IKDC) total scores improved from 43.7 to 76.2 on average, and Knee injury and Osteoarthritis Outcome Score (KOOS) for pain improved from 66.5 to 85.3, and 74.5 to 91.1 for activities of daily living. Further surgery was required in 52 knees (26%), of which 16 (8%) were considered failures, as defined by removal or revision of the allograft.

"The modern technique of OCA transplantation for treating isolated femoral condyle lesions offers patient better results over other cartilage repair procedures," commented Tírico. "These results appear to be equal or superior to any other cartilage repair procedure for the treatment of femoral condyle lesions and leads us to consider whether fresh OCA should be viewed as the current gold standard in cartilage repair for focal femoral condyle lesions.

Credit: 
American Orthopaedic Society for Sports Medicine

High school athletes with shoulder instability benefit from nonoperative treatment

NEW ORLEANS, LA - Nonoperative treatment of high school athletes with shoulder instability is an effective approach, according to research presented today at the American Orthopaedic Society for Sports Medicine's Specialty Day in New Orleans. Researchers also noted that using the Non-Operative Instability Severity Score (NSIS) tool can help identify higher-risk patients who may require other forms of treatment.

"Our study showed that of 57 patients who were initially treated nonoperatively for anterior shoulder instability, 79% achieved a full return to sport and completed the following season without missing any practice or playing time," noted lead author John M. Tokish, MD, from the Mayo Clinic Arizona in Phoenix, Arizona, who performed the work with Ellen Shanley, PhD, and their team at the Steadman Hawkins Clinic of the Carolinas. "We also identified certain risk factors, like bone loss, that may make this approach ineffective for certain individuals."

The research included patients who experienced first time anterior shoulder instability and were managed non-operatively, were involved in high school sports with at least one season of eligibility remaining, and had information available on their return to sport following treatment. Analysis of NSIS scores of the patients treated showed those with a score less than 7 returned at a 97% rate, while those who scored greater than 7 had a 59% success rate. Those who were not successful either did not return to their previous sport, or sustained an injury leading to time lost. Of the high-risk patients, those with bipolar bone loss had a failure rate of 67%.

"Deciding how to treat young athletes with shoulder instability can be challenging and controversial, but this research shows doing so without surgery is often successful," commented Tokish. "Physicians can use clinical tools like the NSIS to classify higher risk patients - such as those with bipolar bone loss - who are unlikely to benefit from the non-operative treatment approach and guide their decision making."

Researchers recommend future studies look at larger patient groups over a longer period to build on this data.

Credit: 
American Orthopaedic Society for Sports Medicine

Study: Absence of key protein, TTP, rapidly turns young bones old

image: The removal of the gene that produces TTP progressively increases the presence of osteoclasts (red) -- cells that break down and absorb bone -- causing rapid bone loss in mice (middle) compared with healthy mice (left).

Image: 
Photo: Keith Kirkwood

BUFFALO, N.Y. - The absence of a protein critical to the control of inflammation may lead to rapid and severe bone loss, according to a new University at Buffalo study.

The study found that when the gene needed to produce the protein tristetraprolin (TTP) is removed from healthy mice, the animals developed the bones of much older rodents.

Within nine months, mice without the gene experienced a nearly 20 percent loss in oral bone. The results also revealed that overexpressing TTP in the animals led to a 13 percent reduction in bone turnover compared to unaffected mice.

Published on March 7 in the Journal of Dental Research, the study is the first to test TTP's influence on bone loss in an animal model.

Inflammation is a necessary reaction by the immune system to protect the body from injury or infection, but if not controlled, it can lead to the destruction of bone and the prevention of bone formation.

While TTP is known to play a major role in the regulation of inflammation, its production slows with age. The research results could have a profound impact on the management of bone health in the elderly, a population at higher risk of osteoporosis and periodontitis.

"TTP is the brake on the system. Without it, inflammation and bone loss would go unchecked," says Keith Kirkwood, DDS, PhD, lead author and professor in the Department of Oral Biology in UB's School of Dental Medicine.

"We don't know all of the reasons why TTP expression decreases with age. So, understanding the factors behind its expression and relationship with bone loss is the first step toward designing therapeutic approaches."

The researchers aim to advance their investigation toward similar studies in humans, particularly among the aging.

Osteoporosis, a condition in which bones become weak and brittle, and low bone mass affect nearly 55 percent of people age 50 and older, and it is estimated that by 2020, more than 61 million people will have either condition, the National Osteoporosis Foundation says.

The statistics surrounding periodontitis are equally grim. The infection - which damages the gums, destroys jaw bone and can lead to tooth loss - occurs in 70 percent of adults age 65 and older, according to the Centers for Disease Control and Prevention.

To better understand TTP's role in periodontitis, an inflammatory disease, the researchers studied three groups of healthy mice: a knockout group without the gene to express TTP, a knock-in group whose genes overexpressed TTP, and a control group of unaffected mice.

The rodents were tested for inflammatory conditions, oral bone levels and the presence of osteoclasts - cells that specialize in breaking down bone - in oral tissue at three-, six- and nine-month periods.

The researchers found that bone in the knockout mice aged more rapidly than in the control group. At three months old, the mice had lost 14 percent of their oral bone. By nine months - still a young age for a mouse - bone loss had increased to 19 percent.

In addition to periodontitis, the knockout mice developed arthritis, eczema and other inflammatory conditions. Osteoclast levels were also higher in the knockout group.

Investigators were surprised to find that the absence of TTP vastly altered the oral microbiome, despite all the rodents being housed in the same space. The finding suggests that systematic inflammation can affect the bacteria in the mouth. Further study is needed to determine whether the new bacteria are pathogenic or play a role in bone loss, says Kirkwood.

Overexpression of TTP in the knock-in mice increased protection against inflammation, lowering bone turnover by 13 percent. The increase in the protein had no effect on the number of osteoclasts, however.

A future investigation will study the effect of TTP on bone health over a two-year period. Kirkwood will also partner with Bruce Troen, MD, professor, and Kenneth Seldeen, PhD, research assistant professor, both in the Jacobs School of Medicine and Biomedical Sciences, to examine the differences in the protein's influence on oral bone and overall bone health.

Credit: 
University at Buffalo

Could living at high altitude increase suicide risk? Evidence suggests possible treatments, reports Harvard Review of Psychiatry

March 9, 2018 - High-altitude areas--particularly the US intermountain states--have increased rates of suicide and depression, suggests a review of research evidence in the Harvard Review of Psychiatry. The journal is published by Wolters Kluwer.

The increased suicide rates might be explained by blood oxygen levels due to low atmospheric pressure, according to the article by Brent Michael Kious, MD, PhD, of University of Utah, Salt Lake City, and colleagues. Pending further research, the evidence may point to possible treatments to reduce the effects of low blood oxygen on mood and suicidal thoughts.

Altitude Linked to Variations in Suicide Rate - Further Study of Mechanisms Needed
The researchers reviewed and analyzed previous evidence linking higher altitude of residence to increased risk of suicide and depression, and considered possible explanations for these associations. "There are significant regional variations in the rates of major depressive disorder and suicide in the United States, suggesting that sociodemographic and environmental conditions contribute," Dr. Kious and coauthors write.

They analyzed 12 studies, most performed in the United States, including population-based data on the relationship between suicide or depression and altitude. While the studies used varying methods, most reported that higher-altitude areas had increased rates of depression and suicide. In general, the correlation was stronger for suicide than for depression.

The highest suicide rates were clustered in the intermountain states: Arizona, Colorado, Idaho, Montana, Nevada, New Mexico, Utah, and Wyoming. (Alaska and Virginia also had high suicide rates.) In a 2014 study, the percentage of adults with "serious thoughts of suicide" ranged from 3.3 percent in Connecticut (average altitude 490 feet) to 4.9 percent in Utah (average altitude 6,100 feet).

Other key findings from previous research on altitude and suicide included:

Populations living at higher altitudes had increased suicide rates despite having decreased rates of death from all causes. Rather than a steady increase, the studies suggested a "threshold effect": suicide rates increased dramatically at altitudes between about 2,000 and 3,000 feet.

Suicide rates were more strongly associated with altitude than with firearm ownership. Other factors linked to suicide rate included increased poverty rate, lower income, and smaller population ratios of white and divorced women. However, the studies could not account for all factors potentially affecting variations in suicide, such as substance abuse rates and cultural differences.

While more than 80 percent of US suicides occur in low-altitude areas, that's because most of the population lives near sea level. Adjusted for population distribution, suicide rates per 100,000 population were 17.7 at high altitude, 11.9 at middle altitude, and 4.8 at low altitude. Studies from some other countries, but not all, also reported increased suicide rates at higher altitudes.

Why would altitude affect suicide rates? Dr. Kious and coauthors suggest the answer might be "chronic hypobaric hypoxia": low blood oxygen related to low atmospheric pressure. That theory is supported by studies in animals and short-term studies in humans. The authors suggest two pathways by which hypobaric hypoxia might increase the risks of suicide and depression: by altering the metabolism of the neurotransmitter serotonin and/or through its effects on brain bioenergetics.

If borne out by future studies, these mechanisms suggest some possible treatments to mitigate the effects of altitude on depression and suicide risk: supplemental 5-hydroxytryptophan (a serotonin precursor) to increase serotonin levels, or creatinine to influence brain bioenergetics. Dr. Kious and colleagues identify several areas in need of further research, including the effects of prolonged exposure to altitude on both serotonin metabolism and brain bioenergetics.

Credit: 
Wolters Kluwer Health

NASA's Webb Telescope to make a splash in search for interstellar water

image: Blue light from a newborn star lights up the reflection nebula IC 2631. This nebula is part of the Chamaeleon star-forming region, which Webb will study to learn more about the formation of water and other cosmic ices.

Image: 
European Southern Observatory (ESO)

Water is crucial for life, but how do you make water? Cooking up some H2O takes more than mixing hydrogen and oxygen. It requires the special conditions found deep within frigid molecular clouds, where dust shields against destructive ultraviolet light and aids chemical reactions. NASA's James Webb Space Telescope will peer into these cosmic reservoirs to gain new insights into the origin and evolution of water and other key building blocks for habitable planets.

A molecular cloud is an interstellar cloud of dust, gas, and a variety of molecules ranging from molecular hydrogen (H2) to complex, carbon-containing organics. Molecular clouds hold most of the water in the universe, and serve as nurseries for newborn stars and their planets.

Within these clouds, on the surfaces of tiny dust grains, hydrogen atoms link with oxygen to form water. Carbon joins with hydrogen to make methane. Nitrogen bonds with hydrogen to create ammonia. All of these molecules stick to the surface of dust specks, accumulating icy layers over millions of years. The result is a vast collection of "snowflakes" that are swept up by infant planets, delivering materials needed for life as we know it. "If we can understand the chemical complexity of these ices in the molecular cloud, and how they evolve during the formation of a star and its planets, then we can assess whether the building blocks of life should exist in every star system," said Melissa McClure of the Universiteit van Amsterdam, the principal investigator on a research project to investigate cosmic ices.

In order to understand these processes, one of Webb's Director's Discretionary Early Release Science projects will examine a nearby star-forming region to determine which ices are present where. "We plan to use a variety of Webb's instrument modes and capabilities, not only to investigate this one region, but also to learn how best to study cosmic ices with Webb," said Klaus Pontoppidan of the Space Telescope Science Institute (STScI), an investigator on McClure's project. This project will take advantage of Webb's high-resolution spectrographs to get the most sensitive and precise observations at wavelengths that specifically measure ices. Webb's spectrographs, NIRSpec and MIRI, will provide up to five times better precision that any previous space telescope at near- and mid-infrared wavelengths.

Infant stars and comet cradles

The team, led by McClure and co-principal investigators Adwin Boogert (University of Hawaii) and Harold Linnartz (Universiteit Leiden), plans to target the Chamaeleon Complex, a star-forming region visible in the southern sky. It's located about 500 light-years from Earth and contains several hundred protostars, the oldest of which are about 1 million years old. "This region has a bit of everything we're looking for," said Pontoppidan.

The team will use Webb's sensitive infrared detectors to observe stars behind the molecular cloud. As light from those faint, background stars passes through the cloud, ices in the cloud will absorb some of the light. By observing many background stars spread across the sky, astronomers can map ices within the cloud's entire expanse and locate where different ices form. They will also target individual protostars within the cloud itself to learn how ultraviolet light from these nascent stars promotes the creation of more complex molecules.

Astronomers also will examine the birthplaces of planets, rotating disks of gas and dust known as protoplanetary disks that surround newly formed stars. They will be able to measure the amounts and relative abundances of ices as close as 5 billion miles from the infant star, which is about the orbital distance of Pluto in our solar system.

"Comets have been described as dusty snowballs. At least some of the water in Earth's oceans likely was delivered by the impacts of comets early in our solar system's history. We'll be looking at the places where comets form around other stars," explained Pontoppidan.

Laboratory experiments

In order to understand Webb's observations, scientists will need to conduct experiments on Earth. Webb's spectrographs will spread incoming infrared light into a rainbow spectrum. Different molecules absorb light at certain wavelengths, or colors, resulting in dark spectral lines. Laboratories can measure a variety of substances to create a database of molecular "fingerprints." When astronomers see those fingerprints in a spectrum from Webb, they can then identify the molecule or family of molecules that created the absorption lines.

"Laboratory studies will help address two key questions. The first is what molecules are present. But just as important, we'll look at how the ices got there. How did they form? What we find with Webb will help inform our models and allow us to understand the mechanisms for ice formation at very low temperatures," explained Karin Öberg of the Harvard-Smithsonian Center for Astrophysics, an investigator on the project.

"It will take years to fully mine the data that comes out of Webb," Öberg added.

The James Webb Space Telescope will be the world's premier infrared space observatory of the next decade. Webb will help humanity solve the mysteries of our solar system, look beyond to distant worlds around other stars, and probe the mysterious structures and origins of our universe and our place in it. Webb is an international project led by NASA with its partners, ESA (European Space Agency) and CSA (Canadian Space Agency).

Credit: 
NASA/Goddard Space Flight Center

Burn specialists report a dramatic increase in burn injury survival over the past 30 years

CHICAGO (March 9, 2018): For many years, people who sustained severe burn injuries often died. But great strides in burn care over the last 30 years have dramatically increased their chances of survival, according to new study findings published as an "article in press" on the Journal of the American College of Surgeons website ahead of print publication.

"Mortality has decreased three to fivefold since the 1980s, ostensibly from the substantial advances in burn care that occurred between 1980 and 1989," said lead study author David N. Herndon, MD, FACS, chief of staff and director of research at the Shriners Hospitals for Children, Galveston, and director of burn services at the University of Texas Medical Branch (UTMB). "Yet, until now, there has never been a definitive study showing the cumulative effect of these advances on survival."

Burns are one of the leading causes of unintentional death and injury in the U.S., according to the American Burn Association.* Very large burns--those that cover 50 percent or more of the body's surface area--put people at high risk of infection and death. In addition to burn size, old age, female gender, and damage to lungs due to the inhalation of smoke put people at greater risk of death.

This is the most definitive report of the role advanced burn treatment has played in reducing risk of death, the authors said. Dr. Herndon and colleagues examined the records of 10,384 adult and pediatric burn patients admitted to Shriners Hospitals for Children®, Galveston, or the Blocker Burn Unit in Galveston from 1989 to 2017. Over this time period, protocols directly derived from these advances were used to guide care of these patients.

The researchers applied multivariate regression analysis to create a statistical profile of their burn patients and to identify the main factors associated with mortality. Factors such age, sex, burn size, whether the patient suffered smoke inhalation injury (damage to the airways), and length of stay were collected at admission.

Of the 10,384 burn admissions, a total of 355 victims died. The researchers looked specifically at the main factors that influenced risk of death in different age groups and then created a risk prediction model.

Using mortality data from the medical literature, as well as data from the National Burn Repository, the researchers compared historical predictions of mortality risk with their observed patient data. They found a significant decrease in mortality in their patient population compared with historical predictions from previous studies.

"In this one area of medicine, these new protocols have massively reduced mortality overall," Dr. Herndon said. "Over the last 30 years at our burn center there has been a continuing reduction in the risk of mortality of about 2 percent per year in all age groups, burn sizes, and genders."

The study also identified the most powerful predictors of mortality: the percent of total body surface burned, age, and the presence of inhalation injury. The probability of death rose as age increased, as burn size increased, and with the presence of inhalation injury.

The data suggest that the continuous improvement in mortality over time is a result of changes in the standard of care, including protocols for management of inhalation injury; nutrition to combat infection and aid in healing; and receiving early burn excision and skin grafts immediately following injury.

"The most dramatic decreases in mortality most recently have been in patients over age 40," Dr. Herndon said. "Remarkably, a patient up to the age of 40 who has sustained a 95 percent body burn now survives half the time, whereas in earlier times a 50 percent body burn killed that same person."

Other factors not assessed in the study that have contributed to better outcomes in burn patients include improvements in the transfer of critically ill patients to hospitals and burn centers.

"We hope our findings will inspire other burn units to try to keep people alive with extensive burns because it's clear that it can be done. Burn specialists also need to focus on implementing the protocols that have allowed this improvement in survival to occur," Dr. Herndon said. "For example, a woman over the age of 40, with very large burns, is a patient who can survive today if these protocols are implemented."

Beyond the effort to reduce mortality rates in burn victims, researchers hope to concentrate on better treatment strategies to improve quality of life. "Our priorities for future advancements need to focus on decreasing scar tissue and morbidity, effective rehabilitation, and returning patients to work," Dr. Herndon said.

Credit: 
American College of Surgeons

West Coast waters returning to normal but salmon catches lagging

image: Fish school around a drill rig off Southern California. A new report says West Coast waters are returning to normal after warm temperatures shook up the food web.

Image: 
Adam Obaza/West Coast Region/NOAA Fisheries

Ocean conditions off most of the U.S. West Coast are returning roughly to average, after an extreme marine heat wave from about 2014 to 2016 disrupted the California Current Ecosystem and shifted many species beyond their traditional range, according to a new report from NOAA Fisheries' two marine laboratories on the West Coast. Some warm waters remain off the Pacific Northwest, however.

The Southwest Fisheries Science Center and Northwest Fisheries Science Center presented their annual "California Current Ecosystem Status Report" to the Pacific Fishery Management Council at the Council's meeting in Rohnert Park, Calif., on Friday, March 9. The California Current encompasses the entire West Coast marine ecosystem, and the report informs the Council about conditions and trends in the ecosystem that may affect marine species and fishing in the coming year.

"The report gives us an important glimpse at what the science is saying about the species and resources that we manage and rely on in terms of our West Coast economy," said Phil Anderson of Westport, Wash., the Council Chair. "The point is that we want to be as informed as we can be when we make decisions that affect those species, and this report helps us do that."

Unusually warm ocean temperatures, referred to as "the Blob," encompassed much of the West Coast beginning about 2014, combining with an especially strong El Nino pattern in 2015. The warm conditions have now waned, although some after-effects remain.

Feeding conditions have improved for California sea lions and seabirds that experienced mass die-offs caused by shifts in their prey during the Blob.

Plankton species, the foundation of the marine food web, have shifted back slightly toward fat-rich, cool-water species that improve the growth and survival of salmon and other fish.

Recent research surveys have found fewer juvenile salmon, and consequently adult salmon returns will likely remain depressed for a few years until successive generations benefit from improving ocean conditions.

Reports of whale entanglements in fishing gear have remained very high for the fourth straight year, as whales followed prey to inshore areas and ran into fishing gear such as pots and traps.

Severe low-oxygen conditions in the ocean water spanned the Oregon Coast from July to September 2017, causing die-offs of crabs and other species.

Even as the effects of the Blob and El Nino dissipate, the central and southern parts of the West Coast face low snow pack and potential drought in 2018 that could put salmon at continued risk as they migrate back up rivers to spawn.

"Overall we're seeing some positive signs, as the ocean returns to a cooler and generally more productive state," said Toby Garfield, a research scientist and Acting Director of the Southwest Fisheries Science Center. "We're fortunate that we have the data from previous years to help us understand what the trends are, and how that matters to West Coast fishermen and communities."

NOAA Fisheries' scientists compile the California Current Ecosystem Status Report from ocean surveys and other monitoring efforts along the West Coast. The tracking revealed "a climate system still in transition in 2017," as surface ocean conditions return to near normal. Deeper water remained unusually warm, especially in the northern part of the California Current. Warm-water species, such as leaner plankton species often associated with subtropical waters, have lingered in these more-northern zones.

One of the largest and most extensive low-oxygen zones ever recorded off the West Coast prevailed off the Oregon Coast last summer, probably driven by low-oxygen water upwelled from the deep ocean, the report said.

While the cooling conditions off the West Coast began to support more cold-water plankton rich in the fatty acids that salmon need to grow, salmon may need more time to show the benefits, the report said. Juvenile salmon sampled off the Northwest Coast in 2017 were especially small and scarce, suggesting that poor feeding conditions off the Columbia River Estuary may remain.

Juvenile salmon that enter the ocean this year amid the gradually improving conditions will not return from the ocean to spawn in the Columbia and other rivers for another two years or more, so fishermen should not expect adult salmon numbers to improve much until then.

"These changes occur gradually, and the effects appear only with time," said Chris Harvey, a fisheries biologist at the Northwest Fisheries Science Center and coauthor of the report. "The advantage of doing this monitoring and watching these indicators is that we can get a sense of what is likely to happen in the ecosystem and how that is likely to affect communities and economies that are closely tied to these waters."

Credit: 
NOAA Fisheries West Coast Region

Blood donors' leftover immune cells reveal secrets of antibody affinity

During some kinds of blood donations, you get most of your blood back. For example, platelet donation involves a procedure in which donor blood is filtered to harvest the platelets for medical use and the rest of the blood components are returned to the donor's body. The byproducts of this procedure - a fraction of immune cells - are typically discarded.

Researchers at Iowa State University, partnering with the LifeServe Blood Center, have used these leftover blood donor cells to gain crucial insights into how natural killer cells circulating in the human body differ from those typically studied in the lab. The results of this research are published in the March 9 issue of the Journal of Biological Chemistry.

Adam Barb, an associate professor of biochemistry at Iowa State, studies the receptor CD16a, which is found on the surface of natural killer cells and binds to the antibody immunoglobulin G (IgG). IgG is the most common antibody produced by the human body to coat the surfaces of pathogens or tumors and signal their destruction by natural killer cells. IgG is used as the basis of most antibody immunotherapies, for example against cancer.

How effectively natural killer cells can destroy their targets depends on how tightly the receptor binds to the antibody. Barb's team had previously found that the extent of this attraction, or affinity, depended on the types and amounts of carbohydrates attached to the antibody. In the new study, they set out to find how carbohydrate modifications of the receptor in humans affected the antibody-receptor binding affinity.

"We know that (receptors) can be expressed by the natural killer cell in thousands to millions of different forms," Barb said. "This is because the molecule is coated with complex carbohydrates, like a sugar coat, that can be highly variable."

Because NK cells are found at a low concentration in human blood, researchers who study these receptors typically insert the gene encoding the receptor into cells that can be grown in culture in the lab, an approach called recombinant expression. But it was not clear whether the conditions in cell culture would result in the same carbohydrate modifications to the receptor that occur in the human body.

"All of the work that had been done at that time...was studied with recombinant material, not from primary sources," Barb said. "People had assumed, with respect to this receptor, that the mammalian (cells) used for the recombinant expression would provide the correct types of carbohydrates."

In order to harvest the receptors from the source, Barb's team turned to a nearby blood bank that performed platelet apheresis, because they knew that a fraction of white blood cells were discarded as part of the filtering procedure.

"When the donor is disconnected from the machine, they don't get those (lymphocytes) back, and that filter is usually just thrown away," Barb said. "So basically (they're) concentrating lymphocytes, including natural killer cells, which is exactly what we want, in these filters."

Barb's team obtained these filters from the blood bank and isolated the natural killer cells. They then examined the carbohydrate modifications of receptors from donors' natural killer cells and how these modifications affected binding to antibodies. They found that the carbohydrate modifications in the patients' receptors were much less elaborate than those from recombinant receptors, resulting in higher affinity.

"There was much less (carbohydrate) processing that the NK cells did in comparison to any of the forms that were expressed in these recombinant systems," Barb said. "And as a result of that, the affinity for antibody appears to be higher in natural killer cells than it would be in a receptor that was expressed from recombinant systems. Smaller carbohydrates appear to make for tighter binding interactions."
The study was carried out on natural killer cell samples from donors that were of similar age, sex and blood type, raising the question of how the receptor's carbohydrate modifications may vary in natural populations.

"There appeared to be some degree of variability between donors," Barb said "(But) how does that change throughout the lifetime, how does that change in response to infection? All of those questions are absolutely things that we would very much like to investigate very specifically."

The results suggest that finding ways to influence the carbohydrate modifications of these receptors could be a way to fine-tune antibody-receptor interactions in the context of antibody therapies.

The work was funded by the National Institutes of Health.

Credit: 
American Society for Biochemistry and Molecular Biology