Culture

Smoking cessation strategies targeting stress reduction may be more successful in women

image: The status/post smartphone app (Infinite Arms, Charleston SC), which is integrated with the research tool RedCap, was used to collect data from smokers in this study.

Image: 
Sarah Pack, Medical University of South Carolina

Gender matters when it comes to smoking cessation.

Women are 31 percent less likely to quit smoking successfully, according to the National Institute of Drug Abuse, in part because nicotine replacement therapy is thought to be more effective in male smokers. In contrast, laboratory-based studies suggest that women crave cigarettes more when they experience stress. However, that finding has not been clearly replicated in a real-world setting.

In an article published online by Nicotine & Tobacco Research, researchers at the Medical University of South Carolina (MUSC) report the findings of a real-world study in 177 smokers. In the study, female smokers experienced more stress and craving than male smokers after viewing stress cues. Stress cues are images that induce stress, similar to news images of violence or war. However, no gender differences in craving were noted after viewing cellphone-delivered smoking cues. Smoking cues are images that suggest smoking behavior, such as a photograph of a cigarette or a person smoking.

These findings suggest that improving quit outcomes in women may require gender-specific cessation strategies.

"We know that not all existing treatments are equally effective for men and women," says Rachel L. Tomko, Ph.D., assistant professor in the Department of Psychiatry and Behavioral Sciences at MUSC and first author on the article. "That could be because they find different aspects of smoking rewarding and relieving, and there are different things that maintain their smoking. Our findings suggest that stress may be one thing that maintains smoking more for women than for men."

"This research helps us understand what drives smoking behavior and what really may create barriers to treatment that we didn't think were there," explains Kevin M. Gray, M.D., professor in the Department of Psychiatry and Behavioral Sciences at MUSC and senior author on the article. "If smoking were all about the nicotine, then everyone would respond beautifully to nicotine replacement therapy. But it's more nuanced and complex than that. The better we can get at it, the better we will be able to create the right kinds of treatment for each individual."

Participants in the real-world study viewed eight images each day (four sets of two) for two weeks. These included smoking cues, stress cues and neutral images. Each time they received a pair of images, they completed a form assessing their stress, negative emotion and craving levels before viewing the images (their baseline value) and after viewing each image. They also tracked the number of cigarettes they smoked each day.

These data were recorded via a smartphone app (status/post; Infinite Arms; Charleston, SC) that integrates with the research tool RedCap (Vanderbilt University), and these RedCap data were hosted by the South Carolina Clinical and Translational Institute, an NIH Clinical and Translational Science Awards Program Hub.

As already noted, female smokers reported experiencing more stress, negative emotion and craving after viewing stress cues, but not smoking cues, than male smokers. Regardless of gender, smokers with higher baseline levels experienced more stress, negative emotion and craving after viewing stress cues. Because women smoke more in response to stress and environmental triggers, their smoking patterns could be expected to vary more than men's. However, the MUSC team found no difference in the number of cigarettes smoked per day for male and female smokers.

"Fortunately, showing smokers stress and smoking cues did not result in an overall increase in cigarettes smoked," says Tomko. "This is likely because smokers are already exposed to similar images on a daily basis. However, it is surprising that women did not have more day-to-day fluctuations in their number of cigarettes than men. It is possible that minor, everyday stressors result in women smoking a cigarette a bit sooner than they would have otherwise but does not impact the overall rate of smoking. We hope to test this in future research."

With other colleagues at MUSC, Gray and Tomko plan to analyze the daily hormone level data collected during the study to explore how hormones affect stress and smoking. Using a special lighter that can record time, they will also conduct studies to see how long it takes for different smokers to light up after experiencing stress. This could, for example, provide more evidence that stress leads to smoking in women. More broadly, they will continue to map out the gender and other differences that affect how smokers respond to treatment and use that knowledge to better craft cessation therapies.

"The really good news - and we can say this both as clinicians and researchers - is that we have effective treatments for smoking cessation," says Gray. "The challenging news is that, even with effective treatments, most smokers who try still struggle to quit smoking. We can try to make improvements by using the blunt instrument of a bigger, better treatment for everybody. However, I think we should also try to think what is different between individuals -- either gender or other characteristics -- and whether those differences help us better tailor our treatments."

Credit: 
Medical University of South Carolina

First confirmed cases of rabbit virus found in UK hares

Collaborative research led by the University of East Anglia has identified one of the causes of recent deaths in UK European brown hare populations.
Working together with diagnostic laboratories in England, Scotland and Germany, the first UK cases of rabbit haemorrhagic disease virus type 2 (RHDV2) have been detected in dead hares found in two locations - Essex and Dorset.
Researchers from UEA joined forces with Suffolk, Norfolk and Essex Wildlife Trusts, the Department for Food and Rural Affairs (DEFRA) and the APHA Surveillance Intelligence Unit to investigate the cause of hare deaths following reports of sick and dead hares from members of the public.
Lead researcher Dr Diana Bell, from UEA's School of Biological Sciences, said: "RHDV2 normally affects rabbits, but the disease is known to have jumped to European brown hares in Italy, Spain, France and Australia.

"This is the first time that RHDV2 has been found in hares in the UK.

"RHDV2 is one of several pathogens we are finding in dead hares and it is too early to say which is currently the primary cause of the hare die-off. We are continuing to investigate other causes for the deaths."

Nationally, brown hares have experienced a decline of more than 80 per cent over the past century due to changes in agricultural practice. The intensification of agriculture has limited their supply of food and habitat.

But concerns about new diseases were raised after landowners, farmers and other members of the public started reporting sightings of obviously sick and dead hares in September 2018.

Members of the public were urged to photograph sick and dying hares and, most importantly, collect the bodies for autopsy so that the impact of new and existing diseases on hare populations could be determined.

Dr Bell said: "We are enormously grateful for the continuing tremendous response from the British public in reporting dead hares to us and helping us collect them for post mortems. This is good example of citizen science.

"Hare deaths are still being reported to us and we are still collecting the bodies to test for RHDV2 and other pathogens that could be contributing to the decline.

Hares can be distinguished from rabbits in a number of ways. Hares are larger than rabbits, with longer hind legs and black-tipped ears that are as least as long as their heads.

"It's still too early to say which diseases are most common at the moment but the expanding dataset will allow us to map reported mortalities over time."

'First cases of rabbit haemorrhagic disease virus type 2 (RHDV2) confirmed in European brown hares (Lepus europaeus) in the UK ' is published today in Vet Record on Friday, January 25, 2019.

The research team are continuing to collect dead hares for post mortem. If you find a freshly dead hare please report it to Dr Bell by emailing d.bell@uea.ac.uk.

Credit: 
University of East Anglia

Do economic conditions affect pregnancy outcomes?

Economic downturn during early pregnancy was linked with modest increases in preterm birth in a Paediatric and Perinatal Epidemiology analysis.

For the analysis, researchers examined a dataset of all singleton births in Michigan from 1990-2012. Each one percentage point increase in state unemployment in the first trimester of pregnancy was associated with a modest 3% increase in the odds of preterm birth.

"Rates of preterm birth in the U.S. are higher than other developed nations, and we don't really know why. Our research indicates a need for increased attention by clinicians and the public health community to the potential role of broader socioeconomic factors--in this case, the economy--to pregnancy health," said lead author Dr. Claire E. Margerison, of Michigan State University.

Credit: 
Wiley

Effective method for reducing hospital stay after 'whipple' operation

PHILADELPHIA - Pancreaticoduodenectomy, or the Whipple operation, is one of the most complex abdominal surgeries, and is commonly prescribed as a first line of therapy for cancer located within the pancreatic head. It remains the most effective treatment method associated with prolonged survival. The surgery involves removal of parts of the pancreas, bile duct, and small intestine, requiring careful reconstruction of the organs involved. Clinicians at Jefferson have now shown that providing patients intensive care after surgery can help reduce hospital stay and reduce time to eligibility for adjuvant chemotherapy. The prospective, randomized, controlled study was published in the Journal of the American College of Surgeons.

"The trial was so successful that we were able to halt the study early and change our standard practice to providing this accelerated post-operative care to all eligible patients," said Harish Lavu, MD, Associate Professor of Surgery at Jefferson (Philadelphia University + Thomas Jefferson University) and researcher with the NCI-Designated Sidney Kimmel Cancer Center - Jefferson Health.

The study authors led by first author Dr. Lavu, MD, and senior author Charles J. Yeo, MD, the Samuel D. Gross Professor and Chair of Surgery at Jefferson, analyzed 76 pancreaticoduodenectomy patients in the study who had a low to moderate risk for complications. They compared the standard 7-day pathway for recovery and discharge, to one that took only five days to complete. The 5-day pathway included early discharge planning, a shortened stay in the ICU, modified diet and drain management; rigorous physical therapy with in-hospital gym visits, and follow up via telehealth after discharge.

"Our accelerated recovery pathway incorporates the latest in recovery science by ensuring patients get mobile shortly after surgery, which has been shown to improve outcomes," said Dr. Yeo. "We also assign experienced recovery nurse practitioners to follow-up care via telehealth, which has been shown to reduce unnecessary hospital readmissions. The results of this study validate much of what the field is beginning to view as best practice and it's exciting to be able to define a more effective pathway to better care for patients."

The 5-day Whipple accelerated recovery pathway (WARP) reduced length of stay without significantly increasing complication rates. Using the WARP protocol 76 percent of patients were ready to be discharged at day 5 in the 5-day group, whereas only 13 percent of the 7-day group were ready to be discharged by day 5.

Perhaps most significantly, reducing recovery time means that patients with pancreatic cancer can transition more quickly to the next phase of treatment. On average, the shorter stay was associated with reducing time to adjuvant therapy by 15 days (51 days with 5-day, versus 66 days with 7-day recovery).

Credit: 
Thomas Jefferson University

Sci-fi to reality: Superpowered salamander may hold the key to human regeneration

video: Scientists at the University of Kentucky have assembled the genome of the axolotl -- the first step towards unlocking the secrets of regeneration with enormous clinical implications down the road.

Image: 
University of Kentucky

LEXINGTON, Ky. (Jan. 24, 2019) -- Regeneration is one of the most enticing areas of biological research. How are some animals able to regrow body parts? Is it possible that humans could do the same? If scientists could unlock the secrets that confer those animals with this remarkable ability, the knowledge could have profound significance in clinical practice down the road.

Scientists at the University of Kentucky have taken this fantasy one step closer to reality, announcing today that they have assembled the genome of the axolotl, a salamander whose only native habitat is a lake near Mexico City.

Axolotls have long been prized as models for regeneration, said Randal Voss, a professor in the UK Spinal Cord and Brain Injury Research Center and a co-PI on the project.

"It's hard to find a body part they can't regenerate: the limbs, the tail, the spinal cord, the eye, and in some species, the lens, even half of their brain has been shown to regenerate," he said.

Though humans share many of the same genes with axolotl, the salamander genome is ten times larger, posing a formidable barrier to genetic analyses.

According to Jeramiah Smith, an associate professor in the UK Department of Biology and Voss' co-PI, recent efforts have provided much of the genetic data for the axolotl but, like a pile of puzzle pieces, until the genome is assembled in the correct order scientists cannot attempt large scale analyses of genome structure and function, which is key to teasing out the mechanisms that bestow upon axolotl their magical powers.

While the massive undertaking to map the human genome provided scientists with the tools to reproduce data in other organisms, the remarkable computational burden posed by organisms with larger genomes made such efforts largely impossible. But Smith and Voss cleverly adapted a classical genetic approach called linkage mapping to put the axolotl genome together in the correct order quickly and efficiently -- the first genome of this size to be assembled to date.

"Just a few years ago, no one thought it possible to assemble a 30+GB genome," said Smith. "We have now shown it is possible using a cost effective and accessible method, which opens up the possibility of routinely sequencing other animals with large genomes."

As proof of concept, Voss and Smith used the assembled data to rapidly identify a gene that causes a heart defect in an axolotl, thus providing a new model of human disease.

"Biomedical research is increasingly becoming a genetically-driven enterprise," said Voss. "To understand human disease, you have to see be able to study gene functions in other organisms like the axolotl."

"Now that we have access to genomic information, we can really start to probe axolotl gene functions and learn how they are able to regenerate body parts. Hopefully someday we can translate this information to human therapy, with potential applications for spinal cord injury, stroke, joint repair...the sky's the limit, really."

The University of Kentucky hosts the only federally-funded axolotl stock center in the U.S., providing axolotls to researchers and educators worldwide. Having a complete genome sequence for the laboratory axolotl greatly increases the value of this resource for biomedical research, particularly since wild axolotls have been designated critically endangered since 2006. According to Voss, UK has almost 1000 adult axolotls, a laboratory population whose pedigree dates back to the 1800's.

Voss' and Smith's data will be published in the February issue of Genome Research.

Credit: 
University of Kentucky

PopPUNK advances speed of bacterial pathogen surveillance

image: Novel computational tool, PopPUNK, enables rapid identification of bacterial pathogens for research and public health applications.

Image: 
Lees <em>et al</em>. 2019

Jan 24, 2019 -- Differences in genetic diversity among bacterial pathogens correlate with clinically important factors, such as virulence and antimicrobial resistance, prompting the need to identify clusters of similar bacterial strains. However, current bacterial clustering and typing approaches are not suitable for real-time pathogen surveillance and outbreak detection.

In a study published today in Genome Research, researchers developed PopPUNK (Population Partitioning Using Nucleotide K-mers), a computational tool for analyzing tens of thousands of bacterial genomes in a single run, up to 200-fold faster than previous methods. Using k-mers, short sections of DNA length k, this software enables rapid estimation of the proportion of k-mers present in one genome that are also shared by another. Differences in k-mer content between genomes may represent changes to individual bases in otherwise similar stretches of DNA or differences in gene content. By calculating these relationships across isolates, the population structure of a species can be efficiently estimated.

Importantly, PopPUNK applies a machine learning method that enables easy identification of emerging strains in a population. Using a previously published data set of E. coli isolates collected over a ten-year study, PopPUNK was able to efficiently classify the prevalence of different strains in the population each year and identify the emergence of antibiotic-resistance strains over time.

Researchers envision PopPUNK will expedite the identification of bacterial strains as the scale of bacterial genomes being sequenced increases and, importantly, allow public health agencies to quickly identify outbreak strains that pose a public health risk.

Credit: 
Cold Spring Harbor Laboratory Press

People behave differently in virtual reality than they do in real life

video: Inside the headset, participants were exposed to yawning while a virtual avatar kept watch.

Image: 
Nicola Anderson/UBC

Immersive virtual reality (VR) can be remarkably lifelike, but new UBC research has found a yawning gap between how people respond psychologically in VR and how they respond in real life.

"People expect VR experiences to mimic actual reality and thus induce similar forms of thought and behaviour," said Alan Kingstone, a professor in UBC's department of psychology and the study's senior author. "This study shows that there's a big separation between being in the real world, and being in a VR world."

The study used virtual reality to examine factors that influence yawning, focusing specifically on contagious yawning. Contagious yawning is a well-documented phenomenon in which people--and some non-human animals--yawn reflexively when they detect a yawn nearby.

Research has shown that "social presence" deters contagious yawning. When people believe they are being watched, they yawn less, or at least resist the urge. This may be due to the stigma of yawning in social settings, or its perception in many cultures as a sign of boredom or rudeness.

The team from UBC, along with Andrew Gallup from State University of New York Polytechnic Institute, tried to bring about contagious yawning in a VR environment. They had test subjects wear an immersive headset and exposed them to videos of people yawning. In those conditions, the rate of contagious yawning was 38 per cent, which is in line with the typical real-life rate of 30-60 per cent.

However, when the researchers introduced social presence in the virtual environment, they were surprised to find it had little effect on subjects' yawning. Subjects yawned at the same rate, even while being watched by a virtual human avatar or a virtual webcam. It was an interesting paradox: stimuli that trigger contagious yawns in real life did the same in virtual reality, but stimuli that suppress yawns in real life did not.

The presence of an actual person in the testing room had a more significant effect on yawning than anything in the VR environment. Even though subjects couldn't see or hear their company, simply knowing a researcher was present was enough to diminish their yawning. Social cues in actual reality appeared to dominate and supersede those in virtual reality.

Virtual reality has caught on as a research tool in psychology and other fields, but these findings show that researchers may need to account for its limitations.

"Using VR to examine how people think and behave in real life may very well lead to conclusions that are fundamentally wrong. This has profound implications for people who hope to use VR to make accurate projections regarding future behaviours," said Kingstone. "For example, predicting how pedestrians will behave when walking amongst driverless cars, or the decisions that pilots will make in an emergency situation. Experiences in VR may be a poor proxy for real life."

If the gap between VR and real life could be closed, scientists would be able to examine the link between the brain, behaviour, and the human experience in both actual reality and altered realities that span place and time, Kingstone added.

The study was published Jan. 22 in Scientific Reports.

Credit: 
University of British Columbia

Zinc deficiency may play a role in high blood pressure

Rockville, Md. (January 24, 2019)--Lower-than-normal zinc levels may contribute to high blood pressure (hypertension) by altering the way the kidneys handle sodium. The study is published ahead of print in the American Journal of Physiology--Renal Physiology.

Zinc deficiency is common in people with chronic illnesses such as type 2 diabetes and chronic kidney disease. People with low zinc levels are also at a higher risk for hypertension. The way in which the kidneys either excrete sodium into the urine or reabsorb it into the body--specifically through a pathway called the sodium chloride cotransporter (NCC)--also plays a role in blood pressure control. Less sodium in the urine typically corresponds with higher blood pressure. Recent research has suggested that zinc may help regulate proteins that in turn regulate the NCC, but a direct link between zinc-deficiency-induced hypertension has not been examined.

Researchers compared male mice with zinc deficiency to healthy controls with normal zinc levels. The zinc-deficient mice developed high blood pressure and a corresponding decrease in urinary sodium excretion. The control group did not experience the same changes. A small group of the zinc-deficient mice were fed a zinc-rich diet partway through the study. Once the animals' zinc reached adequate levels, blood pressure began to drop and urinary sodium levels increased. "These significant findings demonstrate that enhanced renal [sodium] reabsorption plays a critical role in [zinc-deficiency]-induced hypertension," the research team wrote.

"Understanding the specific mechanisms by which [zinc deficiency] contributes to [blood pressure] dysregulation may have an important effect on the treatment of hypertension in chronic disease settings," the researchers added.

Credit: 
American Physiological Society

Investigational monoclonal antibody to treat Ebola is safe in adults

image: May 21, 2018 - A healthy volunteer receives an intravenous infusion of mAb114--an experimental treatment for Ebola virus disease--in a Phase 1 clinical trial held at the NIH Clinical Center in Bethesda, Md.

Image: 
NIAID

The investigational Ebola treatment mAb114 is safe, well-tolerated, and easy to administer, according to findings from an early-stage clinical trial published in The Lancet. Eighteen healthy adults received the monoclonal antibody as part of a Phase 1 clinical trial that began in May 2018 at the National Institutes of Health (NIH) Clinical Center in Bethesda, Maryland. The National Institute of Allergy and Infectious Diseases (NIAID) Vaccine Research Center (VRC), part of NIH, developed the investigational treatment and conducted and sponsored the clinical trial.

The investigational treatment is currently being offered to Ebola patients in the Democratic Republic of the Congo (DRC) under compassionate use and as part of a Phase 2/3 clinical trial of multiple investigational treatments. mAb114, a single monoclonal antibody, binds to the core receptor binding domain of the Zaire ebolavirus surface protein, preventing the virus from infecting human cells. Scientists isolated the antibody from a human survivor of the 1995 Ebola outbreak in Kikwit, DRC. Prior studies showed that mAb114 can protect monkeys from lethal Ebola virus disease when given as late as five days after infection.

Participants in the Phase 1 clinical trial received a single intravenous infusion of mAb114, administered over approximately 30 minutes. Three participants received a 5 milligram(mg)/kilogram (kg) dose; five participants received a 25 mg/kg dose; and 10 participants received a 50 mg/kg dose. All infusions were well-tolerated. Four participants reported mild side effects, such as discomfort, muscle or joint pain, headache, nausea, and chills in the three days following the infusion.

As expected, levels of mAb114 in the blood increased as the dosage was increased. Investigators also observed relatively uniform levels of absorption, distribution, and elimination of mAb114 among participants.

The authors note several advantages for deploying mAb114 in an outbreak setting, including the ease and speed of its administration, and its formulation as a freeze-dried powder that does not require freezer storage. The powder is reconstituted with sterile water and added to saline for administration.

In addition to the ongoing Phase 2/3 clinical trial of mAb114 in the DRC, the VRC is planning to initiate another Phase 1 trial of the investigational treatment in Africa.

Credit: 
NIH/National Institute of Allergy and Infectious Diseases

New global task force report questions effectiveness of spinal fusion procedures, provides recommendations

WASHINGTON, D.C. (January 24, 2019)--There is little to no evidence that two surgical procedures used to fuse crumbled vertebrae following a spinal fracture caused by osteoporosis reduce pain for patients any better than non-surgical or placebo procedures, according to a new report from a global task force of bone health experts published today in the Journal of Bone and Mineral Research (JBMR).

The task force was charged by the American Society for Bone and Mineral Research (ASBMR) to assess the relative efficacy and safety of the most commonly used procedures to treat osteoporotic spinal fractures: vertebroplasty, where medical grade cement is injected into the broken vertebrae to fuse the fragments of bone together; and, balloon kyphoplasty, where a balloon is inserted into the compressed area of spine to lift it and allow cement to be inserted before the balloon removal.

The task force's report, the most comprehensive to date, comes amid aggressive and often misleading marketing by device makers touting the procedures to doctors and patients as a "non-invasive" way to get immediate relief from the pain caused by vertebral compression fractures and a way to avoid potential opioid addiction.

"The message for doctors and their patients suffering from painful spinal fractures is that procedures to stabilize spinal fractures should not be a first choice for treatment," said task force member and lead author Peter Ebeling, M.D., Head of the Department of Medicine in the School of Clinical Sciences at Monash University in Australia. "While patients who had these surgeries may have had a short-term reduction in pain, we found that there was no significant benefit over the long-term in improving pain, back-related disability, and quality of life when compared with those who did not have the procedures."

Key findings from the task force report include:

Vertebroplasty (cement injection into the vertebrae) provides no clearly significant benefit in pain control over placebo or sham procedures based on five randomized placebo-controlled trials.

For balloon kyphoplasty, the lack of placebo-controlled trials of this procedure, along with the absence of any benefits of kyphoplasty over vertebroplasty when compared in a small number of head-to-head trials, argues against the use of this procedure.

An estimated 750,000 people each year in the U.S. suffer from vertebral compression fractures caused by osteoporosis which result in acute and chronic back pain, impaired mobility, and disability. With an increasingly aging population, that number is expected to rise. Some 300,000 patients underwent vertebral augmentation procedures between 2006-2014, according to Medicare data. During that time-period, a majority of patients (73%) underwent the more expensive balloon cement injection procedure (kyphoplasty). The task force reported that both augmentation procedures were introduced into practice "prior to high quality evidence establishing its efficacy and safety and remains in some settings part of a standard routine of care."

While the procedures are minimally invasive, risks include infection at the site of the injection, cement leakage, and complications associated with elderly patients undergoing anesthesia. There have also been concerns about the increased risk of fractures in vertebrae surrounding the fused area, but the task force concluded that more research is needed to assess this risk.

"This report makes it clear that these procedures are not a magic bullet," said Bart Clarke, M.D., President of the American Society for Bone and Mineral Health, and a practicing endocrinologist, Professor of Medicine at the Mayo Clinic and co-author of an accompanying perspective piece published in the JBMR. "Until now, doctors have been left to sift through the data on their own to determine whether these procedures can benefit their patients. This report coalesces all that information concisely and provides recommendations to guide them."

Dr. Clarke added that at his center at the Mayo Clinic, they do not normally perform vertebral augmentation procedures unless a patient's pain is unmanageable for more than 4-6 weeks. "We've seen that with analgesics and other pain relief, our patients often get better within about 6 weeks."

The task force also focused on the critical need for prevention. Patients who have experienced a first fracture of the hip or spine likely have osteoporosis are likely to experience a second fracture. Approximately 25% of older men and women who have a hip fracture will have a second fracture within one year, as will around 20% of older patients who have a vertebral fracture. But research shows that treatment rates for hip fracture patients are low and are actually decreasing over time.

"Overall, prevention is critical. and we need to get these high-risk patients on anti-osteoporosis drugs that have proven to reduce future fractures by as much as 70 percent," Clarke said.

The ASBMR task force offered the following guidance for healthcare professionals and their patients for managing fractures to the spine:

Fully inform their patients of the available evidence: there is little to no evidence that the use of vertebral augmentation works any better than a placebo.

Anti-osteoporosis medications should be started or continued. A change in treatment may also need to be considered if the fracture occurred after 12 months from starting anti-osteoporosis treatment.

"This is a painful condition that for most people spontaneously gets better with time and can be managed with analgesic medications over the short-term," Ebeling said. "From our experiences with patients, we know that non-pharmacologic approaches may be effective, but we need more trials to explore these approaches."

Credit: 
Burness

In life and death, Alzheimer's disease looks different among Hispanic patients

Researchers at Shiley-Marcos Alzheimer's Disease Research Center (ADRC), part of University of California San Diego School of Medicine, report that autopsies of patients diagnosed with Alzheimer's disease (AD) when they were alive -- and confirmed by autopsy -- indicate many cognitive issues symptomatic of the condition are less noticeable in living Hispanic patients.

The findings, produced in collaboration with colleagues at the University of Southern California (USC), are published in the January 2019 issue of the Journal of Alzheimer's Disease.

Authors say the data suggests it may be more difficult for clinicians to detect AD in its mild to moderate stages among living Hispanic patients compared to non-Hispanic patients. As a result, intervention and treatment could be delayed and less effective.

The study involved autopsies of 14 Hispanic and 20 non-Hispanic persons, all with autopsy-confirmed physiological evidence of AD and an equal number of autopsies of cognitively healthy Hispanic and non-Hispanic individuals without an AD finding.

The scientists looked at patterns of neuropsychological deficits, vascular risk factors and neuropathological differences between the Hispanic and non-Hispanic patients, who were matched by age, education, global mental status and severity of functional decline at first diagnosis.

They found that mild-to-moderately affected Hispanic patients with AD were significantly less impaired than non-Hispanic patients with AD, relative to their respective culturally appropriate control groups, on measurements of memory, attention and executive functioning.

While the patient groups had similar overall AD pathology, the authors found, Hispanics with AD showed greater small blood vessel disease in the brain than non-Hispanics with AD, as well as increased amyloid angiopathy, the accumulation of protein fragments in blood vessels associated with AD.

"There have been very few autopsy studies in Hispanic elderly with Alzheimer's disease that have allowed researchers to gain insight about factors that might make it more difficult to clinically diagnose the disease in this demographic," said senior author David P. Salmon, PhD, professor in the Department of Neurosciences and Helen A. Jarrett Chair in Alzheimer's Research at UC San Diego School of Medicine.

"Information from our study can help guide how we assess living Hispanic patients who may have Alzheimer's, to more accurately detect the disease in its early stages."

AD affects 5.7 million Americans, with that number expected to nearly triple by 2050 without prevention or cure. Some studies suggest that the prevalence of dementia might be higher among Hispanics than non-Hispanic whites. Hispanics comprise the largest ethnic minority in the United States. By 2050, they are projected to represent nearly one-third of the total national population. In California and Texas, Hispanics already represent almost 40 percent of residents.

As those numbers grow, the authors said it will be vital that physicians and dementia clinicians use protocols that allow them to best assess this particular demographic, including measures of AD detection that take into account possible effects of bilingualism, educational disparities and cultural factors on neuropsychological procedures.

Their research suggests multiple factors can influence the assessment of severity and profile of cognitive deficits seen in AD, making it difficult to differentiate the disease from normal aging or other dementing disorders. The authors said further study of neuropsychological deficits associated with AD in elderly Hispanic patients, and consideration of possible health disparities and cultural adaptation of cognitive tests, is necessary.

"The evidence we found is important to moving forward because early identification of Alzheimer's disease can allow for earlier implementation of treatments and interventions that prolong the life and well-being of patients and their caregivers," said first author Gali Weissberger, PhD, a postdoctoral researcher at USC Keck School of Medicine.

"The majority of Alzheimer's disease research has focused on non-Hispanic White populations, and findings from this study suggest that certain contextual factors may contribute to a different and less salient profile of cognitive deficits in Hispanic older adults. Findings support the critical need for additional research with minority groups."

Credit: 
University of California - San Diego

3D printing may help treat osteoarthritis

In a Journal of Orthopaedic Research study, scientists used 3D printing to repair bone in the joints of mini-pigs, an advance that may help to treat osteoarthritis in humans.

Specifically, the investigators used 3D printing with a needle-array to generate articular cartilage and subchondral bone using constructs composed of mesenchymal stem cells derived from fat tissue. Printed constructs were implanted into osteochondral defects created in the knees of six mini-pigs. Computed tomography and magnetic resonance imaging tests revealed significant repair within the defects at three and six months post-implantation.

Credit: 
Wiley

How to escape a black hole: simulations provide new clues about powerful plasma jets

video: VIDEO: This simulation shows a rotating black hole (bottom) and a collisionless plasma jet (top). The simulation shows the densities of electrons and positrons, and magnetic field lines. The black hole's "ergosurface," inside of which all particles must rotate in the same direction as the hole, is shown in green.

Image: 
Kyle Parfrey <em>et al</em>./Berkeley Lab

Black holes are known for their voracious appetites, binging on matter with such ferocity that not even light can escape once it's swallowed up.

Less understood, though, is how black holes purge energy locked up in their rotation, jetting near-light-speed plasmas into space to opposite sides in one of the most powerful displays in the universe. These jets can extend outward for millions of light years.

New simulations led by researchers working at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) and UC Berkeley have combined decades-old theories to provide new insight about the driving mechanisms in the plasma jets that allows them to steal energy from black holes' powerful gravitational fields and propel it far from their gaping mouths.

The simulations could provide a useful comparison for high-resolution observations from the Event Horizon Telescope, an array that is designed to provide the first direct images of the regions where the plasma jets form.

The telescope will enable new views of the black hole at the center of our own Milky Way galaxy, as well as detailed views of other supermassive black holes.

"How can the energy in a black hole's rotation be extracted to make jets?" said Kyle Parfrey, who led the work on the simulations while he was an Einstein Postdoctoral Fellow affiliated with the Nuclear Science Division at Berkeley Lab. "This has been a question for a long time."

Now a senior fellow at NASA Goddard Space Flight Center in Maryland, Parfrey is the lead author of a study, published Jan. 23 in Physical Review Letters, that details the simulations research.

The simulations, for the first time, unite a theory that explains how electric currents around a black hole twist magnetic fields into forming jets, with a separate theory explaining how particles crossing through a black hole's point of no return - the event horizon - can appear to a distant observer to carry in negative energy and lower the black hole's overall rotational energy.

It's like eating a snack that causes you to lose calories rather than gaining them. The black hole actually loses mass as a result of slurping in these "negative-energy" particles.

Computer simulations have difficulty in modeling all of the complex physics involved in plasma-jet launching, which must account for the creation of pairs of electrons and positrons, the acceleration mechanism for particles, and the emission of light in the jets.

Berkeley Lab has contributed extensively to plasma simulations over its long history. Plasma is a gas-like mixture of charged particles that is the universe's most common state of matter.

Parfrey said he realized that more complex simulations to better describe the jets would require a combination of expertise in plasma physics and the general theory of relativity.

"I thought it would be a good time to try to bring these two things together," he said.

Performed at a supercomputing center at NASA Ames Research Center in Mountain View, California, the simulations incorporate new numerical techniques that provide the first model of a collisionless plasma - in which collisions between charged particles do not play a major role - in the presence of a strong gravitational field associated with a black hole.

The simulations naturally produce effects known as the Blandford-Znajek mechanism, which describes the twisting magnetic fields that form jets, and a separate Penrose process that describes what happens when negative-energy particles are gulped down by the black hole.

The Penrose process, "even though it doesn't necessarily contribute that much to extracting the black hole's rotation energy," Parfrey said, "is possibly directly linked to the electric currents that twist the jets' magnetic fields."

While more detailed than some earlier models, Parfrey noted that his team's simulations are still playing catch-up with observations, and are idealized in some ways to simplify the calculations needed to perform the simulations.

The team intends to better model the process by which electron-positron pairs are created in the jets in order to study the jets' plasma distribution and their emission of radiation more realistically for comparison to observations. They also plan to broaden the scope of the simulations to include the flow of infalling matter around the black hole's event horizon, known as its accretion flow.

"We hope to provide a more consistent picture of the whole problem," he said.

Credit: 
DOE/Lawrence Berkeley National Laboratory

What makes the deadly pufferfish so delectable

image: Researchers have identified the key compounds responsible for the taste of pufferfish (Takifugu obscurus).

Image: 
Yuan Liu

Some people consider pufferfish, also known as fugu, a delicacy because of its unique and exquisite flavor, which is perhaps seasoned by knowledge that consumption of the fish could be deadly. Now, researchers have identified the major compounds responsible for the taste of pufferfish, minus the thrill of living dangerously. They report their results in ACS' Journal of Agricultural and Food Chemistry.

Pufferfish get their name from their ability to inflate to a much larger size when threatened by predators. But if that defense mechanism fails, the predator may not survive long after its meal: The liver, ovaries, eyes and skin of most species of pufferfish contain tetrodotoxin, a potent neurotoxin. Although specially trained chefs can prepare fugu that's safe to eat, Yuan Liu and colleagues wondered if they could reproduce the flavor of pufferfish without the life-threatening toxin.

The researchers analyzed the key taste-active compounds in Takifugu obscurus, a species of pufferfish found mainly in the East and South China Seas. First, the team ground up pufferfish muscle tissue and cooked, filtered and centrifuged it to produce a liquid pufferfish extract. They then analyzed the extract and found amounts of 28 potential taste compounds, such as free amino acids, nucleotides and inorganic ions. Taste tests with trained panelists revealed that 12 of these compounds, when added to water, best simulated the flavor of pufferfish, which involved strong umami (savory) and kokumi (mouthfulness) components. When the researchers added two flavor peptides they isolated in a prior study, the imitation pufferfish extract tasted even more like the real thing.

Credit: 
American Chemical Society

Study finds unique form of chronic sinusitis in older patients

Older patients with a diagnosis of chronic sinusitis -- a disease of the nasal cavity and paranasal sinuses that often persists over many years -- have a unique inflammatory signature that may render them less responsive to steroid treatment, according to a new study published by Vanderbilt researchers.

The study published in the Journal of Allergy and Clinical Immunology examined tissue and mucus specimens of 147 patients between the ages of 18 to 78 who required sinus surgery for their chronic sinusitis.

With an initial goal of identifying subgroups of patients based on their inflammatory signature -- the different cytokines and inflammatory proteins found in tissue or mucus -- Vanderbilt investigators recognized that one of the identified subgroups was enriched in patients over age 60.

Intrigued by the findings, the team compared all patients according to age by examining their histopathology, tissue specimens taken during surgery, and the immune markers and inflammatory proteins found in their tissue and mucus, and noticed they were strikingly different.

"Most chronic sinusitis in North America -- particularly the kind that requires surgical intervention -- has an inflammatory signature characterized by a group of cytokines associated with allergy and asthma called Th2-associated cytokines," said Justin Turner, MD, PhD, associate professor of Otolaryngology-Head and Neck Surgery and a lead investigator for the study. "Older patients tend to not have significant elevations of those particular cytokines. In contrast, they have an elevation of cytokines that are associated with the body's innate immune function and both acute and chronic inflammatory responses, and that is highly dependent on age.

"You don't see an elevation in those cytokines until around age 60, and then from that age on, there's a progressive increase in the levels of those cytokines seen in the mucus and the tissue of those patients."

Because of this variation, older patients would theoretically be less likely to respond to the steroids used to treat chronic sinusitis characterized by Th2-associated cytokines.

According to Turner, topical steroids such as nasal sprays and irrigations are heavily relied upon for long-term disease and symptom management.

"We're hoping this data will stimulate some interest in the elderly population with respect to chronic sinusitis management, because it suggests we may need patient-specific treatments targeting these older patients. That's particularly important because steroids can have a number of short- and long-term adverse effects, and those side effects are much more likely in older patients than they are in younger patients," said Turner.

To solidify and build upon these findings, Turner's team is currently using data gathered over the last several years to compare surgical outcomes based on age.

Preliminary data suggest that older patients have less perceived benefit from sinus surgery than younger patients, which may be indicative that their disease is distinct and their options for post-operative medical management may be less likely to provide relief.

"Our end goal is that we're looking for better ways to treat chronic sinus disease and to understand the disease process a little better," said Turner. "We feel we have identified a characteristic of a fairly large population of patients that may ultimately change our treatment of those patients going forward. It at least suggests that we need to be doing more research targeted at that population."

Credit: 
Vanderbilt University Medical Center