Culture

Unfruitful: Eating more produce will not cure, stop prostate cancer

image: This is J. Kellogg Parsons, MD, UC San Diego School of Medicine and Moores Cancer Center professor of urology.

Image: 
UC San Diego Health Sciences

National guidelines recommend that men with prostate cancer eat a vegetable-rich diet, suggesting it might decrease cancer progression and death. But in a Phase III randomized clinical trial, patients with prostate cancer assigned to eat seven or more servings of vegetables and fruits daily saw no extra protection from the increased consumption of micronutrients.

"These data indicate that, despite prevailing scientific and public opinion, eating more vegetables will not alter the course of prostate cancer. It will not, to the best of our knowledge, suppress or cure it," said J. Kellogg Parsons, MD, University of California San Diego School of Medicine and Moores Cancer Center professor of urology and study lead investigator. "However, while eating a healthy diet rich in fruits and vegetables and getting more exercise may not cure cancer, it may keep the body stronger and healthier, which may help patients tolerate cancer treatments."

The Men's Eating and Living (MEAL) study, published January 14, 2020 in the Journal of the American Medical Association and led by UC San Diego Moores Cancer Center and Roswell Park Comprehensive Cancer Center investigators, enrolled 478 men aged 50 to 80 years at 91 sites in the United States. The patients had been diagnosed with early-stage prostate adenocarcinoma and enrolled in an active surveillance program in which patients defer immediate treatment until the disease advances.

Patients were randomized to a control group that received written information about diet and prostate cancer or to a telephone counseling behavioral intervention program that encouraged participants to eat foods high in carotenoids, such as leafy greens, carrots and tomatoes, and cruciferous vegetables such as broccoli and cabbage. Both groups were monitored for two years.

"Patients assigned to the intervention increased their intake of fruits and vegetables to a statistically significant degree, and significantly more than control patients did. These findings were supported by significant changes in the blood carotenoid levels of patients. Nonetheless, these data fail to support prevailing assertions in clinical guidelines and the popular media that diets high in micronutrient-rich vegetables improve cancer-specific outcomes among prostate cancer survivors," said James Marshall, PhD, Distinguished Professor with the Department of Cancer Prevention and Population Sciences at Roswell Park, co-senior author on the study with John Pierce, PhD, Professor Emeritus of Cancer Prevention at UC San Diego School of Medicine.

The study is the first randomized clinical trial to test the effect of dietary intervention on prostate cancer. It was conceived based on preliminary scientific data and on inquiries from patients who wondered if a change in diet would influence their diagnosis or treatment, said Parsons, a urologic oncologist at UC San Diego Health, San Diego's only National Cancer Institute-Designated Comprehensive Cancer Center.

"The most common question I receive from men on active surveillance is, 'Can I decrease the chances that I will need treatment for prostate cancer by changing my diet?' We now have good evidence that a diet rich in fruits and vegetables and light on red meat is not likely to impact need for treatment," said co-author James Mohler, MD, professor of oncology with Roswell Park's department of urology. "But this study does not provide justification for eating anything you want, either. The overall health benefits of a diet that's relatively low in fat and rich in fruits, vegetables and healthy grains are well-established."

The impact of nutrition on diseases is an ongoing conversation among researchers and clinicians. Scientific studies have identified a strong role for changing diet to improve outcomes in diabetes and cardiovascular disease, but not in cancer, said Parsons.

Although the MEAL study revealed no positive impact on prostate cancer, it did demonstrate that behavioral modification can lead patients to make healthier food choices, said Parsons.

"We designed a simple and inexpensive program that proved we could change people's diets for the better. We hoped that through nutrition we could alter disease outcomes and then use those data to build a network of diet counselors to help men with prostate cancer eat more vegetables," said Parsons. "It's still an endeavor worth considering, possibly in patients with advanced prostate cancer."

Credit: 
University of California - San Diego

Reduced inhaler use is safe for infants with bronchiolitis

Philadelphia (January 13, 2020) - Bronchiolitis, a lung infection that is one of the most common reasons for hospitalizations in young children, is most prevalent during the winter months and is usually treated with albuterol delivered via inhalers, despite evidence showing no benefit in most patients. A multidisciplinary team of researchers from Children's Hospital of Philadelphia (CHOP) redesigned the hospital's standard treatment for the infection and reduced albuterol use without compromising care.

The research was published in the December issue of Pediatrics.

"Given the frequency of albuterol usage and the drug's potential side effects, we wanted to improve the value of the medical care provided to children with bronchiolitis and searched for ways to reduce use," said Michelle Dunn, M.D., attending physician in general pediatrics at CHOP and lead author of the study. "We set out to revise our treatment plan, or clinical pathway, to reflect the current American Academy of Pediatrics (AAP) guidelines and educate clinicians about the recommended changes."

The AAP updated its bronchiolitis guidelines in 2014, recommending against the use of bronchodilators like albuterol in typical patients with bronchiolitis. To bring its clinical practice in line with those recommendations, the CHOP team used a multidisciplinary approach -- with a goal of reducing the use of albuterol for bronchiolitis in infants in both emergency department and inpatient settings.

To do so, the team modified emergency department and inpatient treatment plans to state explicitly that bronchodilators were not recommended for infants with a typical presentation of bronchiolitis, which involves symptoms of a viral upper respiratory infection that progress to the lower respiratory tract. The team also educated nurses, respiratory therapists and physicians on the new guidelines and modified the electronic health record system, creating a "do not order" option that stated bronchodilators were not recommended for routine use.

After implementing the new protocols, albuterol use in infants with bronchiolitis declined from 43% to 20% in the emergency department and from 18% to 11% in inpatient settings, which prevented more than 600 infants from receiving an unnecessary treatment. The team measured patient admission rates, length of stay and revisit rates and found the reduced albuterol use did not impact those metrics.

The study period covered October 2014 through March 2017, which included three winter seasons. During that time, CHOP had 5,115 emergency department visits and 1,948 hospitalizations for bronchiolitis. Of those, 3,834 emergency department visits and 1,119 inpatient hospitalizations were included in the study.

"The methods used in this study can be applied to other diagnoses where there is potential overuse of testing and interventions," said Joseph J. Zorc, MD, MSCE, attending physician in emergency medicine at CHOP and senior author of the study. "The next step for improving bronchiolitis care at CHOP is focusing on the use of the high-flow nasal cannula, an emerging therapy for infants with severe bronchiolitis."

Credit: 
Children's Hospital of Philadelphia

Study suggests new strategy for treating advanced, progressing bile duct cancer

COLUMBUS, Ohio - A new study led by researchers at The Ohio State University Comprehensive Cancer Center - Arthur G. James Cancer Hospital and Richard J. Solove Research Institute (OSUCCC - James) shows how resistance to a promising targeted drug develops in patients with a rare, lethal cancer of the bile ducts called cholangiocarcinoma.

The study, reported in the journal Molecular Cancer Therapeutics, also suggests that adding another drug at the time of progression might re-sensitize tumor cells to the initial drug, called an FGFR inhibitor.

"While the majority of patients with FGFR-positive cholangiocarcinoma benefit from new FGFR inhibitors in clinical trials, most patients unfortunately develop cancers resistant to the drugs," says study leader Sameek Roychowdhury, MD, PhD, a medial oncologist and researcher at the OSUCCC - James. "We believe that this study is an important step in understanding drug resistance, and improving the treatment of this and other cancers caused by abnormal FGFR gene mutations."

Findings also suggest that monitoring fragments of circulating tumor DNA for acquired mutations that cause resistance to FGFR inhibitors may reveal the presence of resistance mutations and mark the time a patient should begin taking the additional drug, an mTOR inhibitor.

The successful treatment of cholangiocarcinoma is challenging because the disease is usually diagnosed at an advanced stage that has a five-year survival rate of 2%. Patients diagnosed earlier also have low five-year survival due to high rates of disease recurrence. Abnormal activation of the FGFR gene happens in 15 to 20% of people with cholangiocarcinoma, and FGFR inhibitors show effectiveness in 70 to 80% of those patients until resistance develops. There are six studies of FGFR inhibitors in clinical trials at the OSUCCC - James.

"A better understanding of how treatment resistance develops and how to prevent it is critical for improving the treatment of cholangiocarcinoma and other cancers caused by FGFR mutations," says first author Melanie Krook, PhD, a postdoctoral fellow in Roychowdhury's lab.

"Our findings suggest that cholangiocarcinoma patients treated with an FGFR targeted therapy could potential benefit from combination therapies with other drugs such as mTOR inhibitors. Additional laboratory studies are needed to identify the optimal lead strategies for this combination," she adds.

For this study, Roychowdhury, Krook and colleagues examined the FGFR (fibroblast growth factor receptor) gene in the cancer cells of a cholangiocarcinoma patient who died after experiencing disease progression and developing resistance to the FGFR inhibitor infigratinib.

The researchers identified two acquired FGFR mutations in the patient's tumor cells that conferred resistance to FGFR inhibitors. They then used cancer cell lines to learn that the mutations led to activation of the mTOR biochemical pathway. This enabled the cancer cells to grow even in the presence of FGFR inhibitors. Adding an mTOR inhibitor to the cells restored their sensitivity to FGFR inhibitors.

Key findings

Two acquired FGFR2 mutations, p.E565A and p.L617M, were shown to drive resistance to the FGFR inhibitor infigratinib.

The p.E565A mutation upregulates the mTOR signaling pathway, which desensitizes cholangiocarcinoma cell lines to infigratinib and other FGFR inhibitors.

A drug that inhibited the mTOR pathway restored the sensitivity of the cells to infigratinib and other FGFR inhibitors.

"Overall, our findings suggest that an mTOR inhibitor administered at the time of progression may re-sensitize tumor cells to an FGFR inhibitor in patients who develop resistance to these agents," Roychowdhury says.

Credit: 
Ohio State University Wexner Medical Center

Not so fast: Some batteries can be pushed too far

image: At left, a 3D model by Rice University materials scientists shows a phase boundary as a delithiating lithium iron phosphate cathode undergoes rapid discharge. At right, a cross-section shows the "fingerlike" boundary between iron phosphate (blue) and lithium (red). Rice engineers found that too many intentional defects intended to make batteries better can in fact degrade their performance and endurance.

Image: 
Mesoscale Materials Science Group/Rice University

HOUSTON - (Jan. 14, 2020) - Intentional defects in batteries have given Rice University scientists a window into the hazards of pushing lithium-ion cells too far.

New simulations by Rice materials scientist Ming Tang and graduate student Kaiqi Yang, detailed in the Journal of Materials Chemistry A, shows too much stress in widely used lithium iron phosphate cathodes can open cracks and quickly degrade batteries.

The work extends recent Rice research that demonstrated how putting defects in particles that make up the cathode could improve battery performance by up to two orders of magnitude by helping lithium move more efficiently.

But the lab's subsequent modeling study revealed a caveat. Under the pressure of rapid charging and discharging, defect-laden cathodes risk fracture.

"The conventional picture is that lithium moves uniformly into the cathode, with a lithium-rich region that expands smoothly into the cathode's center," said Tang, an assistant professor of materials science and nanoengineering at Rice's Brown School of Engineering.

But X-ray images taken at another lab showed something else. "They saw a fingerlike boundary between the lithium-rich and lithium-poor regions, almost like when you inject water into oil," he said. "Our question was, what causes this?"

The root of the problem appears to be that stress destabilizes the initially flat boundary and causes it to become wavy, Tang said. The change in the boundary shape further increases the stress level and triggers crack formation. The study by Tang's group shows that such instability can be increased by a common type of defect in battery compounds called antisites, where iron atoms occupy spots in the crystal where lithium atoms should be.

"Antisites can be a good thing, as we showed in the last paper, because they accelerate the lithium intercalation kinetics," Tang said, "But here we show a countereffect: Too many antisites in the particles encourage the moving interface to become unstable and therefore generate more stress."

Tang believes there's a sweet spot for the number of antisites in a cathode: enough to enhance performance but too few to promote instability. "You want to have a suitable level of defects, and it will require some trial and error to figure out how to reach the right amount through annealing the particles," he said. "We think our new predictions might be useful to experimentalists."

Credit: 
Rice University

Colloidal quantum dot laser diodes are just around the corner

image: These are colloidal quantum dots operating in LED mode.

Image: 
Los Alamos National Laboratory

Los Alamos scientists have incorporated meticulously engineered colloidal quantum dots into a new type of light emitting diodes (LEDs) containing an integrated optical resonator, which allows them to function as lasers. These novel, dual-function devices clear the path towards versatile, manufacturing-friendly laser diodes. The technology can potentially revolutionize numerous fields from photonics and optoelectronics to chemical sensing and medical diagnostics.

"This latest breakthrough along with other recent advances in quantum dot chemistry and device engineering that we have achieved suggest that laser diodes assembled from solution may soon become a reality," said Victor Klimov, head of the quantum dot group at Los Alamos National Laboratory. "Quantum dot displays and television sets are already available as commercial products. The colloidal quantum dot lasers seem to be next in line."

Colloidal quantum dot lasers can be manufactured using cheaper, simpler methods than modern semiconductor laser diodes that require sophisticated, vacuum-based, layer-by-layer deposition techniques. Solution-processable lasers can be produced in less-challenging lab and factory conditions, and could lead to devices that would benefit a number of emerging fields including integrated photonic circuits, optical circuitry, lab-on-a-chip platforms, and wearable devices.

For the past two decades, the Los Alamos quantum dot team has been working on fundamental and applied aspects of lasing devices based on semiconductor nanocrystals prepared via colloidal chemistry. These particles, also known as colloidal quantum dots, can be easily processed from their native solution environment to create various optical, electronic, and optoelectronic devices. Furthermore, they can be 'size-tuned' for lasing applications to produce colors not accessible with existing semiconductor laser diodes.

In a paper published today in Nature Communications, the Los Alamos researchers successfully resolved several challenges on the path to commercially viable colloidal quantum dot technology. In particular they demonstrated an operational LED, which also functioned as an optically-pumped, low-threshold laser. To achieve these behaviors, they incorporated an optical resonator directly into the LED architecture without obstructing charge-carrier flows into the quantum dot emitting layer. Further, by carefully designing the structure of their multilayered device, they could achieve good confinement of the emitted light within the ultrathin quantum dot medium on the order of 50 nanometers across. This is key to obtaining the lasing effect and, at the same time, allowing for efficient excitation of the quantum dots by the electrical current. The final ingredient of this successful demonstration was unique, home-made quantum dots perfected for lasing applications per recipes developed by the Los Alamos team over the years of research into the chemistry and physics of these nanostructures.

Presently, the Los Alamos scientists are tackling the remaining challenge, which is boosting the current density to levels sufficient for obtaining so-called 'population inversion' -- the regime when the quantum dot active medium turns into a light amplifier.

Credit: 
DOE/Los Alamos National Laboratory

New research finds ranchers consider diverse factors in managing their land

image: Flood irrigation creates wetland habitats when the water flows over the landscape. Photo courtesy of Ryan Scavo.

Image: 
Virginia Tech

Wetlands in the Intermountain West, a region nestled between the Rocky Mountains, the Cascade Range, and the Sierra Nevada, are home to a diverse range of flora and fauna. Wetlands may only make up two percent of the region, but 80 percent of wildlife rely on the rich habitat they provide. The majority of these wetlands are located on private ranchlands. While the persistence of these "working wetlands" depends on the management decisions of ranchers, their perspectives are often missing from conservation and policy-making discussions.

In a new study published in Rangeland Ecology and Management, Ashley Dayer, an assistant professor in the Department of Fish and Wildlife Conservation in the College of Natural Resources and Environment at Virginia Tech, explores the diverse factors that influence how ranchers manage their land.

In collaboration with the Intermountain West Joint Venture, an organization committed to bird habitat conservation by fostering public-private partnerships, and the University of Montana, Dayer and her graduate student Mary Sketch (M.S. '18) hosted two landowner-listening workshops, one in southern Oregon and another in southwestern Wyoming, and invited various landowners and conservation professionals to encourage dialogue between the two parties. Partners for Conservation, a landowner-led conservation organization, played a key role in successful implementation of the workshops.

"In order to have effective conservation in the west, where ranchers own huge tracts of land, the conservation community is keen to work together with them. Ranchers can make choices to manage their land for the benefits of wildlife or they can make choices that don't prioritize wildlife," said Dayer, an affiliated faculty member of the Global Change Center, housed within the Fralin Life Sciences Institute. "We aimed to facilitate a better understanding of how conservation professionals could work with ranchers toward conservation and wildlife management goals."

The relationships between conservationists and ranchers can be complicated. People are quick to assume that ranchers are solely concerned with profit, but Virginia Tech researchers find that ranchers' decisions are more complex than that. This complexity needs to be taken into consideration when developing programs and policies to foster private lands conservation.

"The workshops created an open, trusting space where there was social learning and social exchange happening. It was important for ranchers to know the researchers and the conservation professionals alike were there to hear them," said Mary Sketch, who was the lead author on this paper and another previously published in Society and Natural Resources on the method itself.

Dayer and Sketch evaluated the complex decision-making process of how ranchers choose to manage their land, more specifically how they choose to irrigate their land and why. They found that various reasons go into deciding how land is managed -- not just money.

"Our project was able to add nuance to that understanding; there is a lot more to it," said co-author Alex Metcalf, a social scientist and assistant professor in the W.A. Franke College of Forestry and Conservation at the University of Montana. "Yes, ranchers have to meet the bottom line because they have to make sure they have food on the table, but other concerns and considerations are at play in the choices that they make for their lands."

This study specifically focused on choices about flood irrigation -- a traditional method involving complex ditch systems that spread water across a field, recharging areas once sustained by natural flooding. When the water flows from the ditches, saturates the field, and seeps into the groundwater, it provides forage for cattle to graze on while providing rich habitat for migrating and breeding waterbirds, like ducks and cranes, as well as sage-grouse, an iconic ground-dwelling bird in decline.

"Flood irrigation is often vilified for not being water efficient. The numbers don't always add up when it comes to saving water because there's so much more in the game of land management and conservation, like creating wildlife habitat. This traditional definition of efficiency doesn't grasp that social-ecological complexity," Sketch said. "Our work suggests an expanded definition that considers how flood irrigation provides bird habitat on working wet meadows, recharges the groundwater for communities downstream, creates in-stream flow for fish, and keeps ranchers ranching."

Ranchers described the factors that either help or hinder the use of flood irrigation on private lands. The study identified cultural considerations as a key enabler for continuing flood irrigation. "Ranchers have strong ties with the ranching lifestyle, so many choose to continue flood irrigation because of its history and their personal connection to it," explained Sketch. "It's something they do every year, the generation of ranchers before them did it, and want to maintain that tradition."

"What stands out to me in this work is that there are a group of ranchers committed to the future of their land. They rely on that land for their livelihood; they're closely tied to it; they spend every day outside. It's something that they're very passionate about," Dayer said. "I think that's just a critical thing for the majority of the U.S. public living far from ranches to keep in mind -- our food isn't just coming from grocery stores. It's coming from people who are making choices about how land is used and whether to contribute to conservation."

Despite the commitment of ranchers to their land, nearly half of all U.S. ranches are sold every decade and recruitment of younger generations into the ranching lifestyle has declined. Most of these once-open spaces have been lost to subdivisions and other development. Land conversion not only erodes the sense of community and cultural identity among ranchers, it also eliminates important wildlife habitat.

To keep ranches both environmentally and economically sustainable, both workshops highlighted key areas where conservation professionals can increase rancher engagement and ensure working wetlands continue to benefit both landowners and wildlife. Ranchers identified partnerships and open communications with conservation professionals and policymakers as critical to maintaining successful operations in addition to effective, long-lasting conservation practices. Central to strong partnerships is building trust and "honest people sitting around, getting over their biases, their agendas, and listening to one another," said one rancher.

The Intermountain West Joint Venture has a long history of working alongside landowners and conservationists and has become trusted in the region. Their connections, experiences, and on-the-ground work proved valuable in executing the research. As a result, Dayer and Sketch were better able to understand ranchers' experiences and perspectives. The joint venture is also now playing a critical role in ensuring the results of this study are used.

"This research is ground-breaking in that it helps conservation professionals understand the social context of agricultural irrigation decision-making in the West," said Dave Smith, Intermountain West Joint Venture coordinator. "The findings will enable the conservation community to increasingly support agricultural irrigators in continuing to provide vital habitat for wetland-dependent birds on working lands."

Listening turned out to be an effective conservation tool, and Dayer and Sketch hope that this work continues to change how conservation professionals and ranchers work together.

Credit: 
Virginia Tech

Only 1 in 4 Medicare patients participate in cardiac rehabilitation

DALLAS, Jan. 14, 2019 - Too few people covered by Medicare participated in outpatient cardiac rehabilitation after a heart attack or acute heart event or surgery, particularly women, the elderly and non-white patients, according to new research published today in the American Heart Association's journal Circulation: Cardiovascular Quality and Outcomes.

Every year, an estimated 1.3 million U.S. adults with heart disease may qualify for cardiac rehabilitation (this number does not include those with qualifying heart failure). [1]Outpatient cardiac rehabilitation has been shown to improve health outcomes among patients who have heart failure, have suffered heart attacks or have undergone a cardiac procedure such as coronary artery bypass surgery. This observational study measured participation rates and identified the populations and regions most at risk for suboptimal cardiac rehabilitation.

In the review of more than 366,000 patients covered by Medicare who were eligible for outpatient cardiac rehabilitation in 2016, researchers found:

only about 25% (approximately 90,000) participated in a cardiac rehabilitation program;

among those who participated in cardiac rehabilitation, only 24% began the program within 21 days of the acute cardiac event or surgery; and

among those who participated in cardiac rehabilitation, only about 27% completed the full course of the recommended 36 or more cardiac rehabilitation sessions, which have been shown to improve health outcomes.

"Cardiac rehabilitation has strong evidence demonstrating its lifesaving and life-enhancing benefits, and Medicare Part B provides coverage for the program. However, participation in cardiac rehabilitation programs remains low among people covered by Medicare," said lead study author Matthew D. Ritchey, P.T, D.P.T., O.C.S., M.P.H., a researcher at the Centers for Disease Control and Prevention's Division for Heart Disease and Stroke Prevention. "The low participation and completion rates observed translate to upwards of 7 million missed opportunities in this study to potentially improve health outcomes if 70% of them covered by Medicare who had a heart attack or acute heart event or surgery participated in cardiac rehabilitation and completed 36 sessions."

Additional study findings included:

Participation in outpatient cardiac rehabilitation decreased with increasing age, with only about 10% of patients age 85 and older participating, versus about 32% of those age 65 to 74.

Participation was lower among women than men, about 19% versus about 29%, respectively.

Over half of the cardiac rehab eligible patients had less than 5 comorbid conditions.

Non-Hispanic whites had the highest participation rate at about 26%, versus 16% for Asians, 14% for non-Hispanic blacks and 13% for Hispanics.

Participation also varied by region, with cardiac rehabilitation being lowest in the Southeastern United States and the Appalachian region.

Patients who had a procedure such as coronary bypass surgery were more likely to participate in cardiac rehabilitation than those who had a heart attack with no procedure performed.

Researchers noted that patients face systematic, logistical and cultural barriers to attending and completing an outpatient cardiac rehabilitation program. At the system level, there are no universally accepted, automated, electronic referral processes for cardiac rehabilitation services. On a personal level, patients may not complete rehabilitation due to the costs and/or the time needed to participate in the program versus returning to work and other personal commitments.

"Improving awareness of the value of cardiac rehabilitation, increasing referral of eligible patients and reducing system and patient barriers to participation are all critical steps in improving the referral, enrollment and participation rates, which, in turn, can improve patient outcomes," said Ritchey. "For example, the Agency for Healthcare Research and Quality, recently launched the TAKEheart initiative to implement automatic referral processes with care coordination to increase cardiac rehabilitation referrals, enrollment and retention across hundreds of hospitals. Each of these programs are important building blocks for continued improvement for patients."

This study had the following limitations:

Billing codes were used to identify patients eligible for cardiac rehabilitation, however, referral rates cannot be assessed with use of billing data.

Clinical information was not available for patients; therefore, the authors were unable to validate the billing codes used or to exclude patients who may not have been appropriate for cardiac rehabilitation.

This study was restricted to assessing cardiac rehabilitation use among older patients with Original Medicare coverage, therefore, the findings may not be generalizable to Medicare Advantage members or to younger patients.

The authors were unable to control for factors that may have affected their findings such as the availability of cardiac rehabilitation programs in certain communities.

"It is also important to improve the capacity within existing cardiac rehabilitation programs and to address shortages in available programs, especially in rural areas. One strategy for addressing these shortages could be to increase the use of home-based or tele-cardiac rehabilitation, which have been shown to achieve similar health outcomes as compared to center-based rehabilitation care," said Ritchey.

In 2019, the American Heart Association issued a new Scientific Statement, a collaboration with the American Association for Cardiovascular and Pulmonary Rehabilitation and the American College of Cardiology, detailing the need for and benefits of home-based cardiac rehabilitation programs to improve patient access and health outcomes. The American Heart Association also supports the Increasing Access to Cardiac Rehabilitation Care Act of 2019 (H.R. 3911), introduced in the U.S. House of Representatives in July 2019. The 2018 ACC/AHA Clinical Performance and Quality Measures for Cardiac Rehabilitation, published in April 2018, provide a comprehensive report on the performance and quality measures that can assess and improve the quality of care for patients eligible for cardiac rehabilitation.

Credit: 
American Heart Association

New study finds evidence for reduced brain connections in schizophrenia

image: PET brain scans showing that 18 healthy volunteers (right) have on average higher levels (shown by yellow-red) of synapse marker protein SV2A than 18 participants with schizophrenia (left).

Image: 
E. Onwordi at MRC London Institute of Medical Sciences (LMS)

Advances in scanning have allowed researchers for the first time to show lower levels of a protein found in the connections between neurons in the living brains of people with schizophrenia.

The researchers, who conducted the scans at the psychiatric imaging facility at the Medical Research Council (MRC) London Institute of Medical Sciences, say these changes could underlie the cognitive difficulties seen in schizophrenia and provide targets for research into new treatments.

It was first hypothesised in the early 1980s that schizophrenia was caused by dysfunctional synapses - where the nerve signals are transmitted between neurons in the brain. However, researchers had only been able to study this indirectly, such as in post mortem brains samples, or animal and cell models in the lab.

In this study, published in Nature Communications, the researchers detected this in living brains for the first time by utilising a tracer that emits a signal which can be picked up by a PET brain scan. After being injected, the tracer binds specifically to a protein found in synapses called SV2A (synaptic vesicle glycoprotein 2A), which has been shown in animal and post-mortem studies to be a good marker of the density of synaptic nerve endings in the brain.

They scanned 18 adults with schizophrenia and compared them to 18 people without schizophrenia.

They found that levels of the synaptic protein SV2A were lower in the front parts of the brain - regions of the brain involved in planning - in people with schizophrenia.

Professor Oliver Howes, who led the study, from the MRC London Institute of Medical Sciences, Imperial College London and King's College London, said: "Our current treatments for schizophrenia only target one aspect of the disease - the psychotic symptoms - but the debilitating cognitive symptoms, such as loss of abilities to plan and remember, often cause much more long-term disability and there's no treatment for them at the moment. Synaptic loss is thought to underlie these symptoms.

"Our lab at the MRC London Institute of Medical Sciences is one of the few places in the world with this new tracer, which means we've been able for the first time to show there are lower levels of a synaptic protein in people with schizophrenia. This suggests that loss of synapses could underlie the development of schizophrenia.

"We need to develop new treatments for schizophrenia. This protein SV2A could be a target for new treatments to restore synaptic function."

Dr Ellis Onwordi, who conducted the research, from the MRC London Institute of Medical Sciences, Imperial College London and King's College London, said: "Schizophrenia is a highly debilitating disorder, and the therapeutic options are too limited for many patients. To develop better treatments in the future we need studies like this to shine a light on how the extraordinarily complex wiring of the human brain is altered by this disease."

"Having scans that can characterise the distribution of the approximately 100 trillion synapses in the living brain, and find differences in their distribution between people with and without schizophrenia, represents a significant advance in our ability to study schizophrenia."

The people with schizophrenia who were scanned had all received antipsychotic medication, so the researchers wanted to exclude this as a factor in the synaptic dysfunction. They gave antipsychotic drugs, haloperidol and olanzapine, to rats for 28 days and found it had no effect on the levels of the protein SV2A.

Professor Howes said: "This is reassuring as it's suggesting that our antipsychotic treatments aren't leading to loss of brain connections.

"Next we hope to scan younger people in the very early stages to see how synaptic levels change during the development of the illness and whether these changes are established early on or develop over time."

Credit: 
UK Research and Innovation

New discovery on the activity and function of MAIT cells during acute HIV infection

image: Johan Sandberg, professor of viral immunology at the Department of Medicine, Huddinge, Karolinska Institutet.

Image: 
Photo: Stefan Zimmerman.

In a new study published in Nature Communications, researchers at Karolinska Institutet show that MAIT cells (mucosa-associated invariant T cells), part of the human immune system, respond with dynamic activity and reprogramming of gene expression during the initial phase of HIV infection. The study fills a knowledge gap, as previously there has been a lack of awareness of the function of MAIT cells during this particular phase.

Major efforts have been made in recent years to understand how people's immune systems react and act during the first days and weeks of HIV infection.

"Through a close collaboration with the Walter Reed Army Institutes of Research, we have gained access to an extensive biobank where we have been able to study samples from individuals, both before and after they have been infected with HIV," says Johan Sandberg, professor at the Department of Medicine at Karolinska Institutet in Huddinge and senior author of the study.

MAIT cells are part of the immune system, where their primary task is to control bacteria at the body's barriers, such as skin and mucous membranes. Among other things, MAIT cells have an important function in combatting bacterial infections of the lung, for example tuberculosis. However, MAIT cells also respond during an HIV infection, despite it being a viral infection. This is probably due to the bacteria that the infected individual's immune system loses control over when infected.

"Previous observations have shown that MAIT cells disappear in the later stages of HIV infection. In this study we can see that the MAIT cells actually expand at the initial stage with a strong activation in order to fight the bacteria," says Johan Sandberg.

MAIT cells may not help to control the HIV virus to any great extent, but when the infection becomes chronic, the number of MAIT cells in the body decreases. Unlike many other parts of the body's immune system, MAIT cells do not recover when the HIV infection is treated.

"In the study, we also discovered that the gene expression patterns of the MAIT cells change gradually during the initial phase," says Johan Sandberg. "They take on different characteristics and their antibacterial function declines, which can affect the individual's immune defence in a negative way."

The result of the study increases the understanding of what happens in detail at the initial phase of HIV infection. Knowledge has previously been lacking in this area as the individuals diagnosed with HIV had often been infected for a longer time when the research began.

"In the HIV research that is currently being conducted, the focus is on developing vaccines and finding cures," says Johan Sandberg. "Knowledge of how the immune system works in the initial phase of HIV infection, such as the activity and properties of the MAIT cells, can facilitate the finding of future interventions."

Credit: 
Karolinska Institutet

Global warming to increase violent crime in the United States

People in the United States could see tens of thousands of extra violent crimes every year--because of climate change alone.

"Depending on how quickly temperatures rise, we could see two to three million more violent crimes between now and the end of the century than there would be in a non-warming world," said Ryan Harp, researcher at the Cooperative Institute for Research in Environmental Sciences (CIRES) at the University of Colorado Boulder and lead author of a new study published today in Environmental Research Letters.

In 2018, Harp and his coauthor, Kris Karnauskas, CIRES Fellow and associate professor in the Department of Atmospheric and Oceanic Sciences at CU Boulder, mined an FBI crime database and NOAA climate data to identify a set of compelling regional connections between warming and crime rates, especially in winter. Warmer winters appeared to be setting the stage for more violent crimes like assault and robbery, likely because less nasty weather created more opportunities for interactions between people.

Now, the team has projected additional future violent crimes in the United States, by combining the mathematical relationships they uncovered in previous work with output from 42 state-of-the-art global climate models. The team accounted for key factors that previous studies have overlooked, including variations in crime rates across seasons and for different regions of the country.

"We are just beginning to scratch the surface on the myriad ways climate change is impacting people, especially through social systems and health," Karnauskas said. "We could see a future where results like this impact planning and resource allocation among health, law enforcement and criminal justice communities."

Credit: 
University of Colorado at Boulder

OHSU research informs NIH panel on achieving equity in preventive health services

Six in every 10 Americans have a chronic health condition. These conditions, including heart disease, cancer, diabetes or stroke, are the leading causes of death and disability in the United States and contribute greatly to the nation's annual health care costs, according to the Centers for Disease Control and Prevention.

While many chronic diseases can be prevented, delayed, or identified and treated early when patients work closely with primary care clinicians, differences in the use of health services by racial and ethnic minority groups, rural residents and people of lower socioeconomic status are significant and may contribute to disparities in disease burden and life expectancy.

In a review of 120 previously published articles, researchers at the Pacific Northwest Evidence-based Practice Center at Oregon Health & Science University assessed the effects of barriers that create health disparities in 10 select preventive services, such as cancer screening, smoking cessation or obesity management, and the effectiveness of interventions to reduce barriers.

Their findings indicate that enhanced services such as patient navigation, telephone calls and prompts, and reminders increased cancer screening rates across different patient populations. However, evidence was lacking to determine the effectiveness of interventions for other preventive services.

"In order to achieve health equity in preventive services, additional research is necessary to better understand patient and provider barriers, as well as the roles that health information technology and health systems can play in reducing disparities," says lead author Heidi D. Nelson, M.D., M.P.H., professor of medical informatics and clinical epidemiology and medicine in the OHSU School of Medicine.

The following themes were proposed to enhance future research: community engagement and systems approaches; integration of services and new delivery models; and the need for innovative methods, for example, pragmatic trials conducted in settings where at-risk populations are commonly treated.

Results of this study, published in Annals of Internal Medicine, helped to inform an independent panel, convened by the National Institutes of Health Pathways to Prevention Workshop, to develop 26 recommendations for further research to achieve more equal use and access to 10 preventive health services recommended by the U.S. Preventive Services Task Force.

Credit: 
Oregon Health & Science University

Artificial muscle sheets transform stem cells into bone

image: The shape-memory polymer actuator sheet coated with stem cells shrunken when cold (left) and stretched when heated (right).

Image: 
Source: Polymeric sheet actuators with programmable bioinstructivity, Zijun Deng, Weiwei Wang, Xun Xu, Oliver Gould, Karl Kratz, Nan Ma, and Andreas Lendlein, <em>Proceedings of the National Academy of Sciences</em>, 2020, doi: 10.1073/pnas.1910668117

Stem cells are known for their ability to turn into many different types of cell, be they muscle cells, cartilage, or bone cells. Just like the body they are part of, stem cells sense what happens around them and react accordingly. For decades, researchers have been learning how to steer this differentiation process by changing the cells' environment. The knowledge acquired is already being used in tissue engineering, in other words, to generate substitute materials that restore or maintain damaged biological tissues. However, most research has been done on static scaffolds. Now, researchers from the Helmholtz-Zentrum Geesthacht (HZG), the Berlin-Brandenburg Centre for Regenerative Therapies, the Freien Universität Berlin and the Helmholtz Virtual Institute for Multifunctional Biomaterials in medicine have used a dynamic scaffold.

New method created

The researchers took a polymer sheet that acts like an artificial muscle. The sheet has the unusual property in that it is trained to reversibly morph when exposed to repeated temperature changes. The researchers simply moulded a grid onto the underside of the sheet and programmed it to stretch as the temperature went from body temperature (37 °C) to 10 °C and to contract when re-heated. They then seeded the sheet with stem cells, and carefully observed the changing shape of the gridded sheet and cells. With the help of this "artificial muscle", the scientists could use one physical signal - the temperature change - to simultaneously send a second mechanical signal to the stem cells. With these synchronised stimuli it is possible to encourage the stem cells to turn themselves into bone cells.

"Our polymer actuator sheet has a so-called shape-memory function. In our experiments, this allows it to act like a transducer, with which we can effectively instruct the cells to do as we wish. We found that the changes in temperature, combined with the repeated stretching motion of the film was enough to encourage the stem cells to differentiate into bone cells" explained Professor Andreas Lendlein an author of the paper and head of the HZG's Institute of Biomaterial Science in Teltow, Germany.

Potential application in complex bone fractures

"The programmed polymer sheets could, for example, later be used to treat bones broken so severely that the body can't repair it by itself. Stem cells from a patient's bone marrow could be cultured on the sheet and adaptively wrap around the bone during an operation. The previously "trained" cells could then directly strengthen the bones" said Professor Lendlein. Given the recent report in New Scientist of a successful operation at 10 °C at the University of Maryland School of Medicine, such medical implants could become yet another tool in a surgeon's toolkit.

Credit: 
Helmholtz-Zentrum Hereon

Long-term memory performance depends upon gating system, study finds

image: The simple fruit fly, Drosophila melanogaster, is used in the Davis lab to study the genetics underlying memory.

Image: 
Scott Wiseman for Scripps Research

JUPITER, Fla.--Jan. 13, 2020--Storing and retrieving memories is among the most important tasks our intricate brains must perform, yet how that happens at a molecular level remains incompletely understood. A new study from the lab of Neuroscience Professor Ronald Davis, PhD, at Scripps Research, Florida, sheds light on one element of that memory storage process, namely the storage and retrieval of a type of hardwired long-term memory.

The Davis team found that moving memories to long-term storage involves the interplay of multiple genes, a known group whose activity must be upregulated, and, unexpectedly, another gatekeeping gene set, Ras, and its downstream connecting molecules, which are down-regulated. If either Ras or its downstream connector Raf are silenced, long-term memory storage is eliminated, the team writes in the Proceedings of the National Academies of Sciences, published the week of Jan. 13.

The type of memory they studied, ironically has a rather difficult-to-remember name: "protein-synthesis dependent long-term memory," or PSD-LTM for short. To study how it and other types of memory form, scientists rely upon the fruit fly, Drosophila melanogaster, as a model organism. The genetic underpinnings of memory storage are mostly conserved across species types, Davis explains.

To assess how the flies' memory consolidation process works at a molecular level, they used a process called RNA interference to lower expression of several candidate genes in several areas of the fly brain. Doing so with both the Ras gene and its downstream molecule Raf in the fly brain's mushroom body, its memory-storage area, had a two-pronged effect. It dramatically enhanced intermediate-term memories while completely eliminating PSD long-term memory of an aversive experience, Davis says.

The team's experiments involved exposing flies to certain odors in one section of a glass tube while simultaneously administering a foot-shock. Flies' subsequent avoidant behavior on exposure to that odor indicated their recollection of the unpleasant shock. Regardless of how many times the flies were "trained," lowering expression of Ras and Raf reduced their PSD long-term memory performance, explains first author Nathaniel Noyes, PhD, a research associate in the Davis lab.

While the Ras enzyme, Ras85D, was already known for its roles in organ development and cancer, the studies showed that in the adult brain, it apparently plays memory gatekeeper, helping direct whether experiences should be remembered as intermediate memory that dissipates after a time, or as long-term "protein-synthesis dependent" memory that persists.

Gating off the memory from the intermediate storage process shifted it over to PSD long-term memory storage, indicates that it's an either-or situation. Intermediate storage appears to be the fly brain's preferential, default pathway, Noyes says. He expects that the neurotransmitter dopamine will prove to play a key signaling role.

"We believe that dopamine signals to the brain that this memory is important enough to be stored long-term. We speculate that Ras and Raf receive this dopamine signal and thereby block intermediate memory and promote PSD long-term memory," Noyes says.

How this "intermediate" memory system works in humans requires further study as well, he adds.

"It's becoming apparent that many of the same genes involved in intermediate memory storage also play a role in mammalian memory and plasticity," he notes.

Credit: 
Scripps Research Institute

New study reveals international movements of Atlantic tarpon, need for protection

image: Electronic satellite tags deployed and tracked on 300 Atlantic tarpon, (Megalops atlanticus) in coastal waters of western central Atlantic Ocean, Gulf of Mexico and Caribbean Sea, including as far away as Mexico, Belize and Nicaragua showed that the mature tarpon make extensive seasonal migrations.

Image: 
Jiangang Luo, research scientist, University of Miami Rosenstiel School

MIAMI--The results of an 18-year study of Atlantic tarpon by scientists at the University of Miami (UM) Rosenstiel School of Marine and Atmospheric Science revealed that these large silvery fish take extensive seasonal migrations--1,000s of kilometers in distance--beyond U.S. borders. The new findings can help protect the fish, which is listed as vulnerable by the IUCN--International Union for Conservation of Nature, and the main draw of a more than $6 billion catch-and-release sport fishing industry in the United States.

Using electronic satellite tags, the UM research team tracked nearly 300 Atlantic tarpon (Megalops atlanticus) in coastal waters of the western central Atlantic Ocean, Gulf of Mexico and Caribbean Sea, including as far away as Mexico, Belize and Nicaragua. The results showed that the mature tarpon make extensive seasonal migrations along a warm, seasonally moving ocean-water feature known as the 26o C isotherm where temperatures remain constant. They also found the fish use both freshwater and estuarine habitat throughout their life and identified several previously unknown spawning locations in Florida and the Gulf of Mexico.

"Our findings show that there is international connectivity in the U.S. multibillion-dollar recreational tarpon fishing industry," said Jerry Ault, UM Rosenstiel School professor and a co-author of the study. "This is of great importance to anglers and scientists alike to better understand and protect this valuable--and vulnerable--fish and the people who rely on it."

Atlantic tarpon, known as the Silver King, are considered one of the greatest saltwater sport fish due to their size and spectacular fighting ability. They can reach up to eight feet (2.5 meters) long and weigh up to 355 pounds (161 kilograms), with an average speed of 35 miles per hour.

While tarpon fishing is predominately catch-and-release in the United States, subsistence and commercial harvests of tarpon occurs in many other countries. Sport fishing for tarpon is also very popular in other countries.

Despite the history and importance of recreational catch-and-release fishing for tarpon in the U.S., Atlantic tarpon are now threatened throughout their range by recreational fishing release mortality, directed commercial harvests, intensive harvesting of key prey species, and habitat degradation, said the scientists.

"A myriad of professional charter boat captains in the Florida Keys rely on tarpon fishing bookings as their principal source of income," said the study's lead author Jiangang Luo, a research scientist at the UM Rosenstiel School. "If the tarpon population declines, or alters their migration patterns due to climate changes, it would significantly affect lives and livelihoods in Florida and beyond."

Using the 18-year dataset, the researchers also found that shark predation on tarpon is more significant than previously known across the southeastern United States, Gulf of Mexico, and northern Caribbean Sea.

Credit: 
University of Miami Rosenstiel School of Marine, Atmospheric, and Earth Science

A new old therapy

The fight against drug-resistant pathogens remains an intense one. While the Centers for Disease Control's (CDC) 2019 "biggest threats" report reveals an overall decrease in drug-resistant microbe-related deaths as compared to its previous report (2013) the agency also cautions that new forms of drug-resistant pathogens are still emerging.

Meanwhile, the options for treating infections by these germs are diminishing, confirming doctors' and scientists' worries about the end of the age of antibiotics.

"We knew it was going to be a problem early on," said UC Santa Barbara chemistry and biochemistry professor Irene Chen. "Basically as soon as penicillin was discovered, a few years later it was reported that there was a resistant organism." Thanks to factors such as horizontal gene transfer and rapid reproduction, organisms such as Gram-negative bacteria are able to evolve faster than we can produce antibiotics to control them.

So Chen and her research group are seeking alternatives to antibiotics, in a growing effort to head off the tide of incurable bacterial infections. In their work, the group has turned to bacteriophages, a naturally occurring group of viruses that colonize on bacteria.

"That's their natural function, really, to grow on and kill bacteria," said Chen, author of a paper that appears in the Proceedings of the National Academy of Sciences. By taking advantage of the bacteriophages' ability to home in on specific bacteria without damaging the rest of the microbiome, the researchers were able to use a combination of gold nanorods and near-infrared light to destroy even multidrug-resistant bacteria without antibiotics.

Phage therapy isn't new, Chen said. In fact, it have been used in the former Soviet Union and Europe for about a century, though they are seen largely as last-resort alternatives to antibiotics. Among the unresolved issues of phage therapy is the incomplete characterization of the phages' biology -- a biology that could allow for unintended consequences due to the phages' own rapid evolution and reproduction, as well as potential toxins the viruses may carry. Another issue is the all-or-nothing aspect of phage therapy, she added.

"It's difficult to analyze the effect of a phage treatment," she said. "You might see it completely work or you might see it completely fail, but you don't have the kind of dose response you want."

To surmount these challenges, the Chen lab developed a method of controlled phage therapy.

"What we did was to conjugate the phages to gold nanorods," she explained. These "phanorods" were applied to bacteria on in-vitro cultures of mammalian cells and then exposed to near-infrared light.

"When these nanorods are photo-excited, they translate the energy from light to heat," Chen said, "and that creates very high local temperatures."

The heat is enough to kill the bacteria, and it also kills the phages, preventing any unwanted further evolutions. The result is a guided missile of targeted phage therapy that also allows for dosage control. The lab found success in destroying E. coli, P. aeruginosa and V. cholerae -- human pathogens that cause acute symptoms if left unchecked. They also were able to successfully destroy X. campestris, a bacteria that causes rot in plants.

In a collaboration with UC Santa Barbara mechanical engineer Beth Pruitt, the lab determined that while the heat successfully destroyed bacteria and phage, more than 80% of the mammalian cell culture underneath the bacteria biofilm survived.

"This issue of whether it damages mammalian tissues is very important," Chen said. "Work in nanotechnology and nanomedicine treating bacterial infections indicates that when it's non-targeted, it really does burden the surrounding tissues."

The lab plans to investigate other possible phages to counter other bacteria, possibly engineering a photothermal method that could treat multiple bacterial infections.

Credit: 
University of California - Santa Barbara