Body

Multifocal contact lenses slow myopia progression in children

video: Shaped like a bullseye, the soft multifocal contact lenses have two basic portions for focusing light. The center portion of the lens corrects nearsightedness so that distance vision is clear, and it focuses light directly on the retina. The outer portion of the lens adds focusing power to bring the peripheral light rays into focus in front of the retina. Animal studies show that bringing light to focus in front of the retina cues the eye to slow growth. The higher the power added, the further in front of the retina it focuses peripheral light.

Image: 
National Eye Institute

Children wearing multifocal contact lenses had slower progression of their myopia, according to results from a clinical trial funded by the National Eye Institute, part of the National Institutes of Health. The findings support an option for controlling the condition, also called nearsightedness, which increases the risk of cataracts, glaucoma and retinal detachment later in life. Investigators of the Bifocal Lenses In Nearsighted Kids (BLINK) Study published the results August 11 in the Journal of the American Medical Association.

"It is especially good news to know that children as young as 7 achieved optimal visual acuity and got used to wearing multifocal lenses much the way they would a single vision contact lens. It's not a problem to fit younger kids in contact lenses. It's a safe practice," said BLINK study chair, Jeffrey J. Walline, O.D., Ph.D., associate dean for research at the Ohio State University College of Optometry.

Myopia occurs when a child's developing eyes grow too long from front to back. Instead of focusing images on the retina--the light-sensitive tissue in the back of the eye--images of distant objects are focused at a point in front of the retina. As a result, people with myopia have good near vision but poor distance vision.

Single vision prescription glasses and contact lenses are used to correct myopic vision but fail to treat the underlying problem. Multifocal contact lenses - typically used to improve near vision of people over the age of 40 years - correct myopic vision in children while simultaneously slowing myopia progression by slowing eye growth.

Shaped like a bullseye, the soft multifocal contact lenses have two basic portions for focusing light. The center portion of the lens corrects nearsightedness so that distance vision is clear, and it focuses light directly on the retina. The outer portion of the lens adds focusing power to bring the peripheral light rays into focus in front of the retina. Animal studies show that bringing light to focus in front of the retina cues the eye to slow growth. The higher the power added, the further in front of the retina it focuses peripheral light.

By comparison, single vision glasses and standard contact lenses focus peripheral light to a point behind the retina, which prompts the eye to keep growing.

The researchers examined whether high-add power contact lenses provided better slowing of myopia progression and eye growth than medium-add power contact lenses. They found that only the high-add power contact lenses produced meaningful slowing of eye growth.

In addition to multifocal lenses, other myopia control options include orthokeratology contact lenses, which are worn overnight to reshape the cornea, or low-dose atropine eye drops used at bedtime.

The U.S. Food and Drug Administration approved one lens for myopia control in November 2019, but multifocal contact lenses have been used off-label to slow myopia progression for many years.

Myopia has surged in prevalence over the past five decades. In 1971, 25% of Americans were myopic, compared to 33% in 2004. By 2050, the worldwide prevalence of myopia is projected to be 54%, and the prevalence of high myopia, the most severe form, is projected to increase to 10%. High myopia means that a person's vision requires at least -5.00 diopters, the unit of focusing power correction required to optimize distance vision.

Reasons for the spike are unclear, but evidence suggests that near work, such as screen time, and shrinking outdoor time during early eye development are contributing factors. Genetic factors also play a role in one's predisposition to become myopic.

There are no tests to identify which individuals with myopia will progress to high myopia, but the younger a child is affected, the more opportunity their myopia has to progress if there is no intervention to slow it.

Study participants were 287 myopic children, ages 7 to 11 years. At baseline, the children required -0.75 to -5.00 diopters of correction to achieve clear distance vision. The children were randomly assigned to wear single vision contact lenses or multifocal lenses, the outer lens of which were either high-add power (+2.50 diopters) or medium-add power (+1.50 diopters). They wore the lenses during the day as often as they could comfortably do so. All participants were seen at clinics at the Ohio State University, Columbus, or University of Houston.

After three years, children in the high-add multifocal contact lens group had the slowest progression of their myopia. Mean myopia progression, as measured by changes in the eye prescription required to correct distance vision, was -0.60 diopters for the high-add group, -0.89 diopters for the medium-add group, and -1.05 diopters for the single vision group.

The multifocal lenses also slowed eye growth. The three-year adjusted eye growth was .42 mm for the high add group, .58 mm for the medium add group, and .66 mm for the single vision group.

"Greater amounts of myopia and longer eyes are associated with increased prevalence of eye conditions that can lead to visual impairment. Our study shows that eye care practitioners should fit children with high-add power multifocal contact lenses in order to maximize myopia control and the slowing of eye growth," said principal investigator, David A. Berntsen, O.D., Ph.D., associate professor and Golden-Golden professor of optometry at the University of Houston.. "Compared with single vision contact lenses, multifocal lenses slow myopia progression by about 43% over three years."

"There is a clear benefit from multifocal lenses at three years, but further study is needed to determine the ideal duration for wearing the lenses. Researchers will need determine how permanent the prevention of myopia progression will be once children stop wearing the multifocal lenses," said Lisa A. Jones-Jordan, Ph.D., principal investigator of the Data Coordinating Center at the Ohio State University. A follow-up study is underway to see if the benefits hold among children in this study when they go off treatment.

"We also need more information about the exact nature of the visual signals that slow eye growth. If we understood that process better, perhaps we could maximize it to have an even stronger treatment effect," said principal investigator, Donald O. Mutti, O.D., Ph.D., the E.F. Wildermuth Foundation professor of optometry at Ohio State.

Credit: 
NIH/National Eye Institute

ECMO for patients with COVID-19, severe respiratory failure

What The Study Did: We present our experience in using single-access, dual-stage venovenous ECMO (extracorporeal membrane oxygenation), with an emphasis on early extubation of patients while they received ECMO support.

Authors: Antone J. Tatooles, M.D., of Advocate Christ Medical Center in Oak Lawn, Illinois, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamasurg.2020.3950)

Editor's Note: The article includes conflict of interest disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

#  #  #

Media advisory: The full study is linked to this news release.

Embed this link to provide your readers free access to the full-text article This link will be live at the embargo time https://jamanetwork.com/journals/jamasurgery/fullarticle/10.1001/jamasurg.2020.3950?guestAccessKey=fe8b368f-e767-4fb2-a29b-eb4fb25560ea&utm_source=For_The_Media&utm_medium=referral&utm_campaign=ftm_links&utm_content=tfl&utm_term=081120

Credit: 
JAMA Network

Mass General study shows physical distancing slowed growth of COVID-19 in US

BOSTON - Between March 10 and March 25, 2020, all 50 states and the District of Columbia enacted at least one statewide physical distancing measure to help stop the spread of COVID-19. New research from clinicians at Massachusetts General Hospital (MGH) show these government-issued physical distancing orders significantly slowed the COVID-19 epidemic, leading to an estimated reduction of more than 600,000 cases within three weeks of implementation. The findings were recently published in PLOS Medicine.

"Many have strongly suspected that physical distancing policies helped interrupt COVID-19 transmission during the early days of the U.S. epidemic," said Mark J. Siedner, MD, MPH, an infectious diseases physician at MGH and Associate Professor of Medicine at Harvard Medical School, who co-authored the research. "This study adds clear evidence to support those suspicions. The results show the timing of government-issued orders correlated strongly with reductions in both cases and deaths. In short, these measures work, and policy makers should use them as an arrow in their quivers to get on top of local epidemics where they are not responding to containment measures."

The MGH researchers - in collaboration with colleagues at University College London, the Harvard T.H. Chan School of Public Health, and the Perelman School of Medicine at the University of Pennsylvania - analyzed data from the first five months of the COVID-19 epidemic in the U.S. They collected data on government-issued orders on statewide physical distancing measures and compared changes in COVID-19 cases and COVID-19-attributed deaths in states that implemented physical distancing measures before and after implementation.

The results show the average daily COVID-19 case growth rate began declining approximately one incubation period (i.e. four days) after implementation of the first statewide physical distancing measures. The period of time required for the number of cases to double (epidemic doubling time) increased from approximately four days to eight days within three weeks of implementation. These findings are consistent with other recently published work. What is unique to this new study is researchers found that the average daily COVID-19-attributed death rate also began declining after the implementation of physical distancing measures, which prior to this study had not been analyzed.

The study looked at a wide array of measures, including school and business closures, restrictions on public gathering, and shelter-in-place orders. Most combinations of these orders appeared to have similarly beneficial effects. Because the different types of physical distancing measures were generally implemented in close temporal proximity to each other, the research team was unable to determine specifically which types of physical distancing measures were most effective.

The findings of the model suggest that statewide physical distancing measures reduced the total number of reported COVID-19 cases by approximately 1,600 cases by one week after implementation and - due to the exponential growth of the spread - by approximately 621,000 cases by three weeks after implementation.

One of the most significant limitations of the study is that the implementation of statewide physical distancing measures was not a controlled experiment. If state governments intensified physical distancing measures in response to worsening local epidemics, the analysis would likely have shown that the policies were less effective. The authors further emphasize that, in many states, people may have begun spontaneously changing their behavior in response to a worsening local epidemic even prior to any statewide measures.

"The findings show that physical distancing measures slowed the growth of the COVID-19 epidemic and saved lives, and also bought our health care leaders some time to fortify their surge capacity to deal with the epidemic," said senior author Alexander C. Tsai, MD, psychiatrist at MGH and Associate Professor of Psychiatry at Harvard Medical School. "Unfortunately, the national response has largely abdicated the responsibilities of planning a coordinated response, so we probably could have done more sooner. This means much of the time before these local measures were implemented was simply wasted."

Added Siedner: "This is a case where past success does not predict future control. COVID-19 case growth appears to be trending upward in many states around the country, including Massachusetts. If containment and more conservative measures fail, we should be prepared to slow or reverse reopening efforts. Until a vaccine is made available and widely deployed in an equitable fashion, we have few other options. Fortunately, our data show that these measures work -- if we have the wherewithal to use them."

Credit: 
Massachusetts General Hospital

SARS-CoV-2 infection among health care workers in hospital

What The Study Did: This study sought to establish the rate of COVID-19 among health care workers through widespread screening for SARS-CoV-2 exposure in a large community hospital.

Authors: Allen Jeremias, M.D., M.Sc., of  St Francis Hospital in Roslyn, New York, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamainternmed.2020.4214)

Editor's Note: Please see the article for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

New guidelines for managing mucositis now available

Updated clinical practice guidelines for managing mucositis, a very common and often debilitating complication of cancer therapy, was recently published in the journal Cancer. Patients experiencing mucositis often require enteral or parenteral nutrition, consume more opioids, and experience more interruptions to cancer therapy than patients who do not experience mucositis.

The new guidelines summary, which will provide healthcare professionals better tools to deliver care for cancer patients, is the result of extensive and meticulous literature review and clinical interpretation by the Multinational Association of Supportive Care in Cancer/International Society of Oral Oncology. The MASCC/ISOO charged its Mucositis Study Group, comprised of 250 experts from 33 countries, to conduct the systematic review.

Led by Sharon Elad, DMD MSc, professor at the University of Rochester Medical Center's Eastman Institute for Oral Health, the Mucositis Study Group's major goal is to improve outcomes of patients experiencing mucositis associated with cancer therapies.

Mucositis affects the inner lining of the oral and gastrointestinal tract. Oral mucositis often leads to difficulty eating and swallowing. Gastrointestinal mucositis is associated with nausea, vomiting, diarrhea, bloating, intestinal cramping, and anal pain. For patients who are immunosuppressed, oral mucositis is associated with greater risk for bacteremia, which has possible systemic implications.

Highlights from this newly published summary paper include additional recommendations for the use of photobiomodulation therapy and benzydamine, as well as a stronger guideline statement for cryotherapy. Each of these guidelines is defined for a specific setting and cancer patient population.

"Interestingly, natural honey had sufficient evidence, when used topically and then swallowed, to suggest possible mucositis prevention for patients with head and neck cancer who receive treatment with either radiotherapy or radio-chemotherapy," said Dr. Elad.

"But it's important to note that the long-term effect of this intervention is unclear at this point," she added. "Even with the best evidence-based interventions, we don't have an ultimate guideline for mucositis in all clinical settings. Future research will hopefully identify better interventions that will relieve the patient's pain and improve quality of life."

This summary paper captures the highlights of a series of frequently cited detailed publications describing the approach to various categories of interventions. This includes the following categories for oral mucositis: (1) anti-inflammatory agents, (2) photobiomodulation therapy, (3) protocols categorized as basic oral care, (4) growth factors and cytokines, (5) antimicrobials, mucosal coating agents, anesthetics, and analgesics, (6) cryotherapy, (7) vitamins, minerals, and nutritional supplements, (8) natural and miscellaneous agents. Likewise, it includes a guidelines publication about all interventions for gastrointestinal mucositis.

The 2019/20 guidelines update is a landmark paper on the evolution of the mucositis clinical practice guidelines. The first MASCC/ISOO guidelines paper was published in 2003 and updated in 2007 and 2014. The continuous update of the guidelines is done by a large multi-disciplinary group of clinicians and scientists using meticulous methods which incur validity and applicability. Projects carried out by the various MASCC/ISOO Study Groups result in clinical practice guidelines, position papers, publications, and other products that advance supportive care in cancer.

Credit: 
University of Rochester Medical Center

Texas A&M researchers developing first oral anthrax vaccine for livestock, wildlife

image: Dr. Jamie Benn Felix, along with a team of Texas A&M researchers, is working to develop an anthrax vaccine that could be delivered orally to deer and wildlife.

Image: 
Texas A&M University College of Veterinary Medicine & Biomedical Sciences

There may soon be a new weapon in the centuries-old battle against anthrax in wildlife thanks to groundbreaking work at the Texas A&M University College of Veterinary Medicine & Biomedical Sciences (CVMBS).

Anthrax, a disease caused by a bacterium called Bacillus anthracis, contaminates surface soil and grasses, where it may be ingested or inhaled by livestock or grazing wildlife. This is especially common in the western Texas Hill Country, where each year the disease kills livestock and wildlife.

While normally not an attention-grabbing problem, a spike of cases in 2019 made headlines around the state.

According to Dr. Jamie Benn Felix, a postdoctoral research associate in the Cook Wildlife Lab led by CVMBS Department of Veterinary Pathobiology's (VTPB) Dr. Walt Cook, that spike may have been responsible for the deaths of more than 10,000 animals.

"If you assume the economic value for each animal was $1,000, which is probably extremely low given the number of exotic species on some of the ranches, you're looking at an economic loss of $10 million in just a few months," she said. "And given the problems with reporting cases, it could be significantly higher than that."

The good news is that there is already a vaccine for anthrax, which many livestock owners administer annually. Unfortunately, it can only be administered with an injection that is time consuming for livestock and not feasible for wildlife.

With that in mind, Benn Felix and the Cook Wildlife Lab team, in collaboration with VTPB researchers Dr. Allison Rice-Ficht and Dr. Thomas Ficht, went to work to attempt to create a formulation to deliver the vaccine orally, which would allow for potential distribution to wildlife. She recently published the results of a pilot study in Nature and is now moving on to the next round of tests.

If successful, they will have developed the first effective oral vaccine against anthrax for wildlife.

"The preliminary results showed that this concept has potential, so now we are starting up a deer study and we'll see where it goes from there," Benn Felix said.

Anthrax is among the oldest enemies of microbiologists, and the current vaccination method -- using what's known as the Sterne strain -- is basically the same as it was 85 years ago when Max Sterne developed it, so an oral vaccine has been a goal for some time.

In fact, in the past, many livestock owners trying to save time and effort would pour the vaccine over food, but previous testing by Benn Felix proved the ineffectiveness of this method.

The main issue with an oral vaccine is the ability to keep the bacteria alive in the gastrointestinal tract long enough and in the right amount to produce the desired immune activity in the animal. To that end, other efforts have been made with different strains of the bacteria and other mediums, but have thus far not proven effective.

Benn Felix's approach is both simpler and more complex -- simpler, because her approach uses the same strain that has been proven effective for decades, but more complex because of the use of a gel-like suspension.

"Our idea is that with this oral anthrax vaccine, we can get it into a bait of some sort and then easily vaccinate these animals," Benn Felix said. "The formulation that we're using is the same live strain of bacteria from the current commercial vaccine put into a gel-like substance."

Benn Felix compared the release of the vaccine in the gel-like substance, technically known as alginate encapsulation, to a common gumball machine.

"It's the same general idea as those big glass gumball machines you would see in the mall or a store, in which you put a quarter and get a single gumball out," she said. "The gel holds a bunch of the live attenuated bacteria and it gradually releases some of that bacteria over time."

Though they're currently still working at a small scale, Benn Felix and her team are keeping an eye to the distant future and considering how this vaccine might be implemented at a larger scale.

One example they're looking at is what Dr. Tonie Rocke did at the National Wildlife Health Center in Madison, Wisconsin, with a plague vaccine for prairie dogs.

"They put their vaccine into a bait that was flavored with peanut butter flavoring," Benn Felix said. "That is the same general idea that we're going for with this; we would just distribute the baits and then see how many were consumed, or we would have trail cameras that would see if there was any non-target species that ate any of it.

"There are a lot of things that would go into formulating the bait -- making sure the vaccine is still stable and viable when it's in the bait and then seeing how it would affect or be consumed by wildlife or any other wildlife we don't want to have it," she said.

Currently, one of Benn Felix's biggest obstacles is a lack of data on exactly how much damage is caused by anthrax in wildlife in Texas. Her team is actively reaching out to ranchers, hunters and other groups across the state in an effort to increase the reporting on anthrax cases.

"If anthrax outbreaks aren't reported, it appears as if it's not an issue and the federal government and other organizations don't prioritize funding," Benn Felix said. "I didn't realize this was even an issue until I moved to Texas. Reporting outbreaks will help generate critical data about this issue and demonstrate as a fact what we down here already know, which is that it's a huge issue."

Credit: 
Texas A&M University

COVID-19 clinical trials lack diversity

Despite disproportionately higher rates of COVID-19 infection, hospitalization and death among people of color, minority groups are significantly underrepresented in COVID-19 clinical trials, according to a new perspective authored by faculty from the University of Georgia and University of Colorado and pharmacists from Phoebe Putney Memorial Hospital in Albany.

Published by the New England Journal of Medicine, the article calls on government agencies, medical journals, funders of research, among others, to diversify study participants in order to be able to generalize results to the larger U.S. population. Lead author is Daniel Chastain, a clinical assistant professor of pharmacy at UGA's Albany campus. Co-authors also include Sharmon Osae and Henry Young from the UGA College of Pharmacy and Joeanna Chastain from Phoebe.

In the nationally funded Adaptive COVID-19 Treatment Trial that is testing the efficacy of the antiviral remdesivir, Black Americans accounted for 20% of the total patient population. In the Gilead-funded clinical trial of the drug, only roughly one out of every 10 patients given remdesivir were Black. Latinx and Native Americans comprised 23% of the former trial and less than 1% of the latter.

"The overwhelming majority of the patients in both of those large clinical trials were Caucasians," said Chastain. "Knowing that African Americans die at a higher rate than Caucasians, can I say that this medication will work in them as well? Yes, they enrolled a bunch of patients and yes they got these data out as fast as possible, but can we use this information to inform treatments in all patients?"

The remdesivir trials showed patients given the drug recovered from COVID-19 slightly faster than those who received placebos, but Black, Indigenous and people of color often experience more severe symptoms and complications from the disease. It remains undetermined whether they will respond as well to the medication.

"Why aren't we putting up infrastructure for clinical trial sites in areas that were heavily hit by COVID?" Chastain said. "If we would've included Albany, those clinical trials would've been more diversified and would've been much more representative of what the coronavirus pandemic looks like in our area and throughout the U.S."

Chastain previously co-authored a paper, published in the Journal of Hospital Medicine, urging health care providers to learn from medical mistakes made during previous disease outbreaks and to be more circumspect in using unproven and undertested treatments and therapies on patients.

"I think the hardest question to address is what's the harm? I have no idea what the potential long-term complications of these treatments may be. We don't know. That's what makes me the most nervous going forward," Chastain said. "We're so prone and we're taught that you always have to 'do something,' but sometimes doing something is the worst thing to do in that scenario."

Credit: 
University of Georgia

Does high blood sugar worsen COVID-19 outcomes?

As COVID-19 continues to rage across the U.S., researchers are digging deeper into how the virus wreaks havoc on the body, especially for those with a pre-existing chronic illness.

Now, after preliminary observations of 200 COVID-19 patients with severe hyperglycemia, a Michigan Medicine team is shedding light in a new American Diabetes Association paper about why high blood sugar may trigger worse outcomes in people infected with the virus. And researchers have developed a blood sugar management tool that may potentially reduce risk of secondary infections, kidney issues and intensive care stays in people with diabetes, prediabetes or obesity who get COVID-19.

"Based on preliminary observations of our patients, those with one of these pre-existing conditions are at high risk for making the virus-induced respiratory dysfunction much worse, potentially resulting in death," says first author Roma Gianchandani, M.D., a professor of internal medicine in the Michigan Medicine division of metabolism, endocrinology and diabetes.

But why?

Senior author Rodica Pop-Busui M.D. Ph.D., the Larry. D Soderquist professor of diabetes, professor of internal medicine and vice chair of clinical research in the Department of Internal Medicine, suspects it's the low grade, inflammatory nature of diabetes and hyperglycemia that promote the virus' inflammatory surge, resulting in insulin resistance and severe hyperglycemia.

"When the body becomes this inflamed, it triggers an abnormal immune response that instead of just attacking the virus, affects the rest of the body's healthy cells and tissue, leading to rapid deterioration in health," she says.

Specifically, these patients are at an increased risk for mechanical ventilation, kidney replacement therapy due to kidney failure and requiring medications known as vasopressors to stop dangerously low blood pressure or steroids to combat acute respiratory distress syndrome.

"All of these complications make blood sugar management more difficult, but our team is convinced this management is essential to prevent complications that lead to prolonged inpatient stays, or morbidity," Gianchandani says. "A recent study has already shown there's a correlation between well-controlled blood sugar and lower levels of inflammatory markers."

The research team developed a tool to identify and manage high blood sugar in COVID-19 patients, placing them into certain risk categories that looked at hyperglycemia severity, presence of obesity, level of insulin resistance, extent of kidney dysfunction and evidence of rapid changes in inflammatory markers.

Implementing an algorithm

The newly created hyperglycemia management teams set out to find a way to monitor patients' diabetes without having to use more personal protective equipment to visit the rooms all the time. It was also important to reduce the health care provider's exposure to the virus as much as possible.

Although typically accurate, a continuous glucose monitor wouldn't be as helpful because a patient's low blood pressure and the use of blood pressure medications could falsely elevate blood sugar levels.

The new protocol called for insulin delivery every six hours, and at the same time a nurse would check in on the patient. Some patients who were on ventilators or receiving high doses of Vitamin C would get their arterial or venous blood sugar levels checked, replacing the need for the team's blood sugar check.

For those with the highest blood sugar levels and severe hyperglycemia, insulin infusions were an option for patients until their levels fell between a normal range.

The result of these efforts helped successfully lower blood sugar levels without increasing nurse contact or the overall burden on primary care teams and PPE usage.

"Improving blood sugar control was important in reducing the amount of secondary infections and kidney issues this cohort of patients are susceptible to," Gianchandani says. "This might help shorten ICU stays and lessen the amount of patients that need a ventilator."

It's important to note this algorithm wasn't developed as a result of a clinical trial, but is based solely on preliminary observations in the patients the team followed. A larger, randomized and controlled study is necessary to determine how this algorithm impacts mortality, time to recovery, the length of ICU stays and rate of severe complications.

"Our team is looking forward to the next steps in confirming our hypothesis," Gianchandani says. "In the meantime, I think these observations validate the importance of blood sugar management in COVID-19 patients and can serve as a guide or inspiration for other institutions."

Credit: 
Michigan Medicine - University of Michigan

EULAR: Timely detection of axial spondyloarthritis

A large European cross-sectional study has now examined which factors delay the diagnosis of axSpA (1). A major factor is the number and type of medical professionals who have assessed the patients.

The European League Against Rheumatism (EULAR) therefore recommends initiating measures to prevent misdirected referrals, to facilitate timely referral of patients with a high probability of axSpA to a rheumatologist and consequently speed up the diagnosis.

Axial spondyloarthritis (axSpA) is a chronic systemic rheumatoid-inflammatory disease associated with inflammation of the spine. "The affected patients often have been experiencing spinal pain since early adulthood", EULAR President Professor Dr Iain B. McInnes, Director of the Institute of Infection, Immunity & Inflammation, The University of Glasgow, Scotland, explains: "Over time, the structure of the axial skeleton changes and its ability to move becomes increasingly limited." Patients experience difficulties getting dressed, bathing, showering, putting on shoes and climbing stairs, among other things. The onset of axSpA affects younger patients, with the first symptoms already experienced by patients between 20 and 30 years of age.

"Symptoms can vary widely and physicians may initially fail to associate them with the disease," Professor Dr John Isaacs, Director of Therapeutics, The University of Newcastle, UK and Scientific Chair of the EULAR Scientific Committee, explains. He cites pain as a symptom typically affecting both the lower back and upper back, including the neck. "Many years can pass until the correct diagnosis is made. It is important, however, to detect and accordingly treat the disease as early as possible. Timely treatment can help to prevent permanent damage to bones and joints."

The "European Map of Axial Spondyloarthritis (EMAS)" study investigated which factors can impact the time to diagnosis. 2,846 patients from 13 European countries with an average age of 48.1 years, who on average had been suffering from axial spondyloarthritis for 17.2 years, participated in the cross-sectional study.

Professor Dr Marco Garrido-Cumbrera of the University of Seville, Spain, and scientific advisor to the International Axial Spondyloarthritis International Federation (ASIF), together with the EMAS Steering Committee, present the findings in a recent study: patients were 26.6 +/- 11.1 years old on average at symptom onset, the mean age at diagnosis was 33.7 +/- 11.5 years and the mean time to diagnosis was 7.4 +/- 8.4 years. Different variables were associated with longer time to diagnosis, among them younger age at symptom onset and female gender. The most strongly associated parameter, however, was the number of medical professionals involved prior to diagnosis. Because these professionals proposed incorrect diagnoses, this delayed the time it took to arrive at the correct diagnosis by a rheumatologist.

In this large sample, the diagnostic delay is on average 7.4 years but can even be up to 15 years. The disease burden on the patient is huge, especially as they suffer from terrible and disabling pain for so many years, without even knowing the source of the pain. As Garrido-Cumbrera adds: "A diagnosis introduces a specific and appropriate treatment and brings hope for the future."

According to Professor Dr. Denis Poddubnyy from the Charité-Universitätsmedizin Berlin, who also participated in the study, "the fact that visiting a higher number of healthcare professionals delayed the diagnosis shows that there is an urgent need to take measures that prevent misdirected referrals and bring patients with a high probability of axSpA directly to a rheumatologist." "Improving professional training could indeed prevent unnecessary delays in diagnosis", Poddubnyy concludes.

Credit: 
European Alliance of Associations for Rheumatology (EULAR)

Increased breast cancer risk in obesity linked to fat cell chemicals

Obesity increases the release of tumour-promoting molecules from fat tissue and is associated with an increased risk of breast cancer, according to a study published in Endocrine-Related Cancer. The study found that fat tissue from people with obesity released increased amounts of extracellular vesicles (EV's) enriched in harmful and inflammatory molecules into the blood stream, which can alter breast cancer cells to become more aggressive and invasive. These findings suggest that substances released from fat tissue may increase tumour malignancy that could lead to new therapeutic targets and improved cancer treatments.

Obesity is a global, public health problem with rapidly increasing rates that has been linked to increased risk of breast cancer. Breast cancer is the most common cancer among women and despite the well-known association between obesity and increased breast cancer risk, few studies have evaluated the role of fat tissue. Fat tissue is known to release inflammatory molecules that can increase the risk of diabetes, cardiovascular disease and cancer. They also produce EV's that carry inflammatory molecules and other substances, including enzymes and molecules that are involved in cell-to-cell communication. More of these vesicles are released from the fat tissue of people with obesity. A better understanding of how the contents of EV's may affect cancer cells could help explain the link between obesity and a poor cancer prognosis.

Professor Christina Barja-Fidalgo and her group, from State University of Rio de Janeiro, analysed the effects of the fat tissue-derived EV's on breast cancer cells. Fat tissue was obtained from obese patients who had undergone weight-loss surgery, and from lean people undergoing plastic surgery. These tissues were incubated in culture for 24 hours and the amount and type of substances secreted from both were compared. The analysis showed that the fat tissue from obese patients secreted higher amounts of inflammatory molecules and also produced a greater number of small vesicles, which may increase breast cancer risk.

"When the extracellular vesicles carrying inflammatory molecules interact with breast cancer cells, we see they are able to modify their behaviour, so that they become more aggressive with increased capacity to invade other tissues." Prof Barja-Fidalgo explains.

These findings demonstrate how obesity can induce alterations in fat tissue function, which may explain how obesity contributes to increased breast cancer risk, as well as why obese women are more likely to have a worse prognosis.

Prof Barja-Fidalgo adds: "Identifying these harmful fat tissue secretions in the blood of obese patients could be a new parameter to be monitored, as an indicator of cancer progress. Understanding the content of the vesicles released by fat tissue during obesity may provide new therapeutic targets and improve cancer treatment."

Prof Barja-Fidalgo now plans to investigate the characteristics and specific contents of extracellular vesicles released by fat tissue, to determine the main molecules associated with obesity-induced alterations in tumour cells and their modes of action.

Credit: 
Society for Endocrinology

Molecules in urine allow doctors to monitor skin cancer

What if you could simply provide a urine sample rather than undergo a painful surgical procedure to find out if your cancer was responding to treatment? It may seem too good to be true, but researchers at Pavol Jozef Šafárik University in Košice, Slovakia, have identified fluorescent molecules in urine that may allow patients with malignant melanoma to do just that.

Tracking cancer progression is important as it allows doctors to see if someone is responding to treatment. At present, malignant melanoma patients require invasive biopsies to diagnose and track the progression of their cancer. Using this new approach, doctors could ask patients to provide a urine sample instead, and then fluorescent molecules in the sample could reveal disease progression rapidly and inexpensively.

The research article, "Fluorescence biomarkers of malignant melanoma detectable in urine", has been published in De Gruyter's open access journal Open Chemistry. It describes a group of fluorescent molecules - easily detectable in urine - which correlate with melanoma progression, creating new possibilities for monitoring the disease.

This technique is badly needed as malignant melanoma is particularly challenging to treat and monitor. This skin cancer is highly aggressive and frequently spreads to other sites in the body so monitoring its progression is very important. However, current techniques mean that patients have to undergo invasive surgery to remove tissue samples and then lab technicians must perform expensive and time-consuming analysis of these samples. Unfortunately, patients may avoid getting timely diagnosis and treatment as they fear these invasive procedures.

These issues prompted Dr. Ivana Špaková and colleagues to look for an alternative. They focused on specific fluorescent molecules that cancer cells produce during metabolic processes involved in their growth and progression, and which end up in urine.

The researchers analyzed urine samples from patients with malignant melanoma and healthy controls using fluorescence spectroscopy, a simple and inexpensive detection method, to see if there were any differences in levels of the fluorescent markers. They also performed genetic analysis for the same patients to examine genes involved in melanoma progression.

The urine samples from the malignant melanoma patients contained different levels of the metabolism-linked fluorescent markers compared with those from healthy controls. Strikingly, the levels of the fluorescent molecules in the urine correlated with the stage of melanoma and the expression of genes that are linked to melanoma progression, suggesting that the molecules have significant potential as biomarkers.

"Our results show that we can successfully use urine, a simply and non-invasively collected biological material, to determine the progression and treatment response of malignant melanoma," said Špaková. "The results highlight the potential of 'waste metabolites' in monitoring disease. This method is a user friendly and straightforward technique which could be performed using standard laboratory equipment."

Credit: 
De Gruyter

Recipe for success -- interaction proteomics become a household item

image: The MAC-tag workflow allows molecular microscopy of proteins.

Image: 
Ella Maru Studio

Proteins in human cells do not function in isolation and their interactions with other proteins define their cellular functions. Therefore, detailed understanding of protein-protein interactions (PPIs) is the key for deciphering regulation of cellular networks and pathways, in both health and disease.

In a study published in the of September issue (advanced online 10th of August) of Nature Protocols, a research team led by Research Director Markku Varjosalo from the Institute of biotechnology & HiLIFE, University of Helsinki, introduces an optimised and integrated interaction proteomics protocol (named MAC-tag technology). The protocol combines two state-of-the art methods affinity purification - mass spectrometry (AP-MS) and proximity-dependent biotin identification (BioID to allow rapid identification of protein-protein interactions and more.

The MAC-tag technology allows an easy way to probe the molecular level localisation of protein of interest (an accompanying online resource of MS microscopy is available at http://www.proteomics.fi). The developed MAC-tag and the integrated approach will empower, not only the interaction proteomics community, but also cell/molecular/structural biologists, with an experimentally proven integrated workflow for mapping in detail the physical and functional interactions and the molecular context of proteins.

"The MAC-tag technology stems from long-term efforts on developing new systems biology tools for systematically studying the molecular interactions of proteins. The identification of protein-protein interactions and their changes in disease settings, such as cancer, has proven in our hands a powerful tool and has allowed us to find exact molecular mechanisms underlying these diseases. In principle, our protocol can be used in so many different ways that we probably have not even envisioned half of them." Dr. Varjosalo states.

The MAC-tag technology is currently in use by Dr. Varjosalo and his consortia of virologists, medicinal chemists and other 'omics' researchers in search for novel druggable host proteins as therapeutic targets to inhibit the SARS-CoV-2 infection and therefore to fight Covid-19.

Credit: 
University of Helsinki

Excess weight among pregnant women may interfere with child's developing brain

Obesity in expectant mothers may hinder the development of the babies' brains as early as the second trimester, a new study finds.

Led by researchers at NYU Grossman School of Medicine, the investigation linked high body mass index (BMI), an indicator of obesity, to changes in two brain areas, the prefrontal cortex and anterior insula. These regions play a key role in decision-making and behavior, with disruptions having previously been linked to attention-deficit/hyperactivity disorder (ADHD), autism, and overeating.

In their new study, publishing online August 11 in the Journal of Child Psychology and Psychiatry, the investigators examined 197 groups of metabolically active nerve cells in the fetal brain. Using millions of computations, the study authors divided the groups into 16 meaningful subgroups based on over 19,000 possible connections between the groups of neurons. They found only two areas of the brain where their connections to each other were statistically strongly linked to the mother's BMI.

"Our findings affirm that a mother's obesity may play a role in fetal brain development, which might explain some of the cognitive and metabolic health concerns seen in children born to mothers with higher BMI," says Moriah Thomason, PhD, the Barakett Associate Professor in the Department of Child and Adolescent Psychiatry at NYU Langone Health.

As obesity rates continue to soar in the United States, it is more important than ever to understand how the condition may impact early brain development, says Thomason, who is also an associate professor in the Department of Population Health at NYU Langone.

Previous studies showing an association between obesity and brain development had mostly looked at cognitive function in children after birth. The new investigation is believed to be the first to measure changes in fetal brain activity in the womb, and as early as six months into pregnancy.

Thomason says this approach was designed to eliminate the potential influence of breast feeding and other environmental factors occurring after birth and to examine the earliest origins of negative effects of maternal BMI on the developing child's brain.

For the investigation, the research team recruited 109 women with BMIs ranging from 25 to 47. (According to the National Institutes of Health, women are considered "overweight" if they have a BMI of 25 or higher and are "obese" if their BMI is 30 and higher.) The women were all between six and nine months' pregnant.

The research team used MRI imaging to measure fetal brain activity and map patterns of communication between large numbers of brain cells clustered together in different regions of the brain. Then, they compared the study participants to identify differences in how groups of neurons communicate with each other based on BMI.

The investigators caution that their study was not designed to draw a direct line between the differences they found and ultimate cognitive or behavioral problems in children. The study only looked at fetal brain activity. But, Thomason says, they now plan to follow the participants' children over time to determine whether the brain activity changes lead to ADHD, behavioral issues, and other health risks.

Credit: 
NYU Langone Health / NYU Grossman School of Medicine

Study points out opioid risks for patients transitioning to skilled nursing facilities

PORTLAND, Ore. - Hospital patients discharged to skilled nursing facilities often bring a high-dose painkiller prescription with them, suggesting more attention should be paid to opioid safety for those patients, research from the Oregon State University College of Pharmacy shows.

The findings are important because they shed light on an understudied aspect of the opioid-fueled public health crisis that has gripped the United States for more than two decades.

Also, 61% of the patients in the study who received an opioid prescription upon hospital discharge were older than 65 - an age demographic that carries a high risk of opioid-associated harm.

"Increased efforts are likely needed to optimize opioid prescribing among patients transitioning from hospitals to skilled nursing facilities," said corresponding author Jon Furuno, an associate professor and the interim chair of the Department of Pharmacy Practice.

The study was published in Pharmacoepidemiology and Drug Safety.

Traced to over-prescribing that began in the 1990s, the opioid epidemic claims more than 40,000 American lives per year, according to the U.S. Department of Health and Human Services. Ten million people a year misuse prescription opioids, and 2 million suffer from an opioid use disorder.

"An estimated 130 people die each day in this country due to an opioid overdose," Furuno said. "And prescription opioid misuse in the United States also does economic damage of more than $78 billion per year."

Over a one-year period, Furuno and collaborators at Oregon State, Oregon Health & Science University and the University of Massachusetts Medical School looked at 4,374 hospital patients who were discharged to a skilled nursing facility - a facility for people to receive short-term, rehabilitative care or long-term residential care.

Seventy percent of the patients received an opioid prescription upon hospital discharge, and 68% of those prescriptions were for oxycodone - 1.5 times as potent as morphine. Moreover, greater than half of the prescriptions had a daily morphine milligram equivalent of 90 or higher - a threshold the Centers for Disease Control and Prevention says prescribers should "avoid" or "carefully justify."

"Being a surgical patient, being female, having a diagnosis of cancer or chronic pain, and receiving an opioid on the first day of hospital admission were all independently associated with the likelihood of receiving an opioid prescription upon discharge to a skilled nursing facility," Furuno said. "For patients or residents in those facilities, opioid risks are often compounded by the fact many of them are taking multiple drugs for multiple conditions."

Also, some of those patients are frail and suffer from cognitive impairment that can make safe opioid prescribing more challenging.

"And notably, skilled nursing facility residents are also often undertreated for pain," Furuno said. "These results support the complexity and need to optimize opioid prescribing in this patient population."

Future research, he added, should look at the frequency of inappropriate opioid prescribing among patients leaving hospitals for skilled nursing facilities; prescribing practices within hospitals; and outcomes as patients go home from skilled nursing facilities.

"Complicating matters is the growing number of joint replacement and other surgical patients who receive opioid prescriptions, may stay in a skilled nursing facility for just a short time and then are discharged back into their communities," Furuno said. "Prescribers and pharmacists need to work together to ensure patients' pain is managed safely, and knowing which patients are most at risk can inform the best use of resources like medication counseling and other interventions."

Credit: 
Oregon State University

Brain activity during psychological stress may predict chest pain in people with heart disease

DALLAS, August 10, 2020 — Stress-induced activity in the inferior frontal lobe of the brain may have a direct correlation with chest pain among people with coronary artery disease, according to new research released today in Circulation: Cardiovascular Imaging, an American Heart Association journal.

Angina pectoris, or stable angina, is chest pain or discomfort due to inadequate blood flow to the heart. Angina occurs due to coronary artery disease, and, while recent research has indicated psychological factors including mental stress can lead to angina, little is known about the brain mechanisms involved. This study was designed to measure how activity in the inferior frontal lobe of the brain – the area of the brain associated with emotional regulation and stress – affects the severity of self-reported angina.

“Our study sought to understand the degree to which health care providers should incorporate stress and other psychological factors when evaluating and treating angina,” said lead investigator Amit J. Shah, M.D., M.S.C.R., assistant professor of epidemiology at Emory University’s Rollins School of Public Health in Atlanta. “Although brain imaging during a mental stress challenge is not a test that can be ordered in clinical settings, the study shows an important proof-of-concept that shows the brain’s reactivity to stress is an important consideration when considering angina treatment.”

A total of 148 people with coronary artery disease participated in the study from 2011 to 2014. The study participants were, on average, 62 years old, and consisted of 69% men and 31% women. The group underwent testing of mentally stressful events in a clinical setting and had brain imaging and cardiac imaging conducted in conjunction with the tests.

Participants were assessed with three tests that were performed over a two-week period: a mental stress test with brain imaging; a mental stress test with heart imaging; and an exercise or chemical stress test with heart imaging. During these tests, the researchers monitored participants for chest pain. Additional questionnaires for chest pain and cardiovascular events were assessed after two years.

Investigators examined factors related to the severity of the participants’ angina and observed that brain activity in the inferior frontal lobe showed the strongest relationship with self-reported angina at baseline and also at a two-year follow-up appointment. The results indicated that:

participants who reported having monthly, weekly or daily angina symptoms had higher inferior frontal lobe activity in response to mental stress at both baseline;
those who reported angina during mental stress testing with cardiac imaging also had higher inferior frontal lobe activation compared to individuals who did not have active chest pain during mental stress testing; and
there was a significant association between inferior frontal lobe activation during stress and the degree of change in angina frequency at the two-year follow-up, suggesting that brain-related changes might predict worsened future angina.

“We were surprised by the strength of the relationship between the level of activity in this brain region and the frequency of chest pain reported, as well as the lack of a relationship to factors that are normally considered important when treating angina, such as heart imaging,” said Shah. “The top three factors that explained angina frequency were all stress related, including brain activation, depressive symptoms and PTSD symptoms. This is surprising because when we manage angina in clinical settings, we normally do not consider stress as an underlying factor, and rather focus on blood flow in the heart.”

The authors note the study did have some limitations in that the testing protocols may not reflect real-life stressors, potentially leading to an underestimation of the role stress plays on angina. They also point out the angina questionnaire was answered retrospectively by participants, as opposed to a diary-style log, yet this method of information gathering is not unusual during clinical trials. Additionally, the associations they found suggest but do not prove a cause-and-effect relationship between brain stress reactivity and angina.

Credit: 
American Heart Association