Culture

Using personalized medicine to avoid resistance to leukemia treatment

T-cell acute lymphoblastic leukemia (T-ALL) is a very aggressive type of blood cancer. It is relatively rare but still draws a lot of attention as it mostly develops in children under the age of 20. The standard treatment for T-ALL involves heavy chemotherapy procedures, which result in favorable outcomes with an overall survival of 75% after 5 years.

However, some patients do not respond to this treatment, or they only respond for a short period, after which the disease grows back. These patients therefore need alternative therapies.

Researchers from the Faculty of Health and Medical Sciences, University of Copenhagen, have now identified a combination treatment, which could potentially benefit patients that do not respond to standard chemotherapy.

'Our study suggests that we could use personalized medicine to target the cancer cells in the subgroup of T-ALL patients that do not initially respond to or stop responding to the standard chemotherapy. By combining two specific protein inhibitors, we have shown that we can obtain a strong and durable effect on leukemia cell growth. This might improve the overall survival of T-ALL patients', says Giulia Franciosa, Assistant Professor at the Novo Nordisk Foundation Center for Protein Research.

Targeting two proteins to avoid resistance

The majority of T-ALL patients have mutations in the so-called Notch1 gene. This mutation causes a cell surface receptor to induce cancer cell growth. By using a drug that inhibits this receptor, it is possible to stop the cancer cells from dividing and growing. Unfortunately, the cancer cells are good at adapting and in many cases develop resistance towards the Notch-inhibitor.

'The challenge we are facing with drug resistance is very hard to overcome as long as we are only targeting one protein, in this case the Notch1 receptor, at a time. That is why we have been looking for a therapy option that targets two proteins at the same time, making it much more difficult for the cancer cells to develop resistance. And we found one', says Giulia Franciosa.

Mass spectrometry proteomics gives unbiased answers

By comparing cells that are sensitive to Notch-inhibition with cells that are resistant - either from the beginning or develop resistance over time - the researchers identified a specific signaling protein responsible for the drug resistance: Kinase C. By targeting both proteins at the same time, they were able to eliminate the resistance.

'We used high-resolution mass spectrometry based proteomics to study the underlying molecular mechanisms that cause the resistance. The proteomics technology allows us to analyze the entire set of proteins, the proteome, present in a cell at the same time. By using this technique, we can map out differences and similarities between the responsive and non-responsive cells in an unbiased way. And that is how we found that Protein Kinase C activity is upregulated in resistant cells', says Jesper Velgaard Olsen, Professor at the Novo Nordisk Foundation Center for Protein Research.

The researchers hope that their findings in time can be used in the treatment of T-ALL patients who do not tolerate or respond to standard chemotherapy.

Credit: 
University of Copenhagen - The Faculty of Health and Medical Sciences

People with disabilities faced pandemic triage biases

When COVID-19 patients began filling up ICUs throughout the country in 2020, health care providers faced difficult decisions. Health care workers had to decide which patients were most likely to recover with care and which were not so resources could be prioritized.

But a new paper from the University of Georgia suggests that unconscious biases in the health care system may have influenced how individuals with intellectual disabilities were categorized in emergency triage protocols.

The state-level protocols, while crucial for prioritizing care during disasters, frequently allocated resources to able-bodied patients over ones with disabilities, the researchers found.

The study, published in Disaster Medicine and Public Health Preparedness, found that some states had emergency protocols saying that individuals with brain injuries, cognitive disorders or other intellectual disabilities may be poor candidates for ventilator support. Others had vague guidelines that instructed providers to focus resources on patients who are most likely to survive. Adults with disabilities are significantly more likely to have comorbidities, such as heart disease and diabetes. In the case of COVID-19, those conditions were considered risk factors for poor outcomes, relegating these patients to the bottom of care hierarchy.

To compound the problem, COVID-19 hospital protocols that banned visitors often shut out advocates and family members who might have been able to advocate for these individuals. For patients unable to communicate their needs, the situation could easily turn deadly.

"I think when you leave people out of the conversations making these decisions, you see an issue like structural discrimination and bias," said Brooke Felt, lead author of the paper who graduated from UGA in 2020 with Master of Social Work and Master of Public Health degrees.

"Ableism, which is when people discriminate against those with disabilities and favor people with able bodies, is just so ingrained into the health care system. It is definitely a bias that a lot of people have, and sometimes people don't even recognize it."

Priority of care

Triaging, or prioritizing care and resources during emergencies or disasters, can be a subjective process. State protocols offer a degree of guidance about how to allocate resources, but in the moment, decisions about care often come down to individual health care providers.

In the case of the COVID-19 pandemic, hospitals were overwhelmed with patients and frequently had to make these judgment calls without extensive medical histories or patient advocates.

Curt Harris, corresponding author of the paper and director of the Institute for Disaster Management in UGA's College of Public Health, stressed that this research isn't an attack on clinical providers, who have shouldered an enormous burden throughout the pandemic.

"I do not believe clinicians are deliberately doing this," he said. "I just don't think they have been given the requisite education needed for population-level health issues, nor is it easy for clinicians to reconcile what constitutes high quality of life for patients with intellectual disabilities. This is more of an educational opportunity for clinicians to recognize that an issue exists and begin to make systemic changes, so we do not repeat the same mistakes in the future."

Addressing inequities

At the center of these emergency protocols is the underlying implication that an able-bodied person is more worthy of life-saving treatments than one who is intellectually disabled, Felt said.

One way to address this oversight would be to integrate social workers into the emergency response process, the authors said. They can act as advocates for those who can't speak on their own behalf.

"Involving social workers means you may be more likely to have someone who recognizes the structural inequalities, biases and discrimination and can bring those issues more into focus so they can be addressed," said Felt.

Altering pre-medical and medical education curricula to incorporate training on how to work with individuals with disabilities--something that often isn't covered--could also go a long way in closing the gap in care.

Felt had asked a friend in a medical program about bias and discrimination trainings in his coursework. She was shocked to learn his clinical education did not cover this topic.

"I feel like that should be a foundational class," she said. "That's something that definitely needs to change."

Credit: 
University of Georgia

People with familial longevity show better cognitive aging

(Boston)--If you come from a family where people routinely live well into old age, you will likely have better cognitive function (the ability to clearly think, learn and remember) than peers from families where people die younger. Researchers affiliated with the Long Life Family Study (LLFS) recently broadened that finding in a paper published in Gerontology, suggesting that people who belong to long-lived families also show slower cognitive decline over time.

The Long Life Family Study has enrolled over 5,000 participants from almost 600 families and has been following them for the past 15 years. The study is unique in that it enrolls individuals belonging to families with clusters of long-lived relatives. Since 2006, the LLFS has recruited participants belonging to two groups: the long-lived siblings (also called the proband generation) and their children. Since they share lifestyle and environmental factors, the spouses of these two groups have also been enrolled in the LLFS as a referent group.

To assess cognitive performance, the researchers administered a series of assessments to the study participants meant to test different domains of thinking, such as attention, executive function and memory, over two visits approximately eight years apart. This allowed researchers to ask whether individuals from families with longevity have better baseline cognitive performance than their spouses do and whether their cognition declines more slowly than does that of their spouses.

To study this question, LLFS researchers used a model to determine the change in score on several neuropsychological tests from one visit to the next. "This model allows us to assess both the cross-sectional effect of familial longevity at baseline visit and the longitudinal effect over follow-up time," says co-lead author Mengtian Du, a doctoral student in biostatistics at Boston University School of Public Health.

They showed that individuals from long-lived families performed better than their spouses on two tests: a symbol coding test, which has participants match symbols to their corresponding numbers and provides insight into psychomotor processing speed, attention, and working memory, and a paragraph recall test, which asks participants to remember a short story and assesses episodic memory. The researchers from the LLFS also found that individuals in the younger generation (participants born after 1935) exhibited a slower rate of cognitive decline on the symbol coding test than did their spouses.

"This finding of a slower decline in processing speed is particularly remarkable because the younger generation is relatively young at an average age of 60 years and therefore these differences are unlikely to be due to neurodegenerative disease," explains corresponding author Stacy Andersen, PhD, assistant professor of medicine at Boston University School of Medicine. "Rather we are detecting differences in normal cognitive aging."

According to Andersen this suggests that people with familial longevity demonstrate resilience to cognitive aging. "By studying the LLFS families we can learn about the genetics, environmental factors, and lifestyle habits that are essential in optimizing cognitive health throughout the lifespan."

Credit: 
Boston University School of Medicine

Confirmation of an auroral phenomenon discovered by Finns

image: A new auroral phenomenon discovered by Finnish researchers a year ago is probably caused by areas of increased oxygen atom density occurring in an atmospheric wave channel. The speculative explanation offered by the researchers gained support from a new study.

Image: 
Graeme Whipps

A new auroral phenomenon discovered by Finnish researchers a year ago is probably caused by areas of increased oxygen atom density occurring in an atmospheric wave channel. The speculative explanation offered by the researchers gained support from a new study.

Observations made by University of Helsinki researchers increased the validity of a speculative mechanism according to which a type of aurora borealis named 'dunes' is born. In the new study, photographs of the phenomenon taken by an international group of hobbyists in Finland, Norway and Scotland were compared to concurrent satellite data.

The rare type of aurora borealis was seen in the sky on 20 January 2016 and recorded in photos taken by several hobbyists.

"The dunes were seen for almost four hours in a very extensive area, with the pattern extending roughly 1,500 kilometres from east to west and some 400 kilometres from north to south," says Postdoctoral Researcher Maxime Grandin from the Centre of Excellence in Research of Sustainable Space coordinated by the University of Helsinki.

Useful photographic and video material was collected in close cooperation with Finnish aurora borealis hobbyists, utilising both the internet and social media. Among other things, a time lapse video shot on the night in question by a Scottish hobbyist was found. The video was used to estimate the dunes' propagation speed at over 200 m/s.

The study was published in the esteemed AGU Advances journal.

Validity of the wave guide theory confirmed

Northern Lights are born when charged particles ejected by the Sun, such as electrons, collide with oxygen atoms and nitrogen molecules in Earth's atmosphere. The collision momentarily excites the atmospheric species, and this excitation is released in the form of light.

New types of aurora borealis are rarely discovered. The identification of this new auroral form last year was the result of an exceptional collaboration between hobbyists who provided observations and researchers who started looking into the matter.

The new auroral form named dunes is relatively rare, and its presumed origin is peculiar.

"The differences in brightness within the dune waves appear to be caused by the increased density of atmospheric oxygen atoms," says Professor Minna Palmroth.

A year ago, researchers at the Centre of Excellence in Research of Sustainable Space concluded that the dune-like shape of the new auroral emission type could be caused by concentrations of atmospheric oxygen. This increased density of oxygen atoms is assumed to be brought about by an atmospheric wave known as a mesospheric bore travelling horizontally within a wave guide established in the upper atmosphere.

This rare wave guide is created in between the boundary of the atmospheric layer known as the mesosphere, which is called the mesopause, and an inversion layer that is intermittently formed below the mesopause. This enables waves of a certain wavelength to travel long distances through the channel without subsiding.

The electron precipitation and temperature observations made in the recently published study supported the interpretations of the dunes' origins made a year earlier. An independent observation was made of the wave channel appearing in the area of the dunes, but there are no observation data for the mesospheric bore itself yet.

"Next, we will be looking for observations of the mesospheric bore in the wave guide," Maxime Grandin says.

According to the observation data, electron precipitation occurred in the area where the dunes appeared on 20 January 2016. Therefore, it is highly likely that electrons having the appropriate energy to bring about auroral emissions at an altitude of roughly 100 kilometres were involved. The observations were collected by the SSUSI instrument carried by a DMSP satellite, which measures, among other things, electron precipitation.

On the night in question, there was an exceptionally strong temperature inversion layer in the mesosphere, or a barrier generated by layers of air with different temperatures. The inversion layer associated with the origins of the wave channel was measured with the SABER instrument carried by the TIMED satellite. The observation supports the hypothesis according to which the auroral form originates in areas of increased oxygen density occurring in the upper atmosphere wave guide.

Credit: 
University of Helsinki

New look at a bright stellar nursery

image: This composite combines radio (orange) and infrared images of the W49A molecular cloud, where young stars are forming.

Image: 
DePree, et al.; Sophia Dagnello, NRAO/AUI/NSF; Spitzer/NASA.

This overlay shows radio (orange) and infrared images of a giant molecular cloud called W49A, where new stars are being formed. A team of astronomers led by Chris DePree of Agnes Scott College used the National Science Foundation's Karl G. Jansky Very Large Array (VLA) to make new, high-resolution radio images of this cluster of still-forming, massive stars. W49A, 36,000 light-years from Earth, has been studied for many decades, and the new radio images revealed some tantalizing changes that have occurred since an earlier set of VLA observations in 1994 and 1995.

The VLA radio images show the shape and movement of giant clouds of ionized hydrogen gas formed by the intense ultraviolet radiation from young stars. Comparing old and new VLA images of these ionized regions has shown changes indicating new activity in some of the regions. This new activity includes a narrow, fast-moving jet in one region, supersonic gas motions in three others, and an unexpected reduction in the radio brightness in another.

The astronomers, who reported their findings in the Astronomical Journal, plan to continue observing this region regularly to track changes that will reveal new details about the complex processes of star formation and interactions of the outflows from young stars.

Credit: 
National Radio Astronomy Observatory

Chronic exposure to low levels of blast may be associated with neurotrauma

Scientists at the Walter Reed Army Institute for Research demonstrated that biomarkers associated with traumatic brain injury were elevated among law enforcement and military personnel, particularly in active duty participants with longer duration of service. Most notably, these elevated biomarker levels were observed in individuals without a diagnosed brain injury or concussion.

Some law enforcement and military personnel are regularly exposed to low levels of blast, particularly during training, due to the use of explosive charges and high caliber weapons. Understanding effects from these occupational exposures is a military health care priority to improve diagnosis and mitigation of ill effects.

While repeated exposure to low level blast is not known to result in clinically diagnosed traumatic brain injury, exposures have been linked to a series of reported symptoms such as headaches, fatigue, dizziness, memory difficulties, and tinnitus (ringing in the ears) -- collectively referred to as "breacher's brain" among members of affected communities.

This study, published in the Journal of the American Medical Association, measured neurotrauma biomarker concentrations in blood samples from 106 military and law enforcement personnel who were not actively engaged in training or physical activity at the time of blood collection and compared those concentrations with commercially available samples from individuals who were similar in sex and age but unlikely to have been exposed to blast.

"We found that five biomarkers previously associated with TBI and brain diseases were elevated among personnel when compared to controls," said Dr. Angela Boutte, lead author on the paper and a researcher at the WRAIR Brain Trauma Neuroprotection branch. "Given the difficulty of identifying and evaluating injury associated with repeated low level blast exposure, we hope these data are the first step in our collective goal to identify objective biomarkers as clinically relevant diagnostic tools."

Dr. Bharani Thangavelu and Dr. Walter Carr, WRAIR brain health researchers and co-authors, emphasized the potential impact of blast exposure experienced by military personnel stating, "Low level blast exposure in routine military training should not be expected to result in acute, gross behavioral deficits for the majority of personnel. However, repeated exposure across years does correlate with symptomology, especially when a history of chronic exposure is exacerbated by new, large magnitude exposures."

Efforts to identify and quantify the impact of blast and traumatic brain injury on Service Members have increased dramatically in recent years, including initiatives in response to Congressional mandates. Biomarkers of blast effects on brain health will be a useful tool in this effort, especially as tools that augment decision-making based on symptoms reported by personnel.

Credit: 
Walter Reed Army Institute of Research

Cheap but desirable: Generic drugs a great alternative to the brand-names for hypertension

image: Study shows that generic hypertension drugs could be as good as their brand-name counterparts in lowering blood pressure over the long term

Image: 
Chinese Medical Journal

Hypertension is a common medical condition and a primary cause of cardiovascular diseases and stroke worldwide. Unfortunately, as Professor Wei-Li Zhang of the Chinese National Center for Cardiovascular Diseases notes, "the unaffordability of drugs is a major barrier to medication adherence among patients living in low- and middle-income areas."

One of the countries where hypertension is becoming a major problem is China where, researchers estimate, between 244 million and 300 million adults are living with hypertension. But true to Prof. Zhang's words, most cases of hypertension in China are not adequately controlled, and patients who do seek treatment for hypertension must pay for outpatient clinic visits and bear medication costs out-of-pocket.

An emerging option for controlling the costs of antihypertensive pharmacotherapy is to prescribe generic drugs, which have the same active ingredients as their brand-name counterparts but cost substantially less. Several countries have taken steps to encourage the prescription of generic drugs as a way of reducing healthcare spending and enabling more patients, but some clinicians have reservations about prescribing generics due to concerns that such drugs may not be as effective as their brand-name counterparts. A team of researchers led by Prof. Zhang therefore decided to conduct an investigation to determine whether generic antihypertensive drugs are as effective as brand-name drugs at controlling blood pressure over the long term. Their results appear in a paper recently published in Chinese Medical Journal.

In their study, they analyzed data from a cohort study of patients with hypertension conducted across 18 hospitals in 12 Chinese provinces. Baseline screenings occurred between 2013 and 2015, and follow-up assessments continued until August 1, 2017. The researchers focused on 2,176 study participants who used brand-name antihypertensive drugs and 4,352 participants who used generic drugs. They conducted statistical analyses to compare the two groups in terms of changes in blood pressure over the follow-up period.

They found that generic drugs were just as effective as brand-name drugs at lowering blood pressure. Between the group using generic drugs and that using brand-name drugs, the percentages of patients who achieved well-controlled blood pressure levels (defined here as systolic blood pressures below 140 mmHg and diastolic blood pressures below 90 mmHg) and the likelihoods of experiencing adverse cardiovascular outcomes, such as coronary heart disease and stroke, were similar. After adjusting for age, sex, body mass index values, the number of antihypertensive drugs taken, and traditional cardiovascular risk factors, the mean reduction in systolic blood pressure was 7.9 mmHg for participants who took brand-name drugs and 7.1 mmHg for participants who took generic drugs.

However, some differences in outcomes did emerge when the researchers examined specific participant subgroups. Among participants who were younger than 60 years old, those who took brand-name drugs were more likely to achieve well-controlled blood pressure levels than their counterparts who took generic drugs were. A similar difference between brand-name drugs and generic drugs emerged when the investigators restricted their analyses to men with hypertension.

Despite the differences observed in the subgroup analyses, the results provide strong evidence overall for generic drugs being just as effective as brand-name ones at lowering blood pressure levels. Importantly, the generic drugs achieve these effects at a much lower cost; a 1-mmHg reduction in systolic blood pressure turned out to be US$315.40 cheaper with generic drugs than with brand-name ones. This cost difference means that treatment with generic drugs is only half as expensive as treatment with brand-name drugs, and this will doubtless mean a great deal to the many people in China and beyond who live on low-to-medium incomes.

That's why Prof. Zhang expresses hope that these findings will provide "an impetus for physicians and patients to preferentially use generic drugs instead of expensive brand-name drugs to lower blood pressure levels." By expanding access to antihypertensive treatment, the use of generic drugs will provide important benefits to public health within China and other developing countries.

Credit: 
Cactus Communications

Testing tool can quickly distinguish between viral and bacterial infections

DURHAM, N.C. - When patients complain of coughing, runny nose, sneezing and fever, doctors are often stumped because they have no fundamental tool to identify the source of the respiratory symptoms and guide appropriate treatments.

That tool might finally be on its way. In a study proving feasibility, researchers at Duke Health showed that their testing technology can accurately distinguish between a viral and a bacterial infection for respiratory illness - a critical difference that determines whether antibiotics are warranted. And, importantly, the test provided results in under an hour.

"This is exciting progress," said study lead Ephraim Tsalik, associate professor in the departments of Medicine and Molecular Genetics and Microbiology at Duke University School of Medicine.

"We've been working on this for over a decade," Tsalik said. "We knew in 2016 that our test worked in the research setting, but it's always been our goal to have a test that could produce results rapidly, while patients are at their doctor's office. It's important that the distinction can be made quickly to ensure that antibiotics are not inappropriately prescribed."

Tsalik and colleagues published results of their study in the journal Critical Care Medicine, which confirm the test's accuracy with results available in under an hour.

The researchers have developed a gene expression method that diverges from current diagnostic strategies, which focus on identifying specific pathogens. The current tests are time-consuming and can only identify a pathogen if it's specifically targeted by the test in the first place.

Host gene expression, however, looks for a distinct immune signal that is unique to the type of infection the body is fighting. The immune system activates one set of genes when fighting bacterial infections and a different set of genes in response to a viral infection. After the team discovered these gene expression signatures for bacterial and viral infection, they collaborated with BioFire Diagnostics, a company that specializes in molecular diagnostics, to develop this first-of-its kind test.

In a multisite study of more than 600 patients presenting to hospital emergency departments with respiratory infections, the tests identified bacterial infections with 80% accuracy and viral infections with nearly 87% accuracy. The current standard tests have about 69-percent accuracy. Tests provided results in less than an hour, and their accuracy was confirmed retrospectively using two different methods.

"Acute respiratory illness is the most common reason that people visit a health care provider when feeling sick," Tsalik said. "Patients with these symptoms are inappropriately treated with antibiotics far too often due to challenges in discriminating the cause of illness, fueling antibiotic resistance. Our study shows that a rapid test to distinguish between these two sources of illness is possible and could improve clinical care."

Tsalik said additional studies are underway to validate this approach in additional groups of patients. The researchers are also working to adapt the technology to produce more specific information, including whether the virus causing illness is influenza or SARS-CoV-2.

Credit: 
Duke University Medical Center

Research reveals Medicaid expansion is still improving hospital finances

A new study published in Medical Care Research and Review found that the Affordable Care Act, which expanded Medicaid programs to cover people previously uninsured, provided a financial boost to hospitals.

The study conducted by faculty at the Colorado School of Public Health on the University of Colorado Anschutz Medical Campus is the first to investigate the effects of Medicaid expansion by comparing estimates using data from both the Internal Revenue Service (IRS) and the Centers for Medicare and Medicaid Services (CMS).

"The IRS and CMS data sources serve as primary resources for assessing the impact of Medicaid expansion on hospitals' financial status. The comparison of the two is timely and can inform the decisions of health practitioners, policymakers and regulators at a state and national level," said lead author Tatiane Santos, MPH, PhD, faculty at the Colorado School of Public Health and fellow at the Wharton School and Leonard Davis Institute of Health Economics at the University of Pennsylvania.

Santos adds, "This is especially relevant in the context of the recently passed American Rescue Plan Act of 2021, which provides additional incentives for the 12 states that have not yet expanded Medicaid."

The researchers examined the state-level impact of Medicaid expansion on hospital finances and based on the IRS data found that uncompensated care costs declined by 28 percent in states that expanded Medicaid relative to uncompensated care costs in 2013, the year before Medicaid expansion (9.3 percent of operating expenses in 2013). The findings based on the CMS data showed that there was a 32 percent decline in uncompensated care costs relative to costs in 2013 (5.0 percent of operating expenses in 2013).

These results are in line with previous studies that have reported that expansion has resulted in substantial reductions in hospitals' uncompensated care costs and increases in their Medicaid shortfalls (these shortfalls are the difference between Medicaid reimbursement and what it costs providers to care for patients).

Nationally, the estimated net effect of expansion reduced not-for-profit hospital costs by two percentage points based on IRS data and 0.83 percentage points based on CMS data. Across expansion states, the estimated net effects varied widely with approximately a 10-fold difference for hospitals based on IRS data and a two-fold difference based on CMS data.

Another key finding revealed that the increase in hospital's Medicaid shortfalls has been occurring more gradually, a result that may be partially attributable to a growing Medicaid population in expansion states.

The authors mention that while Medicaid expansion has clearly had an impact on hospitals' financial status, assessment of the actual magnitude of the effects is sensitive to the data sources used.

"Expansion effects have also varied by state, which may be an indicator of how states may potentially weather the COVID-19 pandemic financial shocks, including unemployment and increasing Medicaid enrollment. These are important findings for future consideration as Medicaid expansion continues to be a source of debate across the United States as a health policy initiative," added Santos.

The authors suggest that future studies should further explore the differences across IRS and CMS data. They suggest that as the pandemic unfolds Medicaid will be especially critical in serving the most vulnerable populations. States will need to make difficult financial decisions to protect their safety net hospitals and hospitals at highest risk of financial distress.

Credit: 
University of Colorado Anschutz Medical Campus

Researchers identify protein "signature" of severe COVID-19

BOSTON - Researchers at Massachusetts General Hospital (MGH) have identified the protein "signature" of severe COVID-19, which they describe in a new study published in Cell Reports Medicine. "We were interested in asking whether we could identify mechanisms that might be contributing to death in COVID-19," says MGH infectious disease expert Marcia Goldberg, MD, who studies interactions between microbial pathogens and their hosts, and is senior author of the study. "In other words, why do some patients die from this disease, while others--who appear to be just as ill--survive?"

In March 2020, when the first patients with symptoms of COVID-19 began arriving at MGH's emergency department (ED), Goldberg was contacted by her colleague, Michael Filbin, MD, MS, an attending physician and director of Clinical Research at MGH's ED, and lead author of the study. Filbin and Goldberg had earlier begun collaborating with MGH immunologist Nir Hacohen, PhD, to develop methods for studying human immune responses to infections, which they had applied to the condition known as bacterial sepsis. The three agreed to tackle this new problem with the goal of understanding how the human immune system responds to SARS-CoV-2, the novel pathogen that causes COVID-19.

To undertake this study, the MGH team used proteomics, which is the analysis of the entire protein composition (or proteome) of a cell, tissue or organism. In this case, proteomic analysis was used to study blood specimens taken from patients arriving at the hospital's ED with respiratory symptoms consistent with COVID-19. Collecting these specimens required a large team of collaborators from many departments, which worked overtime for five weeks to amass blood samples from 306 patients who tested positive for COVID-19, as well as from 78 patients with similar symptoms who tested negative for the coronavirus. (For more on this extraordinary effort, click here.)

Next, Arnav Mehta, MD, PhD, a postdoctoral researcher at the Broad Institute of MIT and Harvard, was brought on board to oversee interpretation of the complex data produced by the proteomic analysis. Mehta also works in Hacohen's lab, and the two had long been interested in using proteomic analysis of blood as an alternative to biopsies (which are invasive and painful). "We have been asking, What can we learn about what's happening in the body just by looking at protein signatures in the blood?" says Mehta.

The study found that most patients with COVID-19 have a consistent protein signature, regardless of disease severity; as would be expected, their bodies mount an immune response by producing proteins that attack the virus. "But we also found a small subset of patients with the disease who did not demonstrate the pro-inflammatory response that is typical of other COVID-19 patients," says Filbin, yet these patients were just as likely as others to have severe disease. Filbin notes that patients in this subset tended to be older people with chronic diseases, who likely had weakened immune systems.

The next step was to compare the protein signatures of patients with severe disease (defined as those who required intubation or who died within 28 days of hospital admission) with patients with less-severe cases of COVID-19. The comparison allowed the researchers to identify more than 250 "severity associated" proteins. Importantly, notes Mehta, blood was drawn from patients three times (on enrollment, then three and seven days later). "That allowed us to look at the trajectory of the disease," says Mehta. Among other revelations, this showed that the most prevalent severity-associated protein, a pro-inflammatory protein called interleukin-6, or IL-6, rose steadily in patients who died, while it rose and then dropped in those with severe disease who survived. Early attempts by other groups to treat COVID-19 patients experiencing acute respiratory distress with drugs that block IL-6 were disappointing, though more recent studies show promise in combining these medications with the steroid dexamethasone.

However, Hacohen notes that many of the other severity-associated proteins the analysis identified are likely important for understanding why only a portion of COVID-19 develop severe cases. Learning how the disease affects the lungs, heart and other organs is essential, he says, and proteomic analysis of the blood is a relatively easy method for getting that information. "You can ask which of the many thousands of proteins that are circulating in your blood are associated with the actual outcome," says Hacohen, "and whether there is a set of proteins that tell us something."

Goldberg believes that the proteomic signatures identified in this study will do just that. "They are highly likely to be useful in figuring out some of the underlying mechanisms that lead to severe disease and death in COVID-19," says Goldberg, noting her gratitude to the patients involved in the study. Their samples are already being used to study other aspects of COVID-19, such as identifying the qualities of antibodies that patients form against the virus.

Credit: 
Massachusetts General Hospital

Team cracks century-old mystery over the health struggles of explorer Ernest Shackleton

BOSTON - Researchers from Massachusetts General Hospital (MGH) appear to have solved the 120-year-old mystery surrounding the failing health of famed Antarctic explorer Sir Ernest Shackleton over the course of his daring expeditions to Antarctica in the early part of the twentieth century. In a paper published online in the Journal of Medical Biography, the team moved beyond past theories of congenital heart defect and scurvy advanced by physicians and historians to conclude that the British explorer suffered from beriberi, a serious and potentially life-threatening condition caused by a deficiency of the nutrient thiamine.

"Historians have traditionally looked at Shackleton's symptoms in isolation and speculated about their cause," says lead author Paul Gerard Firth, MD, head of the Division of Community and Global Health in the Department of Anesthesia, Critical Care and Pain Medicine at MGH. "We looked at other explorers on the expedition, as well as members of other early expeditions, and found that some had symptoms--such as breathlessness, neuropathy and effort intolerance--similar to Shackleton's that could be attributed to beriberi. With the benefit of what we now know about nutritional diseases, we believe that beriberi-induced cardiomyopathy--a disease of the heart muscle that makes it difficult for the heart to pump blood--is the correct diagnosis for Ernest Shackleton's deteriorating health."

The researchers learned that Edward Wilson, one of two physicians on Shackleton's first voyage to Antarctica beginning in 1901--when the explorer fell seriously ill and had to return home after voyaging closer to the South Pole than any previous human--may have suspected beriberi after consulting his medical textbooks, but didn't settle on that diagnosis at a time when so little was known about the condition. Instead, the prolonged bouts of extreme shortness of breath and physical weakness Shackleton experienced on the British "Discovery" expedition of 1901 to 1903 were ascribed by his contemporaries and subsequent historians to scurvy or underlying heart disease.

"While Wilson concluded that Shackleton's condition was the result of scurvy--a vitamin C deficiency--that appeared to us to be an incomplete explanation for his labored breathing," says Firth. "Shackleton, after all, had very slight symptoms of scurvy when his breathing difficulties began, and mild scurvy does not cause heart problems."

This careful parsing of the historical evidence led Firth and his colleagues to an alternative nutritional cause of Shackleton's health struggles. "Many of the signs and symptoms of beriberi seen in early explorers developed after three months of thiamine deficiency," explains co-author Lauren Fiechtner, MD, director of the Center for Pediatric Nutrition at MGH. "And that would be consistent with a thiamine-deficient diet they experienced during the grueling months of winter explorations. Fortunately, replacement of thiamine with vitamin B1 supplements can resolve the deficiency within days or hours, although that was not known at the time."

Even severe health challenges were not enough to prevent Shackleton from setting out on a third attempt to reach the South Pole in 1914, a fateful voyage since recounted in books and movies of how his ship Endurance became trapped in packed ice and broke apart, with all 28 crewmen reaching safety after two years and two heroic rescue efforts engineered by Shackleton. In late 1921, the intrepid explorer embarked on his fourth expedition, but suffered a heart attack on January 5, 1922, and died on his ship at age 47.

"The exact nature of Ernest Shackleton's faltering health has puzzled historians and the public for years," says Firth, "and almost 100 years after the start of his fourth and final expedition we're satisfied that we have finally uncovered a medically and scientifically valid explanation."

Credit: 
Massachusetts General Hospital

Do people aged 105 and over live longer because they have more efficient DNA repair?

Researchers have found that people who live beyond 105 years tend to have a unique genetic background that makes their bodies more efficient at repairing DNA, according to a study published today in eLife.

This is the first time that people with 'extreme longevity' have had their genomes decoded in such detail, providing clues as to why they live so long and manage to avoid age-related diseases.

"Aging is a common risk factor for several chronic diseases and conditions," explains Paolo Garagnani, Associate Professor at the Department of Experimental, Diagnostic and Specialty Medicine, University of Bologna, Italy, and a first author of the study. "We chose to study the genetics of a group of people who lived beyond 105 years old and compare them with a group of younger adults from the same area in Italy, as people in this younger age group tend to avoid many age-related diseases and therefore represent the best example of healthy aging."

Garagnani and colleagues, in collaboration with several research groups in Italy and a research team led by Patrick Descombes at Nestlé Research in Lausanne, Switzerland, recruited 81 semi-supercentenarians (those aged 105 years or older) and supercentenarians (those aged 110 years or older) from across the Italian peninsula. They compared these with 36 healthy people matched from the same region who were an average age of 68 years old.

They took blood samples from all the participants and conducted whole-genome sequencing to look for differences in the genes between the older and younger group. They then cross-checked their new results with genetic data from another previously published study which analysed 333 Italian people aged over 100 years old and 358 people aged around 60 years old.

They identified five common genetic changes that were more frequent in the 105+/110+ age groups, between two genes called COA1 and STK17A. When they cross-checked this against the published data, they found the same variants in the people aged over 100. Data acquired from computational analyses predicted that this genetic variability likely modulates the expression of three different genes.

The most frequently seen genetic changes were linked to increased activity of the STK17A gene in some tissues. This gene is involved in three areas important to the health of cells: coordinating the cell's response to DNA damage, encouraging damaged cells to undergo programmed cell death and managing the amount of dangerous reactive oxygen species within a cell. These are important processes involved in the initiation and growth of many diseases such as cancer.

The most frequent genetic changes are also linked to reduced activity of the COA1 gene in some tissues. This gene is known to be important for the proper crosstalk between the cell nucleus and mitochondria - the energy-production factories in our cells whose dysfunction is a key factor in aging.

Additionally, the same region of the genome is linked to an increased expression of BLVRA in some tissues - a gene that is important to the health of cells due to its role in eliminating dangerous reactive oxygen species.

"Previous studies showed that DNA repair is one of the mechanisms allowing an extended lifespan across species," says Cristina Giuliani, Senior Assistant Professor at the Laboratory of Molecular Anthropology, Department of Biological, Geological and Environmental Sciences, University of Bologna, and a senior author of the study. "We showed that this is true also within humans, and data suggest that the natural diversity in people reaching the last decades of life are, in part, linked to genetic variability that gives semi-supercentenarians the peculiar capability of efficiently managing cellular damage during their life course."

The team also measured the number of naturally occurring mutations that people in each age group had accumulated throughout their life. They found that people aged 105+ or 110+ had a much lower burden of mutations in six out of seven genes tested. These individuals appeared to avoid the age-related increase in disruptive mutations, and this may have contributed in protecting them against diseases such as heart disease.

"This study constitutes the first whole-genome sequencing of extreme longevity at high coverage that allowed us to look at both inherited and naturally occurring genetic changes in older people," says Massimo Delledonne, Full Professor at the University of Verona and a first author of the study.

"Our results suggest that DNA repair mechanisms and a low burden of mutations in specific genes are two central mechanisms that have protected people who have reached extreme longevity from age-related diseases," concludes senior author Claudio Franceschi, Professor Emeritus of Immunology at the University of Bologna.

Credit: 
eLife

New neuroimaging technique studies brain stimulation for depression

image: Shixie Jiang, MD, a third-year psychiatry resident at the USF Health Morsani College of Medicine, was the lead author for a study using an emerging functional neuroimaging technology, called diffuse optical tomography, during repetitive transcranial magnetic stimulation (rTMS).

Image: 
USF Health/University of South Florida

TAMPA, Fla. (May 4, 2021) -- Repetitive transcranial magnetic stimulation, or rTMS, was FDA approved in 2008 as a safe and effective noninvasive treatment for severe depression resistant to antidepressant medications. A small coil positioned near the scalp generates repetitive, pulsed magnetic waves that pass through the skull and stimulate brain cells to relieve symptoms of depression. The procedure has few side effects and is typically prescribed as an alternative or supplemental therapy when multiple antidepressant medications and/or psychotherapy do not work.

Despite increased use of rTMS in psychiatry, the rates at which patients respond to therapy and experience remission of often-disabling symptoms have been modest at best.

Now, for the first time, a team of University of South Florida psychiatrists and biomedical engineers applied an emerging functional neuroimaging technology, known as diffuse optical tomography (DOT), to better understand how rTMS works so they can begin to improve the technique's effectiveness in treating depression. DOT uses near-infrared light waves and sophisticated algorithms (computer instructions) to produce three-dimensional images of soft tissue, including brain tissue.

Comparing depressed and healthy individuals, the USF researchers demonstrated that this newer optical imaging technique can safely and reliably measure changes in brain activity induced during rTMS in a targeted region of the brain implicated in mood regulation. Their findings were published April 1 in the Nature journal Scientific Reports.

"This study is a good example of how collaboration between disciplines can advance our overall understanding of how a treatment like TMS works," said study lead author Shixie Jiang, MD, a third-year psychiatry resident at the USF Health Morsani College of Medicine. "We want to use what we learned from the application of the diffuse optical tomography device to optimize TMS, so that the treatments become more personalized and lead to more remission of depression."

DOT has been used clinically for imaging epilepsy, breast cancer, and osteoarthritis and to visualize activation of cortical brain regions, but the USF team is the first to introduce the technology to psychiatry to study brain stimulation with TMS.

"Diffuse optical tomography is really the only modality that can image brain function at the same time that TMS is administered," said study principal investigator Huabei Jiang, PhD, a professor in the Department of Medical Engineering and father of Shixie Jiang. The DOT imaging system used for USF's collaborative study was custom built in his laboratory at the USF College of Engineering.

The researchers point to three main reasons why TMS likely has not lived up to its full potential in treating major depression: nonoptimized brain stimulation targeting; unclear treatment parameters (i.e., rTMS dose, magnetic pulse patterns and frequencies, rest periods between stimulation intervals), and incomplete knowledge of how nerve cells in the brain respond physiologically to the procedure.

Portable, less expensive, and less confining than some other neuroimaging equipment like MRIs, DOT still renders relatively high-resolution, localized 3D images. More importantly, Dr. Huabei Jiang said, DOT can be used during TMS without interfering with treatment's magnetic pulses and without compromising the images and other data generated.

DOT relies on the fact that higher levels of oxygenated blood correlate with more brain activity and increased cerebral blood flow, and lower levels indicate less activity and blood flow. Certain neuroimaging studies have also revealed that depressed people display abnormally low brain activity in the prefrontal cortex, a brain region associated with emotional responses and mood regulation.

By measuring changes in near-infrared light, DOT detects changes in brain activity and, secondarily, changes in blood volume (flow) that might be triggering activation in the prefrontal cortex. In particular, the device can monitor altered levels of oxygenated, deoxygenated, and total hemoglobin, a protein in red blood cells carrying oxygen to tissues.

The USF study analyzed data collected from 13 adults (7 depressed and 6 healthy controls) who underwent DOT imaging simultaneously with rTMS at the USF Health outpatient psychiatry clinic. Applying the standard rTMS protocol, the treatment was aimed at the brain's left dorsolateral prefrontal cortex - the region most targeted for depression.

The researchers found that the depressed patients had significantly less brain activation in response to rTMS than the healthy study participants. Furthermore, peak brain activation took longer to reach in the depressed group, compared to the healthy control group.

This delayed, less robust activation suggests that rTMS as currently administered under FDA guidelines may not be adequate for some patients with severe depression, Dr. Shixie Jiang said. The dose and timing of treatment may need to be adjusted for patients who exhibit weakened responses to brain stimulation at baseline (initial treatment), he added.

Larger clinical trials are needed to validate the USF preliminary study results, as well as to develop ideal treatment parameters and identify other dysfunctional regions in the depression-affected brain that may benefit from targeted stimulation.

"More work is needed," Dr. Shixie Jiang said, "but advances in neuroimaging with new approaches like diffuse optical tomography hold great promise for helping us improve rTMS and depression outcomes."

Credit: 
University of South Florida (USF Health)

Bringing up baby: A crocodile's changing niche

image: Two crocodiles sit atop each other

Image: 
David Clode

Relatives of the giant crocodile might have been kings of the waterways during the Cretaceous period, eating anything--including dinosaurs--that got a little too close to the water's edge, but the largest of these apex predators still started off small. Figuring out how these little crocs grew up in a world surrounded by giants is no small task. Now crocs fossils from Texas are shedding light on how these animals changed their diets as they grew, helping them find a place of their own in environments alongside their bigger, badder relatives.

According to the study, published by Cambridge University Press, the crocodiless in question are members of the Deltasuchus motherali and lived along the coastline of Texas 96 million years ago. Previously known from a single adult skull, this 20-foot-long crocodile left behind bite marks on turtles and, yes, dinosaurs. The new discoveries include at least 14 more members of Deltasuchus, ranging from sizes as large as the original specimen down to a paltry (if still snappy) four feet in length.

Having so many crocs from the same fossil population is not common, and the smaller, more delicate bones of juveniles often did not survive the fossilization process.

"So many fossil groups are only known from one or a handful of specimens," said paleontologist Stephanie Drumheller, lead author of the study and a lecturer of earth and planetary sciences at the University of Tennessee, Knoxville. "It can be easy to fall into the trap of only thinking about the adults." The researchers ran into challenges piecing together this ancient ecosystem, however. Deltasuchus wasn't alone in its coastal swamps.

Living alongside Deltasuchus were other large crocs, like Terminonaris and Woodbinesuchus.

"These two large croc species were comparable in size to an adult Deltasuchus, but because they had long, narrow snouts with slender interlocking teeth, they were targeting smaller prey in the environment," said Thomas Adams, co-author of the new study and curator of paleontology and geology at the Witte Museum in San Antonio.

A smaller crocodile, Scolomastax, lived in the area as well, but its unusual jaw and chunky dentition hint that it preferred hard food and maybe even plants.

"These results confirm previous work that shows fossil crocs were much more diverse and creative when it came to coexisting in the same environments," said Chris Noto, co-author and associate professor at the University of Wisconsin-Parkside. "The very warm conditions of the Cretaceous supported a greater number of reptiles and allowed them to explore new niches not possible in the present day."

When these crocodiles died, their skeletons fell apart as they fossilized, getting jumbled together and complicating efforts to tell which bones went with which animal. To help solve this puzzle, the team turned to 3D scanning technology to help reconstruct the skulls. UT undergraduate student Hannah Maddox meticulously scanned each piece and stitched them together into 3D models of complete skulls.

"It was like solving a great puzzle," said Maddox. "Every piece brought you closer to seeing a toothy grin that hadn't been seen in millions of years."

As the models came together, a more complete picture of how Deltasuchus lived started to take shape.

The juveniles had lighter, skinnier snouts and teeth than their older relatives--faces better suited to snap up quicker, softer prey than the heavier, powerful jaws of their parents. This might have helped make sure that little Deltasuchus were not in direct competition with the similarly sized hard-prey specialists in their environments, but as they grew they had other neighbors to consider. The large-bodied, slender-snouted role was already filled by other species. So Deltasuchus shifted in another direction as it grew, bulking up and taking on the heavy jaws and sturdy teeth of an ambush predator.

"This is an amazing fossil discovery where we not only have a population of a single species, but in an ecosystem that has multiple predators coexisting by filling separate niches," said Adams.

Similar results were found in recent analyses of young tyrannosaurs, which spent their teenaged years outcompeting other medium-sized predators in their ecosystems.

Credit: 
University of Tennessee at Knoxville

Why does heart scarring cause abnormal rhythms in some people but not others?

Scientists have shed light on why some people who have a stroke do not also have abnormal heart rhythms, even though their hearts contain similar scar tissue.

Their results, published today in eLife, could help identify the best treatments for people who might be at risk of recurrent stroke, new heart disorders, or both.

Strokes are often caused by abnormal blood flow resulting from rapid, irregular beating in the upper chamber of the heart. This is also called atrial fibrillation (AFib). But some people have strokes that appear to have been caused by the heart, yet there is no evidence of AFib. In fact, around 25% of strokes fall into this group - called embolic strokes of undetermined source (ESUS).

"The absence of rhythm disorders in these people is confusing because we know that both atrial fibrillation and ESUS are associated with the build-up of a similar level of scar tissue in the heart," explains first author Savannah Bifulco, a graduate student at the Department of Bioengineering, University of Washington, Seattle, US. "We wanted to test whether there is some fundamental difference in the scar tissue between these two groups of patients that might explain why AFib patients suffer from rhythm disorders but ESUS patients do not."

The team developed 90 computer-based models using magnetic resonance imaging (MRI) scans from patients: 45 models were derived from patients who had a stroke of undetermined source and 45 from those who had AFib and had not yet received treatment. They compared the amount and location of the scar tissue in the upper-left heart chamber across all samples and then used simulations to test whether it was still possible to trigger an abnormal heart rhythm.

"Using real patient MRIs, we created computerised models of the hearts of patients who have had a stroke, but do not have AFib. We then ran those models through a battery of virtual stress tests we originally designed to help understand the effects of disease-related atrial changes in patients who did have AFib," explains co-senior author Patrick Boyle, Assistant Professor of Bioengineering, who leads the Cardiac Systems Simulation Lab at the University of Washington. "Interestingly, we found that models from ESUS and AFib patients were equally likely to be affected by this arrhythmia initiation protocol. This is surprising, because it suggests ESUS and AFib patients have the same proverbial tinderbox of fibrotic remodelling. We believe the implication is that these stroke patients are only missing the trigger to start the fibrillation process - the spark to light the fire."

Undetectable AFib is thought to be a potential cause of ESUS, and all people who have had a stroke of undetermined source are usually monitored for AFib and started on aspirin to prevent another stroke. If AFib is detected, stronger anti-clotting drugs would be recommended. As with all treatments, these drugs come with side effects and risks of their own, and it is important to know who really needs them. Yet only 30% of ESUS patients ever show evidence of AFib, making it impossible for clinicians to know which patients should be treated as high-risk for AFib and which ones are better with monitoring alone. Now, the team is moving towards using this modelling approach for stroke and arrhythmia risk stratification in potentially vulnerable groups.

"By using these tools of advanced imaging, computational power and outcomes data to create robust and validated computational models of arrhythmia, we're paving the way towards a better understanding and gaining valuable insights into the nature of each individual's disease course," says co-senior author Nazem Akoum, Director, Atrial Fibrillation Program, Division of Cardiology, University of Washington School of Medicine. "Our goal is to make computational modelling more integrated into how clinical decisions are made, placing what we see in simulations alongside many other factors like medical co-morbidites, diagnostic tests and family history. We want to help clinicians wring every last drop of information and insight from these images to help them paint the most complete picture possible for their patients."

Credit: 
eLife