Culture

Most diets lead to weight loss and lower blood pressure, but effects largely disappear after a year

Reasonably good evidence suggests that most diets result in similar modest weight loss and improvements in cardiovascular risk factors over a period of six months, compared with a usual diet, finds a study published by The BMJ today.

Weight reduction at the 12 month follow-up diminished, and improvements in cardiovascular risk factors largely disappeared, except in association with the Mediterranean diet, which saw a small but important reduction in 'bad' LDL cholesterol.

As such, at least for short-term benefits, the researchers suggest that people should choose the diet they prefer without concern about the size of benefits.

Obesity has nearly tripled worldwide since 1975, prompting a plethora of dietary recommendations for weight management and cardiovascular risk reduction.

But so far, there has been no comprehensive analysis comparing the relative impact of different diets for weight loss and improving cardiovascular risk factors, such as blood pressure and cholesterol levels.

To address this, a team of international researchers set out to determine the relative effectiveness of dietary patterns and popular named diets among overweight or obese adults.

Their findings are based on the results of 121 randomised trials with 21,942 patients (average age 49) who followed a popular named diet or an alternative control diet and reported weight loss, and changes in cardiovascular risk factors.

The studies were designed differently, and were of varying quality, but the researchers were able to allow for that in their analysis.

They grouped diets by macronutrient patterns (low carbohydrate, low fat, and moderate macronutrient - similar to low fat, but slightly more fat and slightly less carbohydrate) and according to 14 popular named dietary programmes (Atkins, DASH, Mediteranean, etc).

Compared with a usual diet, low carbohydrate and low fat diets resulted in a similar modest reduction in weight (between 4 and 5 kg) and reductions in blood pressure at six months. Moderate macronutrient diets resulted in slightly less weight loss and blood pressure reductions.

Among popular named diets, Atkins, DASH, and Zone had the largest effect on weight loss (between 3.5 and 5.5 kg) and blood pressure compared with a usual diet at six months. No diets significantly improved levels of 'good' HDL cholesterol or C reactive protein (a chemical associated with inflammation) at six months.

Overall, weight loss diminished at 12 months among all dietary patterns and popular named diets, while the benefits for cardiovascular risk factors of all diets, except the Mediterranean diet, essentially disappeared.

The researchers point to some study limitations that could have affected the accuracy of their estimates. But say their comprehensive search and thorough analyses supports the robustness of the results.

As such, they say moderate certainty evidence shows that most macronutrient diets result in modest weight loss and substantial improvements in cardiovascular risk factors, particularly blood pressure, at six but not 12 months.

Differences between diets are, however, generally trivial to small, implying that for short-term cardiovascular benefit people can choose the diet they prefer from among many of the available diets without concern about the magnitude of benefits, they conclude.

The extensive range of popular diets analysed "provides a plethora of choice but no clear winner," say researchers at Monash University, Australia in a linked editorial.

As such, they suggest conversations should shift away from specific choice of diet, and focus instead on how best to maintain any weight loss achieved.

As national dietary guidelines fail to resonate with the public, taking a food-based approach with individuals and encouraging them to eat more vegetables, legumes, and whole grains and less sugar, salt and alcohol is sound advice, they add.

"If we are to change the weight trajectory of whole populations, we may learn more from understanding how commercial diet companies engage and retain their customers, and translate that knowledge into more effective health promotion campaigns," they conclude.

Credit: 
BMJ Group

Majority of patients responded in CAR T-cell trial for mantle cell lymphoma

image: This is Michael Wang, M.D.

Image: 
The University of Texas MD Anderson Cancer Center

HOUSTON -- A one-year follow-up study led by The University of Texas MD Anderson Cancer Center revealed a majority of patients with mantle cell lymphoma resistant to prior therapies may benefit from treatment with CD19-targeting chimeric antigen receptor (CAR) T-cell therapy. Findings were published in the April 1 online issue of the New England Journal of Medicine.

The multi-center, 20-site, Phase II ZUMA-2 study reported that 93% of patients responded to the CAR T-cell therapy KTE-X19, with 67% achieving a complete response. At a median one year-follow up, 57% of patients were in complete remission, and the estimated progression-free survival and overall survival were 61% and 83%, respectively. At the time of this analysis, 76% of all treated patients in the study were alive.

In CAR T-cell therapy, patients' T cells are extracted through a process called leukapheresis and genetically reengineered with CAR molecules that help T cells attack cancer cells. The reengineered T cells are infused back into the patient. In this study, a type of CAR T-cell therapy known as KTE-X19 was manufactured and administered to patients.

"ZUMA-2 is the first multi-center, Phase II study of CAR T-cell therapy for relapsed/refractory mantle cell lymphoma, and these efficacy and safety results are encouraging," said Michael Wang, M.D.. professor of Lymphoma & Myeloma. "Although this study continues, our reported results, including a manageable safety profile, point to this therapy as an effective and viable option for patients with relapsed or refractory mantle cell lymphoma."

All patients had relapsed or refractory disease after receiving up to five therapies, and all patients had received Bruton's tyrosine kinase (BTK) inhibitor therapy. BTK inhibitor therapy has greatly improved outcomes in patients with relapsed or refractory mantle cell lymphoma, yet patients who have disease progression after receiving the treatment are likely to have poor outcomes, with median overall survival of just six to 10 months. Few patients in this category qualify to proceed to an allogeneic stem cell transplant.

In this study, the patients' median age was 65 years, and 84% were male. More than 80% of the patients had stage IV disease, and more than half were diagnosed as intermediate to high-risk in a mantle cell lymphoma prognostic index.

The study reported grade three or greater side effects, with the most common being neutropenia and thrombocytopenia. The majority of patients experienced cytokine release syndrome, a common side effect of CAR T-cell therapy, but the syndrome was effectively managed in all patients.

Credit: 
University of Texas M. D. Anderson Cancer Center

Study: Therapy by phone is effective for depression in people with Parkinson's

MINNEAPOLIS - Depression is common in people with Parkinson's disease and contributes to faster physical and mental decline, but it is often overlooked and undertreated. Cognitive-behavioral therapy has shown promising results for treating depression in people with Parkinson's, yet many people don't have access to therapists who understand Parkinson's and can provide this evidence-based depression treatment.

The good news is that participating in cognitive-behavioral therapy by telephone may be effective in reducing depression symptoms for people with Parkinson's, according to a study published in the April 1, 2020, online issue of Neurology®, the medical journal of the American Academy of Neurology.

"These results are exciting because they show that specialized therapy significantly improves depression, anxiety and quality of life in people with Parkinson's disease and also that these results last for at least six months," said study author Roseanne D. Dobkin, Ph.D., of Rutgers-Robert Wood Johnson Medical School in Piscataway, N.J., and a member of the American Academy of Neurology. "While these findings need to be replicated, they also support the promise of telemedicine to expand the reach of specialized treatment to people who live far from services or have difficulty traveling to appointments for other reasons."

The study involved 72 people with an average age of 65 who had Parkinson's disease for an average of six years and depression for nearly three years. The majority were taking antidepressants, and many were already receiving other kinds of talk therapy.

For three months, half of the people took part in weekly, one-hour sessions of cognitive-behavioral therapy by telephone, while also continuing their usual medical and mental health care. The cognitive-behavioral sessions focused on teaching new coping skills and thinking strategies individually tailored to each participant's experience with Parkinson's disease. Their care partners, such as a spouse, another family member, or a close friend, were trained to help their partner use these new skills in between sessions. After the three months were up, participants could choose to continue the sessions up to once a month for six months.

The other half of the people received their usual care, which, for many, included taking antidepressants and/or receiving other forms of talk therapy in their community.

At the beginning of the study, the participants had an average score of 21 on a measure of depression symptoms where scores of 17 to 23 indicate moderate depression. After three months of cognitive-behavioral therapy, scores for that group fell to an average of 14, which indicates mild depression. The people receiving their usual care had no change in their scores.

Six months after finishing the weekly cognitive-behavioral sessions, those participants had maintained their improvements in mood.

A total of 40% of those who engaged in cognitive-behavioral therapy met the criteria for being "much improved" in their depression symptoms, while none of the people who simply continued their usual care did.

"Depression affects up to 50% of people with Parkinson's disease and may occur intermittently throughout the course of illness. Additionally, in many instances, depression is a more significant predictor of quality life than motor disability. So easily accessible and effective depression treatments have the potential to greatly improve people's lives," Dobkin said.

A limitation of the study was that it did not include people with very advanced Parkinson's disease or those who also had dementia, so the results may not apply to them. Also, while insurance coverage for telemedicine is growing, it is not yet available in all cases or all states.

Credit: 
American Academy of Neurology

Managing negative thoughts helps combat depression in Parkinson's patients

People with Parkinson's disease who engage in cognitive behavioral therapy -- a form of psychotherapy that increases awareness of negative thinking and teaches coping skills -- are more likely to overcome depression and anxiety, according to a Rutgers study.

The study was published in the journal Neurology.

About 50 percent of people diagnosed with Parkinson's disease will experience depression, and up to 40 percent have an anxiety disorder.

"The psychological complications of Parkinson's disease have a greater impact on the quality of life and overall functioning than the motor symptoms of the disease," said lead author Roseanne Dobkin, a professor of psychiatry at Rutgers Robert Wood Johnson Medical School. "Untreated, depression can accelerate physical and cognitive decline, compromise independence and make it more difficult for individuals to proactively manage their health, like taking medication, exercising and visiting the physical therapist."

Depression in Parkinson's patients is underrecognized and often goes untreated. Among those who receive treatment, antidepressant medication is the most common approach, though many patients continue to struggle with depressive symptoms. The researchers investigated how adding cognitive behavioral therapy to the care individuals were already receiving would affect their depression.

Cognitive behavioral therapy sessions helped patients re-examine their usual ways of coping with the daily challenges of Parkinson's. Therapy was individually tailored, targeting negative thoughts - such as "I have no control" - and behaviors including social withdrawal or excessive worrying. Treatment also emphasized strategies for managing the disease, such as exercise, medication adherence and setting realistic daily goals.

The researchers enrolled 72 people diagnosed with both Parkinson's and depression. All participants continued their standard treatment. In addition, half the participants (37 people) also received cognitive behavioral therapy over the telephone weekly for three months, then monthly for six months. By the end of treatment, individuals receiving only standard care showed no change in their mental health status, whereas 40 percent of the patients receiving cognitive behavioral therapy showed their depression, anxiety and quality of life to be "much improved."

The convenience of phone treatment reduced barriers to care, allowing patients access to personalized, evidence-based mental health treatment, without having to leave their homes, Dobkin said.

"A notable proportion of people with Parkinson's do not receive the much needed mental health treatment to facilitate proactive coping with the daily challenges superimposed by their medical condition," she said. "This study suggests that the effects of the cognitive behavioral therapy last long beyond when the treatment stopped and can be used alongside standard neurological care to improve global Parkinson's disease outcomes."

Credit: 
Rutgers University

American robins now migrate 12 days earlier than in 1994

image: A robin wearing a GPS tracker on its back.

Image: 
Brian Weeks

Every spring, American robins migrate north from all over the U.S. and Mexico, flying up to 250 miles a day to reach their breeding grounds in Canada and Alaska. There, they spend the short summer in a mad rush to find a mate, build a nest, raise a family, and fatten up before the long haul back south.

Now climate change is making seasonal rhythms less predictable, and springtime is arriving earlier in many parts of the Arctic. Are robins changing the timing of their migration to keep pace, and if so, how do they know when to migrate? Although many animals are adjusting the timing of their migration, the factors driving these changes in migratory behavior have remained poorly understood.

A new study, published in Environmental Research Letters, concludes that robin migration is kicking off earlier by about five days each decade. The study is also the first to reveal the environmental conditions along the migration route that help the birds keep up with the changing seasons. Lead author Ruth Oliver completed the work while earning her doctorate at Columbia University's Lamont-Doherty Earth Observatory.

At Canada's Slave Lake, a pit stop for migrating birds, researchers have been recording spring migration timing for a quarter century. Their visual surveys and netting censuses revealed that robins have been migrating about five days earlier per decade since 1994.

In order to understand what factors are driving the earlier migration, Oliver and Lamont associate research professor Natalie Boelman, a coauthor on the paper, knew they needed to take a look at the flight paths of individual robins.

Their solution was to attach tiny GPS "backpacks" to the birds, after netting them at Slave Lake in mid-migration. "We made these little harnesses out of nylon string," Oliver explained. "It basically goes around their neck, down their chest and through their legs, then back around to the backpack." The unit weighs less than a nickel -- light enough for the robins to fly unhindered. The researchers expect that the thin nylon string eventually degrades, allowing the backpacks to fall off.

The researchers slipped these backpacks onto a total of 55 robins, tracking their movements for the months of April through June. With the precise location from the GPS, the team was able to link the birds' movements with weather data on air temperature, snow depth, wind speed, precipitation, and other conditions that might help or hinder migration.

The results showed that the robins start heading north earlier when winters are warm and dry, and suggest that local environmental conditions along the way help to fine-tune their flight schedules.

"The one factor that seemed the most consistent was snow conditions and when things melt. That's very new," said Oliver. "We've generally felt like birds must be responding to when food is available -- when snow melts and there are insects to get at -- but we've never had data like this before."

Boelman added that "with this sort of quantitative understanding of what matters to the birds as they are migrating, we can develop predictive models" that forecast the birds' responses as the climate continues to warm. "Because the timing of migration can indirectly influence the reproductive success of an individual, understanding controls over the timing of migratory events is important."

For now, it seems as though the environmental cues are helping the robins to keep pace with the shifting seasons. "The missing piece is, to what extent are they already pushing their behavioral flexibility, or how much more do they have to go?" said Oliver.

Because the study caught the birds in mid-migration, the tracking data doesn't reflect the birds' full migration path. To overcome this limitation, the researchers plan to analyze tissue from the robins' feathers and claws, which they collected while attaching the GPS harnesses, to estimate where each bird spent the previous winter and summer.

Over the long term, Oliver says, she hopes to use the GPS trackers to sort out other mysteries as well, such as how much of the change in migration timing is due to the behavioral responses found in the study versus natural selection to changing environments, or other factors.

"This type of work will be really cool once we can track individuals throughout the course of their life, and that's on the near-term horizon, in terms of technological capabilities," she said. "I think that will really help us unpack some of the intricacies of these questions."

Credit: 
Columbia Climate School

Global nuclear medicine community shares COVID-19 strategies and experiences

In an effort to provide safer working environments for nuclear medicine professionals and their patients, clinics across five continents have shared their approaches to containing the spread of COVID-19 in a series of editorials, published ahead of print in The Journal of Nuclear Medicine. This compilation of strategies, experiences and precautions is intended to support nuclear medicine clinics as they make decisions regarding patient care.

Clinicians from Africa, Asia, Australia, Europe and North America provided summaries of the steps their individual hospitals and clinics have taken to combat the COVID-19 pandemic. According to editorial author, Ken Herrmann, MD, MBA, chair of nuclear medicine at the University of Duisburg-Essen, Essen, Germany, the most common steps taken by clinics have been to triage patients upon arrival, reduce elective nuclear medicine studies, improve hygiene practices and establish rotations of medical personnel to create back-up teams should a staff member become infected.

For patients undergoing essential nuclear medicine procedures, incidental findings can suggest signs of COVID-19, according to editorial author Domenico Albano, MD, nuclear medicine physician at the University of Brescia and Spedali Civili Brescia in Italy. Reporting on local experience in a region with high COVID-19 prevalence, Albano and colleagues found six out of 65 asymptomatic PET/CT patients and one of 12 radioiodine patients showed signs of interstitial pneumonia. Five of the seven patients were confirmed to have COVID-19; the remaining two did not receive immediate testing but underwent quarantine and careful monitoring.

"Our observations show that it is mandatory for healthcare personnel to employ hygienic measures, minimize patient contact and optimize distance, and use protective equipment for general clinical services in regions with high COVID prevalence," said Albano. "It is also important to consider potential COVID-19 related findings during reading, and to report such findings to the patient and his referring physicians immediately, for appropriate action."

Of particular interest to the nuclear medicine community is the safety of performing ventilation/perfusion studies. "Previous literature has documented a small degree of leakage of the aerosol from the closed delivery system into the room with the potential for expired air and aerosolized secretions to contaminate personnel within the imaging suite," noted Lionel S. Zuckier, MD, MBA, FRCPC, editorial author and chief of the division of nuclear medicine at Montefiore Medical Center and Albert Einstein College of Medicine in Bronx, New York. "In addition, patients frequently cough following inhalation of the radiopharmaceutical, which may further expose nuclear medicine workers to aerosolized secretions."

Given these circumstances, ventilation/perfusion studies have the potential to result in aerosolized secretions that can contribute to the spread of COVID-19. Zuckier and colleagues recommend eliminating the ventilation portion from lung perfusion/ventilation scans to reduce the risk of spreading COVID-19.

Additional steps taken by clinics to combat the spread of COVID-19 include limiting or cancelling research studies and scheduling symptomatic patients needing essential studies for the end of the day (allowing for thorough cleaning after the study). Some clinics have conducted refresher courses in infection control management and basic emergency management, while others have stressed the need for kindness and consideration in this unprecedented time.

Concerns were also voiced by nuclear medicine clinics around the world regarding potential nuclear reactor production restrictions and international travel limitations. Strategies to tackle these issues are being addressed by nuclear medicine and molecular imaging society leadership worldwide.

"These are difficult times; none of us has ever experienced anything like the current pandemic," remarked Johannes Czernin, MD, JNM editor-in-chief and chief of the Ahmanson Translation Imaging Division at the David Geffen School of Medicine at the University of California, Los Angeles in California. "We thank the hundreds of thousands of health care workers worldwide who have not hesitated for a moment to come in and do their job at times of great personal risk. These are selfless people who help and support each other; cover for each other; volunteer to step in and up as needed."

Credit: 
Society of Nuclear Medicine and Molecular Imaging

AGA issues formal recommendations for PPE during gastrointestinal procedures

Bethesda, Maryland (April 1, 2020) -- Today, the American Gastroenterological Association (AGA) published new COVID-19 recommendations in Gastroenterology, the official journal of the AGA: AGA Institute Rapid Recommendations for Gastrointestinal Procedures During the COVID-19 Pandemic.

This rapid recommendation document was commissioned and approved by the AGA Institute Clinical Guidelines Committee, AGA Institute Clinical Practice Updates Committee, and the AGA Governing Board to provide timely, methodologically rigorous guidance on a topic of high clinical importance to the AGA membership and the public.

KEY GUIDANCE FOR GASTROENTEROLOGISTS

Treat all patients like they have coronavirus: In the absence of accurate and reliable testing for COVID-19 infection and the prolonged asymptomatic shedding that precedes symptoms, all patients should be considered at risk.

Boost PPE for endoscopic procedures: All health care workers should use N95 (or N99 or PAPR) masks instead of surgical masks and double gloves instead of single gloves as part of personal protective equipment (PPE), regardless of patient's COVID-19 status. There is strong evidence to support this recommendation for upper GI procedures, and lower-certainty evidence for lower GI procedures (i.e., colonoscopy). Also, negative pressure rooms, when available may help mitigate the spread of infection. Data on extended use and re-use of masks was also reviewed with insufficient evidence to comment on the safety of extended use (up to 8 hours) or re-use of masks. Indirect laboratory testing of masks suggests loss of durability and fit of N95 masks after 5 donnings.

Details: Endoscopies can generate aerosolized viral particles that can stay viable for up to 3 hours, depending on airflow dynamics. Furthermore, viral particles can stay viable for 72 hours on plastic surfaces and can easily promote spread through direct contact. SARS-CoV2 RNA can also be shed from stool though the risks associated with exposure are uncertain. Therefore, individuals performing endoscopic procedures are at higher risk of developing infection.

Triage GI procedures: Triaging of GI procedures is necessary to minimize risk to health care providers and patients and to limit the spread of infection during a pandemic. We provide a framework for decision-making that is focused on the impact of the delay on patient-important outcomes. Decisions to defer procedures should be made on a case-by-case basis using telemedicine as an adjunctive tool to help triage.

Credit: 
American Gastroenterological Association

Tracking tau

In the fight against neurodegenerative diseases such as frontotemporal dementia, Alzheimer's and Chronic Traumatic Encephalopathy, the tau protein is a major culprit. Found abundantly in our brain cells, tau is normally a team player -- it maintains structure and stability within neurons, and it helps with transport of nutrients from one part of the cell to another.

All that changes when tau misfolds. It becomes sticky and insoluble, aggregating and forming neurofibrillary tangles within neurons, disrupting their function and ultimately killing them. Worse, it probably can take relatively few misfolded tau proteins from one cell to turn its neighbors into malfunctioning, dying brain cells.

"This abnormal form of tau starts to spread from cell to cell," said UC Santa Barbara neuroscientist Kenneth S. Kosik. "It's reminiscent of a serious problem that's known in biology, called prion diseases, such as mad cow disease."

Importantly, unlike true prion diseases, which are spread by contact with infected tissue or bodily fluid, prion-like diseases such as frontotemporal dementia and other tauopathies aren't contagious -- they can't be spread from person to person or by coming into contact with infected tissue. However, the replication is eerily familiar: A misfolded tau protein gets out of a cell and gets taken up by a normal neighboring cell. It then acts as a template in that cell, Kosik explained, which subsequently produces misfolded tau. Over and over again, the cells produce and secrete the toxic knockoff version of tau until whole regions of the brain are affected, which over time will rob a person of their cognitive and physical functions.

What if the spread could be contained? If caught early enough, controlling the proliferation of pathological tau could keep the neurodegenerative disease from progressing, and give the patient a shot at a normal life. But in order to do that, scientists first have to understand how the protein gets around.

In a paper published in the journal Nature, Kosik and his team have uncovered one such mechanism by which tau travels from neuron to neuron. Not only does it shed light on the extensively studied but rather poorly understood tau propagation in neurodegenerative disease, it hints at a way to control the spread of pathological tau.

"The discovery of a mechanism by which tau transits from cell to cell provides a clue that will open up a deep structural approach to the design molecules that can prevent tau spread," said Kosik, who is the Harriman Professor of Neuroscience Research in UC Santa Barbara's Department of Molecular, Cellular, and Developmental Biology.

The major player in this mechanism of uptake and spread, it turns out, is the low-density lipoprotein called LRP1 (low density lipoprotein receptor-related protein 1). It's located on the brain cell membrane and is involved is several biological processes, among them helping the neuron take in cholesterol, which is used as part of cellular structure.

LRP1, the researchers discovered, takes up tau in neighboring cells after it escapes from a cell into the extra-cellular space. One of several low density lipoprotein receptors, LRP1 was singled out by process of elimination: By systematically inhibiting the expression of each of the members of this family via CRISPRi technology, and exposing them to tau, the researchers determined that genetic silencing of LRP1 effectively inhibited tau uptake.

"This protein is an interesting one in its own right because it's a little bit like an extracellular trash can," Kosik said. "It doesn't just pick up tau; if there's other rubbish out there, it also picks it up."

But what about tau is LRP1 recognizing? Digging deeper, the scientists found that a stretch of the amino acid lysine on the tau protein acts as kind of the secret handshake that opens the doors to the neuron.

"So these are all clues," Kosik said.

Stopping the Spread

"Since our cellular work showed that tau can interact with the cell-surface receptor LRP1 and that this causes tau's endocytosis, our hypothesis was that if we reduce LRP1 expression in the mice we should reduce the ability for neighboring neurons to take up tau," explained the study's lead author, postdoctoral researcher Jennifer Rauch.

To back their in-vitro studies, the researchers injected tau into mice, some of which had their LRP1 genes downregulated by a LRP1 suppressor RNA. The tau proteins were bound by a small string of amino acids to a green fluorescent protein to help the scientists observe tau after it was injected.

"As soon as this construct is in a cell, the amino acid connector gets cut, and the fluorescent protein and tau separate from each other," Kosik explained. What they found was that in the animals with normal LRP1, the tau had a tendency to spread; in the LRP1-suppressed mice, the protein stayed put, greatly reducing the likelihood that it would be taken up and replicated by other, normal neurons. "This is the first time we've seen the complete obliteration of tau spread," he said.

"When we reduce LRP1 expression, we see reduced tau spread in the animals," said Rauch, who has previously worked on the role of heparan sulphate proteoglycans on tau uptake. She pointed out a recent study that included Kosik and graduate student Juliana Acost-Uribe that described a patient with a severe genetic form of early-onset Alzheimers's but was spared getting the disease due to a second mutation that appeared to prevent tau spread. The team is keen to learn how this patient's second mutation might prevent tau spread possibly by interacting with LRP1.

"Next," Rauch said, "we are focusing on trying to decipher the interface of the tau-LRP1 interaction and understand if this could be a drug-able target."

Credit: 
University of California - Santa Barbara

Study offers new insight into the impact of ancient migrations on the European landscape

image: A graphic depicting the spread of Yamnaya ancestry over time over a period of around 8,000 years

Image: 
Fernando Racimo

Neolithic populations have long been credited with bringing about a revolution in farming practices across Europe. However, a new study suggests it was not until the Bronze Age several millennia later that human activity led to significant changes to the continent's landscape.

Scientists from the University of Copenhagen and the University of Plymouth led research tracing how the two major human migrations recorded in Holocene Europe - the northwestward movement of Anatolian farmer populations during the Neolithic and the westward movement of Yamnaya steppe peoples during the Bronze Age - unfolded.

In particular, they analysed how they were associated with changes in vegetation - which led to Europe's forests being replaced with the agricultural landscape still much in evidence today.

Their results, published in PNAS, show the two migrations differ markedly in both their spread and environmental implications, with the Yamnaya expansion moving quicker and resulting in greater vegetation changes than the earlier Neolithic farmer expansion.

The study - also involving the University of Gothenburg and the University of Cambridge - used techniques commonly applied in environmental science to model climate and pollution, and applied them to instead analyse human population movements in the last 10 millennia of European history.

It showed that a decline in broad-leaf forest and an increase in pasture and natural grassland vegetation was concurrent with a decline in hunter-gatherer ancestry, and may have been associated with the fast movement of steppe peoples during the Bronze Age.

It also demonstrated that natural variations in climate patterns during this period are associated with these land cover changes.

The research is the ?rst to model the spread of ancestry in ancient genomes through time and space, and provides the ?rst framework for comparing human migrations and land cover changes, while also accounting for changes in climate.

Dr Fernando Racimo, Assistant Professor at the University of Copenhagen and the study's lead author, said: "The movement of steppe peoples that occurred in the Bronze Age had a particularly strong impact on European vegetation. As these peoples were moving westward, we see increases in the amount of pasture lands and decreases in broad leaf forests throughout the continent. We can now also compare movements of genes to the spread of cultural packages. In the case of the Neolithic farming revolution, for example, the two track each other particularly well, in both space and time."

The research made use of land cover maps showing vegetation change over the past 11,000 years, which were produced through the University of Plymouth's Deforesting Europe project.

Scientists working on that project have previously shown more than half of Europe's forests have disappeared over the past 6,000 years due to increasing demand for agricultural land and the use of wood as a source of fuel.

Dr Jessie Woodbridge, Research Fellow at the University of Plymouth and co-author on the study, added: "European landscapes have been transformed drastically over thousands of years. Knowledge of how people interacted with their environment in the past has implications for understanding the way in which people use and impact upon the world today. Collaboration with palaeo-geneticists has allowed the migration of human populations in the past to be tracked using ancient DNA, and for the first time allowed us to assess the impact of different farming populations on land-cover change, which provides new insights into past human-environment interactions."

Credit: 
University of Plymouth

Golden age of Hollywood was not so golden for women

image: Casablanca (1942) had a male director, male producer, three male screenwriters, and seven featured male actors.

Image: 
Bill Gold

EVANSTON, Ill. -- The Golden Age of Hollywood is known for its glitz, glamour and classic movies. Northwestern University researchers have peeled back the gilded sheen to reveal an industry tarnished by severe gender inequity.

By analyzing a century of data (1910 to 2010) in the American Film Institute Archive and the Internet Movie Database (IMDb), the researchers found that female representation in the film industry hit an all-time low during the so-called Golden Age. Women representation in the industry still is struggling to recover today.

"A lot of people view this era through rose-colored glasses because Hollywood was producing so many great movies," said Northwestern's Luís Amaral, who led the study. "They argue that types of movies being made -- such as Westerns, action and crime -- caused the decrease in female representation. But we found the decrease occurred across all genres, including musicals, comedy, fantasy and romance."

The study will be published on April 1 in the journal PLOS ONE.

Amaral is the Erastus Otis Haven Professor of Chemical and Biological Engineering in Northwestern's McCormick School of Engineering.

Consistent findings across all genres and jobs

To conduct the study, Amaral and his team analyzed 26,000 movies produced between 1910 and 2010.

The team looked across all genres -- action, adventure, biography, comedy, crime, drama, documentary family, fantasy, film-noir, history, horror, music, musical, mystery, romance, sci-fi, sport, thriller, war, Western and short -- to measure how many women worked as actors, screenwriters, directors and producers.

Across all genres and all four job types, the resulting graphs form the exact same "U-shape" pattern. Roles for women increased from 1910 to 1920 and then sharply dropped. Around 1950, the roles steadily increase until 2010.

"In general, we found that the percentage of women compared to men in any role was consistently below 50% for all years from 1912 until now," said study coauthor Murielle Dunand, a former intern in Amaral's laboratory and current student at Massachusetts Institute of Technology.

'Men hire men'

Amaral said his findings reflect what was happening in the film industry.

Before Hollywood's Golden Age, the industry was fueled by independent filmmakers, and women participation was steadily increasing. From 1910 to 1920, according to Amaral's data, women actors comprised roughly 40% of casts. Women wrote 20% of movies, produced 12% and directed 5%. By 1930, acting roles for women were cut in half; producing and directing roles hit close to zero.

Amaral and Dunand said the data suggest that the studio system, which emerged between 1915 and 1920, is most likely responsible for the shift. The industry condensed from a somewhat diverse collection of independent filmmakers scattered across the country to just five studios (Warner Bros., Paramount, MGM, Fox and RKO Pictures), which controlled everything.

"As the studio system falls under the control of a small group of men, women are receiving fewer and fewer jobs," Amaral said. "It looks like male producers hire male directors and male writers. This is association, not causation, but the data is very suggestive."

Women improve conditions for other women

Then, two groundbreaking lawsuits caused the studio system to break apart. First, Oscar-nominated actor Olivia de Havilland, who had an exclusive contract with Warner Bros., sued the studio in 1943 to be freed from her contract and won. In 1948, the U.S. federal government sued Paramount Pictures in an antitrust case. At the time, movie studios owned their own theaters and distributed their own movies. When Paramount lost, studios could no longer exclusively produce, distribute and exhibit their films.

"These legal changes took the power away from a handful of men and gave more people the power to start changing the industry," Amaral said. "There is a connection between increased concentration of power and decreased participation of women."

Among the insights hidden in the data, Amaral found that women producers tend to hire greater proportions of women to work in their films.

"Producers affect the gender of the director," he said. "Women with power in Hollywood are making conditions better for other women."

Credit: 
Northwestern University

Shining a spotlight on the history of gender imbalance in Hollywood

image: Historical trends of gender imbalance in the U.S. movie industry.

Image: 
Amaral et al, PLOS ONE 2020 (CC BY)

A new analysis reveals long-term trends in female representation in the U.S. movie industry, including a sharp decline associated with the "Studio System" era that dominated Hollywood from 1922 to 1950. Luís A. Nunes Amaral of Northwestern University, Illinois, and colleagues present these findings in the open-access journal PLOS ONE on April 1, 2020.

While research suggests that gender diversity has significant benefits across industries, women remain underrepresented in many fields, including the U.S. movie industry. However, the movie industry is unlikely to be affected by mechanisms--such as lack of interest or differences in innate ability--that are sometimes hypothesized to explain gender imbalance in other fields.

To better understand gender imbalance in Hollywood, Amaral and colleagues used data from the American Film Institute and the Internet Movie Database to analyze female representation among the teams behind more than 26,000 movies produced in the U.S. from 1911 to 2010.

The analysis showed that female representation among actors, directors, and producers declined dramatically with the birth of the Hollywood Studio System in 1922. During this period, which lasted through 1950, just a few major studios controlled all facets of the movie-making process.

Further statistical analysis uncovered an association between the lack of female producers during the Studio System era and a decline in female directors, screenwriters, and actors. After legal challenges ended the Studio System, actresses gained bargaining power, enabling some to later become producers and directors. Since then, female representation in Hollywood has slowly increased, but remains low.

While the new findings do not demonstrate causal relationships, they suggest that female producers and directors may help further the careers of other women in the industry. The findings could also shed new light on disparities in other fields, such as computer science, that experienced similar declines in female representation as the fields grew in importance.

The authors add: Our study reveals that even in an activity -- acting -- where women have greater levels of interest then men and at least equal ability, they are still discriminated against. Our study is also consistent with the hypothesis that when an industry grows in importance and size it can experience a collapse of diversity.

Credit: 
PLOS

Physical force alone spurs gene expression, study reveals

image: Researchers, including, from left, Ning Wang, a professor of mechanical science and engineering; postdoctoral fellow Jian Sun; and doctoral student Erfan Mohagheghian discovered that mechanical forces on cells can boost gene expression in the nucleus.

Image: 
Photo by L. Brian Stauffer

CHAMPAIGN, Ill. -- Cells will ramp up gene expression in response to physical forces alone, a new study finds. Gene activation, the first step of protein production, starts less than one millisecond after a cell is stretched - hundreds of times faster than chemical signals can travel, the researchers report.

The scientists tested forces that are biologically relevant - equivalent to those exerted on human cells by breathing, exercising or vocalizing. They report their findings in the journal Science Advances.

"We found that force can activate genes without intermediates, without enzymes or signaling molecules in the cytoplasm," said University of Illinois mechanical science and engineering professor Ning Wang, who led the research. "We also discovered why some genes can be activated by force and some cannot."

Previous studies revealed that some genes are susceptible to physical manipulations of cells, but Wang and his colleagues were the first to show that stretching cells alone could influence how such genes are expressed. The team first demonstrated this phenomenon with genes they had inserted in cells. The current study finds that naturally occurring genes can also be activated by stretching.

In the new work, the researchers observed that special DNA-associated proteins called histones played a central role in whether gene expression increased in response to forces that stretched the cell. Histones regulate DNA, winding it up to package it in the nucleus of the cell.

One class of histones, known as Histone H3, appear to prevent force-responsive gene expression when methylated at an amino acid known as lysine 9. Methylation involves adding a molecular tag known as a methyl group to a molecule.

The scientists observed that H3K9 methylation was highest at the periphery of the nucleus and largely absent from the interior, making the genes in the interior more responsive to stretching.

"The genes near the nuclear periphery cannot be activated even if you stretch them, whereas the genes that are close to the center can be activated by stretching," Wang said. "This is because the H3K9 histones at the periphery are highly methylated."

The researchers found they could suppress or boost force-responsive gene expression by increasing or decreasing H3K9 histone methylation.
The scientists also tested whether the frequency of an applied force influenced gene expression. They found that cells were most responsive to forces with frequencies up to about 10-20 hertz.

"Living cells in the human body experience forces of various frequencies (for example breathing, heartbeats, walking, running, jumping and singing), typically ranging from 0.2 hertz to hundreds of hertz," the researchers wrote. At the highest frequencies, cells became stiffer and the enzymes that guide gene transcription could not bind to the DNA, the team found.

Cells' immediate responsiveness to force makes sense from an evolutionary perspective, Wang said.

"Cells must be able to respond quickly to things in their environment so they can survive," he said.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Immunotherapy effective in metastatic prostate cancers with specific markers of immune activation

image: Sumit Subudhi, M.D., Ph.D.

Image: 
The University of Texas MD Anderson Cancer Center

HOUSTON -- Although metastatic castration-resistant prostate cancer (mCRPC) typically has limited response to immunotherapy, a subset of patients with pretreatment evidence of active T-cell responses in their tumors experienced prolonged survival after following treatment with ipilimumab in a Phase II trial at The University of Texas MD Anderson Cancer Center.

The results, published today in Science Translational Medicine, suggest that certain patients with mCRPC may benefit from immune checkpoint inhibitors and provide biomarkers for identifying this subgroup.

"Our results indicate that immune checkpoint blockade can instigate T-cell responses to tumor neoantigens despite a low tumor mutational burden in prostate cancer," said lead author Sumit Subudhi, M.D., Ph.D., assistant professor of Genitourinary Medical Oncology. "We found specific markers among a subset of patients with the greatest benefit, such as T-cell density and interferon-γ signaling, that may help improve our ability to select patients for treatment with checkpoint blockade."

Cancers with the strongest responses to immune checkpoint inhibitors, such as melanoma or lung cancer, tend to have high levels of underlying gene mutations, which leads to production of mutation proteins, or neoantigens, that can be recognized as abnormal by the immune system. Prostate cancers have relatively low mutation levels and fewer neoantigens present.

However, small groups of mCRPC patients within larger Phase III trials have seen favorable outcomes to checkpoint inhibitors, explained Subudhi, which drove the researchers to ask whether effective immune responses could be stimulated by checkpoint blockade in tumors with low mutation levels.

To investigate this question, the research team launched the Phase II trial in collaboration with MD Anderson's immunotherapy platform, which is co-led by corresponding author Padmanee Sharma, M.D., Ph.D., professor of Genitourinary Medical Oncology and Immunology. The platform is part of the institution's Moon Shots Program®, a collaborative effort to accelerate the development of scientific discoveries into clinical advances that save patient's lives.

The trial enrolled 30 MD Anderson patients with mCRPC between January 2015 and May 2018. Of those, 29 received at least one dose of ipilimumab and were able to be included in the final analysis. Median follow-up after the first treatment was 45.5 months.

Across all patients, median progression-free survival (PFS) according to radiographic imaging was 3 months, and median overall survival (OS) was 24.3 months. Eight patients (28%) experienced grade 3 toxicities, the most common of which were dermatitis and diarrhea, and none experienced grade 4 or 5 toxicities.

The researchers noted a "favorable" cohort of nine patients with PFS greater than six months and OS greater than one year, and an "unfavorable" cohort of ten patients with PFS less than six months and OS less than one year. At the time of analysis, six (67%) patients from the "favorable" cohort were alive, with survival ranging between 33 and 54 months.

By comparing pretreatment samples from these two cohorts, the researchers identified markers associated with improved responses to checkpoint blockade. Those in the "favorable" cohort had a higher density of cytotoxic and memory T cells in the tumor as well as increased expression of interferon (IFN)-γ signaling.

Further, the researchers showed that T cells isolated from patients in the "favorable" cohort were capable of recognizing and responding to the neoantigens present in their tumor, whereas T cells from patients in the "unfavorable" group did not appear to have the same responses.

"We were encouraged to see that prostate cancers with a low mutational burden do in fact express neoantigens that elicit T-cell responses that lead to favorable clinical outcomes," said Sharma. "Our findings indicate that anti-CTLA-4 immune checkpoint therapy warrants additional studies in order to develop treatment strategies that may improve survival of patients with metastatic prostate cancer."

Moving forward, the authors plan to investigate this question in larger, multi-institutional studies to validate the findings of the current trial.

Credit: 
University of Texas M. D. Anderson Cancer Center

Prehistoric artifacts suggest a Neolithic era independently developed in New Guinea

New artifacts uncovered at the Waim archaeological site in the highlands of New Guinea - including a fragment of the earliest symbolic stone carving in Oceania - illustrate a shift in human behavior between 5050 and 4200 years ago in response to the widespread emergence of agriculture, ushering in a regional Neolithic Era similar to the Neolithic in Eurasia. The location and pattern of the artifacts at the site suggest a fixed domestic space and symbolic cultural practices, hinting that the region began to independently develop hallmarks of the Neolithic about 1000 years before Lapita farmers from Southeast Asia arrived in New Guinea. While scientists have known that wetland agriculture originated in the New Guinea highlands between 8000 and 4000 years ago, there has been little evidence for corresponding social changes like those that occurred in other parts of the world. To better understand what life was like in this region as agriculture spread, Ben Shaw et al. excavated and examined a trove of artifacts from the recently identified Waim archaeological site. "What is truly exciting is that this was the first time these artifacts have been found in the ground, which has now allowed us to determine their age with radiocarbon dating," Shaw said. The researchers analyzed a stone carving fragment depicting the brow ridge of a human or animal face, a complete stone carving of a human head with a bird perched on top (recovered by Waim residents), and two ground stone pestle fragments with traces of yam, fruit and nut starches on their surfaces. They also identified an obsidian core that provides the first evidence for long-distance, off-shore obsidian trade, as well as postholes where house posts may have once stood.

Credit: 
American Association for the Advancement of Science (AAAS)

Ancient hominins had small brains like apes, but longer childhoods like humans

image: Using precise imaging technology to scan fossil skulls, researchers found that as early as 3 million years ago, children had a long dependence on caregivers.

Image: 
Courtesy of Zeray Alemseged

Human ancestors that lived more than 3 million years ago had brains that were organized like chimpanzee brains, but had prolonged brain growth like humans, new research from the University of Chicago and other leading institutions shows.

That means these hominins -- the species Australopithecus afarensis, made famous by the Lucy and Dikika child fossils found in Ethiopia -- had a mosaic of ape and human features, a hallmark of evolution.

By using precise technology to scan eight fossil skulls from this region, the researchers also resolved a longstanding question of whether this species had prolonged childhood, a period of time unique to humans that allows us to learn and grow.

"As early as 3 million years ago, children had a long dependence on caregivers," said Zeresenay (Zeray) Alemseged, PhD, Donald N. Pritzker Professor of Organismal Biology and Anatomy and senior author of the research, published April 1 in the journal Science Advances. "That gave children more time to acquire cognitive and social skills. By understanding that childhood emerged 3.5 million years ago, we are establishing the timing for the advent of this milestone event in human evolution."

Alemseged, who discovered the Dikika child fossil in 2000 and leads the Dikika field project in Ethiopia, has studied its species for decades and helped design the new research. Widely accepted to be ancestral to all later hominins, including humans, Australopithecus afarensis lived in East Africa more than 3 million years ago and had many human-like features: They walked upright, had brains that were 20% larger than chimpanzees and may have used sharp stone tools.

But many questions about the species remain unresolved, including whether its brain was organized like humans -- which could indicate more complex behaviors, like communication -- and whether it also had protracted brain growth.

When Alemseged discovered the Dikika child, he used a CT scan to examine its skull, and by studying its teeth determined that its age at time of death was around 3 years. To understand how the child's brain was organized, however, he needed more precise imaging technology, so his team used synchrotron-computed tomography -- which uses extremely powerful X-rays to reveal detailed information about a material's structure -- to scan the child's skull and seven other skulls from the same region.

While brains do not fossilize, they do leave imprints on the inside of the skull. With the scans, the researchers could measure endocranial volume, and see the placement of the lunate sulcus -- a fissure that separates the anterior and posterior parts of the brain. This placement differs in humans and chimpanzees -- in humans, who have a large prefrontal cortex, the fissure is pushed further down in the brain. In chimpanzees, the fissure is closer to the front. The scans revealed that Australopithecus afarensis had a lunate sulcus in a similar position to the fissure found in chimpanzee brains.

"This resolves a contentious argument that has polarized paleontologists for years," Alemseged said. "We can now say the organization of the brain was more ape-like."

Did that mean that the species acted more like chimpanzees? Not necessarily. The group of researchers also used synchrotron-computed tomographic scans to count the Dikika child's dental growth lines. Similar to growth rings in trees, these growth lines can show the exact birth and death date of the child. The team's dental experts then calculated the child's age as 2.4 years.

"That allows you to ask how much of the brain was formed at that given age," Alemseged said.

When researchers compared the child's endocranial volume to that of a chimpanzee and humans, they found that brain development in Australopithecus afarensis was protracted, like in humans today. That meant the species had a long childhood, which laid the foundation for subsequent evolution of the brain and social behavior that differentiates humans today.

Alemseged's collaborators on this research were from the Max Planck Institute for Evolutionary Anthropology, Florida State University, the School for Advanced Research, the European Synchrotron Radiation Facility, Griffith University, Arizona State University, the Natural History Museum of London, and University College London.

The Dikka child, known as Selam (which means "peace" in the Ethiopian Amharic language), has been the source of several important papers on human evolution. Alemseged's graduate students are still studying the fossil, looking at the development of its face and shoulder growth. Ultimately, Alemseged hopes to compile everything that he and his lab have discovered about the fossil into one compendium.

"This fossil has played a pivotal role in allowing paleoanthropologists to ask and answer several major questions about how we became human," he said.

Credit: 
University of Chicago Medical Center