Tech

Combination of immunotherapy and VEGF inhibitor improves survival in HCC

image: IMbrave150 abstract table

Image: 
© European Society for Medical Oncology

Singapore, 23 November 2019 - Combination therapy with the PD-L1 inhibitor atezolizumab and the VEGF inhibitor bevacizumab significantly improves overall survival and progression-free survival in patients with unresectable hepatocellular carcinoma (HCC) compared to standard of care, showed results from a phase 3 study reported at the ESMO Asia 2019 Congress. (1,2)

"This is the first study in 11 years to show an improvement in survival with a new fist-line treatment option compared to sorafenib, which has been the standard of care throughout this time," said study first author Ann-Lii Cheng, Director of the National Taiwan University Cancer Center, Taipei, Taiwan. He added, "Atezolizumab plus bevacizumab has the potential to be a practice-changing treatment option in hepatocellular carcinoma."

Unresectable HCC is currently a major challenge in countries with a high prevalence of this cancer. Most patients in countries without screening programmes present with unresectable or advanced HCC because of the late appearance of symptoms, resulting in a very high mortality rate (almost 80%).

"Despite many studies over the past 11 years, we have been unable to find any better treatment option. This has been very frustrating because sorafenib has a response rate of around 10% and is associated with severe side-effects," he explained.

The phase 3 IMbrave150 study randomised patients with unresectable HCC to a combination of atezolizumab plus bevacizumab or sorafenib. Atezolizumab reactivates the immune response to tumour cells while bevacizumab stops tumours growing new blood vessels to obtain nutrients and oxygen but also helps upregulate host immunity to fight against cancer.

Results showed statistically and clinically meaningful improvement in the co-primary endpoints of overall survival and progression-free survival in patients treated with atezolizumab plus bevacizumab compared to those treated with sorafenib.

Commenting on the findings, Ian Chau, Consultant Medical Oncologist at the Royal Marsden Hospital, London, UK, said: "This is the first time a novel treatment has shown a survival benefit compared to the current standard of care. The results are very encouraging and there is a strong possibility this drug combination will be approved by regulatory authorities and be incorporated into international guidelines for advanced HCC."

Angela Lamarca, Consultant Medical Oncologist at the Christie NHS Foundation Trust, Manchester, UK, agreed: "I think this is a breakthrough and based on the results, the combination of atezolizumab plus bevacizumab could become the new standard of care."

Lamarca added, "The results are clinically meaningful in the setting of advanced HCC, as well as statistically significant. The delayed deterioration in quality of life is also important - patients are living longer and their quality of life is better."

She considered the study was well-designed, with several strengths including its large sample size, with just over 500 patients, the use of a combination endpoint of PFS and OS, assessment of response/progression by a central reviewer and analysis based on the intention-to-treat population.

Lamarca also noted that the median follow-up of 8.6 months is relatively short, with the median OS for atezolizumab plus bevacizumab not yet reached. Chau agreed, noting that the improvement in OS is currently based on relatively immature data, with longer follow-up needed to confirm the magnitude of the OS benefit.

Looking to the future, Chau said, "The combination of atezolizumab plus bevacizumab will be very useful to patients with advanced HCC as a new systemic therapy but, with the high cost of immunotherapy and anti-angiogenic agents, it will also be important that those drugs are accessible to patients."

Study results

The phase 3 IMbrave150 study randomised 501 patients with unresectable HCC on a 2:1 basis to atezolizumab (1200mg IV every three weeks) plus bevacizumab (15mg/kg IV every three weeks) or sorafenib (400mg twice daily). The patients continued with their assigned treatment until unacceptable toxicity or loss of clinical benefit as judged by study investigators.

Results showed the hazard ratio (HR) for overall survival (OS) was 0.58 (95% CI 0.42, 0.79, p=0.0006) after a median follow-up of 8.6 months. The median OS had not yet been reached for atezolizumab plus bevacizumab compared to 13.2 months for patients randomised to sorafenib. Median progression free survival (PFS) was also significantly increased (median 6.8 vs 4.3 months, HR 0.59, 95% CI 0.47, 0.76, p

Overall response rate was more than twice as high with atezolizumab plus bevacizumab compared to sorafenib (27% vs 12%, p

Grade 3-4 adverse events occurred in 57% of patients treated with Atezolizumab plus bevacizumab and 55% of those receiving sorafenib. Grade 5 adverse events occurred in 5% and 6% of patients, respectively.

Notes to Editors

Please make sure to use the official name of the meeting in your reports: ESMO Asia 2019 Congress
Official Congress Hashtag: #ESMOAsia19

References

1 Abstract LBA3 'IMbrave150: Efficacy and safety results from a ph III study evaluating atezolizumab (atezo) + bevacizumab (bev) vs sorafenib (Sor) as first treatment (tx) for patients (pts) with unresectable hepatocellular carcinoma (HCC)' will be presented by Ann-Lii Cheng during the Presidential Symposium on Saturday, 23 November 2019, 11:00-12:30 (SGT) in Hall 406. Annals of Oncology, Volume 30, 2019 Supplement 9

2 ESMO Asia Congress 2019 https://www.esmo.org/Conferences/ESMO-Asia-Congress-2019

Disclaimer

This press release contains information provided by the authors of the highlighted abstracts and reflects the content of those abstracts. It does not necessarily reflect the views or opinions of ESMO who cannot be held responsible for the accuracy of the data. Commentators quoted in the press release are required to comply with the ESMO Declaration of Interests policy and the ESMO Code of Conduct.

About the European Society for Medical Oncology (ESMO)

ESMO is the leading professional organisation for medical oncology. With more than 23,000 members representing oncology professionals from over 150 countries worldwide, ESMO is the society of reference for oncology education and information. ESMO is committed to offer the best care to people with cancer, through fostering integrated cancer care, supporting oncologists in their professional development, and advocating for sustainable cancer care worldwide. Visit http://www.esmo.org

LBA3 - IMbrave150: Efficacy and safety results from a ph III study evaluating atezolizumab (atezo) + bevacizumab (bev) vs sorafenib (Sor) as first treatment (tx) for patients (pts) with unresectable hepatocellular carcinoma (HCC)

A-L. Cheng1, S. Qin2, M. Ikeda3, P. Galle4, M. Ducreux5, A. Zhu6, T-Y. Kim7, M. Kudo8, V. Breder9, P. Merle10, A. Kaseb11, D. Li12, W. Verret13, Z. Xu14, S. Hernandez15, J. Liu16, C. Huang17, S. Mulla18, H.Y. Lim19, R. Finn20

1Department of Oncology, National Taiwan University Cancer Center and National Taiwan University Hospital, Taipei, Taiwan, 2Cancer Center, People's Liberation Army Cancer Center, Nanjing, China,

3Hepatobiliary & Pancreatic On-cology Dept., National Cancer Center Hospital East, Kashiwa, Japan, 4I. Medical Department, University Medical Cen-ter Mainz, Mainz, Germany, 5Medical Oncology, Gustave Roussy Cancer Center, Villejuif, France, 6Cancer Center, Harvard Medical School, Massachusetts General Hospital Cancer Center, Boston, MA, USA, 7Medical Oncology Cen-ter, Seoul National University College of Medicine, Seoul, Republic of Korea, 8Department of Gastroenterology and Hepatology, Kindai University Faculty of Medicine, Osaka, Japan, 9Chemotherapy Dept No17, N.N. Blokhin Russian Cancer Research Center, Moscow, Russian Federation, 10Hepatology and Gastroenterology Unit, Hopital de la Croix-
Rousse, Lyon, France, 11GI Medical Oncology, The University of Texas MD Anderson Cancer Center, Houston, TX, USA, 12Department of Medical Oncology & Therapeutics Research, City of Hope Comprehensive Cancer Center and Beckman Research Institute, Duarte, CA, USA, 13Product Development - Oncology, Genentech, Inc., San Francisco,

CA, USA, 14Product Development, Roche, Beijing, China, 15Product Development Oncology Department, Genentech, Inc. - Member of the Roche Group, South San Francisco, CA, USA, 16Product Development, Shanghai Roche Pharma-ceuticals Ltd., Shanghai, China, 17Safety Science Oncology, Roche Product Development, Shanghai, China, 18Product Development, F. Hoffmann-La Roche, Ltd., Mississauga, ON, Canada, 19Department of Medicine, Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul, Republic of Korea, 20Jonsson Comprehensive Cancer Center, Geffen School of Medicine at UCLA, Los Angeles, CA, USA

Introduction: Ph 1b data has shown promising efficacy and safety for atezo + bev in unresectable HCC pts who have not received prior systemic therapy. Here, we report the primary analysis data from the Ph 3 IMbrave150 trial comparing atezo + bev vs sor in this pt population.

Methods: IMbrave150 enrolled systemic treatment (tx)-naïve pts with unresectable HCC. Pts were randomised 2:1 to receive either atezo 1200 mg IV q3w + bev 15 mg/kg IV q3w or sor 400 mg BID until unacceptable toxicity or loss of clinical benefit per investigator. Coprimary endpoints were OS and PFS by independent review facility (IRF)-assessed RECIST 1.1. The key secondary endpoints IRF-ORR per RECIST 1.1 and IRF-ORR per HCC mRECIST were also part of the study statistical testing hierarchy.

Results: The ITT population included 336 pts randomised to atezo + bev and 165 randomised to sor. Baseline de-mographics were well balanced between arms. With a median follow up of 8.6 mo, OS HR was 0.58 (95% CI, 0.42, 0.79; P = 0.0006) and PFS HR was 0.59 (95% CI, 0.47, 0.76; P

Conclusions: IMbrave150 demonstrated statistically significant and clinically meaningful improvement in both OS and PFS for atezo + bev vs sor in pts with unresectable HCC who have not received prior systemic therapy. The safety of atezo + bev is consistent with the known safety profile of each agent, and no new safety signals were identified. Ate-zo + bev has the potential to be a practice changing tx in HCC.

Credit: 
European Society for Medical Oncology

Caught in afterglow: 1st detection of Inverse Compton emission from dying gamma-ray burst

image: Researchers from the Centre for Astro-Particle Physics (CAPP) within the Department of Physics at the University of Johannesburg (UJ) collaborate with the teams from the Fermi gamma-ray space telescope, the MAGIC telescopes on the Canary Islands and many others to better understand energy emissions from stars.
Prof Soebur Razzaque (left) is the Director of the centre, Dr Feraol Fana Dirirsa (right), a research fellow. Both contributed to the validation of Inverse Compton emission by the afterglow of the gamma-ray burst GRB 190114C.

Image: 
Therese van Wyk, University of Johannesburg

A dying star emits intense flashes of light called a gamma-ray burst. Most days, the Fermi gamma-ray space telescope detects these flashes. About 20 years ago, scientists predicted that a gargantuan energy level - tera-electron volts - would be detected in burst afterglow.

In January, the MAGIC telescopes on the Canary Islands observed light at this energy level for the first time. The theories predicting how such light would be produced, are now validated.

Burst mechanics

When a star dies, its core collapses. While it collapses, the core shoots out hot plasma material at nearly the speed of light. Intense flashes of light called gamma-ray bursts result from these hot plasma jets.

When telescopes on satellites observe an area of the night sky, they use two ways to recognise the bursts coming from dying stars.

First, if the bursts last relatively long, from a few seconds to a few minutes, then they are called long-duration bursts. Second, such a burst starts with a 'bang' of very bright gamma-ray emission that pulses brighter and dimmer before it fades away. This is called the variable phase of its emission.

The making of afterglow

While the star's core is collapsing, the star is still rotating on its axis. At the same time, the core starts spewing the fast-moving jet of super-heated ionized matter, which radiates along the star's axis of rotation. It is this jet of ionized star matter that causes the gamma-ray burst from the dying star.

As the jet radiates away from the star, it encounters resistance, even though this is happening in outer space. Pressure builds up and slows down the jet - and then starts producing shock waves. The shock waves are just like the sonic boom from a supersonic jet plane.

The shock waves heat up electrons in the space environment around the jet. The heated electrons then start spiralling in the magnetic medium surrounding the jet. At the same time, the electrons emit light in all wavelengths of the electromagnetic spectrum.

The electrons' light is called the afterglow radiation from the gamma-ray burst. It can last from days to months and is relatively easy to observe. It is explained by the synchrotron radiation model in physics.

Scientists routinely detect afterglow radiation at radio, optical, X-ray and gamma-ray wavelengths. The Fermi Gamma-Ray Space Telescope (FGST) can even detect afterglow light at giga-electron volt energies. A giga-electron volt is 10 to the power 9 electron volts.

Elusive light

Until January 2019 something had been missing from the afterglow picture, however. It was the light energy that scientists had been expecting to see for about 20 years.

Researchers, including Prof Soebur Razzaque, predicted that afterglow from gamma-ray bursts would include far more powerful light. They said there would be light generated at the tera-electron volt level. Which is ten to the power 12, at least a thousand times more powerful than what the FGST had detected up to then.

Prof Razzaque is the Director of the Centre for Astro-Particle Physics (CAPP) within the Department of Physics at the University of Johannesburg (UJ).

"We said that the heated, spiralling electrons around the jet should be undergoing another process. This additional process is called inverse-Compton radiation. Also, this process would generate light with an energy level of a tera-electron volts.

"But it was not possible to validate this theory, because we had not detected light at that energy level yet," says Razzaque.

"Also, if we could detect such light, we hoped for a new window to study the extreme environment that gamma-ray burst afterglow is produced in," he adds.

Caught in space and terra firma

On 14 January 2019, that window opened up. Several telescopes on board space missions observed a gamma ray burst, which was named GRB 190114C. One of these telescopes was the Fermi Gamma-ray Space Telescope, another the Swift Space Observatory.

Within hours, scientists realised GRB 190114C was out of the ordinary. They could see extremely high-energy photons, or light particles. The established synchrotron radiation model could not readily explain these photons.

In fact, about a minute after the gamma-ray bursts' light got to Earth, the MAGIC telescope on the Canary Islands found what researchers had hoped for. The telescope had detected radiation of 1 tera-electron volt or more, lasting as long as predicted for a dying star.

Later, the Fermi and SWIFT satellites also observed the burst's long-lasting afterglow radiation. For the first 10 days after the burst, many telescopes on Earth could detect the afterglow also. The radiation ranged from radio frequencies up to very high energy gamma-rays.

Extreme environment

In South Africa, Dr Feraol Fana Dirirsa started analysing gamma-ray data from the Fermi space telescope soon after the burst. He is a research fellow at the Centre for Astro-Particle Physics at UJ.

Meanwhile, Prof. Razzaque worked with several other scientists from the Fermi, Swift and MAGIC telescope teams. They investigated modelling of multi-wavelength afterglow emission from GRB 190114C.

It soon became clear that the high-energy light that MAGIC detected, had validated their predictions. This light was the tera-electron volt radiation from inverse Compton emission, identified for the first time.

"We observed a huge range of frequencies in the electromagnetic radiation afterglow of GRB 190114C. It is the most extensive to date for a gamma-ray burst," says Razzaque.

"We're elated that the theories around inverse-Compton emission are now validated. However, we need more observations from bursts like this. More data will help us to better understand the extreme environment that gamma-ray burst afterglow is produced in. Meanwhile, other theoretical models about gamma rays and stars are still waiting for direct observation," he adds.

Credit: 
University of Johannesburg

Research shows old newspapers can be used to grow carbon nanotubes

image: TEM images of raw carbon soot grown on kaolin sized paper showing (a) roped single-walled carbon nanotubes (SWCNTs) helically wrapped by a SWCNT, and large SWCNTs, (b) collapsed, (c) folded, and (d) twisted nanotubes. Scale bar = 10 nm (a-c) and 50 nm (d).

Image: 
Rice University

A research collaboration between Rice University and the Energy Safety Research Institute (ESRI) at Swansea University has found that old newspapers can be used as a low cost, eco-friendly material on which to grow single walled carbon nanotubes on a large scale.

Carbon nanotubes are tiny molecules with incredible physical properties that can be used in a huge range of things, such as conductive films for touchscreen displays, flexible electronics, fabrics that create energy and antennas for 5G networks.

The new study, published in the MDPI Journal C , details the research experiments carried out in producing carbon nanotubes which could have the potential to solve some of the problems associated with their large scale production such as:-

The high cost of preparing a suitable surface for chemical growth.

The difficulties in scaling up the process, as only single surface growth processes have been previously available.

The research team discovered that the large surface area of newspapers provided an unlikely but ideal way to chemically grow carbon nanotubes.

Lead researcher Bruce Brinson said: "Newspapers have the benefit of being used in a roll-to-roll process in a stacked form making it an ideal candidate as a low-cost stackable 2D surface to grow carbon nanotubes."

However, not all newspaper is equally good - only newspaper produced with sizing made from kaolin, which is china clay, resulted in carbon nanotube growth.

Co-author Varun Shenoy Gangoli said: "Many substances including talc, calcium carbonate, and titanium dioxide can be used in sizing in papers which act as a filler to help with their levels of absorption and wear. However it was our observation that kaolin sizing, and not calcium carbonate sizing, showed us how the growth catalyst, which in our case was iron, is affected by the chemical nature of the substrate."

ESRI Director Andrew Barron, also a professor at Rice University in the USA, said: "While there have been previous research that shows that graphene, carbon nanotubes and carbon dots can be been synthesised on a variety of materials, such as food waste, vegetation waste, animal, bird or insect waste and chemically grown on natural materials, to date, this research has been limited.

"With our new research, we have found a continuous flow system that dramatically reduces the cost of both substrate and post synthesis process which could impact on the future mass manufacture of single walled carbon nanotubes."

Credit: 
Swansea University

New study provides insight into the mechanisms of blood clots in cancer patients

(Boston)--Researchers have identified a potential new signaling pathway that may help further the understanding of blood clot formation in cancer patients and ultimately help prevent this complication from occurring.

A pulmonary embolism generally occurs when a clot from the deep veins of the extremities also known as deep vein thrombosis (DVT) becomes dislodged and travels to the lungs. This event combined with the DVT is the second most common cause of non-cancer related death in patients with malignancy. Patients with cancer are at an increased risk of developing blood clots for reasons that had previously been unclear until now.

Researchers from Boston University School of Medicine (BUSM) performed a detailed analysis examining the levels of different molecules and break down products--known as metabolites in the blood as well as within blood clots from experimental tumor models. They discovered increased blood levels of two molecules called kynurenine and indoxyl sulfate, both of which are metabolites of the amino acid, tryptophan, often a dietary component. These high levels of kynurenine and indoxyl sulfate were also associated with increased blood clot size in an experimental model.

The study shows that by pharmacologically inhibiting the aryl hydrocarbon receptor (AHR) pathway, a known target of indoxyl sulfate and kynurenine, they could reduce blood clot size, suggesting that this may be a target for future drug development.

According to researchers, this study suggests that Kynurenine and indoxyl sulfate might be key culprits in generating clot formation in patients with cancer via AHR signaling. Moreover, they may provide exciting new opportunities for treating and preventing these known complications in the future. "The importance for the patient is two-fold," explained corresponding author Vipul Chitalia, MD, PhD, associate professor of medicine at BUSM. "First, these metabolites can be measured in the blood of patients with cancer and can potentially guide us in predicting risk of deep vein thrombosis. Also, the signaling pathway triggered by these metabolites can be potentially inhibited by compounds that can be developed in the future as a drug for this complication."

"In addition, dietary modifications could also be considered in such cases," adds coauthor Katya Ravid, DSc, professor of medicine and biochemistry at BUSM.

Credit: 
Boston University School of Medicine

Discovery paves the way for blocking malaria transmission in Brazil

The bacteria that form the gut microbiota influence important processes of the human body, such as digestion, nutrient absorption, and defense against pathogens. The same type of relationship is present in most animals, including in the Anhopheles darlingi mosquito, the main vector of malaria in Brazil.

In the case of this insect, the composition of the gut microbiota appears to determine the susceptibility to infection by Plasmodium vivax - the species responsible for 90% of malaria cases in Brazil. That is, when the mosquito bites a sick human, an interaction occurs between the parasite and the insect's intestinal bacteria, which is crucial for the disease transmission cycle to continue.

This was the conclusion of a study conducted at São Paulo State University (UNESP), which will be presented this Friday (22/11) in Lyon, in France, during the FAPESP Week France symposium. According to the researchers responsible, the discovery enables strategies to be devised for blocking malaria transmission in the vector.

"We discovered that, in the gut of Anopheles, the parasitic load has an influence on the composition of the microbiota and vice-versa. After further investigating the parasite-bacteria relationship, by incorporating microbiota composition data into genetic analyses relating to mosquito immunity, we intend to carry out gene silencing studies. The aim is to develop mosquitoes that are immune to Plasmodium vivax, that is, which do not get infected and, consequently, do not transmit the parasite to humans," said Jayme Augusto de Souza-Neto, a professor at the Bioprocesses and Biotechnology Department of the Faculty of Agricultural Sciences at UNESP in Botucatu and coordinator of the project, supported by São Paulo Research Foundation - FAPESP.

The immune system is the key

This is the first study to analyze the transcriptome (the set of genes that is expressed) and microbiology of Anhopheles darlingi infected by Plasmodium vivax in an integrated way.

A previous study, conducted by another group of scientists using Anopheles gambiae mosquitoes infected by protozoa of the Plasmodium falciparum species, demonstrated that the microbiota influences the parasite's development inside the mosquito. By comparing insects with and without bacteria in the gut, it was found at the time that the microbiota as a whole interferes in the development of the protozoa. It was discovered that, when there are no bacteria (when they are eliminated with antibiotics, for example), Plasmodium tends to develop more easily in the insect's body.

The work carried out at UNESP makes advances by demonstrating that not only the presence of bacteria in the gut but, above all, the composition of that microbiota appears to play a determining role in the intensity of the infection.

"In the groups of mosquitoes with low parasitic infection, we also observed a low quantity of bacteria and a high immune response. In the groups with high parasitic infection, there was a high quantity of bacteria and a low immune response," recounted Souza-Neto.

The researchers also compared the transcriptional responses (the gene expression profile) as well as the load and composition of the gut microbiota of the mosquitoes. "There is a difference in microbiota composition between the groups of insects with high and low parasitic loads. This is probably related with immune response, which is different in these two groups," he said.

The microbiota of the mosquitoes studied was basically composed of varied strains of two families of bacteria: Enterobacteria and Flavobacteria. "There is a dynamic. When the parasitic load increases, some specific bacteria become more abundant and others less so. They appear to act in this process in a highly coordinated way," the researcher reported.

According to Souza-Neto, as the immune response is shared between the bacteria and parasite, the defense against the parasite also reaches the bacteria and vice-versa. "From observing this bacteria-parasite interaction we perceived that, in general, the bacterial and parasitic loads follow exactly the same tendency. The explanation seems to be related to the expression profile of the genes linked to the mosquito's immune system," he said.

"The transcriptome was associated with the mosquito's complement system [proteins that form part of the immune system of invertebrates]. Previous studies have already related the response against the parasite with the complement system. Our interest lies in finding genes that, when superexpressed, make the mosquito refractory to infection by the protozoa, so that it also cannot transmit the parasite to humans," he said.

Another possible explanation would lie in the microbiota's response to the parasite. "The bacteria produce proteins, metabolites, or molecules with antiparasitic action. Reactive oxygen species, such as hydrogen peroxide, may help kill the Plasmodium. This direct action may occur independently and simultaneously to the mosquito's immune system," he said.

The discovery enables population modification strategies to be developed in the future, such as releasing transgenic mosquitoes that are immune to the malaria parasite into nature. The approach is different from population suppression, recently attempted in the fight against dengue, which involves releasing sterile females of the Aedes aegypti species.

"This strategy would be especially interesting for Brazil, where Anhopheles darlingi is the main vector of malaria, but also for other countries in South America," said Souza-Neto.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

How the brain decides to punish or not

image: Brain map of significant ALE values for social punishment. Left?=?left. Coordinates are presented in Talairach space.

Image: 
Oksana Zinchenko, 'Brain Responses to Social Punishment: a Meta-Analysis.' Sci Rep 9, 12800 (2019) doi:10.1038/s41598-019-49239-1 https://www.nature.com/articles/s41598-019-49239-1

Oksana Zinchenko, Research Fellow at the Institute of Cognitive Neuroscience, HSE University, has conducted meta-analysis of 17 articles to find out which areas of the brain are involved decision-making for rendering social punishment. It would appear that in case of both victims of violations as well as witnesses, punishment decisions activate the brain regions responsible for focusing one's attention, processing information, and responding effectively to social interaction. The findings of the study were published in Scientific Reports.

Social punishment is necessary in order to maintain order and cooperation in society. In their everyday lives, people who have committed wrongdoings may face reprimand or rejection. A decision to invoke punishment may be implemented by a person who was affected because of such a violation of norms ('second-party punishment'), or by a neutral person, who nevertheless knows about the norm violation ('third-party punishment'). It used to be a known fact that certain brain areas activate in victims of violations as well as in witnesses in response to different forms of social punishment. However, it was not entirely clear to date which areas were activated in particular.

A typical game for the study of social punishment is the Ultimatum where one test subject makes a decision about how much of the amount given to him or her will be given to another subject. The participant is free to divide it up as he or she likes, even keeping the entire amount. If the second participant finds the decision unfair, they can punish the offender (for example, reject the proposed division), i.e. execute 'second party punishment'. Alternatively, the punishment can be invoked by the third test subject, the witness of the transaction, which will constitute third-party punishment.

Oksana Zinchenko employed activation likelihood estimation (ALE) to analyze data on the brain activity of 383 participants of 17 studies devoted to the subject of social punishment. The participants were either playing the Ultimatum game or were engaged in other types of strategic games simulating norm-violating events that would result in a social punishment. While the participants were performing these tasks, the researchers applied functional Magnetic Resonance Imaging (fMRI) to record their brain activity.

The analysis revealed that such areas of the brain as the bilateral claustrum (upon activation, spreading to the insular cortex), the left superior frontal and right interior frontal gyri were always activated for social punishment tasks. These areas related to either the salience network or central-executive network of the brain. These neuron systems are responsible for focusing attention, detecting errors, and processing contextual information - all essential components for punishment decision-making. The right interior frontal gyrus is regarded as a key region in the brain's 'emotional empathy network', required for adequate responses to various social interactions. As for the left superior frontal gyrus, its main function is believed to store information in the working memory during decision-making processes.

However, the meta-analysis revealed no concordant activation in other brain regions, including those corresponding with the mentalizing network, which operate in a different way with respect to second-party and third-party punishments. This network is responsible for evaluating a wrongdoer's intentions. Some regions of this network may be triggered differently, depending on the type of punishment under consideration.

The researchers have yet to perform a more in-depth analysis of the differences in the brain's responses to various types of social punishment. Meanwhile, we can better understand what mechanisms underlie social control and people's ability to cooperate by studying the similarities in information processing related to social punishment.

Credit: 
National Research University Higher School of Economics

Predicting vulnerability to Alzheimer's disease and delirium

Boston, Mass. - Marked by acute temporary confusion, disorientation and/or agitation, postoperative delirium is the most common post-surgical complication in older adults, striking as many as half of adults older than 65 who undergo high-risk procedures such as cardiac surgery or hip replacements. Postoperative delirium is also tightly linked to Alzheimer's disease (AD). Although each can occur independently, Alzheimer's is a leading risk factor for delirium, and an episode of delirium puts patients at increased risk for cognitive decline and Alzheimer's disease. However, the physiological mechanisms linking delirium and Alzheimer's disease remain largely unknown.

A paper published today in Alzheimer's & Dementia: The Journal of the Alzheimer's Association, researchers at Beth Israel Deaconess Medical Center (BIDMC) shed new light on a genetic risk factor for Alzheimer's disease that may indirectly influence patients' risk of postoperative delirium. In a study of older adults without dementia undergoing major non-cardiac surgery, researchers observed that patients carrying a specific variant of a gene appeared to be much more vulnerable to delirium under certain conditions than people without this genetic variant. The team's findings could open the door to future interventions to prevent or mitigate postoperative delirium in at-risk patients.

"Our findings confirmed our hypothesis that patients' risk of postoperative delirium differs by genetic predisposition," said Sarinnapha M. Vasunilashorn, PhD, an Assistant Professor of Medicine in the Division of General Medicine at BIDMC. "We observed a strong and significant association between high postoperative inflammation and delirium incidence, duration and severity among patients carrying a variant of the gene considered to be risky, while the association was weaker and non-significant among non-carriers."

Vasunilashorn and colleagues focused on a gene called APOE (short for apolipoprotein E). The risky version of the gene - notated as APOE ?4 - is the strongest known genetic risk factor for late-onset Alzheimer's disease and a widely studied genetic risk marker for delirium. While recent studies have shown no direct relationship between APOE ?4 and delirium, Vasunilashorn's team hypothesized that the gene variant might indirectly influence risk of delirium by modifying the body's response to inflammation - part of the immune system's natural defense system - indicated by the presence of an inflammatory marker in the blood called CRP (C-reactive protein).

Using data from the Successful Aging after Elective Surgery (SAGES) study, an ongoing prospective cohort study investigating risk factors and long-term outcomes of delirium, the scientists looked at the incidence, severity and duration of delirium in 560 patients 70 years or older who underwent major non-cardiac surgeries under general or spinal anesthesia. Patients were monitored for delirium, assessed by daily cognitive assessments of patients' attention, memory and orientation throughout their hospital stay.

Analyzing data from patients' blood (drawn before surgery, immediately after surgery, two days after and one month after) revealed that, among carriers of the APOE ?4 gene variant, patients with high levels of inflammation had an increased risk of postoperative delirium. However, among non-carriers of the APOE ?4 gene variant, the scientists found no such association.

"Our findings suggest that APOE ?4 may be an indicator of brain vulnerability," said Vasunilashorn, who also holds appointments at Harvard Medical School, and the Harvard T.H. Chan School of Public Health. "This work may inform the targeting of future interventions, such as anti-inflammatory treatments, for prevention of postoperative delirium and its associated adverse long-term cognitive outcomes in patients with this genetic susceptibility."

Credit: 
Beth Israel Deaconess Medical Center

Bacteria-infecting viruses bind mucosal surface and protect from disease

image: Helium ion microscope image of bacterial growth.

Image: 
Gabriel Almeida/University of Jyväskylä

Mucosal surfaces protect organisms from external stressors and disease. Bacteriophages, viruses that infect bacteria, have been shown to preferentially bind to mucosal surfaces. This has been suggested to provide an extra level of immunity against bacterial infections. Researchers at the University of Jyväskylä, Finland tested this idea using fish, phages (viruses) and a fish-infecting bacteria. Phages were confirmed to bind to the mucosal surface, staying there for days and give protection from subsequent bacterial infection. Research was published in mBio in November 2019.

The mucosal surfaces are important for protection of tissues and homeostasis, but often targeted by disease-causing bacteria. Phages have been suggested to specifically bind to host mucosal surfaces and prevent colonization by pathogenic bacteria. In this symbiotic model phage populations are enriched in the mucus, a substrate in which encounters with their bacterial hosts are more probable, while the animal benefits from protection against invading bacteria.

Researchers at the University of Jyväskylä tested this idea using rainbow trout, phages (viruses) and a fish-infecting bacterium (Flavobacterium columnare). Phages were found to bind to fish mucosa, and maintain there for several days. Phages bound in mucus also protected the fish from diseases, although the pathogenic bacteria had a strong chemotaxis towards mucus, and exposure to mucosal molecules made them more virulent.

However, the mucosal environment made the bacteria more susceptible for phage infections, revealing a new aspect of the tripartite interactions between mucosal surfaces, bacteria and phages.

In conclusion, the mucosal environment influence both bacteria and phages. These interactions are important for understanding disease ecology and has significant impact in preventive phage therapy approaches.

Credit: 
University of Jyväskylä - Jyväskylän yliopisto

NASA examines tropical storm Fung-Wong's rainfall

image: The GPM core satellite passed over Tropical Storm Fung-Wong on Nov. 22 at 3:08 a.m. EST (0808 UTC). Heaviest rainfall (pink) was north of the center falling at a rate of 1.6 inches (40 mm) per hour. Another area far north of the center showed rainfall occurring at a rate of 1 inch (25 mm) per hour (red). Light rain (blue) was found throughout the rest of the storm.

Image: 
NASA/JAXA/NRL

NASA analyzed Tropical Storm Fung-Wong's rainfall and found two small areas of moderate to heavy rainfall, despite being battered by strong wind shear.

NASA has the unique capability of peering under the clouds in storms and measuring the rate at which the rain is falling. Global Precipitation Measurement mission or GPM core passed over Fung-Wong from its orbit in space and measured rainfall rates throughout the storm on Nov. 22 at 3:08 a.m. EST (0808 UTC).

Heaviest rainfall was being pushed north of the center where it was falling at a rate of 1.6 inches (40 mm) per hour. Another area far north of the center showed heavy rainfall occurring at a rate of 1 inch (25 mm) per hour. Light rain was found throughout the rest of the storm.

In general, wind shear is a measure of how the speed and direction of winds change with altitude. Tropical cyclones are like rotating cylinders of winds. Each level needs to be stacked on top each other vertically in order for the storm to maintain strength or intensify. Wind shear occurs when winds at different levels of the atmosphere push against the rotating cylinder of winds, weakening the rotation by pushing it apart at different levels. Winds from the south were pushing against the storm and displacing the heaviest rainfall north of the center.

Seven hours later by 10 a.m. EST, the Joint Typhoon Warning Center noted that Fung-Wong had become devoid of the heavy rainfall that GPM found earlier. That's an indication that the storm is continuing to weaken under the wind shear.

On Nov. 22 at 10 a.m. EST (1500 UTC), despite the wind shear, Tropical Storm Fung-Wong was holding onto tropical storm status with maximum sustained winds near 35 knots (40 mph/65 kph). Fung-Wong was located near latitude 24.8 degrees north and longitude 125.3 degrees east about 169 miles southwest of Kadena Air Base, Okinawa Island, Japan.

Fung-Wong is moving north-northeast and is expected to dissipate within 24 hours.

Typhoons and hurricanes are the most powerful weather event on Earth. NASA's expertise in space and scientific exploration contributes to essential services
provided to the American people by other federal agencies, such as hurricane weather forecasting.

Credit: 
NASA/Goddard Space Flight Center

In a first for cell biology, scientists observe ribosome assembly in real time

LA JOLLA, CA - A team of scientists from Scripps Research and Stanford University has recorded in real time a key step in the assembly of ribosomes--the complex and evolutionarily ancient "molecular machines" that make proteins in cells and are essential for all life forms.

The achievement, reported in Cell, reveals in unprecedented detail how strands of ribonucleic acid (RNA), cellular molecules that are inherently sticky and prone to misfold, are "chaperoned" by ribosomal proteins into folding properly and forming one of the main components of ribosomes.

The findings overturn the longstanding belief that ribosomes are assembled in a tightly controlled, step-wise process.

"In contrast to what had been the dominant theory in the field, we revealed a far more chaotic process," says James R. Williamson, PhD, a professor in the Department of Integrative Structural & Computational Biology at Scripps Research. "It's not a sleek Detroit assembly line--it's more like a trading pit on Wall Street."

For the study, Williamson's lab collaborated with the lab of Joseph Puglisi, PhD, a professor at Stanford University. Although the work is a significant feat of basic cell biology, it should enable important advances in medicine. For example, some current antibiotics work by inhibiting bacterial ribosomes; the new research opens up the possibility of designing future antibiotics that target bacterial ribosomes with greater specificity--and thus, fewer side effects.

More generally, the research offers biologists a powerful new approach to the study of RNA molecules, hundreds of thousands of which are active at any given time in a typical cell.

"This shows that we now can examine in detail how RNAs fold while they are being synthesized and proteins are assembling on them," says first author Olivier Duss, PhD, a postdoctoral research fellow in the Department of Integrative Structural & Computational Biology at Scripps Research. "This has been a very difficult thing to study in biology because it involves several distinct biological processes that are dependent on each other and have to be detected simultaneously."

The team used an advanced imaging technology called "zero-mode waveguide single-molecule fluorescence microscopy," which they have adapted in recent years for real-time tracking of RNAs and proteins. Ribosomes are made of both RNA and proteins, reflecting a molecular partnership that is widely believed to go back nearly to the dawn of life on Earth.

In a proof-of-principle study published last year, the researchers used their approach to record an early, brief and relatively well-studied stage of ribosome assembly from the bacterium E. coli. This involved the transcription, or copying out from its corresponding gene, of a ribosomal RNA, and initial interactions of this RNA strand with a ribosomal protein.

In the new study, the team extended this approach by tracking not only the transcription of a ribosomal RNA but also its real-time folding. The work provided a detailed look at a complex, and until-now mysterious, part of E. coli ribosome assembly--the formation of an entire major component, or domain, of the E. coli ribosome, with assistance from eight protein partners that end up incorporated into the structure.

A key finding was that the ribosomal protein partners guide the folding of the RNA strand through multiple temporary interactions with the strand, well before they nestle into their final places in the folded RNA-protein molecule. The findings, according to the researchers, also hint at the existence of unknown RNA assembly factors, most likely proteins, that were not present in their lab-dish-type imaging experiments but are present in cells and boost the efficiency of RNA folding.

"Our study indicates that in ribosomal RNA-folding, and perhaps more generally in RNA-folding in cells, many proteins help fold RNA though weak, transient and semi-specific interactions with it," Duss says.

The team will now be able to extend this research further to study not only the rest of ribosome assembly, which involves multiple RNA strands and dozens of proteins, but also the many other types of RNA-folding and RNA-protein interaction in cells.

In principle, this research will yield insights into how RNAs misfold and how such events could be corrected. Scientists believe that many diseases involve or potentially involve the improper folding and related processing of RNAs in cells.

Treatments that already target ribosomes might also be improved. Some current antibiotics, including a class known as aminoglycosides, work by binding to sites on bacterial ribosomes that are not present on human ribosomes. These drugs can have side effects because they also impair the ribosomes of good bacteria, for example in the gut.

"When we understand more fully how bacterial ribosomes assemble and function, we could potentially target them in ways that affect a narrower group of harmful bacterial species and spare the good ones, reducing side effects for patients," Duss says.

Because ribosomes function as protein makers, they are also crucial to the survival of fast-growing tumor cells. Several classes of cancer drug already work by slowing ribosome formation in one way or another. A better understanding of the human ribosome would, in principle, enable its assembly to be targeted more precisely and potently to block cancer growth, Duss notes.

Credit: 
Scripps Research Institute

Estimating how self-reported hearing trouble varied among older adults

What The Study Did: Researchers used nationally representative survey data from adults 60 or older to estimate how self-reported hearing trouble varied across sociodemographic characteristics and by actual hearing loss.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

Authors: Adele M. Goman, Ph.D., of the Johns Hopkins Bloomberg School of Public Health in Baltimore, is the corresponding author.

(doi:10.1001/jamaoto.2019.3584)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Big plans to save the planet depend on nanoscopic materials improving energy storage

image: Nanomaterials will be key components for enabling wearable technology, according to an international team of researchers whose comprehensive report on the future of the field was published in science this week.

Image: 
Drexel University

The challenge of building an energy future that preserves and improves the planet is a massive undertaking. But it all hinges on the charged particles moving through invisibly small materials.

Scientists and politicians have recognized the need for an urgent and substantial shift in the world's mechanisms of energy production and consumption in order to arrest its momentum toward environmental cataclysm. A course correction of this magnitude is certainly daunting, but a new report in the journal Science suggests that the technological path to achieving sustainability has already been paved, it's just a matter of choosing to follow it.

The report, authored by an international team of researchers, lays out how research in the field of nanomaterials for energy storage over the last two decades has enabled the big step that will be necessary to make use of sustainable energy sources.

"Most of the biggest problems facing the push for sustainability can all be tied back to the need for better energy storage," said Yury Gogotsi, PhD, Distinguished University and Bach professor at Drexel University's College of Engineering and lead author of the paper. "Whether it's a wider use of renewable energy sources, stabilizing the electric grid, managing the energy demands of our ubiquitous smart and connected technology or transitioning our transportation toward electricity - the question we face is how to improve the technology of storing and disbursing energy. After decades of research and development, the answer to that question may be offered by nanomaterials."

The authors present a comprehensive analysis on the state of energy storage research involving nanomaterials and suggest the direction that research and development must take for the technology to achieve mainstream viability.

The Jam

Most all plans for energy sustainability - from the Green New Deal to the Paris Agreement, to the various regional carbon emissions policies - assert the need to reign in energy consumption while also tapping into new renewable sources, like solar and wind power. The bottleneck for both of these efforts is the need for better energy storage technology.

The problem with integrating renewable resources into our energy grid is that it's difficult to manage energy supply and demand given the unpredictable nature of...nature. So, massive energy storage devices are necessary to accommodate all the energy that is generated when the sun is shining and the wind is blowing and then be able to disburse it quickly during high energy-use periods.

"The better we become at harvesting and storing energy, the more we'll be able to use renewable energy sources that are intermittent in nature," Gogotsi said. "Batteries are like the farmer's silo - if it's not large enough and constructed in a way that will preserve the crops, then it might be difficult to get through a long winter. In the energy industry right now, you might say we're still trying to build the right silo for our harvest - and that's where nanomaterials can help."

The Fix

Unstopping the energy-storage logjam has been a concerted goal for scientists who apply engineering principles to creating and manipulating materials at the atomic level. Their efforts in the last decade alone, which were highlighted in the report, have already improved the batteries that power smartphones, laptops and electric cars.

"Many of our greatest achievements in energy storage in recent years are thanks to the integration of nanomaterials," Gogotsi said. "Lithium-ion batteries already use carbon nanotubes as conductive additives in battery electrodes to make them charge faster and last longer. And an increasing number of batteries use nano-silicon particles in their anodes for increasing the amount of energy stored.

Introduction of nanomaterials is a gradual process and we will see more and more nanoscale materials inside the batteries in the future."

Battery design, for a long time, has been based primarily on finding progressively better energy materials and combining them to store more electrons. But, more recently, technological developments have allowed scientists to design the materials of energy storage devices to better serve these transmission and storage functions.

This process, called nanostructuring, introduces particles, tubes, flakes and stacks of nanoscale materials as the new components of batteries, capacitors and supercapacitors. Their shape and atomic structure can speed the flow of electrons - the heartbeat of electrical energy. And their ample surface area provides more resting places for the charged particles.

The effectiveness of nanomaterials has even allowed scientists to rethink the basic design of batteries themselves. With metallically conducting nanostructured materials ensuring that electrons can freely flow during charge and discharge, batteries can lose a good bit of weight and size by eliminating metal foil current collectors that are necessary in conventional batteries. As a result, their form is no longer a limiting factor for the devices they're powering.

Batteries are getting smaller, charging faster, lasting longer and wearing out slowly - but they can also be massive, charge progressively, store huge amounts of energy for long periods of time and distribute it on-demand.

"It is a very exciting time to work in the area of nanoscale energy storage materials," said Ekaterina Pomerantseva, PhD, an associate professor in the College of Engineering and coauthor of the paper. "We now have more nanoparticles available than ever - and with different compositions, shapes and well-known properties. These nanoparticles are just like Lego blocks, and they need to be put together in a smart way to produce an innovative structure with performance superior of any current energy storage device. What makes this task even more captivating is the fact that unlike Legos, it is not always clear how different nanoparticles can be combined to create stable architectures. And as these desired nanoscale architectures become more and more advanced, this task becomes more and more challenging, triggering the critical thinking and creativity of scientists."

The Future

Gogotsi and his coauthors suggest that capitalizing on the promise of nanomaterials will require some manufacturing processes to be updated and continued research on how to ensure the materials' stability as their size is scaled up.

"The cost of nanomaterials compared to conventional materials is a major obstacle, and low-cost and large-scale manufacturing techniques are needed," Gogotsi said. "But this has already been accomplished for carbon nanotubes with hundreds of tons manufacturing for needs of battery industry in China. Preprocessing the nanomaterials in this way would allow the use of current battery manufacturing equipment."

They also note that the use of nanomaterials would eliminate the need for certain toxic materials that have been key components in batteries. But they also suggest establishing environmental standards for future development of nanomaterials.

"Whenever scientists consider new materials for energy storage, they should always take into account toxicity to humans and environment, also in case of accidental fire, incineration or dumping into waste," Gogotsi said.

What this all means, according to the authors, is that nanotechnology is making energy storage versatile enough to evolve with the shift in energy sourcing that forward-looking policies are calling for.

Credit: 
Drexel University

New machine learning algorithms offer safety and fairness guarantees

image: Philip Thomas at UMass Amherst, with colleagues there and at Stanford, says they say they hope that machine learning researchers will go on to develop new and more sophisticated algorithms using a new framework they developed based on reasoning probabilistically about safety, which can be used responsibly for applications where machine learning used to be considered too risky. They also call on others to conduct research in this space.

Image: 
UMass Amherst

AMHERST, Mass. - Seventy years ago, science fiction writer Isaac Asimov imagined a world where robots would serve humans in countless ways, and he equipped them with built-in safeguards ¬now known as Asimov's Three Laws of Robotics, to prevent them, among other goals, from ever harming a person.

Guaranteeing safe and fair machine behavior is still an issue today, says machine learning researcher and lead author Philip Thomas at the University of Massachusetts Amherst. "When someone applies a machine learning algorithm, it's hard to control its behavior," he points out. This risks undesirable outcomes from algorithms that direct everything from self-driving vehicles to insulin pumps to criminal sentencing, say he and co-authors.

Writing in Science, Thomas and his colleagues Yuriy Brun, Andrew Barto and graduate student Stephen Giguere at UMass Amherst, Bruno Castro da Silva at the Federal University of Rio Grande del Sol, Brazil, and Emma Brunskill at Stanford University this week introduce a new framework for designing machine learning algorithms that make it easier for users of the algorithm to specify safety and fairness constraints.

"We call algorithms created with our new framework 'Seldonian' after Asimov's character Hari Seldon," Thomas explains. "If I use a Seldonian algorithm for diabetes treatment, I can specify that undesirable behavior means dangerously low blood sugar, or hypoglycemia. I can say to the machine, 'while you're trying to improve the controller in the insulin pump, don't make changes that would increase the frequency of hypoglycemia.' Most algorithms don't give you a way to put this type of constraint on behavior; it wasn't included in early designs."

"But making it easier to ensure fairness and avoid harm is becoming increasingly important as machine learning algorithms impact our lives more and more," he says.

However, "a recent paper listed 21 different definitions of fairness in machine learning. It's important that we allow the user to select the definition that is appropriate for their intended application," he adds. "The interface that comes with a Seldonian algorithm allows the user to do just this: to define what 'undesirable behavior' means for their application."

In Asimov's Foundation series, Seldon is in the same universe as his Robot series. Thomas explains, "Everything has fallen apart, the galactic empire is collapsing, partly because the Three Laws of Robotics require certainty. With that level of safety required, robots are paralyzed with indecision because they cannot act with certainty and guarantee that no human will be harmed by their actions."

Seldon proposes fixing this by turning to reasoning probabilistically about safety. "That's a good fit to what we're doing, Thomas says. The new approach he and colleagues provide allows for probabilistic constraints and requires the algorithm to specify ways the user can tell it what to constrain. He says, "The framework is a tool for the machine learning researcher. It guides them toward creating algorithms that are easier for users to apply responsibly to real-world problems."

To test the new framework, they applied it to predict grade point averages in a data set of 43,000 students in Brazil by creating a Seldonian algorithm with constraints. It successfully avoided several types of undesirable gender bias. In another test, they show how an algorithm could improve the controller in an insulin pump while guaranteeing that it would not increase the frequency of hypoglycemia.

Thomas says, "We believe there's massive room for improvement in this area. Even with our algorithms made of simple components, we obtained impressive results. We hope that machine learning researchers will go on to develop new and more sophisticated algorithms using our framework, which can be used responsibly for applications where machine learning used to be considered too risky. It's a call to other researchers to conduct research in this space."

Credit: 
University of Massachusetts Amherst

Fish in California estuaries are evolving as climate change alters their habitat

image: The threespine stickleback is found throughout the coastal areas of the Northern Hemisphere.

Image: 
Courtesy of E. Palkovacs

The threespine stickleback, a small fish found throughout the coastal areas of the Northern Hemisphere, is famously variable in appearance from one location to another, making it an ideal subject for studying how species adapt to different environments. A new study shows that stickleback populations in estuaries along the coast of California have evolved over the past 40 years as climate change has altered their coastal habitats.

The study, published November 21 in Global Change Biology, looked at variation in the armoring that protects the stickleback from predators, specifically the number of bony plates along their sides (called lateral plates). Previous research showed that populations in northern California have a more complete set of this armoring than populations in southern California, corresponding to differences in their habitats.

"There's a gradient from drier systems in the south, where the estuaries are more pond-like, with more vegetation, to increasingly more open, river-like systems as you go north," explained coauthor Eric Palkovacs, associate professor of ecology and evolutionary biology at UC Santa Cruz.

The new study found that threespine stickleback in some California estuaries are evolving to have fewer lateral plates as their habitats become more pond-like due to a warmer, drier climate. Stickleback populations at some central California sites are now looking more like the low-plated populations typical of southern California.

"The lateral plates provide armoring, but the downside is it costs energy to build them and they can limit maneuverability," Palkovacs said. The vegetation and other structural features found in slow-moving water are thought to increase the need for maneuverability and decrease the need for armoring because the stickleback have more places to escape or hide from predators.

Palkovacs and first author Simone Des Roches, who led the study as a postdoctoral researcher in his lab at UC Santa Cruz, took advantage of an opportunity to study historical specimens of threespine stickleback from surveys conducted along the California coast by coauthor Michael Bell in the 1970s. Bell, who has spent most of his career studying sticklebacks, recently retired from Stony Brook University in New York and is now affiliated with the Museum of Paleontology at UC Berkeley.

"It's a real treat for me toward the end of my career to have collaborators who are able to get added value out of these collections," Bell said.

Des Roches and Palkovacs resurveyed the same sites where Bell had collected stickleback specimens in the 1970s and early 80s. They found that the frequency of low-plated stickleback has generally increased over time. Furthermore, the southern limit of the distribution of completely-plated stickleback has moved northward.

Previous research had already established the genetic basis of the differences seen in the lateral plates of threespine stickleback, and genetic testing of the fish Des Roches collected confirmed this.

"We know the genetics underpinning this trait and how it correlates with the environment, so we have these well established relationships of the genes to the morphological traits and the traits to the environment. That allows us to infer that what we're seeing is evolutionary change," said Des Roches, who is now at the University of Washington.

She added that, although this study focused on the lateral plates, other traits are probably also evolving in response to climate-driven changes in the habitat. For example, the fish Des Roches collected in the Big Sur River in 2017 not only have fewer lateral plates than the ones Bell collected in 1974, their overall morphology is also strikingly different.

"Their head shape and mouth shape are completely different, which is consistent with eating different prey. We didn't look at those traits in this study, but I have an idea of what you might find if you did," Des Roches said.

Bell said it's not surprising that stickleback populations in California are evolving in response to climate change, because his previous research had shown how rapidly this species can evolve. He has done extensive research in Alaska looking at what happens when sea-run stickleback colonize freshwater lakes.

"In Alaska, stickleback go from the morphology of sea-run fish to that of freshwater fish in less than ten generations. The difference is that these populations in California are staying in the same place and the environment is changing around them, causing them to evolve," Bell said.

The evidence that climate change is altering the genetic composition of natural populations is significant, he said. Other studies have found morphological changes in plants or animals associated with climate change, but very few have shown changes in traits for which the underlying genetic basis is well established.

Another important aspect of this study is that the changes in stickleback are occurring in response to a climate-driven transformation of their habitat. "We tend to focus on the direct effects on organisms of increased temperatures or changes in precipitation, but we also have to think about how climate change will impact the characteristics of the habitats those organisms rely on," Palkovacs said.

As more stickleback populations become dominated by fish with few lateral plates, the gene variant (or allele) for the completely-plated morphology could disappear from those populations, Des Roches said. This loss of genetic diversity could limit the adaptability of those populations in the future if, for example, a new predator were introduced into their environment.

"They might lack the genetic variation required to adapt, making them more vulnerable to extinction," Des Roches said.

Credit: 
University of California - Santa Cruz

New research finds signal of decreased early post transplant survival in new heart transplant system

MINNEAPOLIS, MN- November 20, 2019 - In an analysis of the new heart organ allocation system for transplant patients in the U.S., researchers have identified a signal of a decrease in heart transplant survival rates. The study, "An Early Investigation of Outcomes with the 2018 Donor Heart Allocation System in the United States," is published as a rapid communication in the Journal of Heart and Lung Transplantation.

For the first time in over a decade, modifications were made to the U.S. donor heart allocation system in October of 2018, aimed at better distinguishing the most medically urgent heart transplant candidates. The old system, in place since 2005, led to overcrowding of the list, prolonged waiting times and consequent inequity in allocation across geographic regions. The new system was envisioned to allow more equitable organ allocation while providing an overall benefit to patients awaiting heart transplantation.

"This is an early trend, however, it is concerning," said lead author Rebecca Cogswell, MD, who is an assistant professor at the University of Minnesota Medical School's Department of Medicine in the Division of Cardiology and medical director of mechanical circulatory support with M Health Fairview. Cogswell and colleagues at the U of M and several institutions across the U.S., including Brigham and Women's Hospital and Harvard Medical School, undertook an early look at outcomes as a result of the new allocation system.

"This early look is similar to the kind of surveillance that occurs in large clinical trials to ensure safety," Cogswell explained.

The authors found that the changed allocation system has resulted in an increase of sicker patients being transplanted with greater frequency as intended, however, unintended consequences are emerging. Organs are being retrieved from longer distances, and fewer patients supported on durable left ventricular assist devices are receiving heart transplants in the U.S.

"The increase in mortality appears to be driven by the fact that patients who are receiving hearts are sicker than in the previous system," Cogswell reported.

The researchers found the waitlist mortality has decreased in the new system. Cogswell explained, "As waitlist mortality in the previous system was relatively low, the absolute impact of this reduction in waitlist mortality is small compared to the increase in death after transplantation that we are observing in this early examination of the new system."

"If these early observations of a substantial decline in heart transplant survival persist, and we certainly hope that they do not, several programs will be under stress for their very survival," said Mandeep R. Mehra, MD, senior author of this study, who is executive director of the Center for Advanced Heart Disease at Brigham and Women's Hospital and a professor of Medicine at Harvard Medical School.

Cogswell stated that more data will be needed to confirm these trends and to inform policy changes.

"As a community, we have a responsibility to look at this data at regular intervals to determine if we need to implement changes sooner rather than later," Cogswell emphasized.

Credit: 
University of Minnesota Medical School