Tech

B-cell enrichment predictive of immunotherapy response in melanoma, sarcoma and kidney cancer

HOUSTON -- The likelihood of a patient responding to immune checkpoint blockade may depend on B cells in the tumor, located within specialized immune-cell clusters known as tertiary lymphoid structures (TLS), according to researchers at The University of Texas MD Anderson Cancer Center.

Studies published today in Nature conclude that enrichment of B cells, a type of immune cell known for producing antibodies, in TLS was predictive of response to checkpoint blockade in patients with melanoma, soft-tissue sarcomas and renal cell carcinomas (RCC).

Checkpoint inhibitors offer the potential for long-term survival to patients across many cancer types, but not all benefit equally. Researchers previously have identified several useful biomarkers of response, which are helpful in identifying patients that may or may not benefit from checkpoint blockade.

The current studies conclude that the presence of B cells and their location within TLS, which act as a lymph node within the tumor, is critical for response to checkpoint blockade, suggesting a dynamic interaction between several components of the immune system.

Mature B cells in tumors of responders suggest active role in tumor immune response

An MD Anderson-led study found that B-cell markers were the most differentially expressed genes in responders relative to non-responders, and B cells in the tumors of responders appeared to be more mature and specialized. These findings were first presented at the 2019 American Association for Cancer Research Annual Meeting.

"These findings open up a whole new area -- that B cells are actually big drivers in cancer immunotherapy, specifically checkpoint blockade," said corresponding author Jennifer Wargo, M.D., professor of Genomic Medicine and Surgical Oncology. "This could lead us to important biomarkers for therapy response as well as potentially new therapeutic options."

The team analyzed samples from patients with advanced melanoma receiving neoadjuvant, or pre-surgical, checkpoint inhibitors as part of a clinical trial sponsored by MD Anderson's Melanoma Moon Shot®, part of the institution's Moon Shots Program®, a collaborative effort to accelerate scientific discoveries into clinical advances that save patients' lives.

The researchers also studied a group of patients with metastatic RCC being treated with neoadjuvant checkpoint blockade as part of a clinical trial led by Padmanee Sharma, M.D., Ph.D., professor of Genitourinary Medical Oncology and Immunology, and Jianjun Gao, M.D. Ph.D., associate professor of Genitourinary Medical Oncology.

Tumor samples were collected from patients at baseline and during treatment through the APOLLO platform, and detailed immune profiling was completed in part by the immunotherapy platform, both part of the Moon Shots Program.

In each cohort, the expression of B cell-related genes was significantly higher in responders and was predictive of response to checkpoint blockade. These findings were further corroborated in an analysis of curated melanoma samples from The Cancer Genome Atlas, in which high expression of B-cell markers was associated with significantly improved overall survival.

"These data indicate the importance of cell types other than T cells, such as B cells, in the anti-tumor immune responses generated by immune checkpoint therapies," said Sharma. "There is a great need to identify biomarkers of response to therapy, and these data may allow for future studies focused on developing composite biomarkers that represent both the T- and B-cell responses."

The researchers determined that B cells were localized in the TLS, and the density of B cells and TLS in the tumor was higher in responders. Further analysis of these infiltrating B cells showed that those in responders expressed more markers of mature and differentiated B cells, such as memory B cells and plasma cells.

"Through these studies, we find that B cells are not just innocent bystanders, but are themselves contributing in a meaningful way to the anti-tumor immune response," said first author Beth Helmink, M.D., Ph.D., fellow in Surgical Oncology.

Wargo also collaborated on another study published today, led by Göran Jönsson, Ph.D., and researchers at Lund University in Sweden, which analyzed an additional group of patients with metastatic melanoma and similarly suggests an important role for B cells within these lymphoid structures.

The researchers continue work to clarify the precise role for B cells in driving responses, but suggest they may be producing tumor-specific antibodies that could be leveraged for future therapeutic approaches to enhance checkpoint blockade.

A full list of collaborating authors, research support and disclosures can be found with the paper here.

Sarcoma patients with B-cell enrichment have improved survival and response rates

In a cancer type previously thought to be refractory to immunotherapy, profiling of soft-tissue sarcomas established five distinct classes of the disease that predict survival outcomes and response to checkpoint blockade. Those with the best outcomes were marked by enrichment of B cells within TLS in the tumor, according to results published today.

The study was led by Wolf Fridman, M.D., Ph.D., and a team from the French National Institute of Health and Medical Research together with Hussein Tawbi, M.D., Ph.D., associate professor of Melanoma Medical Oncology at MD Anderson.

"These results suggest there may be new ways of predicting responses to immunotherapy by including B cells as a novel biomarker," says Tawbi. "Perhaps most exciting is this also opens up the possibility for a therapeutic targeting of B cells in ways that could identify new avenues for treating these patients."

Soft-tissue sarcoma is a rare type of cancer that develops in soft tissues of the body, such as muscles and fat. This diverse group of cancers comprises more than 50 subtypes, classified by their appearance under a microscope, which doesn't yield tremendous insight into underlying biological behavior, explained Tawbi.

Thus, the researchers sought to characterize sarcomas by their immune characteristics by profiling expression of immune-related genes in more than 600 patient samples. The resulting classifications grouped sarcomas into five classes, ranging from "immune desert" tumors to "immune high" tumors.

Tumors with highest levels of immune markers had significantly longer overall survival when compared to "immune desert" sarcomas. The expression of B-cell markers was the strongest factor associated with survival in these patients.

A closer look at tumor samples revealed that TLS existed almost exclusively in the "immune high" tumors, and these structures had high densities of many immune cell types, including B cells.

To investigate correlations with response to checkpoint blockade, the researchers analyzed pre-treatment samples from patients enrolled in SARC028, a multi-center trial performed through the Sarcoma Alliance for Research through Collaboration (SARC) and led by Tawbi. Patients on this trial had metastatic soft-tissue sarcomas and were treated with checkpoint blockade against PD-1.

There were no responders among those with low expression of immune markers, but half of patients in the "immune high" class saw a response to checkpoint blockade. These patients also had a significantly improved progression-free survival compared to those in the "immune desert" classification.

"All of the patients that responded to checkpoint inhibitors did truly have those immune-high signatures, especially with enriched B cells, highlighting the fact that there might be a really important role for these cells in the response to immunotherapy," said Tawbi. "Based on these results, it may now be possible for us to identify more types of sarcomas for which we can use immunotherapy effectively."

The authors are working to validate these findings in a broader cohort of patients and to identify the underlying mechanisms for B cells acting in the tumor, but they suggest these findings can be used to build a novel risk-stratification tool for better utilizing immunotherapies in patients with sarcoma.

A full list of collaborating authors, research support and disclosures can be found with the paper here.

Credit: 
University of Texas M. D. Anderson Cancer Center

Irrigation alleviates hot extremes

Large-scale irrigation is one of the land management practices with the largest effect on climate conditions - and especially hot extremes - in various regions across the globe. Yet how the climatic effects of irrigation compare to those of global warming is largely unknown.

In a new study published in Nature Communications, an international team of researchers under the lead of Sonia Seneviratne, professor for Land-Climate Dynamics at ETH Zurich, has examined the influence of irrigation more closely. They used observational data and global climate simulations to isolate the climatic effects of irrigation from the warming induced by other natural and human climatic drivers, predominantly greenhouse gas emissions.

Cooling effect on hot days

The observational and model results consistently highlight a strong irrigation-induced cooling during warm extremes in intensely irrigated regions such as Southern Europe, North Africa, South Asia, and the United States. The research team found that the effect of global warming contributing to more frequent and intense heat extremes was partly or completely offset by the cooling effect of irrigation over these regions, and South Asia in particular. Over South Asia, irrigation locally reduced the likelihood of hot extremes by a factor of 2 - 8, with particularly strong effects over the Indo-Gangetic Plain.

"This means that, while global warming increases the likelihood of hot extremes almost globally, in some regions, irrigation expansion cancels or even reverses this effect", explains Wim Thiery who performed this research while he was a postdoc at the Institute for Atmospheric and Climate Science. In the meantime, he moved to the University of Brussels to take up an assistant professorship.

"In summary, we showed that irrigation expansion has regionally masked the historical warming of hot extremes from anthropogenic greenhouse gas emissions and all other climate drivers combined", adds Seneviratne.

Will the benefit continue?

While the irrigation-induced cooling is mostly limited to irrigation hotspots, these are often located in densely populated areas. Around one billion people currently benefit from this dampened increase in hot extremes because irrigation massively expanded - it more than quadrupled in area - throughout the 20th century. These results therefore highlight that irrigation substantially reduced human exposure to warming of hot extremes. However, it is questionable whether this benefit will continue towards the future.

Diminishing groundwater reserves (fossil water) and retreating glaciers e.g. in the Himalaya may decrease water supply for irrigation in the long term. "Besides a possible stagnation or even decrease in the global area being irrigated, agricultural water use may potentially become more efficient to meet sustainable development goals related to water resources availability, food security and biodiversity", Thiery points out. In that case the irrigation-induced cooling might level off, leading to accelerated warming across the irrigation hotspots. At the moment, this is still hypothetical; the researchers aim to address this question with their ongoing research.

Credit: 
ETH Zurich

Imprinted color patterns

Structural colors appear because the imprinted pattern on a surface changes the wavelengths of light. Chinese scientists have introduced an azopolymer that allows the imprinting of nanopatterns in a novel room-temperature lithographic process. A key aspect of the technique is the light-induced phase change of a novel azopolymer, explains the study published in the journal Angewandte Chemie. The process relies solely on light regulation and allows nanoimprinting even on flexible substrates.

Delicately structured surfaces are present in many relevant areas, including anticounterfeiting of banknotes and chip manufacturing. In the electronics industry, surface patterns, such as printed circuits, are created by photolithographic processes. Photolithography means that a photoresist, a polymeric material sensitive to ultraviolet (UV) light, is irradiated through a mask. The weakened areas are washed away, and the structures are finished by etching, imprinting, and other processes. To prepare the photoresist for UV-light irradiation, heating and cooling are important steps, which cause changes in the material behavior.

Unfortunately, materials tend to shrink upon cooling, which poses problems when nanosized patterns are desired. Therefore, Haifeng Yu and his colleagues from Peking University have developed a nanolithography process that works entirely at room temperature. Key to the method is a novel photoresist that changes its mechanical behavior solely by light irradiation. A heating step is no longer necessary. The new photoresist contains a chemical component called azobenzene, which switches from a straight "trans" into a bent "cis" form, and vice versa, when irradiated with light. This azobenzene, which is attached to the polymer backbone, causes the mechanochemical changes of the resulting azopolymer.

For pattern fabrication, the authors first liquefied the azopolymer layer coated on a flexible plastic surface by shining UV light on it. Then they pressed a transparent nanopatterned silicone sheet on the liquefied areas and irradiated the layers with visible light. This light induced hardening of the azopolymer, which adopted the template nanopattern. Then the scientists applied a photomask and irradiated the layers with UV light to re-liquefy the uncovered areas. For the final imprint, they pressed another nanopatterned sheet on the azopolymer structure and hardened the layers with visible light to obtain the finished nanopatterned coating layer. This technique is called "athermal nanoimprint lithography".

The nanopatterned surface appeared in multiple structural colors. Tiny letters or ornamental drawings changed their colors depending on the angle they were viewed at. According to the authors, the technique is not limited to structural colors. "It is adaptable to many other substrates like silicon wafers and other light-active materials," they say. The researchers envision applications in nanofabrication areas where heat-independent imprinting processes are required and phototunable materials have advantages.

Credit: 
Wiley

Nine researchers receive EMBO Installation Grants to establish independent laboratories

Heidelberg 15 January 2020 - EMBO is pleased to announce that nine life scientists have been selected to receive Installation Grants, which will support them in establishing independent laboratories in the Czech Republic, Poland, Portugal and Turkey.

"All nine researchers have demonstrated their ability to conduct outstanding scientific research, and we are delighted to welcome them to the EMBO community," says EMBO Director Maria Leptin.

The EMBO Installation Grants promote the international mobility of scientists, with applicants being required to have spent at least two consecutive years outside the country in which they are establishing their laboratory. The purpose of the grants is to encourage young group leaders to establish their laboratories in countries that find it difficult to compete scientifically with large well-funded research centres in other countries.

"It is our goal to promote excellence in life sciences across all thirty EMBC Member States and beyond," says Maria Leptin. "We look forward to supporting these talented life scientists at this important stage in their careers, and to contributing to a thriving life science community across all of Europe."

Installation Grantees receive funding of 50,000 euros per year for three to five years. They also become part of the EMBO Young Investigator Network, through which they gain access to a range of training and networking opportunities.

Funding and selection of Installation Grantees

The Installation Grants are funded primarily by the participating Member States of the European Molecular Biology Conference (EMBC), EMBO's intergovernmental funding body. The funding bodies of the member states that participated in the 2019 call are the Ministry of Education, Youth and Sports (Czech Republic), the Ministry of Science and Higher Education (Poland), Fundação para a Ciência e a Tecnologia (Portugal) and TÜBITAK (Turkey).

The EMBO Young Investigator Committee carries out the assessment and selection of the applicants and makes recommendations to the Strategic Development Installation Grant Board, which consists of representatives from the participating member states.

The 2019 Installation Grantees will establish five laboratories in Poland, two in the Czech Republic, and one each in Portugal and Turkey. Each country's funding body independently decides how many grants to fund each year.

Since the programme's inception in 2006, it has supported 112 group leaders. This year's grantees represent six nationalities. The next application deadline for EMBO Installation Grants is 15 April 2020.

More information about the programme is available online: https://www.embo.org/funding-awards/installation-grants

Installation Grantees 2019

Panagiotis Alexiou, CEITEC, Masaryk University, Brno, CZ, Deep Learning for Genomic and Transcriptomic Pattern Identification

Yusuke Azuma, Jagiellonian University, Krakow, PL, Engineered Bacterial Compartments for Protein Design and Production

Bahar Degirmenci Uzun, Bilkent University, Ankara, TR, Roles of primary cilia and the intestinal stem cell niche in obesity

Peter Draber, BIOCEV, First Faculty of Medicine, Charles University, Vestec, CZ, Search for new therapeutic targets to modulate the immune system

Rafal Mostowy, Jagiellonian University, Krakow, PL, Co-evolution between bacterial capsules and phage tails

Aleksandra Pekowska, Nencki Institute of Experimental Biology, Warsaw, PL, Evolutionary and Functional Genomics of Astrocytes

Pedro Sousa-Victor, Instituto de Medicina Molecular João Lobo Antunes, Lisbon, PT, Mechanisms of tissue ageing and repair: roadblocks to stem cell-based therapies

Grzegorz Sumara, Nencki Institute of Experimental Biology, Warsaw, PL, Signalling cascades in metabolic diseases

Piotr Szwedziak, ReMedy Unit, Warsaw University, PL, Cellular architecture of Archaea in the context of eukaryogenesis

Credit: 
EMBO

Toward safer disposal of printed circuit boards 

Printed circuit boards are vital components of modern electronics. However, once they have served their purpose, they are often burned or buried in landfills, polluting the air, soil and water. Most concerning are the brominated flame retardants added to printed circuit boards to keep them from catching fire. Now, researchers reporting in ACS Sustainable Chemistry & Engineering have developed a ball-milling method to break down these potentially harmful compounds, enabling safer disposal.

Composed of 30% metallic and 70% nonmetallic particles, printed circuit boards support and connect all of the electrical components of a device. Metallic components can be recovered from crushed circuit boards by magnetic and high-voltage electrostatic separations, leaving behind nonmetallic particles including resins, reinforcing materials, brominated flame retardants and other additives. Scientists have linked compounds in brominated flame retardants to endocrine disorders and fetal tissue damage. Therefore, Jujun Ruan and colleagues wanted to develop a method to remove the flame retardants from waste printed circuit boards so that they wouldn't contaminate the environment.

The researchers crushed printed circuit boards and removed the metallic components by magnetic and high-voltage electrostatic separations, as is typically done. Then, they put the nonmetallic particles into a ball mill - a rotating machine that uses small agate balls to grind up materials. They also added iron powder, which prior studies had shown was helpful for removing halogens, such as bromine, from organic compounds. After ball-milling, the bromine content on the surface of the particles had decreased by 50%, and phenolic resin compounds had decomposed. The researchers determined that during the ball-milling process, iron transferred electrons to flame retardant compounds, causing carbon-bromine bonds to stretch and break. 

The authors acknowledge funding from the 111 Project, the Natural Science Foundation of Guangdong Province, China and the Pearl River Star of Science and Technology.

The study is freely available as an ACS Editor's Choice article here.

For more research news, journalists and public information officers are encouraged to apply for complimentary press registration for the ACS Spring 2020 National Meeting & Exposition in Philadelphia.

The American Chemical Society (ACS) is a nonprofit organization chartered by the U.S. Congress. ACS' mission is to advance the broader chemistry enterprise and its practitioners for the benefit of Earth and its people. The Society is a global leader in providing access to chemistry-related information and research through its multiple research solutions, peer-reviewed journals, scientific conferences, eBooks and weekly news periodical Chemical & Engineering News. ACS journals are among the most cited, most trusted and most read within the scientific literature; however, ACS itself does not conduct chemical research. As a specialist in scientific information solutions (including SciFinder® and STN®), its CAS division powers global research, discovery and innovation. ACS' main offices are in Washington, D.C., and Columbus, Ohio.

To automatically receive news releases from the American Chemical Society, contact newsroom@acs.org.

Follow us on Twitter | Facebook

Credit: 
American Chemical Society

A new 'cool' blue 

image: A new class of blue hibonite pigments has improved properties over existing colorants, such as cobalt blue.

Image: 
Adapted from <i>ACS Omega</i> <b>2019</b>, DOI: 10.1021/acsomega.9b03255

Throughout history, people have sought vibrant blue pigments. The Egyptians and Babylonians used lapis lazuli 6,000 years ago. In 1802, a French chemist synthesized cobalt blue. More recently, in 2009 scientists discovered YInMn Blue, otherwise known as "Oregon Blue." But most of these pigments have limitations in terms of cost, stability, color or toxicity. Now, researchers in ACS Omega report a new class of 'cool' blue colorants that are inexpensive, durable and more environmentally friendly.

For the last 200 years, cobalt blue (CoAl2O4) has been a dominant commercial blue pigment because of its color intensity, ease of synthesis and versatility. However, 33% of the colorant by mass is carcinogenic Co2+, making cobalt blue relatively expensive and environmentally harmful to produce. Mas Subramanian, who discovered Oregon Blue, and colleagues at Oregon State University wanted to develop a new class of blue pigments that had enhanced color properties, reduced cost and lower cobalt content than cobalt blue.

The researchers were inspired by the crystalline structure of a light-blue mineral called hibonite. The team systematically substituted Al3+ (aluminum) ions in hibonite with Co2+, Ni2+ (nickel) or Ti4+ (titanium) ions. The resulting series of pigments showed a range of intense blue colors, some with reddish hues. The pigments were stable even when soaked in acidic or basic solutions. In contrast to cobalt blue, the new blues reflected near-infrared light, which could make them useful as 'cool pigments' in energy-saving, heat-reflecting coatings. Importantly, the Co2+ concentration in the new compounds in hibonite blues was as low as 4% by mass, making the pigments cheaper and more environmentally friendly.

Credit: 
American Chemical Society

Reinventing the computer: Brain-inspired computing for a post-Moore's Law era

image: Reinventing computing to better emulate the neural architectures in the brain is the key to solving dynamical problems. For example, with a photo of Abraham Lincoln and advancements in nonlinearity, causality and sparsity, a computer can instantly identify his face and return similar images.

Image: 
Jack D. Kendall and Suhas Kumar

WASHINGTON, D.C., January 15, 2020 -- Since the invention of the transistor in 1947, computing development has seen a consistent doubling of the number of transistors that can fit on a chip. But that trend, known as Moore's Law, may reach its limit as components of submolecular size encounter problems with thermal noise, making further scaling impossible.

In their paper published this week in Applied Physics Reviews, from AIP Publishing, authors Jack Kendall, of Rain Neuromorphics, and Suhas Kumar, of Hewlett Packard Labs, present a thorough examination of the computing landscape, focusing on the operational functions needed to advance brain-inspired neuromorphic computing. Their proposed pathway includes hybrid architectures composed of digital architectures, alongside a resurgence of analog architectures, made possible by memristors, which are resistors with memory that can process information directly where it is stored.

"The future of computing will not be about cramming more components on a chip but in rethinking processor architecture from the ground up to emulate how a brain efficiently processes information," Kumar said.

"Solutions have started to emerge which replicate the natural processing system of a brain, but both the research and market spaces are wide open," Kendall added.

Computers need to be reinvented. As the authors point out, "Today's state-of-the-art computers process roughly as many instructions per second as an insect brain," and they lack the ability to effectively scale. By contrast, the human brain is about a million times larger in scale, and it can perform computations of greater complexity due to characteristics like plasticity and sparsity.

Reinventing computing to better emulate the neural architectures in the brain is the key to solving dynamical nonlinear problems, and the authors predict neuromorphic computing will be widespread as early as the middle of this decade.

The advancement of computing primitives, such as nonlinearity, causality and sparsity, in new architectures, such as deep neural networks, will bring a new wave of computing that can handle very difficult constrained optimization problems like weather forecasting and gene sequencing. The authors offer an overview of materials, devices, architectures and instrumentation that must advance in order for neuromorphic computing to mature. They issue a call to action to discover new functional materials to develop new computing devices.

Their paper is both a guidebook for newcomers to the field to determine which new directions to pursue, as well as inspiration for those looking for new solutions to the fundamental limits of aging computing paradigms.

Credit: 
American Institute of Physics

Emotions to help engage school students in learning

Psychology researchers from HSE University have trialed the reliability of a student engagement scale on 537 Russian primary school students. The findings indicated that the emotional component contributes the most to school engagement. The paper has been published in PLOS ONE journal.

Student engagement impacts children's performance and future success. It is also used as a primary predictor of educational dropout or successful school completion in Europe and North America. The concept of school engagement is broader than the concept of learning motivation: it includes the assessment of a child's general well-being at school, their interest and preparedness to participate in learning activities.

Engagement can be assessed in three components: behavioural, emotional, and cognitive. The behavioural component relates to the child's activity, participation in school events, and readiness to follow the school rules; the emotional component assesses the feeling of comfort, the sense of belonging and interest in the school; the cognitive component assesses the child's willingness to acquire knowledge and their ability for self-regulation.

A school engagement questionnaire that assess three different factors of engagement was not available in the Russian language . That is why HSE researchers decided to adapt and trial one of the most popular international questionnaires of School Engagement , which was developed in 2005.

The researchers translated the School Engagement scale from English with the assistance of developmental psychologists and made the test understandable for kids aged 6 to 12. 537 children from 1st to 4th grades from different schools in Moscow took part in the assessment.

Children marked on a piece of paper how they related to different phrases. For example, the phrase 'I feel bored at school' assesses emotional involvement; 'I watch learning-related TV shows' assesses the cognitive component, and 'I'm attentive in class' - the behavioural one.

Researchers used statistical methods to confirm the validity and reliability of the survey, and analyzed the data on Russian school students' engagement. According to factor analysis, the emotional component is the most important component for assessing the overall engagement of primary school pupils. This is consistent with past findings : a child's interest and comfort at school is particularly important for engaging in learning activities.

'If we look at school as a place where we constantly "rack our brain", continually solve problems and think, it may seem that cognitive involvement would be the factor that is most important for a student's performance and readiness to make the effort. But this is not true. The emotional component makes a bigger contribution,' said Marie Arsalidou, one of the study authors. 'The kids need to feel happy and comfortable at school. And this makes sense: when you are in a place where you are happy, you are ready to work more.'

Researchers also did not observe any differences in engagement among between boys and girls and children of different ages. Previously, some international papers found that school engagement usually drops as the child gets older. Researchers of this most recent study assume that the differences in their results may be related to their younger sample or cultural practices : the unification of the study process across all grades in the Russian school system may result in similar engagement.

The translated questionnaire can now be used at Russian schools to evaluate how involved students are in the study process and how comfortable and interested they feel in class. The engagement measurement can then help prevent performance decrease and dropout. The survey is freely available here.

Credit: 
National Research University Higher School of Economics

First randomized clinical trial found no harms from dementia screening in primary care

image: Groundbreaking CHOICE study from Regenstrief Institute and Indiana University School of Medicine researchers scientifically negates the concern that dementia screening may be harmful. However, most patients declined to follow up on a positive screening result.

Image: 
Regenstrief Institute

INDIANAPOLIS - Research scientists at Regenstrief Institute and Indiana University School of Medicine have conducted the first randomized controlled trial to evaluate the pros and cons of population screening for dementia. The researchers found no harm, as measured by patient reported depressive and anxiety symptoms, from screening for Alzheimer's disease and related dementia in diverse rural, suburban and urban primary care clinics in Indiana.

Furthermore, the trial did not identify any benefit from screening in reducing emergency department visits and hospitalizations, or increasing advance care planning.

"Many patients and families have concerns that dementia screenings may create anxiety or depression in patients because there is, as yet, no cure for this disease. However, this study shows that is not the case," said Nicole Fowler, PhD, MHSA, associate director of the Indiana University Center for Aging Research at Regenstrief Institute.

Dr. Fowler, an assistant professor of medicine at IU School of Medicine and a Regenstrief research scientist, is first author of the new study, which is published in the Journal of the American Geriatrics Society.

"Dementia screening provides awareness for the patient and their family, allowing them to take action - including advance care planning -- and we now know that the screening does not harm the patient. Though we found many patients declined to follow up a positive screening, the knowledge obtained from a screening at least allows them to enter a watchful waiting period or choose to be engaged," Dr. Fowler said.

While 70 percent of study participants who screened positive for cognitive impairment declined a follow up diagnostic assessment, those who did complete a follow up and then received collaborative care had significantly decreased hospital admissions as compared with study participants who were not screened but later developed cognitive impairment. Previous studies led by IU School of Medicine and Regenstrief research scientists have found that the collaborative dementia care model decreased behavioral and psychological symptoms in patients living with dementia and reduced healthcare utilization, resulting in annual cost savings ranging from $908 to $2,856 per patient.

"For a number of reasons, including the lack of drugs to treat dementia and the stigma around the condition, people are hesitant to engage in the next steps of the process after screening," said Dr. Fowler. "The health care system needs to help bridge this gap and encourage people to follow up on the results of screening tests as they would for any other condition."

The study noted that "the finding of statistical equivalence of screening on patients' symptoms of depression and anxiety" is important given that previous studies measuring the public's perceived attitude of dementia screening reported that patients were concerned that this screening would make them feel depressed or anxious. In fact, it did not.

Dementia affects more than 5 million people in the United States and is frequently unrecognized and underdiagnosed in primary care settings, where most older adults receive their health care. It is estimated that as many as half of primary care physicians are unaware of their older patients' cognitive status.

More than 4,000 primary care patients age 65 years and older were enrolled in the randomized, controlled Indiana University Cognitive Health Outcomes Investigation of the Comparative Effectiveness of Dementia Screening (CHOICE) trial. Two-thirds of study participants were female; two-thirds of study participants were white.

"This study is groundbreaking because we have scientifically negated the concern that dementia screening may be harmful," said Malaz Boustani, M.D., MPH, the study's senior author and principal investigator of the trial. "Until now, the lack of evidence of potential harm of dementia screening has been a barrier to dementia screening in primary care. Hopefully our finding of no harm from screening has eliminated this obstacle."

Credit: 
Regenstrief Institute

Spinning quantum dots

The name 'quantum dots' is given to particles of semiconducting materials that are so tiny - a few nanometres in diameter - that they no longer behave quite like ordinary, macroscopic matter. Thanks to their quantum-like optical and electronic properties, they are showing promise as components of quantum computing devices, but these properties are not yet fully understood. Physicists Sanjay Prabhakar of Gordon State College, Georgia, USA and Roderick Melnik of Wilfrid Laurier University, Waterloo, Canada have now described the theory behind some of these novel properties in detail. This work is published in EPJ B.

In the coming quantum computing era, information storage and processing may depend on so-called spintronic devices that exploit the electron spin as well as its charge as a unit of information. This will only be possible, however, if the spin of a single electron can be controlled. Researchers have recently suggested that it should be possible to control the spin of electrons in quantum dots with electric fields through spin-orbit coupling, which is the interaction of the electron's spin with its motion. It is this interplay between electric fields and electron spins that Prabhakar and Melnik have now modelled.

Spin-orbit coupling leads to a split in an electron's energy levels. which can be detected as line splitting in a spectrum. The researchers simulated this effect in quantum dots made from different semiconductor materials, moving slowly through electric fields. They solved the Schrodinger equation for the system, observed strong beating patterns in the spin values and revealed that spin-orbit coupling occurs in these slowly moving dots, inducing a magnetic field in the absence of an external one. These emerging magnetic properties suggest that the dots could, indeed, have potential in quantum computing as storage and processing devices.

Credit: 
Springer

Air pollution from oil and gas production sites visible from space

Oil and gas production has doubled in some parts of the United States in the last two years, and scientists can use satellites to see impacts of that trend: a significant increase in the release of the lung-irritating air pollutant nitrogen dioxide, for example, and a more-than-doubling of the amount of gas flared into the atmosphere.

"We see the industry's growing impact from space," said Barbara Dix, a scientist at the Cooperative Institute for Research in Environmental Sciences (CIRES) at the University of Colorado Boulder and lead author of the new assessment published in the AGU journal Geophysical Research Letters. "We really are at the point where we can use satellite data to give feedback to companies and regulators, and see if they are successful in regulating emissions."

Dix and a team of U.S. and Dutch researchers set out to see if a suite of satellite-based instruments could help scientists understand more about nitrogen oxides pollution (including nitrogen dioxide) coming from engines in U.S. oil and gas fields. Combustion engines produce nitrogen oxides, which is a respiratory irritant and can lead to the formation of other types of harmful air pollutants, such as ground-level ozone.

On oil and gas drilling and production sites, there may be several small and large combustion engines, drilling, compressing gas, separating liquids and gases, and moving gas and oil through pipes and storage containers, said co-author Joost de Gouw, a CIRES Fellow and chemistry professor at CU Boulder. The emissions of those engines are not controlled. "Cars have catalytic converters, big industrial stacks may have emissions reduction equipment..." de Gouw said. "Not so with these engines."

Conventional "inventories" meant to account for nitrogen oxides pollution from oil and gas sites are often very uncertain, underestimating or overestimating the pollutants, de Gouw said. And there are few sustained measurements of nitrogen oxides in many of the rural areas where oil and gas development often takes place, Dix said.

So she, de Gouw and their colleagues turned to nitrogen dioxide data from the Ozone Monitoring Instrument (OMI) on board a NASA satellite and the Tropospheric Monitoring Instrument (TropOMI) on a European Space Agency satellite. They also looked at gas flaring data from an instrument on the NOAA/NASA Suomi satellite system.

Between 2007 and 2019, across much of the United States, nitrogen dioxide pollution levels dropped because of cleaner cars and power plants, the team found, confirming findings reported previously. The clean air trend in satellite data was most obvious in urban areas of California, Washington and Oregon and in the eastern half of the continental United States. "We've cleaned up our act a lot," Dix said.

However, several areas stuck out with increased emissions of nitrogen dioxide: The Permian, Bakken and Eagle Ford oil and gas basins, in Texas and New Mexico, North Dakota, and Texas, respectively.

In those areas, the scientists used a type of time-series analysis to figure out where the pollutant was coming from: Drilling of new wells vs. longer-term production. They could do this kind of analysis because drilling activity swings up and down quickly in response to market forces while production changes far more slowly (once a well is drilled, it may produce oil and natural gas for years or even decades).

Before a downturn in drilling in 2015, drilling generated about 80 percent of nitrogen dioxide from oil and gas sites, the team reported. After 2015, drilling and production produced roughly equal amounts of the pollutant. Flaring is estimated to contribute up to 10 percent in both time frames.

The researchers also developed a new oil and gas emissions inventory, using data on fuel use by the industry, the location of drilling rigs, and well-level production data. The inventory confirmed the satellite trends, said co-author Brian McDonald, a CIRES scientist working in NOAA's Chemical Sciences Division, "It is a promising development that what we observe from space can be explained by expected trends in emissions from the oil and gas industry."

"Scientifically, this is especially important: we can do source attribution by satellite," de Gouw said. "We need to know the important sources to address these emissions in the most cost-efficient manner."

Credit: 
University of Colorado at Boulder

NASA infrared data analyzes cloud top temperatures in Tropical Cyclone Claudia

image: On Jan. 14 at 1:23 EST (0623 UTC) NASA's Aqua satellite analyzed Tropical Storm Claudia using the Atmospheric Infrared Sounder or AIRS instrument. AIRS found coldest cloud top temperatures as cold as or colder than (purple) minus 63 degrees Fahrenheit (minus 53 degrees Celsius) north and east of Imelda's center.

Image: 
NASA JPL/Heidar Thrastarson

Satellite data of Tropical Cyclone Claudia's cloud top temperatures revealed that the storm was weakening.

One of the ways NASA researches tropical cyclones is using infrared data that provides temperature information. The AIRS instrument aboard NASA's Aqua satellite captured a look at those temperatures in Claudia's cloud tops and got insight into the storm's strength.

Cloud top temperatures provide information to forecasters about where the strongest storms are located within a tropical cyclone. Tropical cyclones do not always have uniform strength, and some sides have stronger sides than others. The stronger the storms, the higher they extend into the troposphere,  and the colder the cloud temperatures.

On Jan. 14 at 1:23 EST (0623 UTC) NASA's Aqua satellite analyzed the storm using the Atmospheric Infrared Sounder or AIRS instrument. AIRS found the coldest cloud top temperatures were getting warmer. That is an indication that the uplift of air in the storm is not as strong as it was before. AIRS found temperatures as cold as or colder than minus 63 degrees Fahrenheit (minus 53 degrees Celsius) around Claudia's center.  NASA research has shown that cloud top temperatures that cold indicate strong storms that have the capability to create heavy rain.

On Jan. 15, satellite imagery showed strongest storms within Claudia were separated well to the west of the low level center, indicating wind shear from the east was tearing the storm apart. The Joint Typhoon Warning Center noted, "Central convection has begun to unravel and elongate as convective tops warmed." Claudia is expected to weaken further as it moves over cooler waters.

At 7:55 a.m. EST (8:55 p.m. AWST) on Jan. 15, the Australia Bureau of Meteorology noted that Claudia has maximum sustained winds near 40 mph (65 kph) and weakening. It was located near latitude 20.7 degrees south and longitude 105.8 degrees east.

Tropical Cyclone Claudia continues to move towards the southwest, well away from the Western Australia coast. It is expected to become a depression by Jan. 16 and weaken to a remnant low-pressure area.

Typhoons and hurricanes are the most powerful weather event on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

The AIRS instrument is one of six instruments flying on board NASA's Aqua satellite, launched on May 4, 2002.

Credit: 
NASA/Goddard Space Flight Center

Researchers discover novel potential target for drug addiction treatment

MINNEAPOLIS, MN- January 15, 2020 - New University of Minnesota Medical School research discovers a novel potential target for treating drug addiction through "the hidden stars of the brain."

Dopamine is one of the major reward molecules of the brain and contributes to learning, memory and motivated behaviors. Disruption of dopamine is associated with addiction-related disorders, such as amphetamine substance use and abuse.

A new study published in Neuron suggests that targeting astrocyte calcium signaling could decrease the behavioral effects of amphetamine. The study was co-led by Michelle Corkrum, PhD, a third-year medical student in the Medical Scientist Training Program (MD/PhD) at the University of Minnesota Medical School, and Ana Covelo, PhD, in the lab of Alfonso Araque, PhD, and in collaboration with Mark Thomas, PhD.

Named for their star-shape, Corkrum describes astrocytes as "the hidden stars of the brain." Astrocytes have traditionally been considered "support cells" of the brain and ignored in terms of actively contributing to brain function. This study shows that astrocytes do contribute to information processing and to how organisms think and function in this world.

Corkrum and colleagues found that astrocytes respond to dopamine with increases in calcium in the nucleus accumbens, one of the major reward centers in the brain. This increase in calcium was related to the release of ATP/adenosine to modulate neural activity in the nucleus accumbens. Then, they looked specifically at amphetamine because it is known to increase dopamine and psychomotor activity in organisms. They found that astrocytes respond to amphetamine with increases in calcium, and if astrocyte activity is ablated, the behavioral effect of amphetamine decreases in a mouse model.

"These findings suggest that astrocytes contribute to amphetamine signaling, dopamine signaling and overall reward signaling in the brain," Corkrum said. "Because of this, astrocytes are a potentially novel cell type that can be specifically targeted to develop efficacious therapies for diseases with dysregulated dopamine."

Corkrum attributes the success of this study to the collaborative nature of the University of Minnesota and throughout the graduate program in neuroscience. "We were able to integrate the phenomenal resources that the U of M offers to conduct state-of-the-art research and work with numerous different core facilities, which played key roles in this study," Corkrum said.

Corkrum plans to continue this research and explore what happens with repeated exposures, withdrawal and reinstatement of amphetamine and how the stage of addiction or disease state could affect the need to increase or decrease the activity of astrocytes.

Credit: 
University of Minnesota Medical School

BU and Thai researchers find strengths and gaps in Thailand diabetes care

As Thailand transitions to a high-middle-income country, noncommunicable chronic diseases such as diabetes are on the rise.

A new study by researchers from Boston University School of Public Health (BUSPH) and Chulalongkorn and Mahidol Universities in Bangkok identifies the strengths and weaknesses of diabetes care in Thailand's universal health system. Published in the journal PLOS ONE, the study found that the majority of Thai adults with diabetes were never diagnosed, but that most of those who were diagnosed did receive treatment and got the condition under control.

"Our findings highlight both the achievements of universal health care in Thailand and also the opportunities that remain both on a national level and regionally to ensure that people living with diabetes are integrated into care," says Dr. Andrew Stokes, assistant professor of global health at BUSPH and the study's corresponding author.

The researchers used data from the 2014 Thai National Health Examination Survey, which included both face-to-face interviews and a physical exam portion that collected blood samples after overnight fasting. Of the 15,663 Thai adults included in the study, 8.8% appeared to have diabetes based on their blood samples and/or reporting being treated for diabetes. Of those who appeared to have diabetes, the researchers found that 67.0% reported ever being screened for diabetes, 34.0% reported being diagnosed, 33.3% had been treated, and 26.0% had their diabetes under control.

"Thai healthcare systems may have put emphasis on expanding coverage both in terms of population coverage and medical care benefit packages, which they did quite well with relatively low cost (and limited resources). Nevertheless, this paper highlights the importance of improving the quality of care, especially primary care and public health promotion and disease prevention," says study co-author Dr. Piya Hanvoravongchai, lecturer in the Department of Preventive and Social Medicine in the Faculty of Medicine at Chulalongkorn University.

The researchers found that living in areas with more medical staff and health centers, such as the south and central parts of the country and in urban centers, as well as being older, made a patient more likely to be diagnosed and to have their diabetes under control.

"This suggests that investing in infrastructure in resource-scarce areas could have improved outcomes for diabetes in Thailand. To address the gaps at screening and diagnosis, that might require also expanding efforts beyond the clinic into community settings where people could get tested for high blood sugar where they live and work, and then get linked with medical care," says study lead author Dr. Lily Yan, a resident at Boston Medical Center and a master of science student at BUSPH. "In order for health systems to intervene effectively, they have to first recognize that there is a problem."

Credit: 
Boston University School of Medicine

Scientists pioneer new generation of semiconductor neutron detector

image: Neutron detector based on a single crystal of LiInP2Se6.

Image: 
Argonne National Laboratory

Whether you are trying to detect a possible radiation signature from a suspicious package or vehicle, or you are measuring power output in a nuclear reactor, being able to detect neutrons efficiently and precisely represents a significant challenge.

Most neutron detectors work based on one of two different technologies. Some, like those based on helium, are gas-filled. Others, like those based on lithium or boron, involve scintillators that take absorbed neutrons and emit light in response. In neither case are neutrons converted to electrical current and thus a directly readable signal.

In a new study from the U.S. Department of Energy’s (DOE) Argonne National Laboratory and Northwestern University, scientists have developed a new type of semiconductor neutron detector that boosts detection rates by reducing the number of steps involved in neutron capture and transduction.

“Our material shows that semiconductors that have been previously discounted can be promising if you have the right crystal.” — Mercouri Kanatzidis, Argonne materials scientist

This new material, called LiInP2Se6, converts neutrons into pairs of charged electrons and holes. When a voltage is applied to the material, the electron-hole pairs separate, and a current is generated.

“The true advantage of this semiconductor compared to other types of materials is that it is able to directly detect thermal neutrons,” said Argonne materials scientist Duck Young Chung, one of the authors of the study. “That improves the sensitivity of this detector because it doesn’t require an amplifier and a whole process.”

When neutrons are converted into charged particles in scintillators, many of them are lost in the process of generating a current and being detected. This is because the absorbent lithium atoms are dispersed in a relatively low concentration in these materials, requiring a thicker layer to absorb the neutrons. By contrast, the neutron absorbent part of the semiconductor is much more concentrated, reducing the loss of signal.

By contrast, semiconductor-based technologies have lower energy resolution than scintillators in general, representing a higher sensitivity to the absorbed neutrons. “Instead of having a multiple step process in which you lose a lot of your particles, you now have much higher sensitivity,” Chung said.

Northwestern graduate student Daniel Chica and postdoctoral researcher Yihui He succeeded in producing crystals of high quality for 6Li-enriched LiInP2Se6, which they made into a simple device capable of thermal neutron detection when exposed to a weak source. 

The sensitivity of the detector registers as a higher and narrower peak reading of a characteristic energy signature associated with the neutrons being detected. It is for this reason that another advantage of the semiconductor material could lie in its ability to do what researchers call “neutron forensics.”

“Essentially, by knowing the energy of the neutrons you are detecting, you can determine exactly what isotope produced them,” said Argonne materials scientist and Northwestern University professor Mercouri Kanatzidis. 

Previous semiconductor materials for neutron detection have been difficult and expensive to make. “There are only a few materials that have been studied as semiconductors for neutron detection, and there isn’t a lot of research being done into new options,” Chung said. “But studies like this might lay the groundwork for new studies that could drive down the cost.”

According to Kanatzidis, this study could lead to a renaissance of interest in semiconductor technology for neutron detection. “The idea for this has existed, but no one has found the right material to demonstrate it; other materials were plagued by materials issues that caused people to give up because they could not attain the proper performance,” he said. “Here our material shows that semiconductors that have been previously discounted can be promising if you have the right crystal.”

Credit: 
DOE/Argonne National Laboratory