Earth

Two drugs used in combination prove to be effective against most aggressive asbestos cancer in mice

image: Section of a mouse invasive mesothelioma tumour.

Image: 
CNIO

Exposure to asbestos is the major risk factor for malignant mesothelioma, a type of aggressive cancer that mostly affects the pleura - the tissue that lines the lungs - and the peritoneum - the tissue that lines the abdomen. This has been known for decades, yet the molecular characterisation of mesothelioma is barely known and treatment options are scarce. A study carried out by researchers at the Spanish National Cancer Research Centre (CNIO) on key molecular processes in this type of cancer identified two drugs that, used in combination, could be effective against the most aggressive type of mesothelioma. The compounds are already being used in clinical trials for other tumours, a fact that might help speed up studies on mesothelioma patients.

In addition, the authors of the study suggest a set of molecular markers for predictive therapy response, which can be used to determine which patients are eligible for the clinical trials.

The study, led by Paco Real, Head of the Epithelial Carcinogenesis Group at CNIO, and researcher Miriam Marqués, from the same Group, is being published in Cancer Research.

"The results of the study are relevant because mesothelioma has not been widely studied, has a poor prognosis and no targeted therapies," says Paco Real. "We have found evidence in mice and human cell lines of the potential effectiveness of two compounds that are by no means new. Now we want to start performing clinical trials as soon as possible."

Mesothelioma - a rare form of cancer

In Spain, five out of a million people die of mesothelioma every year. This figure is likely to increase over the next years in the country and at the European level too. Mesothelioma is commonly considered an occupational disease occurring after decades of exposure to asbestos in the workplace. In Spain, asbestos was widely used in the construction industry, especially in the 1960s, until it was banned as a potential health hazard in 2001.

Mesothelioma is a relatively rare form of cancer and, consequently, it is rarely studied in depth. The most aggressive type of mesothelioma is known as sarcomatoid carcinoma and accounts for nearly 25% of cases. Survival times of patients with this type of mesothelioma who get the currently available therapies range between 14 and 18 months.

It is precisely on sarcomatoid mesothelioma that the study carried out by the Epithelial Carcinogenesis Group at CNIO focuses.

Unexpected finding

The results of the study were to a great extent unexpected. The researchers were working on bladder cancer in mice, focusing on two tumour suppressor genes: Pten and Trp53. Surprisingly, the mice did not develop bladder cancer but an aggressive form of sarcomatoid mesothelioma. (The authors believe that, given the fast development of this aggressive form of mesothelioma, there was no time for bladder tumours to develop.)

Looking for the molecular cause of the rapid spread of the tumour cells, the researchers identified enhanced activity of the MEK/ERK and PI3K pathways via novel mechanisms involving a protein of the GPCR family. Blockage of these pathways with MEK and PI3Kbeta inhibitors reduced the proliferative and invasive capacities of mesothelioma cells in mice.

Drug testing under way

The following step consisted of looking for drugs that would inhibit the above-mentioned proteins and that were already being tested in humans. These drugs are Selumetinib, a MEK inhibitor, and AZD8186, a molecule that targets PI3Kbeta.

In Paco Real's words, "First we identified altered molecules in sarcomatoid mesothelioma in mice, then we verified that the proteins were also altered in sarcomatoid mesothelioma in humans and found a drug combo that was effective both in mice and in human cell lines. We chose drugs that were already being tested for other human tumours, because their toxicity is well understood and so they can be used in clinical trials soon."

Since Pten and TP53 are tumour suppressor genes, and MEK and PIK3beta show enhanced activity in many other human tumours, the findings might be relevant to types of cancer other than malignant mesothelioma. Marqués and Real are studying this possibility as well.

Credit: 
Centro Nacional de Investigaciones Oncológicas (CNIO)

Patient step counts predict lung cancer treatment outcomes, study finds

ARLINGTON, Va., January 8, 2020 -- Numerous studies have shown that monitoring physical activity promotes better health - from reducing body mass index to watching for signs of hypertension, for example. A new study suggests step counters could play yet another role: predicting outcomes for people undergoing chemoradiation therapy for lung cancer.

"I consider step counts to be a new vital sign for cancer treatment," said Nitin Ohri, MD, a radiation oncologist at Albert Einstein College of Medicine and Montefiore Health System in New York and lead investigator of the study published in the November 2019 issue of the International Journal of Radiation Oncology • Biology • Physics, the flagship scientific journal of the American Society for Radiation Oncology (ASTRO) (and made available online ahead of publication in August 2019). "We found that tracking our patients' activity levels prior to treatment could give clinicians data with powerful implications."

"Our study shows that people who are inactive for their age will have a significantly more difficult time with radiation therapy. They are more likely to end up in the hospital, experience treatment delays and disease recurrence; and are less likely to survive. This is valuable information worth considering when making treatment decisions."

Dr. Ohri and his team measured activity levels for 50 patients with locally advanced, non-small cell lung cancer who wore step counters prior to undergoing concurrent chemoradiation therapy. Participants were categorized as inactive, moderately active or highly active, based on the number of steps they took each day and adjusted for their age.

Researchers found dramatic differences in how well patients in each group fared during treatment, with people who were inactive at baseline faring poorest. For example, half of the people in the inactive group had to be hospitalized during treatment, compared to just 9% of people who were more active. Only about 10% of inactive patients were alive and without disease after 18 months, compared to roughly 60% of those who were more active. Overall, 45% of inactive patients were still alive after 18 months, compared to more than 75% of those who were more active.

While this study focused on activity levels prior to the start of treatment, previous research by Dr. Ohri has shown that patients often become less active during treatment, with negative consequences. "When activity levels declined during treatment, that was an indicator that patients are at high risk for hospitalization within the next few days," he said.

While Dr. Ohri acknowledges that the modest study size limits its ability to change clinical practice, he hopes his findings will encourage investigators in large, multi-institutional clinical trials of new cancer treatments to monitor patient activity levels as part of their data collection.

"If someone's step counts decrease dramatically during treatment - say, from 5,000 to 2,000 steps a day - that change needs to spark some conversations. Having an objective indicator of patients' functional status could be critical in identifying who needs extra care during treatment."

Credit: 
American Society for Radiation Oncology

Study finds losing a night of sleep may increase blood levels of Alzheimer's biomarker

A preliminary study by researchers at Uppsala University has found that when young, healthy men were deprived of just one night of sleep, they had higher levels of tau - a biomarker for Alzheimer's disease - in their blood than when they had a full, uninterrupted night of rest. The study is published in the medical journal Neurology.

Tau is a protein found in neurons and the protein can form into tangles. These accumulate in the brains of people with Alzheimer's disease. This accumulation can start decades before symptoms of the disease appear. Previous studies of older adults have suggested that sleep deprivation can increase the level of tau in the cerebral spinal fluid. Trauma to the head can also increase circulating concentrations of tau in blood.

"Many of us experience sleep deprivation at some point in our lives due to jet lag, pulling an all-nighter to complete a project, or because of shift work, working overnights or inconsistent hours," said study author Jonathan Cedernaes, MD, PhD, from Uppsala University in Sweden. "Our exploratory study shows that even in young, healthy individuals, missing one night of sleep results in a slight increase in the level of tau in blood. This suggests that over time, similar types of sleep disruption could potentially have detrimental effects."

The study involved 15 healthy, normal-weight men with an average age of 22. They all reported regularly getting seven to nine hours of quality sleep per night.

There were two phases to the study. For each phase, the men were observed under a strict meal and activity schedule in a sleep clinic for two days and nights. Blood samples were taken in the evening and again in the morning. For one phase, participants were allowed to get a good night of sleep both nights. For the other phase, participants were allowed to get a good night of sleep the first night followed by a second night of sleep deprivation. During sleep deprivation, lights were kept on while participants sat up in bed playing games, watching movies or talking.

Researchers found that the men had an average 17-percent increase in tau levels in their blood after a night of sleep deprivation compared to an average 2-percent increase in tau levels after a good night of sleep.

Researchers also looked at four other biomarkers associated with Alzheimer's but there were no changes in levels between a good night of sleep and one night of no sleep.

"It's important to note that while accumulation of tau in the brain is not good, in the context of sleep loss, we do not know what higher levels of tau in blood represent" said Cedernaes. "When neurons are active, release of tau in the brain is increased. Higher levels in the blood may reflect that these tau proteins are being cleared from the brain or they may reflect an overall elevation of the concentration of tau levels in the brain. Future studies are needed to investigate this further, as well as to determine how long these changes in tau last, and to determine whether changes in tau in blood reflects a mechanism by which recurrent exposure to restricted, disrupted or irregular sleep may increase the risk of dementia. Such studies could provide key insight into whether interventions targeting sleep should begin at an early age to reduce a person's risk of developing dementia or Alzheimer's disease."

The main limitation of the study was its small size. In addition, it looked only at healthy young men, so the results may not be the same for women or older people.

Credit: 
Uppsala University

How the rice blast fungus 'eats' its own cell wall to launch an attack

image: During infection in rice blast fungus, chitin from its cell wall breaks down into chitosan and acetic acid. This process stimulates cell differentiation in the host plant, leading to appressorium formation, which then facilitates the infection process.

Image: 
Dr. Takashi Kamakura

All living organisms respond and adapt to changes in their environment. These responses are sometimes so significant that they cause alterations in the internal metabolic cycles of the organism--a process called "metabolic switching." For example, rice blast fungus--a pathogenic fungal species that causes the "rice blast" infection in rice crops--switches to the "glyoxylate cycle" when the nutrient source starts to deplete. Another response to environmental change is called "cell differentiation", where cells switch to another type altogether. In rice blast fungus, for example, the fungal cells differentiate and generate a large amount of pressure on the cell wall, causing the fungus to develop a specialized structure called "appressorium," which ultimately facilitates the infection. Such methods of adaptation have been seen across various organisms, but exactly how they occur is not very clear yet.

In a recent study published in iScience, a team of researchers at Tokyo University of Science, led by Prof Takashi Kamakura, found for the first time that extremely low concentrations of acetic acid alter cellular processes in rice blast fungus. Their research was based on the fact that Cbp1--a protein that can remove acetyl groups from chitin (the main component of the cell wall of fungi)--plays a huge role in appressorium formation by converting chitin into chitosan and releasing acetic acid. Explaining the objective of the study, Prof Kamakura says, "Metabolic switching in nutrient-deficient environments depends on changes in the nutrient source, but its mechanism has remained poorly understood until now. Since chitin was known to induce a subsequent resistance response (immune response), we speculated that Cbp1 functions to escape recognition from plants. Also, because the enzymatic activity of Cbp1 affects cell differentiation, we hypothesized that the reaction product of chitin deacetylation by Cbp1 may be a signal for cell differentiation."

For their study, the scientists used a mutant form of the fungus that did not produce Cbp1 and thus could not form appressorium as it was unable to produce acetic acid from chitin. The scientists observed that when minuscule concentrations of acetic acid, even as low as a hundred molecules per fungal spore, were added, appressorium formation was restored in the mutants. This implied that acetic acid could act as a chemical signal to trigger cell differentiation. Then, to better understand the role of acetic acid in the glyoxylate cycle, the researchers focused on an enzyme unique to this metabolic pathway: isocitrate lyase. They found that the mutant forms of the fungus had much lower levels of this enzyme, meaning that they could not switch to the glyoxylate cycle. But, as seen before, the addition of acetic acid at an extremely low concentration was enough to restore normal levels of the enzyme and thus induce appressorium formation. "Our study is the first to reveal the novel role of acetic acid in metabolic switching and cell differentiation in eukaryotic cells," remarks Prof Kamakura.

Interestingly, these findings indicate that using chitin molecules from their own cell wall could be a survival strategy used by several types of bacteria and fungi. This would allow them to thrive in environments deprived of nutrients--such as on the surface of a host leaf--and avoid host defense/immune mechanisms. Acetic acid could then be used both as a carbon source and as a signal to trigger metabolic switching and cell differentiation. Prof Kamakura explains, "The use of acetic acid obtained from a pathogen's own cell wall for the activation of the glyoxylate cycle is perhaps a general mechanism in various infection processes."

The finding that extremely low concentrations of a small, simple molecule like acetic acid could induce significant changes in cellular processes is unprecedented and was only known before for certain hormones in animals. Understanding this type of inter-species chemical interactions could prove to be immensely valuable in agriculture, bioengineering, and medicine, to name a few areas. Prof Kamakura concludes, "It is yet to be found whether this phenomenon is common to other organisms. But, since metabolites such as butyric acid derived from human intestinal bacteria are involved in immune cell activation and cancer progression, our findings have implications in a wide variety of fields, including medicine and agriculture."

Credit: 
Tokyo University of Science

Evolving landscape added fuel to Gobi Desert's high-speed winds

image: A new study finds that the dark, rocky landscape of the Hami basin in the Gobi Desert helped to make it one of the windiest places in China.

Image: 
Alex Pullen and Paul Kapp

On February 28, 2007, harsh winds blew 10 train cars off a track running near China's Hami basin, killing three passengers and seriously injuring two others. Hurricane-force gusts of 75 mph or more scour this basin every 15-20 days or so, on average, and can reach maximum speeds of more than 120 mph. A study published last week in Nature Communications has documented a new feedback loop that may have helped to make this basin in the Gobi Desert one of the windiest places in China.

"It's an odd-looking environment because it's covered by these dark-colored gravels," explained lead author Jordan Abell. "It's really hot, and can be extremely windy. Our team wondered if the surface plays any role in these extreme conditions." Abell is a graduate student at Columbia University's Lamont-Doherty Earth Observatory and the Department of Earth and Environmental Sciences. His advisor is Lamont-Doherty geochemist Gisela Winckler, also a co-author on the paper.

The Hami basin may once have been covered in a fine, light-colored sediment, similar to California's Death Valley. Within the past 3 million years, however, strong winds carried away those fine sediments, leaving behind a sea of gray and black rocks.

Using a weather and forecasting model, Abell and his colleagues studied how this change from light to dark landscape affected wind speeds in the basin. By absorbing more sunlight, the darker stones exposed by wind erosion heated up the air within the depression. The team found that the resulting differences in temperature between the depression and the surrounding mountains increased wind speeds by up to 25 percent. In addition, the amount of time the area experiences high wind speeds increased by 30 to 40 percent.

Thus, by changing how much sunlight the ground absorbs, wind erosion appears to have exacerbated wind speeds in this region. It's the first time this positive feedback loop has been described and quantified, said Abell.

But it's probably not the only example of its kind. The researchers think this interaction may have helped to shape other stony deserts in Australia, Iran, and perhaps even on Mars.

Understanding this relationship between landscape changes, albedo, and wind erosion may help to make climate simulations more accurate for both the past and future.

Climate models typically do not account for changes in the reflectance of landscapes other than those caused by ice and vegetation. They also tend to assume arid landscapes remain unchanged over time. That could be problematic in some cases, said Abell.

"If you wanted to calculate the wind or atmospheric circulation in this area 100,000 years ago, you would need to consider the change in the surface geology, or else you could be incorrect by 20 or 30 percent," he said.

He added that the newly discovered relationship could also help to accurately model how other landscape changes, such as urbanization and desertification, influence atmospheric patterns by changing the reflectance of the earth's surface.

Credit: 
Columbia Climate School

Plant-derived SVC112 hits cancer stem cells, leaves healthy cells alone

image: Tin Tin Su, PhD, and colleagues show promise of new drug SVC112 against cancer stem cells

Image: 
Paul Muhlrad

The red, tube-shaped flowers of the firecracker bush (Bouvardia ternifolia), native to Mexico and the American Southwest, attract hummingbirds. The bush also provides the chemical bouvardin, which the lab of University of Colorado Cancer Center and CU Boulder researcher, Tin Tin Su, PhD, and others have shown to slow a cancer's ability to make proteins that tell cancer cells to grow and spread. Now a paper based on nearly half a decade of work, published in the journal Cancer Research, shows that the molecule SVC112, based on bouvardin and synthesized by Su's Colorado-based pharmaceutical startup, SuviCa, Inc. acts specifically against head and neck cancer stem cells (CSCs), resulting in better tumor control with less toxicity to healthy cells than existing, FDA-approved protein synthesis inhibitors. The group hopes these promising preclinical results will lay the groundwork for human clinical trials of SVC112 in head and neck cancer patients.

"Proteins are the keys to initiating genetic programs in the cells to tell them Now you grow, now you stay put, now you metastasize. And those proteins are called transcription factors," says paper co-senior author, Antonio Jimeno, MD, PhD, director of the Head and Neck Cancer Clinical Research Program and co-leader of the Developmental Therapeutics Program at CU Cancer Center, member of the Gates Center for Regenerative Medicine, and the Daniel and Janet Mordecai Endowed Chair for Cancer Stem Cell Research at the CU School of Medicine.

Cancer stem cells (CSCs) are a subpopulation of cancer cells that, like healthy stem cells, act as factories, manufacturing cells that make up the bulk of a cancer's tissue. Unfortunately, CSCs often resist treatments like radiation and chemotherapy, and can survive to restart tumor growth once treatment ends.

"Many groups have linked the production of transcription factors to the survival and growth of cancer stem cells, but inhibitors have just been too toxic - they come with too many side effects. Definitely our studies suggest that this drug could be an advantage over existing drugs. It inhibits protein synthesis in a way that no other drug does and that's why we're excited," says Su, who is also co-leader of the CU Cancer Center Molecular and Cellular Oncology Program.

Importantly, the group's work showed that SVC112 acts specifically against proteins like Myc and Sox2 needed by cancer stem cells, while leaving healthy cells relatively unharmed. They did this by comparing the effects of the drug in "matched pairs" of cancer cells and healthy cells grown from samples graciously donated by five head and neck cancer patients in Colorado. For further comparison, the group did the same experiments with the FDA-approved protein synthesis inhibitor known as omacetaxine mepesuccinate (also called homoharringtonin, or HHT).

"Having cancer cells along with matched non-cancer cells from the same patient is pretty unique. When we tested these matched pairs with SVC112 and with HHT, what we saw is the approved drug eliminated both cancer and normal cells, whereas SVC112 had selectivity - it affected cancer cells but not healthy cells - so theoretically the effects on the normal tissue will be less," Su says. In fact, healthy cells were between 3.8 and 5.6 times less sensitive to SVC112 than were cancer cells (healthy cells and cancer cells were equally sensitive to the FDA-approved drug HHT).

The next step was using SVC112 to treat head and neck tumors grown in mouse models from samples of human tumors. Earlier work had shown that SVC112 sensitized previously radiation-resistant CSCs to radiation treatment, and so the group tested SVC112 and radiation alone and in combination.

"What we saw is that only when you decrease the population of cancer stem cells to under 1 percent of the total makeup of a tumor did the tumor shrink," Jimeno says. "It's like cancer stem cells are in the control tower, directing the growth of the tumor. If you impair enough of these directors, other cancer cells don't know what to do and cancer growth slows down or stops."

Ongoing work continues in two major directions, with Su's team continuing to propel the drug toward the clinic and Jimeno's team working to understand of the basic biology driving the drug's action, how to best combine it with other treatments such as radiation or immunotherapy, and its potential uses in other cancer types.

"This is the first report of the drug, from the drug's chemical structure, its basic effects on commercial cell lines, to its mechanism of action with patient-derived cell lines and more complex action on CSCs, all the way to animal models from patient samples," Jimeno says.

Early drug development undertaken outside the funding structure of established pharmaceutical sponsors often requires contributions from many sources, and the current project is no exception, receiving support from subcontracts to SuviCa's Small Business Innovation Research (SBIR) award, a National Institutes of Health grant to the Su lab, pilot funding from the CU Cancer Center, and philanthropy support from the Gates Center and the CU School of Medicine.

"We are so grateful for the belief from all these organizations and individuals, and especially to our patients, whose courage has been essential in making the models we need to test this new drug," says Jimeno.

Proposals are already underway to take the next important step: Testing SVC112 in an early human clinical trial.

Credit: 
University of Colorado Anschutz Medical Campus

Experiments into amorphous carbon monolayer lend new evidence to physics debate

Plastic, glass and gels, also known as bulk amorphous materials, are everyday objects to all of us. But for researchers, these materials have long been scientific enigmas - specifically when it comes to their atomic makeup, which lacks the strict ordered structure of crystals found in most solids such as metals, diamonds and salts.

Although generally believed by the scientific community to be continuous random networks of atoms, a long-standing, fundamental question existed: Are amorphous materials truly continuous random networks or do they have nanocrystallites embedded within them?

Now, we finally have answers - thanks to a new study detailing the first successful experiments growing, imaging with atomic resolution, and investigating the properties of two-dimensional amorphous carbon. The paper appears today in Nature and is published by an international team of researchers, including Sokrates Pantelides, University Distinguished Professor of Physics and Engineering at Vanderbilt University.

"For the first time, thanks to the discovery of this monolayer material, we're able to confirm the composition of an amorphous structure as a random network containing nanocrystallites, lending strong evidence to one side of the primordial debate," said Pantelides. "But this work not only provides answers; It presents a physical, two-dimensional carbon material, distinct from the lauded graphene, with potentially promising applications well into our future."

Future device applications of the material, according to Pantelides, could include anti-corrosion barriers for magnetic hard discs in future computers and for current collector electrodes in batteries.

The questions regarding amorphous material composition persisted for years due to long-standing technological issues for researchers, which included limitations in small-scale microscopy that prevented physicists from accurately imaging three-dimensional amorphous materials at the atomic scale. And while researchers were able to accurately image amorphous monolayers, such monolayers were until now fabricated by using high-energy electron beams to disorder crystalline monolayers. 

The first-ever stable monolayer of amorphous carbon, grown by a team led by Barbaros Özyilmaz of the National University of Singapore and imaged by the group of Kazu Suenaga in Tsukuba science city, Japan, makes these issues problems of the past.

A theoretical physicist, Professor Pantelides worked remotely with the teams in Singapore and Tokyo to integrate experimental data, theory fundamentals, and results of calculations. A former graduate student of Pantelides, Junhao Lin, a post-doctoral fellow in the Suenaga group, performed the key microscopy. Vanderbilt post-doctoral fellow Yun-Peng Wang constructed an appropriate model and performed calculations.  

The growth method, which uses a cold substrate, and uses a laser to provide energy in a controlled way, yields reproducible monolayer films and led to newfound knowledge of atomic arrangements and electrical, mechanical and optical properties.

Thanks to the team's successful development and findings, the reproducible approach opens the door for research into the growth of other amorphous two-dimensional materials.   

Credit: 
Vanderbilt University

APS Tip Sheet: Improving quantum information processing

image: A new protocol compares the closeness of quantum states in information sent from different devices.

Image: 
APS Journals

January 6, 2020 - Scientists use basic units of quantum information, called qubits, to share information across quantum devices. Qubits store information by simultaneously inhabiting two quantum-mechanical states. Now, scientist Elben et al. have established a new protocol to crosscheck if two quantum computers are producing equal results. The researchers tested their protocol on experimental data from a previous experiment involving highly entangled 10-qubit quantum states. They found the process required notably fewer measurements than needed in quantum tomography, in which simulators must reconstruct each entire quantum state. The process will enable these devices to better measure and compare the closeness -- or fidelity -- of multiple quantum states across devices and time. The study could improve the way quantum information processing simulators and computers transmit information.

Credit: 
American Physical Society

Outbreak science: Infectious disease research leads to outbreak predictions

Infectious diseases have a substantially growing impact on the health of communities around the world and pressure to both predict and prevent such diseases is ever-growing. LSU Assistant Professor of Biological Science Tad Dallas and colleagues developed a simple approach to accurately predict disease outbreaks by combining novel statistical techniques and a large dataset on pathogen biogeography.

Pathogen biogeography refers to the spatial distribution of pathogens across the globe, and fundamentally links biogeography, community ecology, and epidemiology.

"Our approach leverages data on the entire network of pathogens and countries in order to forecast potential pathogen outbreak, emergence and re-emergence events," Dallas said. "Emergence events, which are first records of a pathogen recorded in a given country, are incredibly difficult to predict, as they are sort of by definition unexpected."

Using information from the Global Infectious Diseases and Epidemiology Network, or GIDEON data resource, Dallas collaborated with two co-authors, Colin Carlson from Georgetown University, and Timothée Poisot from the University of Montreal.

The study, published in Royal Society Open Science, weighs data on a large number of pathogens to forecast outbreaks, whereas previous approaches are much more simple. A simple approach to understanding and forecasting the occurrence of a pathogen in a country would require only information on the specific pathogen of interest. As Dallas explained, this more traditional approach a researcher would take would say, "If I observe an outbreak of a given pathogen in a country in half the number of years I looked for it, then I predict that there is a 50 percent chance the pathogen will be there next year." But it's possible that the distribution of other pathogens and other countries may inform the prediction of outbreak potential of the pathogen of interest. For instance, similar pathogens may have similar global distributions, and similar countries may have similar pathogen communities. Using this information to inform the forecasting of a single pathogen in a country of interest greatly increased the predictive accuracy relative to the traditional null model.

Dallas and colleagues recognize that detailed information on countries and pathogens may be required to more accurately forecast what countries certain pathogens appear in, but the simple method they developed outperforms traditional null models.

"The null model was an important component of this work, as it takes temporal data and makes a reasonable assumption," Dallas said. "While simple, this is an approach which has been used to forecast disease risk.[2] Our model predicts pathogen outbreaks better than this null approach, while it fails to predict emergence events. The goal is not necessarily to forecast all pathogen outbreak events everywhere, but simply to provide a benchmark for comparison to more advanced techniques."

This kind of research is important for people to know because it not only provides a baseline to test more complicated models against, but it's also important for people to be aware of the distributions of pathogens in general, Dallas said.

"Infectious disease outbreaks, whether they be widespread like Influenza or fairly geographically restricted like Ebola, may be difficult to prevent," he said. "However, if we can forecast outbreak potential in time, public health officials and governments can preemptively prepare for a potential outbreak event."

Dallas' favorite part of his research so far has been combining knowledge across different subfields of ecology.

"By combining aspects of community ecology into the study of human infectious disease, we were able to gain some insight into the distribution of pathogens at a global scale," he said. "And to apply ideas across subfields, it helps to have a great team of collaborators from somewhat different fields, as I definitely found in Colin Carlson and Timothée Poisot."

While Dallas' current research focuses on the global distribution of animal parasites and less with infectious diseases, he is furthering disease research by advising honor student Tivon Eugene, who is examining spatio-temporal patterns of pathogen outbreak and emergence events at LSU.

The rapid spread of pathogens can be accredited to growing contact between wildlife and humans, changes in climate and land use, food insecurity as well as geopolitical conflict. Looking ahead, developing the means to predict and prevent infectious disease outbreak and emergence requires collaboration of scientific knowledge like Dallas' to address this global health concern.

Credit: 
Louisiana State University

NASA-NOAA satellite catches Tropical Cyclone Blake and western Australia fires

image: NASA-NOAA's Suomi NPP satellite captured an image of Tropical Storm Blake covering a good portion of Western Australia on January 8, 2020 although its center is just inland from the Kimberley coast. The red spots at the southern extent of the state indicate wildfires. The brown stream extending over the Southern Indian Ocean and into the Great Australian Bight is the smoke generated from the fires.

Image: 
NASA Worldview, Earth Observing System Data and Information System (EOSDIS)

Tropical cyclone Blake made landfall in the Kimberley coast of Western Australia and NASA-NOAA's Suomi NPP satellite provided an image that showed its center inland with the storm extending to the southern part of the state where fires raged.

On January 8, 2020, the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard Suomi NPP provided a visible image of Blake that showed its center was just inland from the Kimberley coast. Bands of thunderstorms stretched far south all the way to the southern coast, where fires are raging. In the VIIRS image, red spots at the southern extent of the state indicate the heat signature of wildfires burning. A brown stream extending over the Southern Indian Ocean and into the Great Australian Bight is the smoke generated from the fires.

At 10 p.m. EST on January 7 (0300 UTC on January 8) the Joint Typhoon Warning Center or JTWC in Pearl Harbor, Hawaii issued their final warning on the system. At that time, Blake was centered near latitude 20.7 degrees south and longitude 119.7 degrees east, about 66 nautical miles east-southeast of Port Hedland, Australia. Blake was moving to the southwest and had maximum sustained winds near 35 knots (40 mph).

By January 8 at 10 a.m. EST, radar from the Australian Bureau of Meteorology's station in Kalgoorlie showed some showers in the area.

According to the website Emergency WA (Western Australia) there were fires burning south of Kalgoorlie in the Dundas Nature Reserve on January 7 at 11 p.m. EST. That fire is affecting Salmon Gums including the southern part of the Dundas Nature Reserve, Norseman, Higginsville, Widgiemooltha, Londonderry, Burra Rock, Victoria Rock and North Cascade in the Shires of Coolgardie, Dundas, Esperance and Widgiemooltha.

Blake is dissipating inland over the northwestern part of Western Australia, but generating rain that extends to the south.

Tropical cyclones and hurricanes are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

Credit: 
NASA/Goddard Space Flight Center

Research will help land managers take risk-analysis approach to new wildfire reality

image: Taylor Creek and Klondike Fires, Rogue-Siskiyou NF, OR, 2018.

Image: 
Photo by Kari Greer

CORVALLIS, Ore. - New digital tools developed by Oregon State University will enable land managers to better adapt to the new reality of large wildfires through analytics that guide planning and suppression across jurisdictional boundaries that fires typically don't adhere to.

Led by Chris Dunn, a research associate in the OSU College of Forestry with several years of firefighting experience, scientists have used machine learning algorithms and risk-analysis science to analyze the many factors involved in handling fires: land management objectives, firefighting decisions, fire mitigation opportunities and the needs of communities, the environment and the fire management system.

Their findings were published in Environmental Research Letters.

"We have to learn to live with fire," Dunn said. "There is no forecast or evidence of a future without more fire. If we accept that view, then the question becomes, what kind of fire do we want?"

Now, Dunn notes, "we suppress 97 or 98% of fires such that we experience the 2 or 3% that are really bad, where we have no chance of successful suppression because they're just so explosive."

But those numbers over time can be inverted, he says.

"We can choose to have more beneficial fires whose impacts aren't as consequential and less of those really bad ones, eventually," Dunn said. "It could ultimately mean more fire, but more of the right kind of fire in the right places for the right reasons."

Using fire-prone landscapes of the Pacific Northwest as their study areas, Dunn and collaborators developed a trio of complementary, risk-based analytics tools - quantitative wildfire risk assessment, mapping of suppression difficulty, and atlases of locations where fires might be controlled.

"These tools can be a foundation for adapting fire management to this new reality," Dunn said. "They integrate fire risk with fire management difficulties and opportunities, which makes for a more complete picture of the fire management landscape."

That picture makes possible a risk-based planning structure that allows for preplanning responses to wildfires, responses that balance risk with the likelihood of success.

The landscapes used in the study are "multijurisdictional" - i.e., a mix of federal, state and private property - which highlights the shared responsibility of wildfire risk mitigation, Dunn said.

"We're a couple decades into having really large wildfires here in the American West," he said. "Fires today are bigger, faster and more intense - we're really in a new fire reality. We see this issue globally, like the intense fires currently burning in Australia.

"It's time we step up to the plate with risk-analysis analytics and computing power to complement the experiential knowledge of our fire management service," Dunn said. "As partners, scientists, managers and communities, we can work together to determine how to best interact with fires now and into the future."

The models allow for a view of landscapes "through a fire lens outside of fire season, so we can think in advance, plan for them, be more strategic with our fire engagement. Ultimately, we can move toward better outcomes," he said.

Before 1910, frequent low-severity surface fires played a key role in maintaining the forests of the mountain regions of the western United States. In the decades since, the fire deficits that resulted from federal policy - in concert with grazing, logging and land-use changes - have caused major structural shifts in older forests as shade-tolerant and fire-intolerant species have moved in.

The policy of fire suppression traces its roots to the Great Fire of 1910, which killed 87 people, destroyed several towns and burned an area roughly the size of Connecticut. The blaze consumed 3 million acres of forest in Idaho, Montana, Washington and British Columbia and resulted in timber losses of an estimated $1 billion.

However, attempting total fire exclusion leads to what researchers and forestry professionals refer to as the "wildfire paradox" - the harder you try to stamp out wildfires, the worse the fires are when you can't extinguish them.

"The instinct to suppress large fires is a pathology that works against us," Dunn said. "But with our models and process, these decision tools help diversify our initial response and lead to a default fire response that allows fire to play its ecological role while also providing the risk-reduction benefits of recently burned areas."

Planning units on a map known as PODs - short for potential operational delineations -summarize risk to inform wildfire response decisions. Their boundaries line up with "high probability control features" - roads, rivers, lakes, canyons, etc. - to help ensure that a response will be successful if launched.

"Suppression is necessary in areas where there are high values at risk, and we can be more successful at doing suppression when it's needed," Dunn said. "The fires we need to fight, maybe we can catch them at 500 acres instead of 5,000 if we do planning and management ahead of time instead of when we are chasing the fires."

Jurisdictional or ownership boundaries rarely align with effective control locations, Dunn said, and thus the PODs highlight areas where partners can work together for shared stewardship.

The computer models generate maps that show "how firefighters see the landscape when they're engaging a fire," Dunn said - a visual perspective he hopes can spark a cultural change that will lead to living with fire more constructively.

"Historically, we have zoned landscapes based on timber resources or critical wildlife habitat such as the northern spotted owl," he said. "Now we have the firefighters' perspective that we can share with communities, timber companies, NGOs, creating a platform for partners to understand the decisions made by our fire management service. This can foster partnerships to address the fire challenges of today and ultimately provide the best near- and long-term outcomes for our ecosystems and communities."

Credit: 
Oregon State University

'She' goes missing from presidential language

CAMBRIDGE, MA -- Throughout most of 2016, a significant percentage of the American public believed that the winner of the November 2016 presidential election would be a woman -- Hillary Clinton.

Strikingly, a new study from cognitive scientists and linguists at MIT, the University of Potsdam, and the University of California at San Diego shows that despite those beliefs, people rarely used the pronoun "she" when referring to the next U.S. president before the election. Furthermore, when reading about the future president, encountering the pronoun "she" caused a significant stumble in their reading.

"There seemed to be a real bias against referring to the next president as 'she.' This was true even for people who most strongly expected and probably wanted the next president to be a female," says Roger Levy, an MIT professor of brain and cognitive sciences and the senior author of the new study. "There's a systematic underuse of 'she' pronouns for these kinds of contexts. It was quite eye-opening."

As part of their study, Levy and his colleagues also conducted similar experiments in the lead-up to the 2017 general election in the United Kingdom, which determined the next prime minister. In that case, people were more likely to use the pronoun "she" than "he" when referring to the next prime minister.

Levy suggests that sociopolitical context may account for at least some of the differences seen between the U.S. and the U.K.: At the time, Theresa May was prime minister and very strongly expected to win, plus many Britons likely remember the long tenure of former Prime Minister Margaret Thatcher.

"The situation was very different there because there was an incumbent who was a woman, and there is a history of referring to the prime minister as 'she' and thinking about the prime minster as potentially a woman," he says.

The lead author of the study is Titus von der Malsburg, a research affiliate at MIT and a researcher in the Department of Linguistics at the University of Potsdam, Germany. Till Poppels, a graduate student at the University of California at San Diego, is also an author of the paper, which appears in the journal Psychological Science.

Implicit linguistic biases

Levy and his colleagues began their study in early 2016, planning to investigate how people's expectations about world events, specifically, the prospect of a woman being elected president, would influence their use of language. They hypothesized that the strong possibility of a female president might override the implicit bias people have toward referring to the president as "he."

"We wanted to use the 2016 electoral campaign as a natural experiment, to look at what kind of language people would produce or expect to hear as their expectations about who was likely to win the race changed," Levy says.

Before beginning the study, he expected that people's use of the pronoun "she" would go up or down based on their beliefs about who would win the election. He planned to explore how long would it take for changes in pronoun use to appear, and how much of a boost "she" usage would experience if a majority of people expected the next president to be a woman.

However, such a boost never materialized, even though Clinton was expected to win the election.

The researchers performed their experiment 12 times between June 2016 and January 2017, with a total of nearly 25,000 participants from the Amazon Mechanical Turk platform. The study included three tasks, and each participant was asked to perform one of them. The first task was to predict the likelihood of three candidates winning the election -- Clinton, Donald Trump, or Bernie Sanders. From those numbers, the researchers could estimate the percentage of people who believed the next president would be a woman. This number was higher than 50 percent during most of the period leading up to the election, and reached just over 60 percent right before the election.

The next two tasks were based on common linguistics research methods -- one to test people's patterns of language production, and the other to test how the words they encounter affect their reading comprehension.

To test language production, the researchers asked participants to complete a paragraph such as "The next U.S. president will be sworn into office in January 2017. After moving into the Oval Office, one of the first things that ...."

In this task, about 40 percent of the participants ended up using a pronoun in their text. Early in the study period, more than 25 percent of those participants used "he," fewer than 10 percent used "she," and around 50 percent used "they." As the election got closer, and Clinton's victory seemed more likely, the percentage of "she" usage never went up, but usage of "they" climbed to about 60 percent. While these results indicate that the singular "they" has reached widespread acceptance as a de facto standard in contemporary English, they also suggest a strong persistent bias against using "she" in a context where the gender of the individual referred to is not yet known.

"After Clinton won the primary, by late summer, most people thought that she would win. Certainly Democrats, and especially female Democrats, thought that Clinton would win. But even in these groups, people were very reluctant to use 'she' to refer to the next president. It was never the case that 'she' was preferred over 'he,'" Levy says.

For the third task, participants were asked to read a short passage about the next president. As the participants read the text on a screen, they had to press a button to reveal each word of the sentence. This setup allows the researchers to measure how quickly participants are reading. Surprise or difficulty in comprehension leads to longer reading times.

In this case, the researchers found that when participants encountered the pronoun "she" in a sentence referring to the next president, it cost them about a third of a second in reading time -- a seemingly short amount of time that is nevertheless known from sentence processing research to indicate a substantial disruption relative to ordinary reading -- compared to sentences that used "he." This did not change over the course of the study.

"For months, we were in a situation where large segments of the population strongly expected that a woman would win, yet those segments of the population actually didn't use the word 'she' to refer to the next president, and were surprised to encounter 'she' references to the next president," Levy says.

Strong stereotypes

The findings suggest that gender biases regarding the presidency are so deeply ingrained that they are extremely difficult to overcome even when people strongly believe that the next president will be a woman, Levy says.

"It was surprising that the stereotype that the U.S. president is always a man would so strongly influence language, even in this case, which offered the best possible circumstances for particularized knowledge about an upcoming event to override the stereotypes," he says. "Perhaps it's an association of different pronouns with positions of prestige and power, or it's simply an overall reluctance to refer to people in a way that indicates they're female if you're not sure."

The U.K. component of the study was conducted in June 2017 (before the election) and July 2017 (after the election but before Theresa May had successfully formed a government). Before the election, the researchers found that "she" was used about 25 percent of the time, while "he" was used less than 5 percent of the time. However, reading times for sentences referring to the prime minister as "she" were no faster than than those for "he," suggesting that there was still some bias against "she" in comprehension relative to usage preferences, even in a country that already has a woman prime minister.

The type of gender bias seen in this study appears to extend beyond previously seen stereotypes that are based on demographic patterns, Levy says. For example, people usually refer to nurses as "she," even if they don't know the nurse's gender, and more than 80 percent of nurses in the U.S. are female. In an ongoing study, von der Malsburg, Poppels, Levy, and recent MIT graduate Veronica Boyce have found that even for professions that have fairly equal representation of men and women, such as baker, "she" pronouns are underused.

"If you ask people how likely a baker is to be male or female, it's about 50/50. But if you ask people to complete text passages that are about bakers, people are twice as likely to use he as she," Levy says. "Embedded within the way that we use pronouns to talk about individuals whose identities we don't know yet, or whose identities may not be definitive, there seems to be this systematic underconveyance of expectations for female gender."

Credit: 
Massachusetts Institute of Technology

New Phytopathology journal focus issue emphasizes virological advances

image: Evaluation of an attenuated papaya leaf distortion mosaic virus mutant for cross protection against severe strain tagged with GFP infection via symptoms and GFP fluorescence in three papaya cultivars (Tuo et al.).

Image: 
Decai Tuo and Wentao Shen

Viruses cause some of the most important plant diseases worldwide with an estimated $60 billion in annual yield losses to agricultural crops. Since the first report of tobacco mosaic virus in the late 19th century, discovery of new plant viruses has increased rapidly, particularly since the advent of next-generation sequencing technology.

Furthermore, the development of reverse genetics systems for a growing number of plant viruses since the late 20th century has facilitated identification of viral determinants involved in replication, movement, vector transmission, host range, and pathogenicity. Studies on virus-host and virus-vector interactions have facilitated identification of factors involved in disease development and horizontal transmission, respectively.

Given the importance of and rapid research progress in plant virology in recent years, Phytopathology emphasized virological advances in its Fundamental Aspects of Plant Viruses focus issue, which is available now.

Phytopathology is an international journal publishing articles on fundamental research that advances understanding of the nature of plant diseases, the agents that cause them, their spread, the losses they cause, and measures used to control them.

Credit: 
American Phytopathological Society

Lifestyle choices could slow familial frontotemporal dementia

A physically and mentally active lifestyle confers resilience to frontotemporal dementia (FTD), even in people whose genetic profile makes the eventual development of the disease virtually inevitable, according to new research by scientists at the UC San Francisco Memory and Aging Center.

The research aligns with long-standing findings that exercise and cognitive fitness are one of the best ways to prevent or slow Alzheimer's disease, but is the first study to show that the same types of behaviors can benefit people with FTD, which is caused by a distinct form of brain degeneration.

FTD is a neurodegenerative disease that can disrupt personality, decision-making, language, or movement abilities, and typically begins between the ages of 45 and 65. It is the most common form of dementia in people under 65 (accounting for 5 to 15 percent of dementia cases overall) and typically results in rapid cognitive and physical decline and death in less than 10 years. There are currently no drugs to treat FTD, though numerous clinical trials for the disease are underway at UCSF Memory and Aging Center and elsewhere.

"This is devastating disease without good medical treatments, but our results suggest that even people with a genetic predisposition for FTD can still take actions to increase their chances of living a long and productive life. Their fate may not be set in stone," said Kaitlin Casaletto, PhD, assistant professor of neurology at the UCSF Memory and Aging Center and corresponding author of the new study, published January 8, 2020 in Alzheimer's and Dementia.

'If This Were a Drug, We Would Be Giving it to All Our Patients'

About 40 percent of people with FTD have a family history of the disease, and scientists have identified specific dominant genetic mutations that drive the development of the disease in roughly half of these cases. But even in these individuals, the disease can have very different courses and severity.

"There's incredible variability in FTD, even among people with the same genetic mutations driving their disease. Some people are just more resilient than others for reasons we still don't understand," said, Casaletto, a member of the UCSF Weill Institute for Neurosciences. "Our hypothesis was that the activities people engage in each day of their lives may contribute to the very different trajectories we see in clinic, including when the disease develops and how it progresses."

To test this hypothesis, Casaletto and colleagues studied how lifestyle differences affected FTD progression in 105 people with dominant, disease-causing genetic mutations who were mostly asymptomatic or had experienced only mild, early-stage symptoms. The research participants were drawn from two large multisite studies, called ARTFL and LEFFTDS (recently combined into a study known as ALLFTD), led by co-authors Adam Boxer, MD, PhD, and Howie Rosen, MD, also of the UCSF Memory and Aging Center.

As part of these larger studies, all participants underwent initial MRI scans to measure the extent of brain degeneration caused by the disease, completed tests of thinking and memory, and reported on their current levels of cognitive and physical activity in their daily lives (e.g., reading, spending time with friends, jogging). At the same time, their family members completed regular gold-standard assessments of how well the study participants were functioning in their lives -- managing finances, medications, bathing themselves, and so on. All of these measures were repeated at annual follow-up visits to track the long-term progression of participants' disease.

Even after only two to three visits (one to two years into the ongoing study), Casaletto and her team have already begun to see significant differences in the speed and severity of FTD between the most and least mentally and physically active individuals in the study, with mentally and physically active lifestyles showing similar effects across participants.

Specifically, the researchers found that functional decline, as assessed by participants' family members, was 55 percent slower in the most active 25 percent of participants compared to the least active five percent. "This was a remarkable effect to see so early on," Casaletto said. "If this were a drug, we would be giving it to all of our patients."

The researchers found that participants' lifestyles did not significantly alter the inexorable degeneration of brain tissue associated with FTD, as measured by follow-up MRI scans a year into the study. But even among participants whose brain scans revealed signs of atrophy, the most mentally and physically active participants continued to perform twice as well as the least active participants on cognitive tests. These results suggest that active lifestyles may slow FTD symptoms by providing some form of cognitive resilience to the consequences of brain degeneration.

Findings Could Illuminate Biology of Brain Resilience Across Dementias

The researchers anticipate seeing even larger differences in cognitive decline between more and less active groups as the merged ALLFTD study continues to follow these participants over time. "We've seen such significant effects in just the first year or two in people with very mild disease -- if these results hold, we may see that an active lifestyle sets individuals on a different trajectory for the coming years," Casaletto said.

The next step for the research is to include more detailed and objective assessments of participants' physical and mental activity -- including fitting them with wearable FitBit activity sensors -- to begin to estimate exactly how much activity is needed to promote cognitive resilience.

Casaletto cautions that the results, though exciting, so far only report a correlation: "It is possible that some participants have less active lifestyles because they have a more severe or aggressive form of FTD, which is already impacting their ability to be active. Clinical trials that manipulate cognitive and physical activity levels in people with FTD mutations are needed to prove that lifestyle changes can alter the course of the disease."

With this caveat in mind, Casaletto hopes the findings will not only encourage care teams and individuals with family histories of FTD to adopt lifestyle changes that could provide more productive years of life, but also that the ongoing study will lead to a better biological understanding of the drivers of resilience in people with FTD.

"We can see that lifestyle differences impact people's resilience to FTD despite very penetrant genetics, so now we can start to ask more fundamental questions, like how these behaviors actually affect the brain's biology to confer that resilience." Casaletto said. "Is that biological effect something we could replicate pharmacologically to help slow the progression of this terrible disease for everyone?"

Credit: 
University of California - San Francisco

Nanoparticles deliver 'suicide gene' therapy to pediatric brain tumors growing in mice

Johns Hopkins researchers report that a type of biodegradable, lab-engineered nanoparticle they fashioned can successfully deliver a "suicide gene" to pediatric brain tumor cells implanted in the brains of mice. The poly(beta-amino ester) nanoparticles, known as PBAEs, were part of a treatment that also used a drug to kill the cells and prolong the test animals' survival.

In their study, described in a report published January 2020 in the journal Nanomedicine: Nanotechnology, Biology and Medicine, the researchers caution that for safety and biological reasons, it is unlikely that the suicide gene herpes simplex virus type I thymidine kinase (HSVtk) -- which makes tumor cells more sensitive to the lethal effects of the anti-viral drug ganciclovir -- could be the exact therapy used to treat human medulloblastoma and atypical teratoid/rhabdoid tumors (AT/RT) in children.

So-called "suicide genes" have been studied and used in cancer treatments for more than 25 years. The HSVtk gene makes an enzyme that helps restore the function of natural tumor suppression.

Specifically, the experiments found that a combination of the suicide gene and ganciclovir delivered by intraperitoneal injection to mice killed more than 65% of the two types of pediatric brain tumor cells. The combination was deliberately "transfected" with the gene seven days after the nanoparticle therapy was used to deliver the genetic material. Mice bearing an AT/RT-type tumor lived 20% longer after receiving the treatment -- 42 days, compared to 35 days for untreated mice. Those with a group 3 medulloblastoma-type tumor implanted in the brain lived 63% longer, surviving 31 days compared to 19 days for untreated mice.

"It's an exciting alternate way to be able to deliver gene therapy to a tumor in a selective fashion that targets only tumor cells," says Eric Jackson, M.D., associate professor of neurosurgery at the Johns Hopkins University School of Medicine. "Our idea now is to find other collaborators who may have a gene therapy that they think would work well to kill these tumors."

Medulloblastoma and AT/RT are two of the most prevalent and deadly pediatric brain malignancies. Traditional treatments, including radiation, can harm healthy tissue as well as the tumor, and can produce long-lasting developmental side effects in growing children, making it critical to find new therapies, Jackson notes.

Gene therapy that targets only cancer cells is a promising treatment avenue, but many gene therapy methods use a modified virus to deliver their therapeutic payloads of DNA, a method that may not be safe or suitable for pediatric use. "A lot of these viruses are safe if you have a mature immune system, but in very young patients with more fragile immune systems, a virus delivery system may pose additional risks," says Jackson.

To address this issue, Jackson collaborated with Jordan Green, Ph.D., an investigator at the Johns Hopkins Kimmel Cancer Center Bloomberg~Kimmel Institute for Cancer Immunotherapy, director of the Johns Hopkins Biomaterials and Drug Delivery Laboratory and professor of biomedical engineering, to find a different kind of carrier for gene therapy. Green and his colleagues developed the PBAE class of polymeric nanoparticles, which can be engineered to bind and carry DNA.

The biodegradable PBAEs are injected into a tumor mass where they safely release their DNA cargo after being ingested by tumor cells. In previous studies that used similar particles to deliver gene therapy to adult brain cancers and liver cancers in cell cultures and in rodents, Green and his colleagues found that the nanoparticles preferentially target tumor cells over healthy cells.

The mechanism that allows the particles to target tumor cells preferentially is still being investigated, but Green thinks "the chemical surface of the particle is likely interacting with proteins that are on the surface of certain types of cancer cells."

Green and his colleagues altered the nanoparticles to target the two pediatric malignancies. "By making small chemical changes to the polymers that make up the nanoparticles, we can significantly change the cellular uptake into particular kinds of cancer cells, and the subsequent gene delivery to the cytosol, in a cell-specific way," Green says.

Jackson says he hopes the nanoparticles can be used to deliver a variety of gene-based treatments - including therapies that alter the expression levels of genes, turn genes on and off altogether, or sensitize cells to other therapies - depending on the specifics of a patient's tumor. "In some ways, we're still in the discovery phase of what genes to target" in medulloblastoma and AT/RT, he says.

The nanoparticles "can carry larger genes than what can be carried by a virus, and can carry combinations of genes," says Green. "It's a platform that doesn't have limitations on the cargo size being delivered, or limitations related to immunogenicity or toxicity. And, it is easier to manufacture than a virus."

Credit: 
Johns Hopkins Medicine