Earth

Mayo researchers discover way to prime cancer tumors for immunotherapy

ROCHESTER, Minn. -- A cancer tumor's ability to mutate allows it to escape from chemotherapy and other attempts to kill it. So, encouraging mutations would not be a logical path for cancer researchers. Yet a Mayo Clinic team and their collaborators took that counterintuitive approach and discovered that while it created resistance to chemotherapy, it also made tumors sensitive to immunotherapy. They also found that this approach worked successfully across tumor types and individual patient genomes. Their findings involving mouse models and human cells appear in Nature Communications.

The international team of researchers based in Rochester, Minn. and London, led by Richard Vile, Ph.D., a Mayo Clinic professor of pediatric oncology, studied models of both pediatric brain tumors and melanoma. They found that, in mice, high levels of the protein APOBEC3B drove a high rate of tumor mutations. Yet at the same time, these levels of APOBEC3B also sensitized cells to treatment with immune checkpoint blockade, a major mechanism of immunotherapy.

"When you put that in the context of vaccine therapy, the mutations generate neoepitopes ? a type of peptide that is a prime target for killer T cells," says Dr. Vile. "So that, combined with the checkpoint blockade, make for a potential cross-tumor therapy."

The results showed a high rate of cures in subcutaneous melanoma and brain tumor models, and effectiveness no matter the tumor type or location. The results also showed that an individualized approach for each patient is not required. The team are hoping to translate this work into clinical trials for pediatric brain tumors within the next year.

Credit: 
Mayo Clinic

Why bumble bees are going extinct in time of 'climate chaos'

video: Climate change means places are getting hotter than ever before, and these extremes appear to be driving the disappearance of bumble bees across continents.

Image: 
Peter Soroye, University of Ottawa

When you were young, were you the type of child who would scour open fields looking for bumble bees? Today, it is much harder for kids to spot them, since bumble bees are drastically declining in North America and in Europe.

A new study from the University of Ottawa found that in the course of a single human generation, the likelihood of a bumble bee population surviving in a given place has declined by an average of over 30%.

Peter Soroye, a PhD student in the Department of Biology at the University of Ottawa, Jeremy Kerr, professor at the University of Ottawa and head of the lab group Peter is in, along with Tim Newbold, research fellow at UCL (University College London), linked the alarming idea of ''climate chaos'' to extinctions, and showed that those extinctions began decades ago.

"We've known for a while that climate change is related to the growing extinction risk that animals are facing around the world," first author Peter Soroye explained. "In this paper, we offer an answer to the critical questions of how and why that is. We find that species extinctions across two continents are caused by hotter and more frequent extremes in temperatures."

"We have now entered the world's sixth mass extinction event, the biggest and most rapid global biodiversity crisis since a meteor ended the age of the dinosaurs." - Peter Soroye

Massive decline of the most important pollinators on Earth

"Bumble bees are the best pollinators we have in wild landscapes and the most effective pollinators for crops like tomato, squash, and berries," Peter Soroye observed. "Our results show that we face a future with many less bumble bees and much less diversity, both in the outdoors and on our plates."

The researchers discovered that bumble bees are disappearing at rates "consistent with a mass extinction."

"If declines continue at this pace, many of these species could vanish forever within a few decades," Peter Soroye warned.

The technique

"We know that this crisis is entirely driven by human activities," Peter Soroye said. "So, to stop this, we needed to develop tools that tell us where and why these extinctions will occur."

The researchers looked at climate change and how it increases the frequency of really extreme events like heatwaves and droughts, creating a sort of "climate chaos" which can be dangerous for animals. Knowing that species all have different tolerances for temperature (what's too hot for some might not be for others), they developed a new measurement of temperature.

"We have created a new way to predict local extinctions that tells us, for each species individually, whether climate change is creating temperatures that exceed what the bumble bees can handle," Dr. Tim Newbold explained.

Using data on 66 different bumble bee species across North America and Europe that have been collected over a 115-year period (1900-2015) to test their hypothesis and new technique, the researchers were able to see how bumble bee populations have changed by comparing where bees are now to where they used to be historically.

"We found that populations were disappearing in areas where the temperatures had gotten hotter," Peter Soroye said. "Using our new measurement of climate change, we were able to predict changes both for individual species and for whole communities of bumble bees with a surprisingly high accuracy."

A new horizon of research

This study doesn't end here. In fact, it opens the doors to new research horizons to track extinction levels for other species like reptiles, birds and mammals.

"Perhaps the most exciting element is that we developed a method to predict extinction risk that works very well for bumble bees and could in theory be applied universally to other organisms," Peter Soroye indicated. "With a predictive tool like this, we hope to identify areas where conservation actions would be critical to stopping declines."

"Predicting why bumble bees and other species are going extinct in a time of rapid, human-caused climate change could help us prevent extinction in the 21st century." - Dr. Jeremy Kerr

There is still time to act

"This work also holds out hope by implying ways that we might take the sting out of climate change for these and other organisms by maintaining habitats that offer shelter, like trees, shrubs, or slopes, that could let bumble bees get out of the heat," Dr. Kerr said. "Ultimately, we must address climate change itself and every action we take to reduce emissions will help. The sooner the better. It is in all our interests to do so, as well as in the interests of the species with whom we share the world."

Credit: 
University of Ottawa

First phase i clinical trial of CRISPR-edited cells for cancer shows cells safe and durable

Following the first U.S. test of CRISPR gene editing in patients with advanced cancer, researchers report these patients experienced no negative side effects and that the engineered T cells persisted in their bodies - for months. The results from this Phase 1 clinical trial suggest the gene editing approach was safe and feasible, which until now has been uncertain, and the findings represent an important step toward the ultimate goal of using gene editing to help a patient's immune system attack cancer. CRISPR-Cas9 gene editing provides a powerful tool to improve the natural ability of human T cells to fight cancer. Though engineered T cell therapies are greatly advancing cancer treatment, whether CRISPR-Cas9 edited T cells would be tolerated and survive once reinfused into a human has been unknown. Here, focusing on three patients in their 60s with advanced cancers that didn't respond to standard treatments, Edward A. Stadtmauer and colleagues sought to test this. They removed T lymphocytes from these patients' blood and used CRISPR to delete genes from the cells that might interfere with the immune systems's ability to fight cancer. Next, the researchers used a virus to arm the T-cells to attack a protein typically found on cancer cells, NY-ESO-1. They infused the cells back into the patients and monitored cell engraftment and persistence. There were no toxicity-related side effects, the authors report, and the engineered T cells could be detected up to 9-months post-infusion. The work builds on earlier research developing adoptive T cell therapies led by Dr. Carl June, who was also part of this study. In a related Perspective, Jennifer Hamilton and Jennifer Doudna write, "these findings provide the cell engineering field with a guide for the safe production and non-immunogenic administration of gene edited somatic cells." They add: "The big question that remains unanswered by this study is whether gene-edited, engineered T cells are effective against advanced cancer," while also noting that the study by Stadtmauer and colleagues was restricted to editing protocols available in 2016, in which gene-edited T cells are likely less effective than they would be using newer technology available today.

Credit: 
American Association for the Advancement of Science (AAAS)

Mosquitoes seek heat using repurposed ancestral cooling receptor

In a mosquito responsible for transmission of malaria, heat-seeking behavior - critical to this insect's ability to locate and feed on warm-blooded hosts - relies on a thermoreceptor that was once focused on heat avoidance (to help the mosquito keep cool). Today, the receptor is wired for heat targeting (to help the insect find its next meal). A new study reporting this finding, by suggesting a means to block mosquito heat-seeking, may help guide the development of novel methods for controlling mosquito-borne illnesses like malaria. Of all insect vectors, mosquitoes are perhaps the most famous, responsible for transmitting a host of different pathogens. Efforts to control malaria through vaccines or its mosquito-vectors using pesticides have proven difficult, leading researchers to pursue alternative strategies. Like other disease-spreading insects, mosquitoes use specialized receptors that sense body heat to target the source of their next blood meal. However, the molecular basis of their heat-seeking behavior remains unknown. Chloe Greppi and colleagues evaluated whether ancestral cooling-activated receptors play a role in heat sensing in Anopheles gambiae - the primary mosquito vector responsible for the transmission of malaria in most of sub-Saharan Africa. Using genome-wide analyses and labeled CRISPR-Cas9 mutants, Greppi et al. identified the evolutionarily conserved sensory thermoreceptor IR21a as a key driver ofheat-seeking behaviors. In other insects, Ir21a is a cooling receptor and mediates heat avoidance, allowing the insects to maintain optimal body temperatures. According to the authors, the evolution of blood-feeding in An. gambiae mosquito involved a repurposing of this ancestral thermoreceptor to facilitate warmth sensing instead. While blocking Ir21a did not wholly end heat-seeking behavior outright, it significantly reduced the ability of female mosquitos to find a source of blood. "Thermoreception has been a relatively neglected aspect of vector biology, with research efforts focused largely on chemoreception," writes Claudio Lazzari in a related Perspective. Identifying the root of thermosensation opens research avenues and possibilities for controlling vector-borne disease, Lazzari says.

Credit: 
American Association for the Advancement of Science (AAAS)

Scientists discover how rogue communications between cells lead to leukemia

image: Rogue communications in the membrane of blood stem cells.

Image: 
Ilpo Vattulainen and Joni Vuorio from the University of Helsinki

New research has deciphered how rogue communications in blood stem cells can cause Leukaemia.

The discovery could pave the way for new, targeted medical treatments that block this process.

Blood cancers like leukaemia occur when mutations in stem cells cause them to produce too many blood cells.

An international team of scientists, including researchers at the University of York, have discovered how these mutations allow cells to deviate from their normal method of communicating with each other, prompting the development of blood cells to spiral out of control.

The scientists used super-resolution fluorescent microscopy to study the way blood stem cells talk to each other in real time.

They observed how cells receive instructions from 'signalling proteins', which bind to a receptor on the surface of another cell before transmitting a signal telling the cell how to behave.

Blood stem cells communicate via cytokines, which are one of the largest and most diverse families of signalling proteins and are critical for the development of blood cells and the immune system.

Understanding this process led researchers to the discovery that mutations associated with certain types of blood cancers can cause blood stem cells to 'go rogue' and communicate without cytokines.

The stem cells begin to transmit uncontrolled signals causing the normal system of blood cell development to become overrun, producing an imbalance of healthy white and red blood cells and platelets.

Professor Ian Hitchcock from the York Biomedical Research Institute and the Department of Biology at the University of York, said: "Our bodies produce billions of blood cells every day via a process of cells signalling between each other. Cytokines act like a factory supervisor, tightly regulating this process and controlling the development and proliferation of the different blood cell types.

"Our observations led us to a previously unknown mechanism for how individual mutations trigger blood stem cells to start signalling independently of cytokines, causing the normal system to become out of control and leading to diseases like leukaemia.

"Understanding this mechanism may enable the identification of targets for the development of new drugs."

This research team used a combination of molecular modelling, structural biology, biophysics, super-resolution microscopy and cell biology to demonstrate for the first time that these specific receptors on the surface of blood stem cells are linked by cytokines to form pairs.

Co-author of the study, Professor Jacob Piehler from Osnabrück University, said: "By directly visualising individual receptors at physiological conditions under the microscope, we were able to resolve a controversy that has preoccupied the field for more than 20 years."

Professor Ilpo Vattulainen from the University of Helsinki, added: "Our biomolecular simulations unveiled surprising features concerning the orientation of active receptor pairs at the plasma membrane, explaining how mutations render activation possible without a ligand (such as a cytokine). These predictions were subsequently confirmed experimentally"

First author Dr. Stephan Wilmes, who started the project as a Postdoc at Osnabrück University before moving to the University of Dundee: "It was truly inspiring to tackle this highly relevant biomedical question by applying cutting-edge biophysical techniques. Here in Dundee, I had the chance to perform complementary activity assays, which corroborated our mechanistic model."

Credit: 
University of York

Key molecular machine in cells pictured in detail for the first time

video: This is a video of the histone mRNA three-prime (3') end-processing machine.

Image: 
Liang Tong Lab, Columbia University

CHAPEL HILL, NC - February 6, 2020 - Scientists from the UNC School of Medicine, Columbia University, and Rockefeller University have revealed the inner workings of one of the most fundamental and important molecular machines in cells.

The researchers, in a study published in Science, used biochemical experiments and cryo-electron microscopy (cryo-EM) to determine the atomic structure of a complex assembly of molecules known as the histone mRNA three-prime (3') end-processing machine. This machine plays a fundamental role in proper activity and duplication of the cell genome and when defective, it may lead to human diseases, including cancers.

Histone proteins are found in all plants and animals, and they form a "beads-on-a-string" arrangement where the DNA in chromosomes is wrapped around the beads of histones. Histones ensure the efficient packaging of DNA and help regulate which genes are turned "on" and which are kept "off," processes needed for all cells to function properly.

The histone mRNA 3' end-processing machine is responsible for cutting - at precisely the right place - the mRNA transcript that is copied out from a histone gene and encodes the corresponding histone protein. The machine performs an essential role in cells' production of histone proteins, which occurs at high levels whenever a cell divides and must replicate its DNA. The structure shows how the machine is activated only after it binds the histone mRNA, preventing cleavage of other RNAs.

"This structure provides the first atomic insights into a critical process in cells, and beautifully explains the large body of current knowledge on this machinery," said senior author Liang Tong, PhD, William R. Kenan, Jr. Professor of the Biological Sciences Department at Columbia University. "The structure has been long awaited by scientists in the field, and the elegant amphora shape of the machinery is an unexpected bonus. The structure also provides valuable insights into other RNA 3'-end processing machineries, because they share key components with the histone machinery."

Tong's recent studies showed the canonical machinery in an inactive form. So, now, scientists have a glimpse of how the machinery is activated.

"This structure is another illustration of the remarkable power of the new cryo-EM technique," Tong added.

The solution of the structure of this complex, a landmark achievement of molecular biology, is the culmination of nearly 40 years of research by a number of laboratories and molecular biologists.

"I started on the problem of histone mRNA and how it's regulated when I first started my laboratory as an assistant professor at Florida State in 1974," said study co-author William Marzluff, PhD, Kenan Distinguished Professor of Biochemistry and Biophysics at the UNC School of Medicine and the Integrative Program in Biological and Genome Sciences at UNC-Chapel Hill. "And this is certainly the most important contribution we have made in this field of inquiry so far."

"For a long time we have been studying different pieces of this molecular machine, but now for the first time we know how all the pieces fit together and work together," said Zbigniew Dominski, PhD, professor in the Department of Biochemistry & Biophysics at the UNC School of Medicine, who led the discovery of many of the components of the machine. "It's as if someone opened up the hood of an old car so that at last you could see how the whole engine looks and works, suddenly learning about unexpected mechanical and functional details."

Dominski was a co-corresponding author of the study with Thomas Walz, PhD, Professor and Head of Laboratory of Molecular Electron Microscopy at Rockefeller University.

A special tail

Every protein is produced in a process that starts with a gene. Special enzymes copy out, or transcribe, the information in the gene in the form of ribonucleic acid (RNA), a close molecular cousin of DNA in the cell nucleus. A special molecular machine called a 3' end processing machine must then cut that strand of RNA at the correct place to process it into a molecule called a messenger RNA (mRNA), which migrates into the main part of the cell and is translated there into the final protein.

The mRNAs for virtually all proteins are processed by one type of 3' end-processing machine, which cuts them at the correct place and adds a special tail to them. Histone transcripts in animal cells which encode histone proteins needed for cell division are processed by a different machine, which cuts them but adds no tail. And this is the machine that we are now very familiar with due to this breakthrough structural study.

"No one really knows why histone mRNAs are different from other mRNAs; it's what we call a theological question," Marzluff joked.

The canonical and histone RNA 3' end-processing machines are each composed of more than a dozen individual proteins and RNA molecules. Some of these elements are found in both machines, suggesting a common evolutionary origin. Since the histone 3' processing machine contains the same three core proteins as the canonical machine, including the protein that actually cleaves the RNA, the process of activation of the two machines is likely similar, although the way the two machines recognize their RNA targets is distinct.

Tong, Dominski, Marzluff and their colleagues succeeded in assembling a working version of the histone RNA 3' end-processing machine from its 13 protein and 2 RNA components, essentially in a test-tube. The machine was then imaged using cryo-electron microscopes at the New York Structural Biology Center (NYSBC), and subsequent data processing ultimately led to a structure at near atomic resolution. The team was also able to mutate key components to verify their individual functions.

The amphora

The structure of the machine turned out to resemble an amphora with one long handle. The cryo-EM analysis also revealed how the machine recognizes histone RNA and cuts it at precisely the right place.

"It detects two elements of the RNA strand, and only when they are present does the cutting device in this machine expose its blades, so to speak," Dominski said. "There is no randomness, no accident; it cleaves only what it is supposed to cleave, and the study reveals this beautifully."

Dominski has been investigating this histone RNA-processing machinery since the mid-1990s and with Marzluff, Tong, and others has been responsible for key discoveries about individual components. "This is more or less the end of the road, as far as understanding how this machine works," he said. "We've resolved that pretty clearly with this study, and it's a good feeling."

Credit: 
University of North Carolina Health Care

Apps could take up less space on your phone, thanks to new 'streaming' software

image: Researchers have developed software that reduces space taken up by apps on a smartphone, allowing users to continue downloading the apps they want without deleting some first.

Image: 
Jamayal Tanweer

WEST LAFAYETTE, Ind. -- If you resort to deleting apps when your phone's storage space is full, researchers have a solution.

New software "streams" data and code resources to an app from a cloud server when necessary, allowing the app to use only the space it needs on a phone at any given time.

"It's like how Netflix movies aren't actually stored on a computer. They are streamed to you as you are watching them," said Saurabh Bagchi, a Purdue University professor of electrical and computer engineering, and computer science, and director of the Center for Resilient Infrastructures, Systems and Processes.

"Here the application components, like heavy video or graphics or code paths, are streaming instantly despite the errors and slowdowns that are possible on a cellular network."

Bagchi's team showed in a study how the software, called "AppStreamer," cuts down storage requirements by at least 85% for popular gaming apps on an Android.

The software seamlessly shuffles data between an app and a cloud server without stalling the game. Most study participants didn't notice any differences in their gaming experience while the app used AppStreamer.

Since AppStreamer works for these storage-hungry gaming apps, it could work for other apps that usually take up far less space, Bagchi said. The software also allows the app itself to download faster to a phone.

The researchers will present their findings Feb. 18 at the 17th International Conference on Embedded Wireless Systems and Networks in Lyon, France. Conference organizers have selected this study as one of three top papers.

AppStreamer is a type of software known as middleware, located between the apps on a device and the operating system.

The middleware automatically predicts when to fetch data from a cloud server. AT&T Labs Research provided data from cellular networks for this study to help evaluate which bandwidths AppStreamer would use and how much energy it would consume.

AppStreamer could help phones better accommodate 5G connectivity - high-speed wireless cellular networks that would allow devices to download movies in seconds and handle other data-heavy tasks much faster than the 4G networks currently available to most phones.

Using AppStreamer on a 5G network would mean that an app downloads instantly, runs faster and takes up minimal space on a phone.

The researchers also designed AppStreamer to use "edge computing," which stores and sends data from edge servers. These servers, located in spots such as cellphone towers, are closer to a device compared to the cloud. The shorter distance reduces data download time.

Bagchi's lab researches ways to make edge computing more reliable. Bagchi wrote on those challenges in an article recently published in Communications of the ACM.

The researchers believe that AppStreamer could be good for more than just phones. In order for self-driving cars to respond to their surroundings more safely, they would need to reliably pull data from servers in milliseconds. Middleware such as AppStreamer could eventually supply this functionality through edge computing on a 5G network.

Credit: 
Purdue University

Cancer-causing culprits could be caught by their DNA fingerprints

Causes of cancer are being catalogued through an international study revealing the genetic fingerprints of DNA-damaging processes that drive cancer development. Researchers from University of California San Diego School of Medicine, Wellcome Sanger Institute, Duke-NUS Medical School Singapore, the Broad Institute of MIT and Harvard, with collaborators around the world, have created the most detailed list yet of these genetic fingerprints, providing clues to how each cancer develops.

These fingerprints will allow scientists to search for previously unknown chemicals, biological pathways and environmental agents responsible for causing cancer.

"We identified almost every publically available cancer genome at the start of this project and analyzed their whole genome sequences," said first author Ludmil B. Alexandrov, PhD, assistant professor of in the departments of Cellular and Molecular Medicine and Bioengineering at UC San Diego. "The data from these thousands of cancers allowed us to describe mutational signatures in much more detail, and we are confident that we now know most of the signatures that exist."

The research, published on February 5, 2020 in Nature as part of the global Pan-Cancer Project, will help delineate the causes of cancer, inform prevention strategies and define new directions for cancer diagnoses and treatments.

In the United States, the National Cancer Institute estimates 1.7 million new cases of cancer were diagnosed in 2018 and more than 600,000 people died from the disease. Approximately 38 percent of men and women in this country will be diagnosed with cancer in their lifetime.

Cancer is caused by genetic changes -- mutations -- in the DNA of a cell, prompting the cell to divide uncontrollably. Many known causes of cancer, such as ultraviolet light and tobacco use, leave a specific fingerprint of damage in the DNA, known as a mutational signature. These fingerprints can help understand how cancers develop, and potentially how they might be prevented. However, past studies have not been large enough to identify all potential mutational signatures.

"Using our detailed catalogue of the range of mutational signatures in cancer DNA, researchers worldwide will now be able to investigate which chemicals or processes are linked to these signatures," said co-senior author Mike Stratton, PhD, director of the Wellcome Sanger Institute. "This will increase our understanding of how cancer develops, and discover new causes of cancer, helping to inform public health strategies to prevent cancer."

This study identified new mutational signatures that had not been seen before, from single letter typo mutations, to slightly larger insertions and deletions of genetic code. The result is the largest database of reference mutational signatures. Only about half of all mutational signatures have known causes.

"Some types of these DNA fingerprints, or mutational signatures, reflect how the cancer could respond to drugs," said co-senior author Steven Rozen, PhD, director of the Center for Computational Biology and professor of Cancer and Stem Cell Biology at Duke-NUS Medical School. "Further research into this could help to diagnose some cancers and what drugs they might respond to."

The global Pan-Cancer Project is the largest and most comprehensive study of whole cancer genomes. The collaboration has created a huge resource of primary cancer genomes, available to researchers worldwide to advance cancer research.

"The availability of a large number of whole genomes enabled us to apply more advanced analytical methods to discover and refine mutational signatures and expand our study into additional types of mutations," said co-senior author Gad Getz, PhD, institute member of the Broad Institute of MIT and Harvard, professor of pathology at Harvard Medical School, and faculty member and director of bioinformatics at the Massachusetts General Hospital Cancer Center.

"Our new collection of signatures provides a more complete picture of biological and chemical processes that damage or repair DNA and will enable researchers to decipher the mutational processes that affect the genomes of newly sequenced cancers."

Credit: 
University of California - San Diego

NASA sees tropical storm Damien form off Australia's Pilbara coast

image: On Feb. 6, 2020, the MODIS instrument that flies aboard NASA's Terra satellite provided a visible image of newly developed Tropical Storm Damien off Western Australia's Pilbara coast.

Image: 
NASA Worldview

The low-pressure area that formed off Australia's Kimberley coast and lingered there for a couple of days has moved west and developed into Tropical Cyclone Damien off the Pilbara coastline. NASA's Terra satellite passed over the Southern Indian Ocean and provided forecasters with a visible image of the new tropical storm.  The Pilbara Coast is also known as the northwest coast of Western Australia.

On Feb. 6, the Moderate Resolution Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite provided a visible image of Damien that showed the storm had a more rounded shape than it did the previous day as it continued to consolidate. A more rounded shape of a tropical cyclone indicates it is becoming a more organized storm. Satellite imagery revealed a small central dense overcast with rain bands wrapping in towards the low-level circulation center.

The Australian Bureau of Meteorology (ABM) issued warnings and watches for Damien on Feb. 6. The Warning Zone includes Wallal Downs to Mardie, including Port Hedland, Karratha and Dampier. The Watch Zone extends from Mardie to Onslow, and the inland central Pilbara including Tom Price, Paraburdoo, Marble Bar and Nullagine. A Blue Alert is in effect for residents in or near Wallal Downs to Port Hedland and Mardie to Onslow but not including Onslow, (including the towns of Pannawonica, Tom Price, Paraburdoo, Nullagine and Marble Bar). A Yellow Alert is in effect for residents in or near Port Hedland to Mardie and south to Millstream (including the Town of Port Hedland, Whim Creek, Point Samson, Wickham, Roebourne, Karratha and Dampier).

At 11:00 p.m. AWST (10 a.m. EST) on Feb. 6, the ABM said Tropical Cyclone Damien had maximum sustained winds near 75 kilometers per hour (40 knots/47 mph) with higher gusts. It was located near latitude 17.5 degrees south and longitude 118.1 degrees east, about 315 kilometers (196 miles) north of Port Hedland and 385 kilometers (239 miles) north-northeast of Karratha. Damien is moving to the west-southwest at 20 kilometers (12 miles) per hour.

The tropical cyclone is expected to continue to intensify as it tracks to the west southwest. Damien is expected to turn south towards the Pilbara coast during Friday, Feb 7. ABM cautioned that, "Severe tropical cyclone impact is forecast for the Pilbara coast during Saturday, Feb. 8."

NASA's Terra satellite is one in a fleet of NASA satellites that provide data for hurricane research.

Tropical cyclones/hurricanes are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

Credit: 
NASA/Goddard Space Flight Center

Graphene mapping 50 times faster

image: Denoising algorithm improves Raman scanning speed.

Image: 
©Science China Press

Graphene is always raising high expectations, as a strong, ultrathin, two-dimensional material that could also be the basis for new components in information technology. There is huge need for characterization of graphene devices. This can be done using Raman spectroscopy. Laser light is sent to the material sample, and scattered photons tell us about the rotations and vibrations of the molecules inside, and thus about the crystal structure. On average, only around 1 in 10 million photons is scattered in this way. This not only makes it hard to detect the right information, it is also very slow: it may take half a second to image one single pixel. The question is if Raman still remains the best option, or if there are better alternatives. UT researchers Sachin Nair and Jun Gao keep Raman spectroscopy as a starting point, but manage to improve the speed drastically: not by changing the technique itself, but by adding an algorithm.

Noise reduction

This algorithm is not unknown in the world of signal processing and it is called Principal Component Analysis. It is used to improve the signal-to-noise ratio. PCA determines the characteristics of noise and those of the 'real' signal. The larger the dataset, the more reliable this recognition is, and the clearer the actual signal can be distinguished. Apart from that, modern Raman instruments have a detector called electron-multiplying charge-coupled device (EMCCD) that improves the signal-to-noise-ratio. The net result of this work is that processing one pixel doesn't take half a second, but only 10 milliseconds or less. Mapping a single sample doesn't take hours anymore. An important feature for vulnerable materials like graphene oxide is that the intensity of the laser can be lowered two or three orders of magnitude. These are major steps ahead for getting a fast grip on the materials' properties.

Multi-purpose

Except for graphene, the improved Raman technique can also be used for other two dimensional materials like germanene, silicene, molybdenum disulfide, tungsten disulfide and boron nitride. Use of the algorithm is not limited to Raman spectroscopy; techniques like Atomic Force Microscopy and other hyperspectral techniques could also benefit from it.

The research has been done in the group Physics of Complex Fluids of Prof Frieder Mugele, part of UT's MESA+ Institute. The researchers collaborated with the Medical Cell BioPhysics group and the Physics of Interfaces and Nanomaterials group, both of the University of Twente as well.

Credit: 
Science China Press

Computer simulation for understanding brain cancer growth

The growth of brain cancers can be better understood with the help of a new computer platform developed by international scientists coordinated by Newcastle University.

The unique platform which can be used to help develop better treatments for glioma is freely available: https://biodynamo.org/

Publishing today in the journal Methods, the team from Newcastle University, the University of Cyprus, UCL and CERN describe how they have developed the three-dimensional model which brings together the macroscopic scale of tissue with the microscopic scale of individual cells. This allows the platform to realistically model cancer mechano-biology while maintaining the resolution power which makes it cost-effective.

Dr Roman Bauer from the School of Computing said: "Built on top of the latest computing technologies, the BioDynaMo platform enables users to perform simulations on an increased scale and complexity making it possible to tackle challenging scientific research questions."

Although many specialised software tools exist, establishing this high-performance, general-purpose platform is a major step forward.

Jean de Montigny, PhD student from the Faculty of Medical Sciences said: "The advantage for scientists and medics is that BioDynaMo can be used on standard laptops or desktop computers and provides a software platform which can be used to easily create, run and visualise 3D agent-based biological simulations."

Glioma

Glioma is a broad category of brain and spinal cord tumours that originate from neural stem cells and glial cells that support neurons. Treatment for patients (who tend to be older adults) involves surgery to remove as much of the tumour as possible followed by chemotherapy and radiotherapy.

By modelling the growth of a glioma, clinicians can better understand how a tumour may develop and progress. Cancer researchers could use however such models in the future to improve the outcome of pertinent therapeutic procedures.

This paper develops a three-dimensional in silico hybrid model of cancer, which describes the multi-variate phenotypic behaviour of tumour and host cells. The model encompasses the role of cell migration and adhesion, the influence of the extracellular matrix, the effects of oxygen and nutrient availability, and the signalling triggered by chemical cues and growth factors.

Involving CERN, which contributed its deep knowledge in large-scale computing to this collaboration with Newcastle University, the project has been supported by Intel.

Moreover, the University of Cyprus and UCL investigators contributed with their expertise in multiscale modelling.

Biodynamo is an international collaboration partially funded by the CERN Budget for Knowledge Transfer to Medical Applications, through a grant awarded in 2016.

Credit: 
Newcastle University

Bridging the gap between AI and the clinic

The power of artificial intelligence (AI) in medicine lies in its ability to find important statistical patterns in large datasets. A recently published study is an important proof of concept for how AI can help doctors and brain tumour patients make better treatment decisions.

Meningiomas - tumours that arise from the membranes that surround the brain and spinal cord - are the most common primary central nervous system tumour, with an incidence of 8.14 per 100,000 population. While they generally have better outcomes than other brain tumours, there is a great deal of variability in aggressiveness. Being able to predict malignancy and accurately estimate survival is therefore incredibly important in deciding whether surgery is the best option for the patient.

In this study, researchers from The Neuro (Montreal Neurological Institute-Hospital) and the Montreal Children's Hospital of the McGill University Health Centre trained machine learning algorithms on data from more than 62,000 patients with a meningioma. Their goal was to find statistical associations between malignancy, survival, and a series of basic clinical variables including tumour size, tumour location, and surgical procedure.

While the study demonstrated that the models could effectively predict outcomes in individual patients, the researchers emphasised the need for further refinements using larger sets that include brain imaging and molecular data.

They also developed an open-source smartphone app to allow clinicians and other researchers to interactively explore the predictive algorithms described in the paper. They hope that making the app entirely free and open source could help future projects translate newly developed machine learning algorithms to real-world clinical practice. The app is available here for demonstration: http://www.meningioma.app

Jeremy Moreau, a PhD candidate and the study's first author, says the idea of the app was to make the predictive models accessible for the average clinician. While more work is necessary before it can be used in clinic, Moreau says putting it in the hands of doctors allows them to give suggestions that will be needed for further development.

"We have gotten great feedback on how the app can be used to explore how different clinical factors might influence malignancy and survival," says Moreau. "We believe it provides a unique entry-point for furthering the translatability and transparency of machine learning models, which too often remain impossible for the average clinician to evaluate because of the time and programming knowledge this would require."

The study was published in the journal npj Digital Medicine on January 30, 2020. The study was funded by the Foundation of the Department of Neurosurgery and was also undertaken thanks in part to funding from the Canada First Research Excellence Fund, awarded to McGill University for the Healthy Brains, Healthy Lives Initiative. Moreau also received training awards from the Fonds de recherche du Québec - Santé (FRQS) and the Foundation of Stars. Moreau is co-supervised by Sylvain Baillet, a neuroscientist at The Neuro and Dr. Roy Dudley, a clinician-scientist at The Montreal Children's Hospital. Patient data was accessed through the National Cancer Institute's Surveillance, Epidemiology, and End Results Database.

The Neuro

The Neuro - The Montreal Neurological Institute and Hospital - is a world-leading destination for brain research and advanced patient care. Since its founding in 1934 by renowned neurosurgeon Dr. Wilder Penfield, The Neuro has grown to be the largest specialized neuroscience research and clinical center in Canada, and one of the largest in the world. The seamless integration of research, patient care, and training of the world's top minds make The Neuro uniquely positioned to have a significant impact on the understanding and treatment of nervous system disorders. In 2016, The Neuro became the first institute in the world to fully embrace the Open Science philosophy, creating the Tanenbaum Open Science Institute. The Montreal Neurological Institute is a McGill University research and teaching institute. The Montreal Neurological Hospital is part of the Neuroscience Mission of the McGill University Health Centre. For more information, please visit http://www.theneuro.ca

Credit: 
McGill University

When kids face discrimination, their mothers' health may suffer

COLUMBUS, Ohio - A new study is the first to suggest that children's exposure to discrimination can harm their mothers' health.

The findings reveal that when biological and environmental explanations for a woman's health status between age 40 and 50 are accounted for, an association can be found between her children being treated unfairly and a decline in her health during midlife.

"Our study suggests that when a child experiences discrimination, these instances of unfair treatment are likely to harm his or her mother's health in addition to their own," said Cynthia Colen, associate professor of sociology at The Ohio State University and lead author of the study.

The discrimination captured by the data included high-impact incidents of unfair treatment in the workplace as well as everyday mistreatment ranging from poor restaurant service to harassment.

Though previous research has suggested that pregnant women's experiences of discrimination can negatively affect the health of their babies, this is the first time researchers have identified health effects of unfair treatment in the opposite direction, from older children to their middle-aged mothers. The finding prompted the authors to argue that discrimination should be considered not just a social problem, but a health problem.

"When we think about discrimination, we tend to think about what happens to an individual if they themselves experience unfair treatment, whether it's because of their sex or their race or something else," Colen said. "This paper argues that the health effects of discrimination reverberate through families and have the potential to reverberate through communities.

"Our results suggest that discrimination is better understood as a complex social exposure with far-reaching health implications."

The study is published online in the Journal of Health and Social Behavior and will appear in a future print edition.

Colen and her colleagues studied two generations of families using data from mother-child pairs in the National Longitudinal Survey of Youth 1979, a nationally representative sample of men and women who have been surveyed on a regular basis for over 40 years. The NLSY is run by Ohio State's Center for Human Resource Research.

The dataset for this work included 3,004 mothers and 6,562 children, and focused on adolescent and young adults' answers to survey questions about exposure to acute or chronic discrimination and their mothers' self-rated health at ages 40 and 50.

The discrimination measures were developed in the 1990s by study co-author David Williams of Harvard University. Acute discrimination could include being unfairly fired from a job or a threatening encounter with police. The chronic discrimination measure assesses the frequency of routine interpersonal exchanges that leave a person feeling disrespected, insulted or demeaned.

The measures were determined based on responses to such questions as "Have you ever been unfairly denied a promotion?" or "Have you ever been unfairly stopped, searched, questioned, physically threatened, or abused by police?" for acute discrimination. To measure chronic discrimination, respondents answered questions like "How often have you been treated with less respect than other people?" and "How often have you been called names or insulted?"

African American adolescents and young adults reported the most experiences of discrimination: Almost 22 percent of blacks reported frequent instances of acute discrimination, compared to 14 percent of Hispanics and 11 percent of whites.

Racial disparities in the mothers' health status also were evident: By age 50, 31 percent of blacks reported having fair or poor health, compared to 17 percent of whites and 26 percent of Hispanics.

Analyzing the data in statistical models revealed that mothers of children reporting moderate or high levels of acute discrimination were up to 22 percent more likely to face a decline in their health between age 40 and 50 than mothers of children who reported low levels of acute discrimination. Smaller but significant declines in health were also noted for mothers whose children experienced frequent chronic discrimination. These associations were evident among African Americans, Hispanics and whites.

Racial health disparities have been well-documented in previous research, but the specific reasons for these discrepancies can be hard to identify and quantify. Colen expected to find that children's experiences with discrimination would help explain why mothers of color had poorer health than whites, but found that this was true only among African American mothers.

The analysis showed that children's experiences with acute discrimination explained almost 10 percent, and chronic discrimination about 7 percent, of the gap in health declines between black and white women, but was not linked to the health gap between white and Hispanic moms - even though the data showed that these disparities exist. Colen said adding health data from the mothers at age 60, which wasn't available when she conducted this research, may provide a clearer picture of the intergenerational health effects of discrimination over time.

"We have known for a long time that people who are treated unfairly are more likely to have poor mental and physical health," Colen said. "Now we know that these negative health effects aren't restricted to the person who experiences discrimination firsthand - instead they are intergenerational, and they are likely to be a contributor to racial disparities in health that mean people of color can expect to die younger and live less healthy lives."

Credit: 
Ohio State University

New tool monitors real time mutations in flu

image: An influenza virus binds to receptors on a respiratory tract cell, allowing the virus to enter and infect the cell.

Image: 
U.S. Centers for Disease Control and Prevention

A Rutgers-led team has developed a tool to monitor influenza A virus mutations in real time, which could help virologists learn how to stop viruses from replicating.

The gold nanoparticle-based probe measures viral RNA in live influenza A cells, according to a study in The Journal of Physical Chemistry C. It is the first time in virology that experts have used imaging tools with gold nanoparticles to monitor mutations in influenza, with unparalleled sensitivity.

"Our probe will provide important insight on the cellular features that lead a cell to produce abnormally high numbers of viral offspring and on possible conditions that favor stopping viral replication," said senior author Laura Fabris, an associate professor in the Department of Materials Science and Engineering in the School of Engineering at Rutgers University-New Brunswick.

Viral infections are a leading cause of illness and deaths. The new coronavirus, for example, has led to more than 24,000 confirmed cases globally, including more than 3,200 severe ones and nearly 500 deaths as of Feb. 5, according to a World Health Organization report.

Influenza A, a highly contagious virus that arises every year, is concerning due to the unpredictable effectiveness of its vaccine. Influenza A mutates rapidly, growing resistant to drugs and vaccines as it replicates.

The new study highlights a promising new tool for virologists to study the behavior of influenza A, as well as any other RNA viruses, in host cells and to identify the external conditions or cell properties affecting them. Until now, studying mutations in cells has required destroying them to extract their contents. The new tool enables analysis without killing cells, allowing researchers to get snapshots of viral replication as it occurs. Next steps include studying multiple segments of viral RNA and monitoring the influenza A virus in animals.

Credit: 
Rutgers University

Alaska's national forests contribute 48 million salmon a year to state's fishing industry

image: Alaska's Tongass and Chugach National Forests, which contain some of the world's largest remaining tracts of intact temperate rainforest, contribute an average of 48 million salmon a year to the state's commercial fishing industry, a new USDA Forest Service-led study has found. The average value of these 'forest fish' when they are brought back to the dock is estimated at $88 million per year.

Image: 
Ali Freibott, US Forest Service

Alaska's Tongass and Chugach National Forests, which contain some of the world's largest remaining tracts of intact temperate rainforest, contribute an average of 48 million salmon a year to the state's commercial fishing industry, a new USDA Forest Service-led study has found. The average value of these "forest fish" when they are brought back to the dock is estimated at $88 million per year.

Led by the Forest Service's Pacific Northwest Research Station, the study used Alaska Department of Fish and Game data and fish estimates from 2007 to 2016 to quantify the number and value of Pacific salmon originating from streams, rivers, and lakes on the Tongass and Chugach, which are, respectively, the largest and second-largest national forests in the country. The study focused on five commercially important salmon species--Chinook, coho, sockeye, pink, and chum--caught primarily in four commercial salmon management areas adjacent to these two forests.

"Pacific salmon fisheries are absolutely central to Alaska's economy and culture," said Adelaide Johnson, a Juneau-based hydrologist with the Pacific Northwest Research Station and the study lead. "We suspected that many of the ocean-caught Pacific salmon that support the fishing industry likely began their lives in forest streams that drain the Tongass and Chugach National Forests."

Johnson and Forest Service colleagues Ryan Bellmore and Ronald Mendel, and Alaska Department of Fish and Game's Stormy Haught, used a three-step process to determine the number of fish originating from the Tongass and Chugach. First, they calculated the total number of salmon caught in regional commercial harvest areas. They then subtracted the number of salmon originating from hatcheries--a process facilitated by the hatchery practice of marking juvenile fish--and the number of salmon that originated outside national forest boundaries, such as commercially caught fish that were born in Canadian rivers and rivers farther south in the contiguous United States.

"Our findings underscore just how important Alaska's forest rivers and lakes are for sustaining salmon," said Bellmore, who also is based in Juneau. "At the same time, this study vastly underestimates the value of salmon because it does not include subsistence and recreational salmon fisheries, which are critically important to local communities and the regional economy."

The authors note that even salmon that do not originate from the Chugach and Tongass may still be supported by these forests for a portion of their lives. Pacific salmon fry, for example, that emerge upstream of national forest lands will migrate downstream and may use rivers, lakes, and estuaries within national forest boundaries for rearing.

Additional research is needed to clarify all of the pathways by which these national forests support productive fisheries. Nevertheless, this study can contribute to discussions about alternative land management strategies that might affect salmon populations and associated commercial salmon fisheries, Johnson said.

The USDA Forest Service's Pacific Northwest Research Station--headquartered in Portland, Ore.--generates and communicates scientific knowledge that helps people make informed choices about natural resources and the environment. The station has 11 laboratories and centers located in Alaska, Washington, and Oregon and about 300 employees. Learn more online at https://www.fs.usda.gov/pnw/.

Credit: 
USDA Forest Service - Pacific Northwest Research Station