scholarly journals Deep Learning with Multimodal Representation for Pancancer Prognosis Prediction

2019 ◽  
Author(s):  
Anika Cheerla ◽  
Olivier Gevaert

AbstractEstimating the future course of cancer is invaluable to physicians; however, current clinical methods fail to effectively use the vast amount of multimodal data that is available for cancer patients.To tackle this problem, we constructed a deep neural network based model to predict the survival of patients for 20 different cancer types using gene expressions, microRNA data, clinical data and histopathology whole slide images (WSIs). We developed an unsupervised encoder to compress these four data modalities into a single feature vector for each patient, handling missing data through a resilient, multimodal dropout method. Encoding methods were tailored to each data type - using deep highway networks to extract features from genomic and clinical data, and convolutional neural networks extract features from pathology images. We then used these feature encodings trained on pancancer data to predict pancancer and single cancer survival data, achieving a C-index of 0.784 overall.This work shows that it is possible to build a pancancer model for prognosis that also predicts prognosis in single cancer sites. Furthermore, our model handles multiple data modalities, efficiently analyzes WSIs, and summarizes patient details flexibly into an unsupervised, informative profile. We thus present a powerful automated tool to accurately determine prognosis, a key step towards personalized treatment for cancer patients.

2019 ◽  
Vol 35 (14) ◽  
pp. i446-i454 ◽  
Author(s):  
Anika Cheerla ◽  
Olivier Gevaert

Abstract Motivation Estimating the future course of patients with cancer lesions is invaluable to physicians; however, current clinical methods fail to effectively use the vast amount of multimodal data that is available for cancer patients. To tackle this problem, we constructed a multimodal neural network-based model to predict the survival of patients for 20 different cancer types using clinical data, mRNA expression data, microRNA expression data and histopathology whole slide images (WSIs). We developed an unsupervised encoder to compress these four data modalities into a single feature vector for each patient, handling missing data through a resilient, multimodal dropout method. Encoding methods were tailored to each data type—using deep highway networks to extract features from clinical and genomic data, and convolutional neural networks to extract features from WSIs. Results We used pancancer data to train these feature encodings and predict single cancer and pancancer overall survival, achieving a C-index of 0.78 overall. This work shows that it is possible to build a pancancer model for prognosis that also predicts prognosis in single cancer sites. Furthermore, our model handles multiple data modalities, efficiently analyzes WSIs and represents patient multimodal data flexibly into an unsupervised, informative representation. We thus present a powerful automated tool to accurately determine prognosis, a key step towards personalized treatment for cancer patients. Availability and implementation https://github.com/gevaertlab/MultimodalPrognosis


2015 ◽  
Vol 28 (5) ◽  
pp. 965-976 ◽  
Author(s):  
Bohyeon Kim ◽  
Il Do Ha ◽  
Maengseok Noh ◽  
Myung Hwan Na ◽  
Ho-Chun Song ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document