2019
DOI: 10.1101/577197
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Deep Learning with Multimodal Representation for Pancancer Prognosis Prediction

Abstract: Estimating the future course of cancer is invaluable to physicians; however, current clinical methods fail to effectively use the vast amount of multimodal data that is available for cancer patients.To tackle this problem, we constructed a deep neural network based model to predict the survival of patients for 20 different cancer types using gene expressions, microRNA data, clinical data and histopathology whole slide images (WSIs). We developed an unsupervised encoder to compress these four data modalities in… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
21
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
4
2

Relationship

2
4

Authors

Journals

citations
Cited by 11 publications
(21 citation statements)
references
References 42 publications
0
21
0
Order By: Relevance
“…Table 3 also showed the results of the DCAP using multi-omics data at last column. In comparison, the C-index obtained by three traditional methods (general Cox model, Cox model with lasso regularization, and Cox model with elastic net) achieved C-index values between 0.565 and 0.569, which were much lower than those obtained by some advanced methods in recent studies (Cox_DL, Cox_transfer, Cox_TRACE, Cox_cCMTL) (Cheerla and Gevaert, 2019;Wang, et al, 2017). These advanced methods obtained C-index values between 0.605 and 0.632, with an average of 0.620.…”
Section: Predicting Cancer Prognosis With Multi-omics Datamentioning
confidence: 75%
See 1 more Smart Citation
“…Table 3 also showed the results of the DCAP using multi-omics data at last column. In comparison, the C-index obtained by three traditional methods (general Cox model, Cox model with lasso regularization, and Cox model with elastic net) achieved C-index values between 0.565 and 0.569, which were much lower than those obtained by some advanced methods in recent studies (Cox_DL, Cox_transfer, Cox_TRACE, Cox_cCMTL) (Cheerla and Gevaert, 2019;Wang, et al, 2017). These advanced methods obtained C-index values between 0.605 and 0.632, with an average of 0.620.…”
Section: Predicting Cancer Prognosis With Multi-omics Datamentioning
confidence: 75%
“…The solid lines were drawn by DCAP and the dotted lines (grey) were drawn by DCAP-kmeans. (Wang, et al, 2017) b : The results reported in reference (Cheerla and Gevaert, 2019) c : DCAP using only mRNA Figure. 3 The C-index values obtained by DCAP models in cross-cancer prognosis prediction. The DCAP models trained on one cancer are used to predict prognosis on another cancer.…”
Section: Figurementioning
confidence: 94%
“…For this, it combines models that excel at processing each of these modalities into a single model. For example, an integration of histological images and gene expression data has been shown to improve pan-cancer prognosis [190]. At the same time, pregnancy outcomes are highly interrelated and may point towards different phenotypes with similar pathologies.…”
Section: Box 4 Machine Learning: Multimodal and Multitask Learning Fmentioning
confidence: 99%
“…The problem of small data size may be a limiting factor in many biomedical analyses, especially when studies are conducted with data that is expensive to produce, or in the case of multi-modal data (Cheerla and Gevaert, 2019). Our work shows the promise of meta-learning for biomedical applications to alleviate the problem of limited data.…”
Section: Resultsmentioning
confidence: 92%
“…For example, autoencoder architectures have been employed to extract features from genomic data for liver cancer prognosis prediction (Chaudhary et al, 2018). The Cox model has also been integrated in a neural network setting to allow greater modeling flexibility (Cheerla and Gevaert (2019); Ching et al (2018); Luck et al (2017); Yousefi et al (2017).…”
Section: Introductionmentioning
confidence: 99%