2013
DOI: 10.2147/ijn.s40733
|View full text |Cite
|
Sign up to set email alerts
|

Feature selection and survival modeling in The Cancer Genome Atlas

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

1
13
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(14 citation statements)
references
References 14 publications
1
13
0
Order By: Relevance
“…For the 1NN model, the correlation coefficient r 1 using the full set of iron genes was 0.23. This produces higher correlation values than those found in Kim and Bredel [20], who found a correlation of 0.18 between observed and predicted survival of GBM survival times when employing 1NN over a collection of 12,042 genes. For the Coxnet algorithm, we performed 10-fold cross validation to determine L 1 regularization parameters.…”
Section: Resultsmentioning
confidence: 53%
See 3 more Smart Citations
“…For the 1NN model, the correlation coefficient r 1 using the full set of iron genes was 0.23. This produces higher correlation values than those found in Kim and Bredel [20], who found a correlation of 0.18 between observed and predicted survival of GBM survival times when employing 1NN over a collection of 12,042 genes. For the Coxnet algorithm, we performed 10-fold cross validation to determine L 1 regularization parameters.…”
Section: Resultsmentioning
confidence: 53%
“…The correlation coefficient r 2 for the Coxnet model was -0.30 (as we are comparing risk with survival times, we should expect a negative correlation). While this correlation was slightly weaker than the 1NN model, the removal of 53 of 61 genes produced stronger correlations when comparing with Kim and Bredel [20], who found a correlation of -0.22 using Coxnet applied to a collection of 12,042 genes, and a correlation of -0.24 when restricting to genes associated with cancer pathways. The Coxnet algorithm minimizes a penalized likelihood, and can assign multiple regression coefficients to exactly zero.…”
Section: Resultsmentioning
confidence: 74%
See 2 more Smart Citations
“…K-NN is a popularly used classifier due to its simplicity, straightforwardness and high efficiency even with noisy data [29]. Even with its straightforwardness, it is capable of achieving high accuracy rates in medical applications [30,31]. K-NN assigns a class to each data point in the test set according to the class of it amongst k-nearest neighbors inside the training set [32].…”
mentioning
confidence: 99%