2015
DOI: 10.1016/j.csda.2014.08.013
|View full text |Cite
|
Sign up to set email alerts
|

Ordinal Logic Regression: A classifier for discovering combinations of binary markers for ordinal outcomes

Abstract: In medicine, it is often useful to stratify patients according to disease risk, severity, or response to therapy. Since many diseases arise from complex gene-gene and gene-environment interactions, patient strata may be defined by combinations of genetic and environmental factors. Traditional statistical methods require specifying interactions a priori making it difficult to identify high order interactions. Alternatively, machine learning methods can model complex interactions, however these models are often … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 28 publications
0
1
0
Order By: Relevance
“…The inconsistencies in the literature may occur under different model application conditions, as further discussed below. LR ( 38 ) involves techniques for determining the effects of multiple independent variables on a dependent variable. Logistic sigmoid units are typically used to output (class) binary classifiers rather than ternary classifications, while KNN relies on several nearest neighbor points for classification and is known to work reliably in smaller datasets, as has been shown in previous studies ( 39 ).…”
Section: Discussionmentioning
confidence: 99%
“…The inconsistencies in the literature may occur under different model application conditions, as further discussed below. LR ( 38 ) involves techniques for determining the effects of multiple independent variables on a dependent variable. Logistic sigmoid units are typically used to output (class) binary classifiers rather than ternary classifications, while KNN relies on several nearest neighbor points for classification and is known to work reliably in smaller datasets, as has been shown in previous studies ( 39 ).…”
Section: Discussionmentioning
confidence: 99%