2007
DOI: 10.2202/1544-6115.1248
|View full text |Cite
|
Sign up to set email alerts
|

Sparse Logistic Regression with Lp Penalty for Biomarker Identification

Abstract: In this paper, we propose a novel method for sparse logistic regression with non-convex regularization Lp (p <1). Based on smooth approximation, we develop several fast algorithms for learning the classifier that is applicable to high dimensional dataset such as gene expression. To the best of our knowledge, these are the first algorithms to perform sparse logistic regression with an Lp and elastic net (Le) penalty. The regularization parameters are decided through maximizing the area under the ROC curve (AUC)… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
47
0

Year Published

2008
2008
2016
2016

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 57 publications
(47 citation statements)
references
References 28 publications
0
47
0
Order By: Relevance
“…Here, we evaluate the influence of smoothing, normalization, and template choice on the performance of two classification methods: penalized logistic regression (PLR) and linear SVM (hard and soft margin) when combined with a highdimensional image warping technique called symmetric normalization (SyN) (Avants et al [3]) as implemented in the Advance Normalization Tools (ANTS) software package. Penalized logistic regression (PLR) has been used before in genetics to analyze microarray and sequence data (Liu et al [34]; Park and Hastie [39]; Shevade and Keerthi [44]; Zhu and Hastie [53]) and in the context of neuroimaging applications PLR has been used before to analyze fMRI (Ryali et al [43]; Yamashita et al [52]) and regional volumes of sMRI (Casanova et al [6,7]) data. Here we use PLR to solve classification problems of very large size that result when voxels from sMRI images are used as input features.…”
Section: Introductionmentioning
confidence: 99%
“…Here, we evaluate the influence of smoothing, normalization, and template choice on the performance of two classification methods: penalized logistic regression (PLR) and linear SVM (hard and soft margin) when combined with a highdimensional image warping technique called symmetric normalization (SyN) (Avants et al [3]) as implemented in the Advance Normalization Tools (ANTS) software package. Penalized logistic regression (PLR) has been used before in genetics to analyze microarray and sequence data (Liu et al [34]; Park and Hastie [39]; Shevade and Keerthi [44]; Zhu and Hastie [53]) and in the context of neuroimaging applications PLR has been used before to analyze fMRI (Ryali et al [43]; Yamashita et al [52]) and regional volumes of sMRI (Casanova et al [6,7]) data. Here we use PLR to solve classification problems of very large size that result when voxels from sMRI images are used as input features.…”
Section: Introductionmentioning
confidence: 99%
“…Filter approaches are independent of any specific learning algorithms [22,2], while wrapper approaches involve learning algorithms as part of the evaluation procedure [39,23]. Some representative wrapping methods include Sequential Floating Selection (SFS) [39], Sequential Forward Floating Selection (SFFS) [32] and sparse logistic regression based methods [27,29,14,34].…”
Section: Metaheuristics For Feature Selectionmentioning
confidence: 99%
“…For example, sparse logistic regression has been used in the prediction of Leukemia [15], Alzheimer's disease [22] and cancers [10]. In recent years people also designed different regularization terms [13][17] [26] to a enforce more complex sparsity patterns on the learned model. However, all these works require a vector based representation of the data.…”
Section: Related Workmentioning
confidence: 99%
“…Depending on the different sparsity structures the model wants to explore, we can construct different sparsity-induced regularization terms. By adding them to the objective of conventional logistic regression we can get different types of sparse logistic regression models [13][17] [26].…”
Section: Introductionmentioning
confidence: 99%