2021
DOI: 10.48550/arxiv.2109.03028
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Robust adaptive Lasso in high-dimensional logistic regression with an application to genomic classification of cancer patients

Abstract: Penalized logistic regression is extremely useful for binary classification with a large number of covariates (significantly higher than the sample size), having several real life applications, including genomic disease classification. However, the existing methods based on the likelihood based loss function are sensitive to data contamination and other noise and, hence, robust methods are needed for stable and more accurate inference. In this paper, we propose a family of robust estimators for sparse logistic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
5
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(6 citation statements)
references
References 33 publications
1
5
0
Order By: Relevance
“…The minimization of the objective (9) produces robust adaptively weighted DPD-LASSO estimators, which includes the DPD-LASSO estimator for w(•) = 1. The resulting penalized MDPDEs are indeed robust for all positives values of α when the initial estimator β is also robust, as proved in Basu et al [7], and non-robust at α = 0 corresponding to the MLE. Moreover, they are consistent and asymptotically normal in the high dimensional data set-up with non polynomial order, i.e., when log(k) = O(n s ) for some s ∈ (0, 1), under some regularity conditions.…”
Section: Robust Regularized Logistic Regressionsupporting
confidence: 64%
See 4 more Smart Citations
“…The minimization of the objective (9) produces robust adaptively weighted DPD-LASSO estimators, which includes the DPD-LASSO estimator for w(•) = 1. The resulting penalized MDPDEs are indeed robust for all positives values of α when the initial estimator β is also robust, as proved in Basu et al [7], and non-robust at α = 0 corresponding to the MLE. Moreover, they are consistent and asymptotically normal in the high dimensional data set-up with non polynomial order, i.e., when log(k) = O(n s ) for some s ∈ (0, 1), under some regularity conditions.…”
Section: Robust Regularized Logistic Regressionsupporting
confidence: 64%
“…Fokianos [11], Park and Hastie [21], Plan and Vershynin [23], Zhu and Hastie [30] and Sun and Wang [26] are interesting papers based on the LASSO estimator for the logistic regression model. Basu et al [7] extended the LASSO procedure with DPD-based loss function for the logistic regression model, producing more robust estimators. The so-called LASSO penalized MDPDE (DPD-LASSO) is then given by…”
Section: Robust Regularized Logistic Regressionmentioning
confidence: 99%
See 3 more Smart Citations