2019 IEEE 8th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP) 2019
DOI: 10.1109/camsap45676.2019.9022472
|View full text |Cite
|
Sign up to set email alerts
|

Robust, Sparse and Scalable Inference Using Bootstrap and Variable Selection Fusion

Abstract: This paper introduces a new regularized version of the robust τ -regression estimator for analyzing high-dimensional data sets subject to gross contamination in the response variables and covariates. We call the resulting estimator adaptive τ -Lasso that is robust to outliers and high-leverage points and simultaneously employs adaptive 1-norm penalty term to reduce the bias associated with large true regression coefficients. More specifically, this adaptive 1-norm penalty term assigns a weight to each regressi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 38 publications
0
3
0
Order By: Relevance
“…Finally, the BLB estimator is obtained by averaging over each subset. Several follow‐up works includes: the subsampled double bootstrap (SDB) proposed by Sengupta et al (2016), which combines the idea of the BLB and a fast double bootstrap by only using one inflated resample for each subset; the bag of little fast and robust bootstrap (BLFRB) by Basiri et al (2015), which incorporates the BLB and a fast fixed‐point estimation technique from the fast and robust bootstrap method for each subset and thus can be applied to any estimator representable as smooth fixed‐point equations; a two‐stage algorithm from Mozafari‐Majd and Koivunen (2019) for datasets with sparsity and outlying observations, where variable selection is performed for each disjoint subset in the first stage and the actual inference is performed in the second stage by employing a robust MM‐estimator based extension of the BLB method.…”
Section: Bootstrapmentioning
confidence: 99%
See 2 more Smart Citations
“…Finally, the BLB estimator is obtained by averaging over each subset. Several follow‐up works includes: the subsampled double bootstrap (SDB) proposed by Sengupta et al (2016), which combines the idea of the BLB and a fast double bootstrap by only using one inflated resample for each subset; the bag of little fast and robust bootstrap (BLFRB) by Basiri et al (2015), which incorporates the BLB and a fast fixed‐point estimation technique from the fast and robust bootstrap method for each subset and thus can be applied to any estimator representable as smooth fixed‐point equations; a two‐stage algorithm from Mozafari‐Majd and Koivunen (2019) for datasets with sparsity and outlying observations, where variable selection is performed for each disjoint subset in the first stage and the actual inference is performed in the second stage by employing a robust MM‐estimator based extension of the BLB method.…”
Section: Bootstrapmentioning
confidence: 99%
“…However, for two‐sample U ‐statistics, the computational advantage of the BLB and SDB may not be available. Moreover, the bootstrap algorithms proposed in Basiri et al (2015) and Mozafari‐Majd and Koivunen (2019) cannot be applied to the two‐sample U ‐statistic directly. Therefore, we pursue other versions of bootstrap algorithms.…”
Section: Bootstrapmentioning
confidence: 99%
See 1 more Smart Citation