2021
DOI: 10.1002/cjs.11590
|View full text |Cite
|
Sign up to set email alerts
|

Optimal subsampling for linear quantile regression models

Abstract: Subsampling techniques are efficient methods for handling big data. Quite a few optimal sampling methods have been developed for parametric models in which the loss functions are differentiable with respect to parameters. However, they do not apply to quantile regression (QR) models as the involved check function is not differentiable. To circumvent the non‐differentiability problem, we consider directly estimating the linear QR coefficient by minimizing the Hansen–Hurwitz estimator of the usual loss function … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
2
0

Year Published

2021
2021
2026
2026

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 14 publications
0
2
0
Order By: Relevance
“…Regarding the first issue, it is generally accepted that carefully designed sampling probabilities make unequal probability samplings more efficient than simple random or uniform sampling. Many researchers have developed efficient or optimal sampling plans for frequently encountered parametric statistical problems, including linear regression models (Ma et al, 2014), logistic regression (Fithian and Hastie, 2014;Wang et al, 2018;Wang, 2019), softmax regression (Yao et al , 2023), generalized linear models (Ai et al, 2021b), quantile regression (Ai et al, 2021a;Fan et al, 2021;Wang and Ma, 2021), and more general models (Shen et al, 2021;Yu et al, 2022).…”
Section: Introductionmentioning
confidence: 99%
“…Regarding the first issue, it is generally accepted that carefully designed sampling probabilities make unequal probability samplings more efficient than simple random or uniform sampling. Many researchers have developed efficient or optimal sampling plans for frequently encountered parametric statistical problems, including linear regression models (Ma et al, 2014), logistic regression (Fithian and Hastie, 2014;Wang et al, 2018;Wang, 2019), softmax regression (Yao et al , 2023), generalized linear models (Ai et al, 2021b), quantile regression (Ai et al, 2021a;Fan et al, 2021;Wang and Ma, 2021), and more general models (Shen et al, 2021;Yu et al, 2022).…”
Section: Introductionmentioning
confidence: 99%
“…The current literature on subdata selection is rapidly growing. Much of the relevant literature focuses on identifying subdata that yields precise estimates of parameters in a given statistical model, for example, for linear regression [see, 10,23,8,36,41], logistic regression [42,39,7], multinomial logistic regression [46], generalized linear models [13,36,1,50,53,16], quantile regression [40,2,12,33], and quasi-likelihood [50]. All of these methods assume a true underlying model.…”
Section: Introductionmentioning
confidence: 99%
“…Later, Yao and Wang (2019) and Ai et al (2021b) extended the subsampling method to softmax regression and generalized linear models, respectively. Very recently, Wang and Ma (2021), Ai et al (2021a), Fan et al (2021), and Shao et al (2022) employed the optimal subsampling method to ordinary quantile regression, and Shao and Wang (2021) and Yuan et al (2022) developed the subsampling for composite quantile regression.…”
Section: Introductionmentioning
confidence: 99%