2018
DOI: 10.5705/ss.202016.0402
|View full text |Cite
|
Sign up to set email alerts
|

Conditional quantile correlation learning for ultrahigh dimensional varying coefficient models and its application in survival analysis

Abstract: In this paper, we consider a robust approach to the ultrahigh dimensional variable screening under varying coefficient models. Different from the existing works focusing on the mean regression function, we propose a novel procedure based on the conditional quantile correlation sure independent screening (CQCSIS). This new proposal is applicable to heterogeneous or heavy-tailed data in general and is invariant to monotone transformation of the response. Furthermore, we generalize such a screening procedure to a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
2

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 34 publications
(67 reference statements)
0
7
0
Order By: Relevance
“…Variable or feature selection in highdimensional linear quantile regression has been extensively studied in the literature and various shrinkage and screening techniques have been introduced to identify these significant covariates(e.g., Wang, Wu and Li, 2012;Fan, Fan and Barut, 2014;Zheng, Peng and He, 2015;Ma, Li and Tsai, 2017). For extensions to high-dimensional nonparametric quantile regression, we refer to Fernández-Val (2011), He, Wang andHong (2013) and Xia, Li and Fu (2018). In this section, we consider a general nonparametric quantile regression setting which contains high-dimensional mixed continuous and discrete covariates.…”
Section: Extension To High-dimensional Settingmentioning
confidence: 99%
See 2 more Smart Citations
“…Variable or feature selection in highdimensional linear quantile regression has been extensively studied in the literature and various shrinkage and screening techniques have been introduced to identify these significant covariates(e.g., Wang, Wu and Li, 2012;Fan, Fan and Barut, 2014;Zheng, Peng and He, 2015;Ma, Li and Tsai, 2017). For extensions to high-dimensional nonparametric quantile regression, we refer to Fernández-Val (2011), He, Wang andHong (2013) and Xia, Li and Fu (2018). In this section, we consider a general nonparametric quantile regression setting which contains high-dimensional mixed continuous and discrete covariates.…”
Section: Extension To High-dimensional Settingmentioning
confidence: 99%
“…The above theorem complements some existing sure screening properties in highdimensional quantile estimation (c.f., He, Wang and Hong, 2013;Ma, Li and Tsai, 2017;Xia, Li and Fu, 2018). An alternative kernel screening procedure is to conduct the leaveone-out kernel estimation for each marginal quantile regression and then use the datadriven CV method to determine the optimal smoothing parameter.…”
Section: Letmentioning
confidence: 99%
See 1 more Smart Citation
“…Assuming that error term follows the normal distribution, the Wald P value for testing null hypothesis β j =0 can be obtained by survreg function in R. After sorting 20 000 P values in an ascending order, the top markers in the ranked list reflect the strongest covariate effects on the mean survival time. Following the recommendation by Fan and Lv and other references, we keep the top nnormallnormalonormalgfalse(nfalse) variables in the ranked list and screen out the rest of the covariates. Thus, the first 66 variables in the ranked list are selected to conduct downstream analysis.…”
Section: Simulationmentioning
confidence: 99%
“…To the best of our knowledge, there are very few works applying this classical dependence concept in high-dimensional setting. Xia et al (2018) proposed a robust conditional feature screening approach, however, their method performs only robustly against the response but not against the covariates. Another relevant recent work is Ma et al (2017).…”
Section: Introductionmentioning
confidence: 99%