2019
DOI: 10.1111/biom.13065
|View full text |Cite
|
Sign up to set email alerts
|

Multiclass Linear Discriminant Analysis With Ultrahigh-Dimensional Features

Abstract: Summary Within the framework of Fisher’s discriminant analysis, we propose a multiclass classification method which embeds variable screening for ultrahigh-dimensional predictors. Leveraging inter-feature correlations, we show that the proposed linear classifier recovers informative features with probability tending to one and can asymptotically achieve a zero misclassification rate. We evaluate the finite sample performance of the method via extensive simulations and use this method to classify post-transplan… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
17
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
1
1

Relationship

3
4

Authors

Journals

citations
Cited by 13 publications
(17 citation statements)
references
References 39 publications
(71 reference statements)
0
17
0
Order By: Relevance
“…As stated before, jointly estimating the whole Ω matrix of all 185,000 voxels is computationally prohibitive. Here we employed a “divide-and-conquer” algorithm introduced in Li et al 36 , to detect the local brain networks. The local networks are much smaller in size and thus their corresponding precision matrices are much easier to calculate.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…As stated before, jointly estimating the whole Ω matrix of all 185,000 voxels is computationally prohibitive. Here we employed a “divide-and-conquer” algorithm introduced in Li et al 36 , to detect the local brain networks. The local networks are much smaller in size and thus their corresponding precision matrices are much easier to calculate.…”
Section: Methodsmentioning
confidence: 99%
“…They are, therefore, usually ignored in contemporary neuroimaging association studies. However, when taken into consideration their connections with other signals, marginally weak signals could exude strong predictive effects 35,36 . Such a case is illustrated in Figure 1.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…To deal with the curse of dimensionality, several developments have been made over the last decade or so. For example, among others, new developments include the nearest shrunken centroids [40], shrunken centroids regularized discriminant analysis [18], features annealed independence rule (fair) [12], sparse and penalized LDA [38,42], regularized optimal affine discriminant (road) [13], multi-group sparse discriminant analysis [16], pairwise sure independent screening [29], and the ultra highdimensional multiclass LDA [23]. The general idea of these methods is to incorporate a feature selection strategy in a classifier in order to obtain certain optimality properties in the sense of misclassification rates.…”
Section: Introductionmentioning
confidence: 99%
“…As in our brain‐GWAS, each response image consists of Q ≈ 350,000 voxels, and each genome consists of P ≈ 560,000 SNPs, while we only have n = 373 samples. Furthermore, conditions that guarantee selection consistency for the MSGLasso may fail to hold for ultrahigh‐dimensional cases (Li, Hong & Li 2019).…”
Section: Introductionmentioning
confidence: 99%