2009 Ninth IEEE International Conference on Data Mining 2009
DOI: 10.1109/icdm.2009.78
|View full text |Cite
|
Sign up to set email alerts
|

Least Square Incremental Linear Discriminant Analysis

Abstract: Abstract-Linear discriminant analysis (LDA) is a wellknown dimension reduction approach, which projects highdimensional data into a low-dimensional space with the best separation of different classes. In many tasks, the data accumulates over time, and thus incremental LDA is more desirable than batch LDA. Several incremental LDA algorithms have been developed and achieved success; however, the eigenproblem involved requires a large computation cost, which hampers the efficiency of these algorithms. In this pap… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2011
2011
2021
2021

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 32 publications
(13 citation statements)
references
References 11 publications
0
13
0
Order By: Relevance
“…Seng et al [24], in their groundbreaking work, had used Bi-Directional Principle Component analysis (BDPCA) [26] and Least-Square Linear Discriminant Analysis (LSLDA) [27] for dimensionality reduction and class discrimination. In their study, these two models were cascaded with the output of BDPCA being used as the input of LSLDA [27]. The extracted features are then forwarded to Convolutional Neural network.…”
Section: Zhang Et Al In Their Workmentioning
confidence: 99%
“…Seng et al [24], in their groundbreaking work, had used Bi-Directional Principle Component analysis (BDPCA) [26] and Least-Square Linear Discriminant Analysis (LSLDA) [27] for dimensionality reduction and class discrimination. In their study, these two models were cascaded with the output of BDPCA being used as the input of LSLDA [27]. The extracted features are then forwarded to Convolutional Neural network.…”
Section: Zhang Et Al In Their Workmentioning
confidence: 99%
“…As for learning from data streams, some algorithms are naturally or can be easily extended to incremental version, including k-NN, naive Bayes classifier, binary linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), and so on. In addition, the incremental/online versions of more sophisticated algorithms have been proposed in the literature, including but not limited to decision trees [46], random forests [41], [11] , multi-class LDA [34], [28], logistic regression [31], support vector machines [32], [5], and other kernel methods [29], [25]. Besides the base learning algorithms, the online versions of ensemble learning techniques, bagging and boosting, were also derived in [36] by approximating binominal distribution using a Poisson distribution.…”
Section: Related Workmentioning
confidence: 99%
“…Furthermore, many well-optimised algorithms [34] can be readily applied to large data. Finally, its adaptable solution supports efficient model update for incremental learning [28]. The new HER model casts re-id into such a regression problem, benefiting from all of its advantages in scalability.…”
Section: Related Workmentioning
confidence: 99%
“…However, this work is the first to formulate it for a verification setting as in re-id. For its incremental extension, our model HER + , differs significantly to [28] which only supports updates on a single sample without regularisation employed.…”
Section: Related Workmentioning
confidence: 99%