2011 Conference Record of the Forty Fifth Asilomar Conference on Signals, Systems and Computers (ASILOMAR) 2011
DOI: 10.1109/acssc.2011.6190349
|View full text |Cite
|
Sign up to set email alerts
|

Shrinkage fisher information embedding of high dimensional feature distributions

Abstract: Abstract-In this paper, we introduce a dimensionality reduction method that can be applied to clustering of high dimensional empirical distributions. The proposed approach is based on stabilized information geometrical representation of the feature distributions. The problem of dimensionality reduction on spaces of distribution functions arises in many applications including hyperspectral imaging, document clustering, and classifying flow cytometry data. Our method is a shrinkage regularized version of Fisher … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2015
2015
2015
2015

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 11 publications
0
1
0
Order By: Relevance
“…The first one is nonlinear approaches, for example, Isomap embedding (Isomap) [5], local tangent space alignment (LTSA) [6], Laplacian eigenmaps (LE) [7], local linear embedding (LLE) [8], and so forth. The other one is linear approaches, for example, principal component analysis (PCA), linear discriminant analysis (LDA), random projection (RP) [9], Locality Preserving Projection (LPP) [10], and so forth.…”
Section: Introductionmentioning
confidence: 99%
“…The first one is nonlinear approaches, for example, Isomap embedding (Isomap) [5], local tangent space alignment (LTSA) [6], Laplacian eigenmaps (LE) [7], local linear embedding (LLE) [8], and so forth. The other one is linear approaches, for example, principal component analysis (PCA), linear discriminant analysis (LDA), random projection (RP) [9], Locality Preserving Projection (LPP) [10], and so forth.…”
Section: Introductionmentioning
confidence: 99%