2002
DOI: 10.1109/72.977295
|View full text |Cite
|
Sign up to set email alerts
|

The MCA EXIN neuron for the minor component analysis

Abstract: The minor component analysis (MCA) deals with the recovery of the eigenvector associated to the smallest eigenvalue of the autocorrelation matrix of the input data and is a very important tool for signal processing and data analysis. It is almost exclusively solved by linear neurons. This paper presents a linear neuron endowed with a novel learning law, called MCA EXINn and analyzes its features. The neural literature about MCA is very poor, in the sense that both a little theoretical basis is given (almost al… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
46
0

Year Published

2004
2004
2010
2010

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 124 publications
(46 citation statements)
references
References 48 publications
0
46
0
Order By: Relevance
“…A well-known tools for computing the principal and minor subspace of a data matrix is Oja's rule [1,2] and several variations of it [3,4,5,6]. Many other methods that involve Rayleigh and inverse Rayleigh quotients [7,8,9] are derived for performing principal subspace analysis (PSA), and minor subspace analysis (MSA), principal and minor component analysis (PCA, and MCA, respectively).…”
Section: Introductionmentioning
confidence: 99%
“…A well-known tools for computing the principal and minor subspace of a data matrix is Oja's rule [1,2] and several variations of it [3,4,5,6]. Many other methods that involve Rayleigh and inverse Rayleigh quotients [7,8,9] are derived for performing principal subspace analysis (PSA), and minor subspace analysis (MSA), principal and minor component analysis (PCA, and MCA, respectively).…”
Section: Introductionmentioning
confidence: 99%
“…This paper reviews single-neuron MCA rules. The review is based on the study by Cirrincione et al 2 , where the rules OJAn, 3 , LUO, 4,5,6 MCA-EXIN (here EXIN for short), 2 OJA+, 8 and FENG…”
Section: Introductionmentioning
confidence: 99%
“…MCA has several applications in adaptive signal processing, parameter estimation, and computer vision. 1,2 In neural network techniques for MCA, estimates of the minor eigenvectors are updated by sequentially processing data vectors drawn from the distribution. Neural MCA methods are advantageous for highdimensional data, since they avoid the computation of the covariance matrix, and are suitable for the tracking of non-stationary distributions.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations