DOI: 10.1007/978-0-387-69319-4_6
|View full text |Cite
|
Sign up to set email alerts
|

Sparse Component Analysis: a New Tool for Data Mining

Abstract: In many practical problems for data mining the data X under consideration (given as (m x A^)-matrix) is of the form X = AS, where the matrices A and S with dimensions mxn and nx N respectively (often called mixing matrix or dictionary and source matrix) are unknown (m < n < N). We formulate conditions (SCA-conditions) under which we can recover A and S uniquely (up to scaling and permutation), such that S is sparse in the sense that each column of S has at least one zero element. We call this the Sparse Compon… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
30
0

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 34 publications
(31 citation statements)
references
References 27 publications
0
30
0
Order By: Relevance
“…LDSTM analysis identified thirty nine well-localized spatial components comprising cortical (18), subcortical (2) and cerebellar (19) regions. Cortical and subcortical spatial components (a k 's) are shown in Fig.…”
Section: Ldstm Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…LDSTM analysis identified thirty nine well-localized spatial components comprising cortical (18), subcortical (2) and cerebellar (19) regions. Cortical and subcortical spatial components (a k 's) are shown in Fig.…”
Section: Ldstm Resultsmentioning
confidence: 99%
“…Local dimension-reduced modelling (LDSTM) as presented here addresses an approach to source estimation and localization in resting state fMRI data analysis that dispenses with artificial stochastic model assumptions, such as those used in classical blind source separation (principal component analysis (PCA), independent component analysis (ICA) and non-negative matrix factorization (NMF) [3,18,19,21]). In addition to being sparse, the columns of the observation matrix act as point-spreading functions that allow system sources and their observation matrix to be identified via LSCA [32] of the whole fMRI dataset.…”
Section: Discussionmentioning
confidence: 99%
“…Within the signal processing community, besides independence, sparsity is another commonly imposed assumption that arises from the principle of parsimony (Babaie-Zadeh et al, 2006;Bronstein et al, 2005;Chen et al, 1998;Georgiev et al, 2007;Mairal et al, 2010;Wright et al, 2010). For neural coding, the sparseness means that the number of activated neurons is much less than the total number of neurons in the population in one time period.…”
Section: Introductionmentioning
confidence: 98%
“…The assumption underlying sparse representation is that each voxel's fMRI signal is linearly composed of sparse components. Georgiev et al (2007) proposed a sparse component analysis algorithm to identify the sources in fMRI data without the independent assumption. Li et al (2009Li et al ( , 2012 applied sparse representation for detection of voxels in fMRI data with task relevant information (Li et al, 2009) and proposed a sparse representation-based multivariate pattern analysis algorithm to localize brain activation patterns corresponding to different stimulus classes/brain states (Li et al, 2012).…”
Section: Introductionmentioning
confidence: 99%
“…Various approaches have been proposed to cope with these drawbacks by considering sparse suboptimal solutions to KPCA; see, e.g., [1,9], [19,Chapter 14], and the references therein. For other methods that employ sparsity constraints in data mining applications, such as sparse component analysis, see, e.g., [6,7,16].…”
mentioning
confidence: 99%