2017
DOI: 10.1016/j.neucom.2017.01.045
|View full text |Cite
|
Sign up to set email alerts
|

Graph regularized multilayer concept factorization for data representation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
22
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 32 publications
(22 citation statements)
references
References 32 publications
0
22
0
Order By: Relevance
“…Several methods of NMF are discussed here, which include: Semi supervised constrained NMF [19], semisupervised graph based discriminative NMF [20], Bayesian learning approach to reduce the generalization error in upper bound using NMF [21] and update rules [22], sparseness NMF, which provides better characterization of the features [23], sparse unmixing NMF [24], locally weighted sparse graph regularized NMF [25], graph-regularized NMF [26], graph dual regularization [27], multiple graph regularized NMF [28], graph regularized multilayer NMF [29], adaptive graph regularized NMF [30], hyper-graph regularized [31], graph regularization with sparse NMF [32], multi-view NMF [33], extended incremental NMF [34], incremental orthogonal projective NMF [35], correntropy induced metric NMF [36], multi-view NMF [37], patch based NMF [38], MMNMF [39], regularized NMF [40], FR conjugate gradient NMF [41]. However, these methods failed to address the problems associated with non-orthogonality due to the presence of nonnegative elements in NMF.…”
Section: Related Workmentioning
confidence: 99%
“…Several methods of NMF are discussed here, which include: Semi supervised constrained NMF [19], semisupervised graph based discriminative NMF [20], Bayesian learning approach to reduce the generalization error in upper bound using NMF [21] and update rules [22], sparseness NMF, which provides better characterization of the features [23], sparse unmixing NMF [24], locally weighted sparse graph regularized NMF [25], graph-regularized NMF [26], graph dual regularization [27], multiple graph regularized NMF [28], graph regularized multilayer NMF [29], adaptive graph regularized NMF [30], hyper-graph regularized [31], graph regularization with sparse NMF [32], multi-view NMF [33], extended incremental NMF [34], incremental orthogonal projective NMF [35], correntropy induced metric NMF [36], multi-view NMF [37], patch based NMF [38], MMNMF [39], regularized NMF [40], FR conjugate gradient NMF [41]. However, these methods failed to address the problems associated with non-orthogonality due to the presence of nonnegative elements in NMF.…”
Section: Related Workmentioning
confidence: 99%
“…Nonnegative matrix factorization with local similarity learning (KLS-NMF) [44] introduced self-expressiveness mechanism into matrix factorization to learn the local similarity of the data in the kernel space. Chen et al [49] performed the multilayer concept factorization by exploiting the local structure. Qian et al [50] combined sparse graph learning into NMF for solving the feature matrix.…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, how to extract effective information from high-dimensional data becomes very significant. In recent years, data representation plays an important role in pattern recognition and image processing [1]- [3]. A suitable data representation is helpful to reveal the potential information structure of the data very well, which is convenient for the next processing.…”
Section: Introductionmentioning
confidence: 99%