2016
DOI: 10.1109/jstsp.2016.2555239
| View full text |Cite
|
Sign up to set email alerts
|

Abstract: Mining useful clusters from high dimensional data has received significant attention of the computer vision and pattern recognition community in the recent years. Linear and non-linear dimensionality reduction has played an important role to overcome the curse of dimensionality. However, often such methods are accompanied with three different problems: high computational complexity (usually associated with the nuclear norm minimization), non-convexity (for matrix factorization methods) and susceptibility to gr… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
98
0
1

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 106 publications
(99 citation statements)
references
References 32 publications
0
98
0
1
Order By: Relevance
“…This includes signal analysis concepts such as Fourier transforms [32], filtering [30], [33], wavelets [34], [35], filterbanks [36], [37], multiresolution analysis [38]- [40], denoising [41], [42], and dictionary learning [43], [44], and stationary signal analysis [45], [46]. Spectral clustering and principal component analysis approaches based on graph signal filtering have also been proposed recently [47], [48]. Several approaches have been proposed for learning the graph directly from data [49], [50].…”
Section: Preliminaries On Graph Signal Processingmentioning
confidence: 99%
“…This includes signal analysis concepts such as Fourier transforms [32], filtering [30], [33], wavelets [34], [35], filterbanks [36], [37], multiresolution analysis [38]- [40], denoising [41], [42], and dictionary learning [43], [44], and stationary signal analysis [45], [46]. Spectral clustering and principal component analysis approaches based on graph signal filtering have also been proposed recently [47], [48]. Several approaches have been proposed for learning the graph directly from data [49], [50].…”
Section: Preliminaries On Graph Signal Processingmentioning
confidence: 99%
“…This work is also related to the matrix factorization proposed by Shahid et al [22], where the graph Laplacians of both the features and the observation regularize the decomposition of a dataset into a low-rank matrix and a sparse matrix representing noise. Then the observations are clustered using k-means on the low-dimensional principal components of the smooth low-rank matrix.…”
Section: A Related Workmentioning
confidence: 99%
“…(3) [23,24] incorporate structural knowledge into RPCA by adding spectral graph regularisation. Given the graph Laplacian Φ of each data similarity graph, Robust PCA on Graphs (RPCAG) and Fast Robust PCA on Graphs (FRPCAG) add an additional tr(LΦL T ) term to the PCP objective for the low-rank component L. The main drawback of the above mentioned models is that the side information needs to be accurate and noiseless, which is not trivial in practical scenarios.…”
Section: Related Workmentioning
confidence: 99%
“…The tolerance threshold for RPCAG and FR-PCAG are all set to = 10 −7 for reasons of consistency. We choose λ = 1/ max(n 1 , n 2 ) for a general matrix of dimension n 1 × n 2 as suggested in [23,24]. For simulation experiments, γ in RPCAG is given by the minimiser (at γ = 0.2) of L−L0 F L0 F on the benchmark problem (Figure 7).…”
Section: A Parameter Calibrationmentioning
confidence: 99%