2017
DOI: 10.1007/978-3-319-59876-5_42
|View full text |Cite
|
Sign up to set email alerts
|

Fast Spectral Clustering Using Autoencoders and Landmarks

Abstract: Abstract. In this paper, we introduce an algorithm for performing spectral clustering efficiently. Spectral clustering is a powerful clustering algorithm that suffers from high computational complexity, due to eigen decomposition. In this work, we first build the adjacency matrix of the corresponding graph of the dataset. To build this matrix, we only consider a limited number of points, called landmarks, and compute the similarity of all data points with the landmarks. Then, we present a definition of the Lap… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
2
2

Relationship

1
8

Authors

Journals

citations
Cited by 13 publications
(14 citation statements)
references
References 12 publications
(10 reference statements)
0
13
0
Order By: Relevance
“…There exist many different improvements over the basic spectral clustering which was explained. Some of these developments are distributed spectral clustering (Chen et al, 2010), consistency spectral clustering (Von Luxburg et al, 2008), correctional spectral clustering (Blaschko & Lampert, 2008), spectral clustering by autoencoder (Banijamali & Ghodsi, 2017), multi-view spectral clustering Kumar & Daumé, 2011;, selftuning spectral clustering (Zelnik-Manor & Perona, 2004), and fuzzy spectral clustering . Some existing surveys and tutorials on spectral clustering are (Von Luxburg, 2007;Guan-zhong & Li-bin, 2008;Nascimento & De Carvalho, 2011;Guo et al, 2012).…”
Section: Other Improvements Over Spectral Clusteringmentioning
confidence: 99%
“…There exist many different improvements over the basic spectral clustering which was explained. Some of these developments are distributed spectral clustering (Chen et al, 2010), consistency spectral clustering (Von Luxburg et al, 2008), correctional spectral clustering (Blaschko & Lampert, 2008), spectral clustering by autoencoder (Banijamali & Ghodsi, 2017), multi-view spectral clustering Kumar & Daumé, 2011;, selftuning spectral clustering (Zelnik-Manor & Perona, 2004), and fuzzy spectral clustering . Some existing surveys and tutorials on spectral clustering are (Von Luxburg, 2007;Guan-zhong & Li-bin, 2008;Nascimento & De Carvalho, 2011;Guo et al, 2012).…”
Section: Other Improvements Over Spectral Clusteringmentioning
confidence: 99%
“…Aledhari et al [1] proposed a deep learning based method to minimize large genomic DNA dataset for transmission through the internet. Banijamali et al [6] integrated the recent deep auto-encoder technique into landmark-based spectral clustering.…”
Section: Related Workmentioning
confidence: 99%
“…Note these scenarios are not the different ways that we split and distribute the data for fast computation rather each should be viewed as one type of distributed settings: D 1 for which data at different sites have roughly disjoint supports, D 2 for which data at different sites have some overlap in terms of supports, and in D 3 individual sites have similar data distribution. 40000 data points are generated from the Gaussian mixture (6), and the number of representative points is 1000 (i.e., the data compression ratio is 40:1). The number of data points at Site 1 and 2, and also the number of representative points can all be calculated by the site specification and the data compression ratio accordingly.…”
Section: Synthetic Datamentioning
confidence: 99%
“…number and width of layers) which forces the network to obtain a different representation of the data while keeping important information. However, most of the recent researches on deep clustering [35,36,[38][39][40] proposed a different structure for each studied dataset and the clustering efficiency of the techniques proposed strongly depends on a specific DAE structure.…”
Section: Deep Autoencoder: Challenges and Issuesmentioning
confidence: 99%