2019
DOI: 10.1007/978-3-030-30490-4_51
|View full text |Cite
|
Sign up to set email alerts
|

Improving Deep Image Clustering with Spatial Transformer Layers

Abstract: Image clustering is an important but challenging task in machine learning. As in most image processing areas, the latest improvements came from models based on the deep learning approach. However, classical deep learning methods have problems to deal with spatial image transformations like scale and rotation. In this paper, we propose the use of visual attention techniques to reduce this problem in image clustering methods. We evaluate the combination of a deep image clustering model called Deep Adaptive Clust… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
1
1
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 9 publications
0
1
0
Order By: Relevance
“…Attention mechanisms were first used in Natural Language Processing (NLP) (Bahdanau et al, 2014;Vaswani et al, 2017) then in computer vision (Guan et al, 2018;Jaderberg et al, 2015;Woo et al, 2018). Some methods have been proposed for unsupervised learning, but mainly on specific cases, like graph clustering (Wang et al, 2019) or the use of spatial attention (Souza and Zanchettin, 2019). However, a few works have also been conducted for time series clustering.…”
Section: Attention Mechanismmentioning
confidence: 99%
“…Attention mechanisms were first used in Natural Language Processing (NLP) (Bahdanau et al, 2014;Vaswani et al, 2017) then in computer vision (Guan et al, 2018;Jaderberg et al, 2015;Woo et al, 2018). Some methods have been proposed for unsupervised learning, but mainly on specific cases, like graph clustering (Wang et al, 2019) or the use of spatial attention (Souza and Zanchettin, 2019). However, a few works have also been conducted for time series clustering.…”
Section: Attention Mechanismmentioning
confidence: 99%