2020
DOI: 10.1007/978-3-030-58586-0_45
|View full text |Cite
|
Sign up to set email alerts
|

Mitigating Embedding and Class Assignment Mismatch in Unsupervised Image Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
25
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
3
2

Relationship

2
6

Authors

Journals

citations
Cited by 25 publications
(26 citation statements)
references
References 18 publications
1
25
0
Order By: Relevance
“…It is important to note that certain deep image clustering methods report superior performance to our method [21,15,24,16]. However, most approaches are multi-stage methods consisting of initialization methods, multiple losses, and fine-tuning methods.…”
Section: Clustering Image Datasetsmentioning
confidence: 85%
See 2 more Smart Citations
“…It is important to note that certain deep image clustering methods report superior performance to our method [21,15,24,16]. However, most approaches are multi-stage methods consisting of initialization methods, multiple losses, and fine-tuning methods.…”
Section: Clustering Image Datasetsmentioning
confidence: 85%
“…k-means [7] can also be thought of as a special case of GMM where the clusters are represented with untilted spheres. Since these traditional clustering algorithms are limited to hand-designed parameters hence limited representability, there have been efforts to incorporate neural networks to model advanced clusters [4,12,13,14,15,16].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…As a solution in this paper, we introduce Elsa (Energy based learning for semi-supervised anomaly detection), a novel anomaly detection method that unifies contrastive learning and energy-based functions. Our model benefits from the high representation power of unsupervised contrastive learning via its pre-training step, which can accommodate existing algorithms [6,14,15]. It then applies a carefully designed energy function over the pre-trained embedding to learn the probability distribution p(x) of normal data, with the help of a small set of labels that indicate whether given samples are normal or OOD.…”
Section: Introductionmentioning
confidence: 99%
“…Entropy-based balancing has often been Figure 1: Illustration for this work's basic concept: robust learning approach via clean sample selection using pseudolabels from unsupervised clustering algorithm. adopted to prevent degenerate solutions along with these kinds of objectives [16,21,41].…”
Section: Introductionmentioning
confidence: 99%