Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence 2018
DOI: 10.24963/ijcai.2018/171
|View full text |Cite
|
Sign up to set email alerts
|

Centralized Ranking Loss with Weakly Supervised Localization for Fine-Grained Object Retrieval

Abstract: Fine-grained object retrieval has attracted extensive research focus recently. Its state-of-the-art schemes are typically based upon convolutional neural network (CNN) features. Despite the extensive progress, two issues remain open. On one hand, the deep features are coarsely extracted at image level rather than precisely at object level, which are interrupted by background clutters. On the other hand, training CNN features with a standard triplet loss is time consuming and incapable to learn discriminative f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
52
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 51 publications
(52 citation statements)
references
References 15 publications
0
52
0
Order By: Relevance
“…We have conducted extensive experiments on two widelyused FGIR benchmark CUB-200-2011 and CARS196, with comparisons to a set of state-of-the-art methods. Quantitatively, it outperforms CRL-WSL (Zheng et al 2018) by 12.0% in CAR196, and runs 5× faster in training over triplet loss.…”
Section: Introductionmentioning
confidence: 94%
See 1 more Smart Citation
“…We have conducted extensive experiments on two widelyused FGIR benchmark CUB-200-2011 and CARS196, with comparisons to a set of state-of-the-art methods. Quantitatively, it outperforms CRL-WSL (Zheng et al 2018) by 12.0% in CAR196, and runs 5× faster in training over triplet loss.…”
Section: Introductionmentioning
confidence: 94%
“…Fine-Grained Image Retrieval (FGIR). FGIR has attracted increasing research focus in recent years (Wei et al 2017;Xie et al 2015;Zhang et al 2016;Zheng et al 2018). It aims to differentiate subordinate classes, where the challenges are two-fold: 1) Most classes are highly correlated and difficult to be distinguished due to their subtle difference i.e., small inter-class variance.…”
Section: Related Workmentioning
confidence: 99%
“…The clustering-based structured losses aim to learn a discriminative embedding space by optimizing clustering metric and are applied in abundant fields of computer vision like face recognition [53,54] and fine-grained image retrieval (FGIR) [55,56]. Clustering loss [57] utilizes the structured prediction framework to realize clustering with higher score for ground truth than others.…”
Section: Clustering-based Structured Lossmentioning
confidence: 99%
“…The triple-center loss (TCL) [59] was proposed to learn a center for each category and separate the cluster centers and their relevant samples from different categories. To enhance the performance of FGIR, centralized ranking loss (CRL) [55] was proposed aiming to optimize centers and enlarge the compactness and separability of intraclass and interclass samples. Later, decorrelated global-aware centralized loss (DGCRL) [56] was proposed to optimize the center space by utilizing Gram-Schmidt independent operation and enhance the clustering result by combining softmax loss.…”
Section: Clustering-based Structured Lossmentioning
confidence: 99%
“…To some extent, SCDA is not a deep learning method because it just localizes salient regions with a pretrained CNN model. To address the limitation of unsupervised fine-grained image retrieval by pre-trained models, CRL-WSL [24] proposes a unified architecture to jointly learn salient regions and meaningful deep descriptors with labeled data. Recently, DCL-NS [25] obtains state-of-the-art performance on several benchmark datasets.…”
Section: B Fine-grained Image Retrievalmentioning
confidence: 99%