Proceedings of the 5th ACM on International Conference on Multimedia Retrieval 2015
DOI: 10.1145/2671188.2749336
|View full text |Cite
|
Sign up to set email alerts
|

Effective, Efficient, and Scalable Unsupervised Distance Learning in Image Retrieval Tasks

Abstract: Various unsupervised learning methods have been proposed with significant improvements in the effectiveness of image search systems. However, despite the relevant effectiveness gains, these approaches commonly require high computation efforts, not addressing properly efficiency and scalability requirements. In this paper, we present a novel unsupervised learning approach for improving the effectiveness of image retrieval tasks. The proposed method is also scalable and efficient as it exploits parallel and hete… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
14
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6

Relationship

3
3

Authors

Journals

citations
Cited by 14 publications
(15 citation statements)
references
References 41 publications
0
14
0
Order By: Relevance
“…The small number of images per class (only 4) makes this dataset a very challenging one for unsupervised learning algorithms. Despite this fact, the CPRR achieved high gains ranging from +5.08% to +17.72% and superior to a recent baseline [9] . Fig.…”
Section: Effectiveness Evaluationmentioning
confidence: 77%
See 4 more Smart Citations
“…The small number of images per class (only 4) makes this dataset a very challenging one for unsupervised learning algorithms. Despite this fact, the CPRR achieved high gains ranging from +5.08% to +17.72% and superior to a recent baseline [9] . Fig.…”
Section: Effectiveness Evaluationmentioning
confidence: 77%
“…In this work, two different depths are considered: L , which defines a broader neighborhood used by the rank normalization step; and k , which defines a local neighborhood used by Cartesian product operations. Once both k and L values are much smaller than n , a sparse matrix structure [9] can be used for storage of similarity scores.…”
Section: Rank Similarity Scorementioning
confidence: 99%
See 3 more Smart Citations