2018
DOI: 10.1109/tip.2017.2749147
|View full text |Cite
|
Sign up to set email alerts
|

Hashing with Angular Reconstructive Embeddings

Abstract: Large-scale search methods are increasingly critical for many content-based visual analysis applications, among which hashing-based approximate nearest neighbor search techniques have attracted broad interests due to their high efficiency in storage and retrieval. However, existing hashing works are commonly designed for measuring data similarity by the Euclidean distances. In this paper, we focus on the problem of learning compact binary codes using the cosine similarity. Specifically, we proposed novel angul… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
33
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 94 publications
(33 citation statements)
references
References 49 publications
0
33
0
Order By: Relevance
“…Similarly, the ordinal relation preserving restriction for bitwise weights is re-defined as in Eq. (9).…”
Section: S(i J)mentioning
confidence: 99%
See 2 more Smart Citations
“…Similarly, the ordinal relation preserving restriction for bitwise weights is re-defined as in Eq. (9).…”
Section: S(i J)mentioning
confidence: 99%
“…The data-independent hashing, such as locality-sensitive hashing (LSH) [7], randomly generates hashing functions, *Correspondence: zhwang@sdut.edu.cn 1 School of Computer Science and Technology, Shandong University of Technology, Zibo, 255000 China Full list of author information is available at the end of the article and it typically requires a long binary code or multihash tables to achieve satisfying performance. In contrast, the data-dependent hashing algorithms, such as BDMFH [8] and ARE [9], utilize machine learning mechanisms to learn similarity preserving binary codes. Bidirectional discrete matrix factorization hashing (BDMFH) [8] proposes to alternate two mutually promoted processes of learning binary codes from data and recovering data from the binary codes.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Finally, the algorithm to learn parameters should be fast and for unseen samples, the hashing method should produce the hash codes efficiently. It is very challenging to simultaneously satisfy all three requirements, especially, under the binary constraint which leads to an NP-hard mixed-integer optimization The proposed hashing methods in literature can be categorized into data-independence (Gionis et al (1999); Kulis and Grauman (2009); Raginsky and Lazebnik (2009)) and datadependence; in which, the latter recently receives more attention in both (semi-)supervised (Do et al (2016b); Kulis and Darrell (2009) ;Lin et al (2014); Liu et al (2012); Norouzi et al (2012); Shen et al (2015); ; Cao et al (2018); Jain et al (2017); Liu et al (2016); Lin et al (2015); Lai et al (2015); Lin et al (2016)) and unsupervised (Carreira-Perpiñán and Raziperchikolaei (2015); Do et al (2016aDo et al ( , 2017; Gong and Lazebnik (2011); He et al (2013); Heo et al (2012); Shen et al (2018); Hu et al (2018); Huang and Lin (2018); y. Duan et al (2018); Wang et al (2018); Duan et al (2017); En et al (2017); ) manners.…”
Section: Introductionmentioning
confidence: 99%
“…A solution to avoid an exhaustive comparison is to employ approximate NN (ANN) search algorithms. ANN algorithms aim at identifying images from a large archive that have a high probability to be NNs of the query image with a sublinear, or even constant time complexity [1]. Recently, hashing-based ANN search techniques have been used in RS due to their high time-efficient (in terms of both storage and speed) and accurate search capability within huge data archives [2].…”
Section: Introductionmentioning
confidence: 99%