2016 New Trends in Signal Processing (NTSP) 2016
DOI: 10.1109/ntsp.2016.7747793
|View full text |Cite
|
Sign up to set email alerts
|

Hash function generation by neural network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
2
2

Relationship

1
8

Authors

Journals

citations
Cited by 14 publications
(10 citation statements)
references
References 2 publications
0
10
0
Order By: Relevance
“…Most notably, there has been work on learning locality-sensitive hash (LSH) functions to build Approximate Nearest Neighborhood (ANN) indexes. For example, [66,68,40] explore the use of neural networks as a hash function, whereas [69] even tries to preserve the order of the multi-dimensional input space. However, the general goal of LSH is to group similar items into buckets to support nearest neighborhood queries, usually involving learning approximate similarity measures in high-dimensional input space using some variant of hamming distances.…”
Section: Related Workmentioning
confidence: 99%
“…Most notably, there has been work on learning locality-sensitive hash (LSH) functions to build Approximate Nearest Neighborhood (ANN) indexes. For example, [66,68,40] explore the use of neural networks as a hash function, whereas [69] even tries to preserve the order of the multi-dimensional input space. However, the general goal of LSH is to group similar items into buckets to support nearest neighborhood queries, usually involving learning approximate similarity measures in high-dimensional input space using some variant of hamming distances.…”
Section: Related Workmentioning
confidence: 99%
“…However, order-preserving hashing is unsuitable for sorting since it does not provide fast enough training and inference times, and, to the best of our knowledge, there does not exist any sorting algorithm that uses order-preserving hashing for sorting. Similarly, localitysensitive hashing [54,56,57] can also not be used for sorting a single numeric value as we are concerned with sorting a single dimension rather than efficiently finding similar items in a multi-dimensional space. Finally, perfect hashing's objective is to avoid element collisions, which would initially seem an interesting choice to use for Learned Sort.…”
Section: Related Workmentioning
confidence: 99%
“…The application of neural networks as a hash function is researched mainly in the context of security. Turčaník et al [32] initialize a neural network with random weights and biases. Lian et al [23] apply a chaotic map function to the weights and biases.…”
Section: Neural Network As Hash Functionsmentioning
confidence: 99%
“…This is possible because a key design feature for GNNs is that they should be permutation invariant or at least equivariant [13], which also provides the desired effect for this case. The application of neural networks for hash functions is mainly researched under a security perspective [23,32]. Their goal is to provide a one-way function that ensures big changes in the resulting hash, even when there are only small differences in the input.…”
Section: Introductionmentioning
confidence: 99%