2012
DOI: 10.1111/j.1467-8640.2011.00423.x
|View full text |Cite
|
Sign up to set email alerts
|

Similarity‐based Retrieval With Structure‐sensitive Sparse Binary Distributed Representations

Abstract: We present an approach to similarity-based retrieval from knowledge bases that takes into account both the structure and semantics of knowledge base fragments. Those fragments, or analogues, are represented as sparse binary vectors that allow a computationally efficient estimation of structural and semantic similarity by the vector dot product. We present the representation scheme and experimental results for the knowledge base that was previously used for testing of leading analogical retrieval models MAC/FAC… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
30
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 45 publications
(30 citation statements)
references
References 74 publications
0
30
0
Order By: Relevance
“…VSAs make use of additional operations on high-dimensional vectors. So far, VSAs have been applied in various fields including robotics [45], addressing catastrophic forgetting in deep neural networks [9], medical diagnosis [73], fault detection [29], analogy mapping [52], reinforcement learning [30], long-short term memory [11], text classification [31], and synthesis of finite state automata [49]. They have been used in combination with deep-learned descriptors before, e.g.…”
Section: Vector Symbolic Architecturesmentioning
confidence: 99%
“…VSAs make use of additional operations on high-dimensional vectors. So far, VSAs have been applied in various fields including robotics [45], addressing catastrophic forgetting in deep neural networks [9], medical diagnosis [73], fault detection [29], analogy mapping [52], reinforcement learning [30], long-short term memory [11], text classification [31], and synthesis of finite state automata [49]. They have been used in combination with deep-learned descriptors before, e.g.…”
Section: Vector Symbolic Architecturesmentioning
confidence: 99%
“…Note that such data representation schemes by similarity preserving binary vectors have been developed for objects represented by various data types, mainly for (feature) vectors (see survey in [131]), but also for structured data types such as sequences [102,72,85,86] and graphs [127,128,148,136,62,134]. A significant part of this research is developed in the framework of distributed representations [45,76,106,126,89], including binary sparse distributed representations [102,98,103,127,128,113,114,137,138,139,148,135,136,61,134,129,130,131,132,31,33] and dense distributed representations [75,76] (see [82,84,87,88,83] for examples of their applications).…”
Section: Generalization In Namsmentioning
confidence: 99%
“…This is achieved by using codewords with very large dimensions. Several different types of VSAs have been introduced, each using different representations (Plate, 2003;Kanerva, 2009;Gayler, 1998;Rachkovskij and Kussul, 2001;Aerts et al, 2009;Rachkovskij et al, 2013;Gallant and Okaywe, 2013;Snaider and Franklin, 2014).…”
Section: Related Workmentioning
confidence: 99%