2023
DOI: 10.1145/3558000
|View full text |Cite
|
Sign up to set email alerts
|

A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part II: Applications, Cognitive Models, and Challenges

Abstract: This is Part II of the two-part comprehensive survey devoted to a computing framework most commonly known under the names Hyperdimensional Computing and Vector Symbolic Architectures (HDC/VSA). Both names refer to a family of computational models that use high-dimensional distributed representations and rely on the algebraic properties of their key operations to incorporate the advantages of structured symbolic representations and vector distributed representations. Holographic Reduced Representations [322, 32… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
20
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 41 publications
(22 citation statements)
references
References 361 publications
0
20
0
Order By: Relevance
“…Similarly, it is important to disentangle the strengths and weaknesses of NeuroPixelHD compared to other machine learning models (RF, SVM, etc.). In general, HDC-based models may at times achieve higher task accuracy compared to non-symbolic alternatives (Kim et al, 2018; Imani et al, 2017), but their main strength lies in their symbolic and interpretable structure and computational efficiency (Kleyko et al, 2023; Thomas et al, 2021; Imani et al, 2021). The interpretability of HDC allows one to ask “cognitive” queries from a learned model.…”
Section: Discussionmentioning
confidence: 99%
“…Similarly, it is important to disentangle the strengths and weaknesses of NeuroPixelHD compared to other machine learning models (RF, SVM, etc.). In general, HDC-based models may at times achieve higher task accuracy compared to non-symbolic alternatives (Kim et al, 2018; Imani et al, 2017), but their main strength lies in their symbolic and interpretable structure and computational efficiency (Kleyko et al, 2023; Thomas et al, 2021; Imani et al, 2021). The interpretability of HDC allows one to ask “cognitive” queries from a learned model.…”
Section: Discussionmentioning
confidence: 99%
“…We begin by showing numerical results which measure the quality of the retrieved structures in terms of the unbinding error P ε and the SNR of overlaps defined in Eqn. 20. In all reported results, the extracted item (and the associated query) comes from pairs that are not part of the cueing structure S 0 .…”
Section: Retrieval Of Structured Memoriesmentioning
confidence: 96%
“…An early attempt used the tensor product to create a distributed representation of pairwise relations between discrete items 6 . Subsequently, several Vector-Symbolic Architectures (VSA) were proposed as compressions of the tensor product to avoid the increase in dimensionality of the representation, allowing for the creation of hierarchies of relations in a compact way [17][18][19][20][21][22] . More recently, several architectures for deep or recurrent neural networks have been proposed to promote flexible relational reasoning [23][24][25][26][27][28][29][30] .…”
mentioning
confidence: 99%
“…Hyperdimensional (HD) computing, also known as Vector Symbolic Architectures (Gayler, 2003 ; Kanerva, 2009 ; Kleyko et al, 2022c , d ), with origins in Holographic Reduced Representation (Plate, 1994a , 1995 ), is an approach to perform computations using vectors that contain many (in order of at least hundreds) of components. In HD computing, each basic concept within a domain is associated with a single vector.…”
Section: Introductionmentioning
confidence: 99%