2022
DOI: 10.3389/fnins.2022.757125
|View full text |Cite
|
Sign up to set email alerts
|

GrapHD: Graph-Based Hyperdimensional Memorization for Brain-Like Cognitive Learning

Abstract: Memorization is an essential functionality that enables today's machine learning algorithms to provide a high quality of learning and reasoning for each prediction. Memorization gives algorithms prior knowledge to keep the context and define confidence for their decision. Unfortunately, the existing deep learning algorithms have a weak and nontransparent notion of memorization. Brain-inspired HyperDimensional Computing (HDC) is introduced as a model of human memory. Therefore, it mimics several important funct… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
3

Relationship

5
4

Authors

Journals

citations
Cited by 32 publications
(11 citation statements)
references
References 64 publications
0
7
0
Order By: Relevance
“…Specifically, hypervector representation is (1) holographic , that information is distributed evenly across components of the hypervector (Kleyko et al, 2023 ), (2) robust , that hypervectors are extremely noise tolerant as a natural result of hypervector redundancy (Kanerva, 2009 ; Poduval et al, 2022b ; Barkam et al, 2023a ), and (3) simple , that only lightweight operations are needed to perform learning tasks (Hernandez-Cane et al, 2021 ; Ni et al, 2022b ). In addition, the ability for hypervectors to operate symbolically through simple arithmetic has granted HDC the ability to perform cognitive tasks in a transparent and compositional way, e.g., memorization, learning, and association (Poduval et al, 2022a ; Hersche et al, 2023 ). Given the importance of the properties aforementioned, most HDC frameworks have a dedicated and specially designed HDC encoder for mapping original inputs to corresponding hypervectors.…”
Section: Introductionmentioning
confidence: 99%
“…Specifically, hypervector representation is (1) holographic , that information is distributed evenly across components of the hypervector (Kleyko et al, 2023 ), (2) robust , that hypervectors are extremely noise tolerant as a natural result of hypervector redundancy (Kanerva, 2009 ; Poduval et al, 2022b ; Barkam et al, 2023a ), and (3) simple , that only lightweight operations are needed to perform learning tasks (Hernandez-Cane et al, 2021 ; Ni et al, 2022b ). In addition, the ability for hypervectors to operate symbolically through simple arithmetic has granted HDC the ability to perform cognitive tasks in a transparent and compositional way, e.g., memorization, learning, and association (Poduval et al, 2022a ; Hersche et al, 2023 ). Given the importance of the properties aforementioned, most HDC frameworks have a dedicated and specially designed HDC encoder for mapping original inputs to corresponding hypervectors.…”
Section: Introductionmentioning
confidence: 99%
“…We exploit Hyper-Dimensional Computing (HDC) as an alternative computational model that mimics important brain functionalities toward high-efficiency and noise-tolerant computation (Kanerva, 2009 ; Rahimi et al, 2016b ; Pale et al, 2021 , 2022 ; Zou et al, 2021 ). HDC supports operators that emulate the behavior of associative memory and enables higher cognitive functionalities (Gayler, 2004 ; Kanerva, 2009 ; Poduval et al, 2022 ). In HDC, objects are thereby encoded with high-dimensional vectors, called hypervectors , which have thousands of elements (Kanerva, 2009 ; Rahimi et al, 2016b ; Imani et al, 2019c ).…”
Section: Introductionmentioning
confidence: 99%
“…This encoding is performed using a set of pre-generated base vectors . HDC is well suited to address several learning tasks in IoT systems as: (i) HDC is computationally efficient and amenable to hardware level optimization 30 – 32 , (ii) it supports single-pass training with no back-propagation or gradient computation, (iii) HDC offers an intuitive and human-interpretable model 33 , (iv) it is a computational paradigm that can be applied to a wide range of learning and cognitive problems 33 45 , and (v) it provides strong robustness to noise—a key strength for IoT systems 46 . Despite the above-listed advantages, HDC encoding schemes are not designed for handling neuromorphic data.…”
Section: Introductionmentioning
confidence: 99%