2021
DOI: 10.1007/978-3-030-85099-9_13
|View full text |Cite
|
Sign up to set email alerts
|

On Effects of Compression with Hyperdimensional Computing in Distributed Randomized Neural Networks

Abstract: A change of the prevalent supervised learning techniques is foreseeable in the near future: from the complex, computational expensive algorithms to more flexible and elementary training ones. The strong revitalization of randomized algorithms can be framed in this prospect steering. We recently proposed a model for distributed classification based on randomized neural networks and hyperdimensional computing, which takes into account cost of information exchange between agents using compression. The use of comp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 37 publications
(32 reference statements)
0
1
0
Order By: Relevance
“…There are also efforts to apply HDC/VSA within machine learning outside of classification. Examples of such efforts are using data transformed into HVs for clustering [7,137], unsupervised learning [279,306], multi-task learning [32][33][34], distributed learning [149,361], model compression [32,143,361,362], and ensemble learning [25,409].…”
Section: Applications In Machine Learning Beyond Classificationmentioning
confidence: 99%
“…There are also efforts to apply HDC/VSA within machine learning outside of classification. Examples of such efforts are using data transformed into HVs for clustering [7,137], unsupervised learning [279,306], multi-task learning [32][33][34], distributed learning [149,361], model compression [32,143,361,362], and ensemble learning [25,409].…”
Section: Applications In Machine Learning Beyond Classificationmentioning
confidence: 99%