2017 12th International Workshop on Self-Organizing Maps and Learning Vector Quantization, Clustering and Data Visualization (W 2017
DOI: 10.1109/wsom.2017.8020009
|View full text |Cite
|
Sign up to set email alerts
|

Fusion of deep learning architectures, multilayer feedforward networks and learning vector quantizers for deep classification learning

Abstract: The advantage of prototype based learning vector quantizers are the intuitive and simple model adaptation as well as the easy interpretability of the prototypes as class representatives for the class distribution to be learned. Although they frequently yield competitive performance and show robust behavior nowadays powerful alternatives have increasing attraction. Particularly, deep architectures of multilayer networks achieve frequently very high accuracies and are, thanks to modern graphic processor units us… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 44 publications
0
4
0
Order By: Relevance
“…Multiple studies explored the integration of the hierarchical feature-extracting capability of deep feature extractors with VQ [12,50,4] which were also later adapted to Continual Learning. TPCIL [46] proposed to retain the topology of the feature space to preserve old knowledge over the increments.…”
Section: Vector Quantizationmentioning
confidence: 99%
“…Multiple studies explored the integration of the hierarchical feature-extracting capability of deep feature extractors with VQ [12,50,4] which were also later adapted to Continual Learning. TPCIL [46] proposed to retain the topology of the feature space to preserve old knowledge over the increments.…”
Section: Vector Quantizationmentioning
confidence: 99%
“…This allows to rewrite both of them using equivalent operations. This strategy is already described for different levels of abstraction in [53,54,55,56,57].…”
Section: A Different View On Fully-connected Layersmentioning
confidence: 99%
“…One of the first contributions reporting an approach for the fusion of LVQ with NNs is [53]. Later on, the idea was formulated more precisely [54]. In [57], a fused network was applied to train a network on MNIST and Cifar10.…”
Section: Related Workmentioning
confidence: 99%
“…deep neural networks in applications that involve sophisticated feature extraction, vector quantization methods can be found incorporated in current deep learning architectures [15], [22], [23], and have recently shown impressive robustness against adversarial attacks, suggesting suitability in security critical applications [24].…”
Section: Introductionmentioning
confidence: 99%