2023
DOI: 10.1609/aaai.v37i8.26104
|View full text |Cite
|
Sign up to set email alerts
|

Why Capsule Neural Networks Do Not Scale: Challenging the Dynamic Parse-Tree Assumption

Abstract: Capsule neural networks replace simple, scalar-valued neurons with vector-valued capsules. They are motivated by the pattern recognition system in the human brain, where complex objects are decomposed into a hierarchy of simpler object parts. Such a hierarchy is referred to as a parse-tree. Conceptually, capsule neural networks have been defined to mimic this behavior. The capsule neural network (CapsNet), by Sabour, Frosst, and Hinton, is the first actual implementation of the conceptual idea of capsule neura… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 20 publications
0
0
0
Order By: Relevance
“…Moreover, in recent literature related to capsule networks, it was observed that with an increase in model depth, capsules tend to become inactive. Such inactive capsules are no longer activated [42].…”
Section: A Comparison Of Different Siamese Networkmentioning
confidence: 99%
“…Moreover, in recent literature related to capsule networks, it was observed that with an increase in model depth, capsules tend to become inactive. Such inactive capsules are no longer activated [42].…”
Section: A Comparison Of Different Siamese Networkmentioning
confidence: 99%