2024
DOI: 10.3390/app14041491
|View full text |Cite
|
Sign up to set email alerts
|

Stable Low-Rank CP Decomposition for Compression of Convolutional Neural Networks Based on Sensitivity

Chenbin Yang,
Huiyi Liu

Abstract: Modern convolutional neural networks (CNNs) play a crucial role in computer vision applications. The intricacy of the application scenarios and the growing dataset both significantly raise the complexity of CNNs. As a result, they are often overparameterized and have significant computational costs. One potential solution for optimizing and compressing the CNNs is to replace convolutional layers with low-rank tensor decomposition. The most suitable technique for this is Canonical Polyadic (CP) decomposition. H… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 53 publications
0
1
0
Order By: Relevance
“…Hence, stability is one of the most important issues if an ANN model is to be used in practical applications [49]. This has motivated numerous studies on the development of stable neural control strategies for different types of ANN models [38,[40][41][42]44,[50][51][52], including fractional-order neural networks of a continuous and discrete nature [23,24,27,28,30,43,46,47]. For discrete fractional-order neural network models, considerable investigations have been devoted to investigating the Mittag-Lefller stability properties [28,30], which generalize the exponential stability ones.…”
Section: Introductionmentioning
confidence: 99%
“…Hence, stability is one of the most important issues if an ANN model is to be used in practical applications [49]. This has motivated numerous studies on the development of stable neural control strategies for different types of ANN models [38,[40][41][42]44,[50][51][52], including fractional-order neural networks of a continuous and discrete nature [23,24,27,28,30,43,46,47]. For discrete fractional-order neural network models, considerable investigations have been devoted to investigating the Mittag-Lefller stability properties [28,30], which generalize the exponential stability ones.…”
Section: Introductionmentioning
confidence: 99%