2022
DOI: 10.3390/info13100451
|View full text |Cite
|
Sign up to set email alerts
|

Convolutional Neural Network Model Compression Method for Software—Hardware Co-Design

Abstract: Owing to their high accuracy, deep convolutional neural networks (CNNs) are extensively used. However, they are characterized by high complexity. Real-time performance and acceleration are required in current CNN systems. A graphics processing unit (GPU) is one possible solution to improve real-time performance; however, its power consumption ratio is poor owing to high power consumption. By contrast, field-programmable gate arrays (FPGAs) have lower power consumption and flexible architecture, making them mor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 19 publications
0
0
0
Order By: Relevance
“…Nevertheless, deploying CNNs on FPGAs presents challenges such as high storage requirements, external memory bandwidth limitations, and significant computational demands, especially as future models become more complex. Consequently, how to compress and optimize CNNs without sacrificing model performance becomes a crucial preprocessing step for their deployment on FPGAs [1]. Gong et.al proposed that deep neural networks generally contain parameter redundancy.…”
Section: Introductionmentioning
confidence: 99%
“…Nevertheless, deploying CNNs on FPGAs presents challenges such as high storage requirements, external memory bandwidth limitations, and significant computational demands, especially as future models become more complex. Consequently, how to compress and optimize CNNs without sacrificing model performance becomes a crucial preprocessing step for their deployment on FPGAs [1]. Gong et.al proposed that deep neural networks generally contain parameter redundancy.…”
Section: Introductionmentioning
confidence: 99%