2021
DOI: 10.1155/2021/6630552
|View full text |Cite
|
Sign up to set email alerts
|

Extensible Embedded Processor for Convolutional Neural Networks

Abstract: Convolutional neural networks (CNNs) require significant computing power during inference. Smart phones, for example, may not run a facial recognition system or search algorithm smoothly due to the lack of resources and supporting hardware. Methods for reducing memory size and increasing execution speed have been explored, but choosing effective techniques for an application requires extensive knowledge of the network architecture. This paper proposes a general approach to preparing a compressed deep neural ne… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 18 publications
0
1
0
Order By: Relevance
“…As many embedded systems are battery powered, hardware accelerators may not be power efficient for systems with power constraints [2]. Misko et al [14] achieve extensible embedded processors for CNN computations with minimal addition to existing microprocessor hardware. Custom SIMD instructions of small convolution size 3 × 3 are added to increase the data parallelism of the SIMD units.…”
Section: Hardware Approachmentioning
confidence: 99%
“…As many embedded systems are battery powered, hardware accelerators may not be power efficient for systems with power constraints [2]. Misko et al [14] achieve extensible embedded processors for CNN computations with minimal addition to existing microprocessor hardware. Custom SIMD instructions of small convolution size 3 × 3 are added to increase the data parallelism of the SIMD units.…”
Section: Hardware Approachmentioning
confidence: 99%