Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2018 23rd Asia and South Pacific Design Automation Conference (ASP-DAC) 2018
DOI: 10.1109/aspdac.2018.8297401
|View full text |Cite
|
Sign up to set email alerts
|

CANNA: Neural network acceleration using configurable approximation on GPGPU

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
5
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
6
3

Relationship

2
7

Authors

Journals

citations
Cited by 14 publications
(5 citation statements)
references
References 24 publications
0
5
0
Order By: Relevance
“…There are many other ways of approximating multiplication that had not been applied to deep CNNs, such as [43], [44], [45] among countless others. While we believe that the studied multiplier designs are the most promising, there are most likely other related opportunities for improving CNNs.…”
Section: Related Workmentioning
confidence: 99%
“…There are many other ways of approximating multiplication that had not been applied to deep CNNs, such as [43], [44], [45] among countless others. While we believe that the studied multiplier designs are the most promising, there are most likely other related opportunities for improving CNNs.…”
Section: Related Workmentioning
confidence: 99%
“…There are many other ways of approximating multiplica-tion that had not been applied to deep CNNs, such as [43], [44], [45] among countless others. While we believe that the studied multiplier designs are the most promising, there are most likely other related opportunities for improving CNNs.…”
Section: Related Workmentioning
confidence: 99%
“…To further enhance the associative mechanism, additional approximations not strictly related to the arithmetic MAC approximation can offer an orthogonal dimension for optimization. Bit obfuscation and operand precision lowering were used to relax the matching rules indeed, further increasing the repetitiveness of certain patterns and the probability the required data get available in the associative memory [17][18][19]. To be noted that the error introduced by approximate matching rules might call for auxiliary error-recovering policies through online hardware calibration and/or custom training procedures.…”
Section: Introductionmentioning
confidence: 99%