2021
DOI: 10.1145/3462329
|View full text |Cite
|
Sign up to set email alerts
|

A Quality-assured Approximate Hardware Accelerators–based on Machine Learning and Dynamic Partial Reconfiguration

Abstract: Machine learning is widely used these days to extract meaningful information out of the Zettabytes of sensors data collected daily. All applications require analyzing and understanding the data to identify trends, e.g., surveillance, exhibit some error tolerance. Approximate computing has emerged as an energy-efficient design paradigm aiming to take advantage of the intrinsic error resilience in a wide set of error-tolerant applications. Thus, inexact results could reduce power consumption, delay, area, and ex… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
2
2
2

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(1 citation statement)
references
References 54 publications
0
1
0
Order By: Relevance
“…Approximate computing (AC) has reemerged as a mainstream computing paradigm. The main principle of AC is sacrificing output accuracy and accepting less-than-optimal results to save area, power, and delay [20]. AC is applicable at the software level, e.g., loop perforation, and hardware level, e.g., inexact full adders [21].…”
Section: Approximate Computingmentioning
confidence: 99%
“…Approximate computing (AC) has reemerged as a mainstream computing paradigm. The main principle of AC is sacrificing output accuracy and accepting less-than-optimal results to save area, power, and delay [20]. AC is applicable at the software level, e.g., loop perforation, and hardware level, e.g., inexact full adders [21].…”
Section: Approximate Computingmentioning
confidence: 99%