2014
DOI: 10.1145/2567662
|View full text |Cite
|
Sign up to set email alerts
|

High-Level Abstractions and Modular Debugging for FPGA Design Validation

Abstract: Design validation is the most time-consuming task in the FPGA design cycle. Although manufacturers and third-party vendors offer a range of tools that provide visibility and control of the different stages of a design, many require that the design be fully re-implemented for even simple parameter modifications or do not allow the design to be run at full speed. Designs are typically first modeled using a high-level language then later rewritten in a hardware description language, first for simulation and then … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2014
2014
2020
2020

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(5 citation statements)
references
References 6 publications
0
5
0
Order By: Relevance
“…Their work, however, does not support most of the compiler optimizations performed during the HLS, and the use of SignalTap causes a high memory usage for the trace buffers, as reported also in Reference [33]. Iskander et al [27] propose an approach composed by two parts: a High-level Validation and Low level Debug. For the High-level Validation they run the golden reference software on a softcore on the FPGA, saving the results and comparing them with the results obtained from the accelerators.…”
Section: Related Workmentioning
confidence: 99%
“…Their work, however, does not support most of the compiler optimizations performed during the HLS, and the use of SignalTap causes a high memory usage for the trace buffers, as reported also in Reference [33]. Iskander et al [27] propose an approach composed by two parts: a High-level Validation and Low level Debug. For the High-level Validation they run the golden reference software on a softcore on the FPGA, saving the results and comparing them with the results obtained from the accelerators.…”
Section: Related Workmentioning
confidence: 99%
“…The golden reference and the hardware signature are compared at the end of the execution and bugs are automatically detected. Iskander et al [19] propose a different hybrid approach composed by two parts: a High-Level Validation and Low Level Debug. For the High-Level Validation they run the golden reference software on a softcore on the FPGA, saving the results and comparing them with the results obtained from the accelerators.…”
Section: Related Workmentioning
confidence: 99%
“…To directly affect productivity of source-level debugging, several academic and commercial HLS tools have added verification and debug support to their tool flows [47,34,58,59,43,45,44,46,95]. These verification techniques include detecting HW/SW execution discrepancy (simulation-based or on-board execution-based), developing software-like debug environments, and synthesizing C assertions to hardware.…”
Section: Source-level Debuggingmentioning
confidence: 99%
“…Several commercial [34, 58,59] and academic tools [47] couple hardware execution with a software reference model to compare outputs from both executions at runtime. However, they typically compare output streams at the interface level and thus lack detailed internal behavior to diagnose the cause for mismatched execution.…”
Section: Simulation-based Discrepancymentioning
confidence: 99%
See 1 more Smart Citation