2019 IEEE International Symposium on Performance Analysis of Systems and Software (ISPASS) 2019
DOI: 10.1109/ispass.2019.00040
|View full text |Cite
|
Sign up to set email alerts
|

mRNA: Enabling Efficient Mapping Space Exploration for a Reconfiguration Neural Accelerator

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
15
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 27 publications
(15 citation statements)
references
References 24 publications
0
15
0
Order By: Relevance
“…mRNA [146] is a mapper that performs design space exploration to find the optimal mapping targeting the re-configurable DNN accelerator MAERI [137]. Similarly to other design space exploration tools, it explores all the possible permutations of the for loops of the Conv layer 7-nested loop representation and all the possible combinations of tiling factors.…”
Section: Tools For Design Space Exploration (Dse)mentioning
confidence: 99%
“…mRNA [146] is a mapper that performs design space exploration to find the optimal mapping targeting the re-configurable DNN accelerator MAERI [137]. Similarly to other design space exploration tools, it explores all the possible permutations of the for loops of the Conv layer 7-nested loop representation and all the possible combinations of tiling factors.…”
Section: Tools For Design Space Exploration (Dse)mentioning
confidence: 99%
“…Mappers for Spatial Accelerators: Most prior works focus on developing mappers specific to their architectures. For example, mRNA [28] for MAERI [7], Auto-TVM [29] for the GEMM core of the VTA architecture [30] limiting their applicability to generic spatial accelerators. Mapping optimizers such as Interstellar [31], dMazeRunner [32] are specific to convolutions and fix certain aspects of the dataflow (such as choice parallel loops and loop order), constraining the space.…”
Section: Related Workmentioning
confidence: 99%
“…[12,38] maximized the psum reuse, and [3] maximized weight reuse in RF and psum reuse in SPM and did not explore other execution methods. Likewise, [15,37,39,40] considered a batch size of N = 1 images, missing the opportunities for weight reuse. Thus, prior techniques organized the loops in certain ways and without explicit modeling of the complete spatiotemporal execution, they lacked information about different execution methods.…”
Section: Related Workmentioning
confidence: 99%
“…Pruning the search space: The space of execution methods is vast because, total options for multi-level tiling of the loop-nest range from several hundred to thousands [39] and for each tiling configuration, loops are reordered in numerous ways. For example, we can organize a 7-deep loopnest of convolution into 7!…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation