2016
DOI: 10.1145/2914770.2837636
|View full text |Cite
|
Sign up to set email alerts
|

SMO: an integrated approach to intra-array and inter-array storage optimization

Abstract: The polyhedral model provides an expressive intermediate representation that is convenient for the analysis and subsequent transformation of affine loop nests. Several heuristics exist for achieving complex program transformations in this model. However, there is also considerable scope to utilize this model to tackle the problem of automatic memory footprint optimization. In this paper, we present a new automatic storage optimization technique which can be used to achieve both intra-array as well as inter-arr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(8 citation statements)
references
References 20 publications
0
8
0
Order By: Relevance
“…Can we guarantee that we obtain the mapping that is dimension-wise optimal? • While Heuristic 1 can be generalized to inter-array optimizations where different statements have different mappings, as done by Bhaskaracharya for their intra-array mapping [3], it is unclear how to extend our lattice-based method with reuse vectors. • Although we have focused on compactness of the allocation in this paper, it is not always the best for performance.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Can we guarantee that we obtain the mapping that is dimension-wise optimal? • While Heuristic 1 can be generalized to inter-array optimizations where different statements have different mappings, as done by Bhaskaracharya for their intra-array mapping [3], it is unclear how to extend our lattice-based method with reuse vectors. • Although we have focused on compactness of the allocation in this paper, it is not always the best for performance.…”
Section: Discussionmentioning
confidence: 99%
“…The first difference is that we assume a single σ, i.e., σ T = σ S , so that we can work with the difference j − i. But we could also apply Heuristic 1 to map different arrays in the same memory space, each with a possibly different mappings, as explored in SMO [3]. The second difference is that, at each step, we remove all pairs such that σ( j) − σ( i) = 0 (as in the search for maximal parallelism [12]) while, in Pluto, dependences need to be kept for defining other dimensions for tiling.…”
Section: Link With Multi-dimensional Scheduling and Tilingmentioning
confidence: 99%
“…There is the possibility to contract array space after scheduling [6,7,11,19,25,34]. However, the re-scheduled program may inherently require more memory than the source program, especially if the scheduler is unaware that its decisions may increase the memory footprint.…”
Section: Related Workmentioning
confidence: 99%
“…A conflict set as presented in [7,11] can be useful to determine whether a scalar is conflicting with an array (Listing 6 line 13). For the purpose of DeLICM we additionally need to analyze the stored content.…”
Section: Related Workmentioning
confidence: 99%
“…Memory allocation for polyhedral programs is a well studied problem, and there are two main approaches. One either does memory allocation after the schedule is chosen [4,5,19,20,43,52,64,66] since it often leads to a smaller memory footprint, or else uses a schedule independent memory allocation, based on the so called universal occupancy vectors (UOV). This problem is solved when the program has uniform dependences, i.e., when each dependence can be described by a constant vector, and for some simple extensions of this [57,72].…”
Section: A2 Memory Allocationmentioning
confidence: 99%