Proceedings of the 7th International Symposium on Memory Management 2008
DOI: 10.1145/1375634.1375642
|View full text |Cite
|
Sign up to set email alerts
|

Memory management for self-adjusting computation

Abstract: The cost of reclaiming space with traversal-based garbage collection is inversely proportional to the amount of free memory, i.e., O(1/(1 − f )), where f is the fraction of memory that is live. Consequently, the cost of garbage collection can be very high when the size of the live data remains large relative to the available free space. Intuitively, this is because allocating a small amount of memory space will require the garbage collector to traverse a significant fraction of the memory only to discover litt… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
33
0

Year Published

2009
2009
2015
2015

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 28 publications
(34 citation statements)
references
References 36 publications
(48 reference statements)
1
33
0
Order By: Relevance
“…Self-adjusting computation [3,6,5,24,15] offers a solution to the incremental-computation problem by enabling any computation to respond to changes in its data by efficiently recomputing only the subcomputations that are affected by the changes. To this end, a self-adjusting computation tracks dependencies between the inputs and outputs of subcomputations, and, in incremental runs, only rebuilds subcomputations affected (transitively) by modified inputs.…”
Section: Self-adjusting Computationmentioning
confidence: 99%
See 2 more Smart Citations
“…Self-adjusting computation [3,6,5,24,15] offers a solution to the incremental-computation problem by enabling any computation to respond to changes in its data by efficiently recomputing only the subcomputations that are affected by the changes. To this end, a self-adjusting computation tracks dependencies between the inputs and outputs of subcomputations, and, in incremental runs, only rebuilds subcomputations affected (transitively) by modified inputs.…”
Section: Self-adjusting Computationmentioning
confidence: 99%
“…[35]). Recent advances on selfadjusting computation propose general-purpose techniques that can achieve optimal update times (e.g., [6,5,24]). Self-adjusting computation offers abstractions for automatic incrementalization, allowing programs written in a conventional style to be compiled to programs that can respond to changes in their data inputs automatically.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…The first was a pure higher-order language with a modal type system that was implemented both as a Standard ML library with a monad and explicit destination-passing [2] and a Haskell library using several monads to enforce the modal constraints [6]. Subsequent proposals included a direct-style higher-order language compiled into a continuation-passing style (CPS) higherorder language implemented in the MLton Standard ML compiler [13], and a low-level imperative language implemented as a compiler for C [10]. All of these designs focus on strict languages with call-by-value (CBV) functions that eagerly evaluate function arguments 7 and none of them supported efficient reordering.…”
Section: Related Workmentioning
confidence: 99%
“…More recent work on self-adjusting computation introduced dynamic dependency graphs [3] and a way to integrate them with a form of memoization [2,4]. Several flavors of self-adjusting computation have been implemented in programming languages such as C [19], Haskell [8], Java [34] and Standard ML [27,10].…”
Section: Introductionmentioning
confidence: 99%