2007 Design, Automation &Amp; Test in Europe Conference &Amp; Exhibition 2007
DOI: 10.1109/date.2007.364434
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic Power Management under Uncertain Information

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
15
0

Year Published

2008
2008
2018
2018

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 16 publications
(15 citation statements)
references
References 9 publications
0
15
0
Order By: Relevance
“…To be more realistic, we consider in this work that the SR state cannot be directly obtained by the PM. In contrast to previous work on POMDP [7] [8], the PM has no prior knowledge of the characteristics of the SR. Therefore, workload prediction has to be incorporated to provide partial information to the PM so that the PM can learn in the observation domain of the SR.…”
Section: System Modelmentioning
confidence: 74%
See 1 more Smart Citation
“…To be more realistic, we consider in this work that the SR state cannot be directly obtained by the PM. In contrast to previous work on POMDP [7] [8], the PM has no prior knowledge of the characteristics of the SR. Therefore, workload prediction has to be incorporated to provide partial information to the PM so that the PM can learn in the observation domain of the SR.…”
Section: System Modelmentioning
confidence: 74%
“…Other enhancements include time-indexed semi-MDP of Simunic et al [6]. To cope with uncertainties in the underlying hardware state, DPM policies based on partially observable Markov decision process (POMDP) have been proposed in [7] and [8]. Note that in the aforesaid stochastic DPM approaches, request inter-arrival times and system service times are modeled as stationary processes that satisfy certain probability distributions.…”
Section: Introductionmentioning
confidence: 98%
“…The workload variation has a significant impact on the system performance and power consumption. Thus, a robust power management technique must consider the uncertainty and variability that emanate from the environment, hardware and application characteristics [11] and must be able to interact with the environment to obtain information that can be processed to produce optimal policies.…”
Section: Introductionmentioning
confidence: 99%
“…Stochastic methods are naturally selected for modeling and optimization of such system. Stochastic models such as Markov decision process ( [4]), SemiMarkov decision process ( [5]), and Partially Observable Markov Decision process ( [6], [7]) have been investigated by previous DPM research. All of these models assume that the workload is intrinsically Markovian and this embedded Markov model can be reconstructed (or trained) based on the given observation sequence.…”
Section: Introductionmentioning
confidence: 99%
“…The authors of [6] consider an integrated circuit under process, voltage and temperature variations as a partially observable system. POMDP is applied to search for the optimal power management policy of such system.…”
Section: Introductionmentioning
confidence: 99%