Dynamic Speed Scaling Minimizing Expected Energy Consumption for Real-Time Tasks
Abstract
This paper proposes a Discrete Time Markov Decision Process (MDP) approach to compute the optimal on-line speed scaling policy to minimize the energy consumption of a single processor executing a finite or infinite set of jobs with real-time constraints. We provide several qualitative properties of the optimal policy: monotonicity with respect to the jobs parameters, comparison with on-line de-terministic algorithms. Numerical experiments in several scenarios show that our proposition performs well when compared with off-line optimal solutions and out-performs on-line solutions oblivious to statistical information on the jobs.
Domains
Computer Science [cs]Origin | Files produced by the author(s) |
---|
Loading...