2017
DOI: 10.1287/ijoc.2017.0758
|View full text |Cite
|
Sign up to set email alerts
|

Machine Speed Scaling by Adapting Methods for Convex Optimization with Submodular Constraints

Abstract: Abstract. In this paper, we propose a new methodology for the speed-scaling problem based on its link to scheduling with controllable processing times and submodular optimization. It results in faster algorithms for traditional speed-scaling models, characterized by a common speed/energy function. Additionally, it efficiently handles the most general models with job-dependent speed/energy functions with single and multiple machines.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
1
1

Relationship

2
5

Authors

Journals

citations
Cited by 9 publications
(9 citation statements)
references
References 38 publications
(37 reference statements)
0
9
0
Order By: Relevance
“…[127] extends this result to a range of different strictly convex cost functions for the case of continuous variables. Their result is used in [128] to solve optimization problems on graphs and in [153] to derive efficient algorithms for processor scheduling problems. For a special case of RAPs with nested constraints, the equivalence of (a, b, f )-separable RAPs is proven in [5] for the case where the functions f are strictly convex and differentiable, b = 0, and where the variables are continuous.…”
Section: Reduction Results In the Literaturementioning
confidence: 99%
See 1 more Smart Citation
“…[127] extends this result to a range of different strictly convex cost functions for the case of continuous variables. Their result is used in [128] to solve optimization problems on graphs and in [153] to derive efficient algorithms for processor scheduling problems. For a special case of RAPs with nested constraints, the equivalence of (a, b, f )-separable RAPs is proven in [5] for the case where the functions f are strictly convex and differentiable, b = 0, and where the variables are continuous.…”
Section: Reduction Results In the Literaturementioning
confidence: 99%
“…Recently, [153] applied the equivalence result in [127] to improve the time complexity of several other speed scaling problems. Together with the result in this section, this suggests that there is a great potential for using the reduction result in this chapter to contribute to more efficient algorithms within this research field.…”
Section: 55mentioning
confidence: 99%
“…5. In our recent paper Shioura, Shakhlevich and Strusevich (2017) , we demonstrate how the flow and submodular optimization techniques can be applied to the off-line problems of speed scaling. These problems reduce to minimizing convex separable functions under submodular constraints.…”
Section: Discussionmentioning
confidence: 99%
“…Recently, Shioura et al (2017) applied the equivalence result in Nagano and Aihara (2012) to improve the time complexity of several other speed-scaling problems. Together with the result in this section, this suggests that there is a great potential for using the reduction result in this article to contribute to more efficient algorithms within this research field.…”
Section: Storage Operation In Energy Systemsmentioning
confidence: 99%
“…Nagano and Aihara (2012) extend this result to a range of different strictly convex cost functions for the case of continuous variables. Their result is used by Nagano and Kawahara (2013) to solve optimization problems on graphs and by Shioura et al (2017) to derive efficient algorithms for processor scheduling problems. For a special case of RAPs with nested constraints, the equivalence of (a, b, f )-separable RAPs is proven by Akhil and Sundaresan (2018) for the case in which the functions f are strictly convex and differentiable, b 0, and with continuous variables.…”
Section: Introductionmentioning
confidence: 99%