Proceedings of the 4th Conference on Innovations in Theoretical Computer Science 2013
DOI: 10.1145/2422436.2422491
|View full text |Cite
|
Sign up to set email alerts
|

Robust optimization in the presence of uncertainty

Abstract: We study optimization in the presence of uncertainty such as noise in measurements, and advocate a novel approach of tackling it. The main difference to any existing approach is that we do not assume any knowledge about the nature of the uncertainty (such as for instance a probability distribution). Instead, we are given several instances of the same optimization problem as input, and, assuming they are typical w.r.t. the uncertainty, we make use of it in order to compute a solution that is good for the sample… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
4
2
1

Relationship

3
4

Authors

Journals

citations
Cited by 12 publications
(12 citation statements)
references
References 15 publications
0
12
0
Order By: Relevance
“…Our contribution, therefore, provides a faster solution than explicit enumeration for the problems where counting of approximate solutions is required [12]. Counting and sampling from close-to-optimum solutions is the key-element of the recent optimization method with uncertain input data of Buhmann et al [2]. Our work thus makes a step towards practical algorithms in this context.…”
Section: Dynamic Programming As Shortest-path Computation In Dagsmentioning
confidence: 97%
See 1 more Smart Citation
“…Our contribution, therefore, provides a faster solution than explicit enumeration for the problems where counting of approximate solutions is required [12]. Counting and sampling from close-to-optimum solutions is the key-element of the recent optimization method with uncertain input data of Buhmann et al [2]. Our work thus makes a step towards practical algorithms in this context.…”
Section: Dynamic Programming As Shortest-path Computation In Dagsmentioning
confidence: 97%
“…We achieve our result by a modification of a conceptually interesting dynamic program for all feasible solutions for the knapsack problem [13]. Our motivation for studying our counting problem comes from a new approach [2] to cope with uncertainty in optimization problems. There, we not only need to count the number of approximate solutions for a given problem instance, but we also need to count the number of solutions that are approximate (within a given approximation ratio) for two problem instances at the same time.…”
Section: Introductionmentioning
confidence: 99%
“…(1)). In general, computation or at least estimation of these cardinalities requires to enumerate the elements of S and to test if they belong to C γ [5]. This operation is computationally hard, 2014 IEEE International Symposium on Information Theory since S grows exponentially with the number of vertices and the enumeration problems are often in #P. 3 For algorithms, these enumeration problems arise in a constraint form, i.e., the algorithm itself might help to optimize this enumeration, eliminating all those solutions that get "out of consideration" as the algorithm progresses.…”
Section: Algorithm Regularization For Minimum Spanning Treesmentioning
confidence: 99%
“…Both approaches and related robust estimates are described in Section 2. Section 3 introduces a new class of estimators, by relaxing the constraint of optimality and defining regions of acceptability, similarly as [23] in discrete combinatorial problems, or [24,25] in operations research. The rationale behind this relaxation is to be able to construct an estimatek which produces values of the cost function close enough from the minimal cost attainable given the configuration induced by u ∈ U, with high enough probability.…”
Section: Introductionmentioning
confidence: 99%