Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence 2020
DOI: 10.24963/ijcai.2020/297
|View full text |Cite
|
Sign up to set email alerts
|

Non-monotone DR-submodular Maximization over General Convex Sets

Abstract: Many real-world problems can often be cast as the optimization of DR-submodular functions defined over a convex domain. These functions play an important role with applications in many areas of applied mathematics, such as machine learning, computer vision, operation research, communication systems or economics. In addition, they capture a subclass of non-convex optimization that provides both practical and theoretical guarantees. In this paper, we show that for maximizing non-monotone DR-submodular f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

1
17
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(18 citation statements)
references
References 15 publications
(1 reference statement)
1
17
0
Order By: Relevance
“…In particular, Durr et al (2021) show that, under the previous assumptions (i)-(iv), there exists a Frank-Wolfe type of algorithm with…”
Section: Introductionmentioning
confidence: 91%
See 2 more Smart Citations
“…In particular, Durr et al (2021) show that, under the previous assumptions (i)-(iv), there exists a Frank-Wolfe type of algorithm with…”
Section: Introductionmentioning
confidence: 91%
“…While it is highly desirable to design efficient approximation algorithms under this general setting where neither the objective function is monotonic nor the feasible set is down-closed, unfortunately, Vondrák (2013) shows that such a problem admits no constant approximation under the value oracle model. Our main contribution is to present a 1 4 = 0.25-approximation Frank-Wolfe type of algorithm with a sub-exponential time-complexity under the value oracle model, improving a previous approximation ratio of 1 3 √ 3 ≈ 0.1924 with the same order of time-complexity by Durr et al (2021).…”
mentioning
confidence: 92%
See 1 more Smart Citation
“…This problem was studied in Durr et al (2021) and Du et al (2022), with the latter improving the approximation ratio in the former. Algorithm 3 recovers the algorithm in Du et al (2022).…”
Section: Exampel 3: Nonnegative Non-monotone Submodular and Solvable ...mentioning
confidence: 99%
“…Motivated by the above-mentioned situation, a few recent works started to consider DRsubmodular maximization subject to a general (not necessarily down-closed) convex set constraint K. In general, no constant approximation ratio can be guaranteed for this problem in subexponential time due to an hardness result by Vondrák [27]. However, Durr et al [13] showed that this inapproximability result can be bypassed when the convex set constraint K includes points whose ℓ ∞ -norm is less than the maximal value of 1. Specifically, Durr et al [13] showed a sub-exponential time offline algorithm guaranteeing 1 3 √ 3 (1 − m)-approximation for this problem, where m is the minimal ℓ ∞ -norm of any vector in K. Later, Th áng & Srivastav [25] showed how to obtain a similar result in an online (regret minimization) setting, and an improved sub-exponential offline algorithm obtaining 1 4 (1 − m)-approximation was suggested by Du et al [12].…”
Section: Introductionmentioning
confidence: 99%