Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence 2018
DOI: 10.24963/ijcai.2018/207
|View full text |Cite
|
Sign up to set email alerts
|

Distributed Pareto Optimization for Subset Selection

Abstract: The subset selection problem that selects a few items from a ground set arises in many applications such as maximum coverage, influence maximization, sparse regression, etc. The recently proposed POSS algorithm is a powerful approximation solver for this problem. However, POSS requires centralized access to the full ground set, and thus is impractical for large-scale real-world applications, where the ground set is too large to be stored on one single machine. In this paper, we propose a distributed version of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 17 publications
(12 citation statements)
references
References 10 publications
0
12
0
Order By: Relevance
“…It is worth mentioning that recently some work Qian et al (2018Qian et al ( , 2016Qian et al ( , 2015) converts a general subset selection problem into a bi-objective problem with an additional objective, the set size, and uses a simple evolutionary algorithm to optimise it. Such a Pareto optimisation subset selection can achieve the same general approximation guarantee as the greedy algorithm, but has better ability to avoid local optima.…”
Section: Subset Selectionmentioning
confidence: 99%
“…It is worth mentioning that recently some work Qian et al (2018Qian et al ( , 2016Qian et al ( , 2015) converts a general subset selection problem into a bi-objective problem with an additional objective, the set size, and uses a simple evolutionary algorithm to optimise it. Such a Pareto optimisation subset selection can achieve the same general approximation guarantee as the greedy algorithm, but has better ability to avoid local optima.…”
Section: Subset Selectionmentioning
confidence: 99%
“…For concise illustration, we will mainly show the difference in the proof of Theorem 3. [34] Assume that a set function f is monotone and non-submodular. Then, for any x ∈ {0, 1} n , there…”
Section: Theoremmentioning
confidence: 99%
“…One is to enhance the search ability of existing EAs by re-scheduling the local search operators [20], [21]; while the other is to simplify the problem via Divide-and-Conquer (DC) [22], [23], [24] or Dimension Reduction [25], [26]. For saving the computational time in each iteration, parallel computing or distributed computing techniques are frequently employed to optimize individual solutions or decision variables on different threads [12], [13], [27], [28], [29]. Among them, the DC methodology has been frequently introduced into EAs for large-scale optimization problems, since it can be regarded as an integrated solution to improve the above two factors for EAs.…”
Section: The Divide-and-conquer Based Easmentioning
confidence: 99%