2019
DOI: 10.48550/arxiv.1901.02051
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

DPPNet: Approximating Determinantal Point Processes with Deep Networks

Abstract: Determinantal Point Processes (DPPs) provide an elegant and versatile way to sample sets of items that balance the point-wise quality with the set-wise diversity of selected items. For this reason, they have gained prominence in many machine learning applications that rely on subset selection. However, sampling from a DPP over a ground set of size N is a costly operation, requiring in general an O(N 3 ) preprocessing cost and an O(N k 3 ) sampling cost for subsets of size k. We approach this problem by introdu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2019
2019
2019
2019

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 21 publications
(23 reference statements)
0
1
0
Order By: Relevance
“…Particular attention has been paid to determinantal point processes due to the intuitive way they capture negative dependence, and the fact that they are parameterized by a single positive semi-definite kernel matrix. Convenient parameterization has allowed an abundance of fast algorithms for learning the kernel matrix [23,26,43,47], and sampling [2,40,46]. SR distributions are a fascinating and elegant probabilistic family whose applicability in machine learning is still an emerging topic [17,34,41,45].…”
Section: Related Workmentioning
confidence: 99%
“…Particular attention has been paid to determinantal point processes due to the intuitive way they capture negative dependence, and the fact that they are parameterized by a single positive semi-definite kernel matrix. Convenient parameterization has allowed an abundance of fast algorithms for learning the kernel matrix [23,26,43,47], and sampling [2,40,46]. SR distributions are a fascinating and elegant probabilistic family whose applicability in machine learning is still an emerging topic [17,34,41,45].…”
Section: Related Workmentioning
confidence: 99%