2020
DOI: 10.48550/arxiv.2006.15412
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Submodular Combinatorial Information Measures with Applications in Machine Learning

Rishabh Iyer,
Ninad Khargonkar,
Jeff Bilmes
et al.

Abstract: Information-theoretic quantities like entropy and mutual information have found numerous uses in machine learning. It is well known that there is a strong connection between these entropic quantities and submodularity since entropy over a set of random variables is submodular. In this paper, we study combinatorial information measures that generalize independence, (conditional) entropy, (conditional) mutual information, and total correlation defined over sets of (not necessarily random) variables. These measur… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 29 publications
0
6
0
Order By: Relevance
“…are some examples [8]. Submodular Mutual Information (Smi): Given a set of items A, Q ⊆ V, the submodular mutual information (MI) [5,7] is defined as…”
Section: Preliminariesmentioning
confidence: 99%
See 1 more Smart Citation
“…are some examples [8]. Submodular Mutual Information (Smi): Given a set of items A, Q ⊆ V, the submodular mutual information (MI) [5,7] is defined as…”
Section: Preliminariesmentioning
confidence: 99%
“…For balanced subset selection via Basil we use the recently introduced Smi functions in [7,5] and their extensions introduced in [14] as acquisition functions. Note that we only use a subset of functions presented in [14], that are the most scalable, for per-class selection of data points.…”
Section: Examples Of Smi Functionsmentioning
confidence: 99%
“…are some examples [9]. Submodular Mutual Information (Smi): Given a set of items A, Q ⊆ V, the submodular mutual information (MI) [6,8] is defined as…”
Section: Preliminariesmentioning
confidence: 99%
“…Wang et al [191] apply the 3-increasing condition to develop a variation of the continuous greedy algorithm, and Chen et al [44] study functions with alternating monotonicity conditions, something we will also encounter. Measures of redundancy [68] and analogues of mutual information [94] also fit the form of second derivatives and are highly relevant to works on 3-increasing functions.…”
Section: 51d Higher-order Monotonicity and Derivativesmentioning
confidence: 99%