2014 IEEE International Symposium on Information Theory 2014
DOI: 10.1109/isit.2014.6875120
|View full text |Cite
|
Sign up to set email alerts
|

Sorting with adversarial comparators and application to density estimation

Abstract: We study maximum selection and sorting of n numbers using pairwise comparators that output the larger of their two inputs if the inputs are more than a given threshold apart, and output an adversariallychosen input otherwise. We consider two adversarial models. A non-adaptive adversary that decides on the outcomes in advance based solely on the inputs, and an adaptive adversary that can decide on the outcome of each query depending on previous queries and outcomes.Against the non-adaptive adversary, we derive … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
40
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(40 citation statements)
references
References 12 publications
0
40
0
Order By: Relevance
“…We note that the quadratic running time can be brought down to near-linear, as shown in [DK14,AJOS14] (with the same sample complexity, although at the price of a worse semi-agnostic constant). This improved running time, however, is not crucial for our applications.…”
Section: Removing the Assumption On Knowing Optmentioning
confidence: 69%
“…We note that the quadratic running time can be brought down to near-linear, as shown in [DK14,AJOS14] (with the same sample complexity, although at the price of a worse semi-agnostic constant). This improved running time, however, is not crucial for our applications.…”
Section: Removing the Assumption On Knowing Optmentioning
confidence: 69%
“…Theorem 12) provide broad generalizations of the bounds for learning Ising models and Gaussian MRFs presented in recent work of Devroye et al [29]. Their bounds are obtained by bounding the VC complexity of the Yatracos class induced by the distributions of interest, while our bounds are obtained by constructing ε-nets of the distributions of interest, and running a tournament-style hypothesis selection algorithm [28,25,1] to select one distribution from the net. Since the distribution families we consider are non-parametric, our main technical contribution is to bound the size of an ε-net sufficient to cover the distributions of interest.…”
Section: Setting Revenue Guarantee and Sample Complexity Prior Result...mentioning
confidence: 91%
“…While this recent work bounds the VC dimension of the Yatracos class of these families of distributions, for our more general families of non-parametric distributions we construct instead covers under either total variation distance or Prokhorov distance, and combine our cover-size upper bounds with generic tournament-style algorithms; see e.g. [28,25,1]. The details are provided in Appendix F. While there are many details, we illustrate one snippet of an idea used in constructing a ε-cover, in total variation distance, of the set of all MRFs with hyper-edges E of size at most d and a discrete alphabet Σ on every node.…”
Section: Sample Complexity For Learning Mrfs and Bayesnetsmentioning
confidence: 99%
See 1 more Smart Citation
“…Besides providing an asymptotically smaller search space for Nash equilibria in anonymous games, or any other optimization problem over PMDs, the polynomial rather than exponential dependence of the cover size on n has direct consequences to the learnability of these distributions; see Theorem 7 (from [DK14]) and [AJOS14] for a similar result, which improve a long line of similar results in the probability literature [DL01]. In particular, a cover of polynomial size implies directly that these distributions can be learned from a number of samples logarithmic in n, despite their support being polynomial in n. Motivated by such applications of covers to algorithms and learning we use our structural result to obtain an improved cover theorem.…”
Section: Introductionmentioning
confidence: 99%