2015
DOI: 10.1007/s00453-015-9971-3
|View full text |Cite
|
Sign up to set email alerts
|

Learning Poisson Binomial Distributions

Abstract: We consider a basic problem in unsupervised learning: learning an unknown Poisson binomial distribution. A Poisson binomial distribution (PBD) over {0, 1, . . . , n} is the distribution of a sum of n independent Bernoulli random variables which may have arbitrary, potentially non-equal, expectations. These distributions were first studied by Poisson (Recherches sur la Probabilitè des jugements en matié criminelle et en matiére civile. Bachelier, Paris, 1837) and are a natural n-parameter generalization of the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

2
43
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
5
2

Relationship

4
3

Authors

Journals

citations
Cited by 39 publications
(45 citation statements)
references
References 52 publications
(95 reference statements)
2
43
0
Order By: Relevance
“…We close this subsection by observing that while the results of [Roo00, DP15] were used in a crucial way in subsequent work of Daskalakis et al [DDS15] to obtain a sample complexity upper bound on learning Poisson binomial distributions, in our context we use these results to obtain a sample complexity lower bound for population recovery. Intuitively, the difference is that in the [DDS15] scenario of learning an unknown Poisson binomial distribution, there is no noise process affecting the samples: the learning algorithm is assumed to directly receive draws from the underlying Poisson binomial distribution being learned.…”
Section: Our Lower Boundsmentioning
confidence: 95%
“…We close this subsection by observing that while the results of [Roo00, DP15] were used in a crucial way in subsequent work of Daskalakis et al [DDS15] to obtain a sample complexity upper bound on learning Poisson binomial distributions, in our context we use these results to obtain a sample complexity lower bound for population recovery. Intuitively, the difference is that in the [DDS15] scenario of learning an unknown Poisson binomial distribution, there is no noise process affecting the samples: the learning algorithm is assumed to directly receive draws from the underlying Poisson binomial distribution being learned.…”
Section: Our Lower Boundsmentioning
confidence: 95%
“…Consider the following very simple learning problem: Let {X i } n i=1 be independent random variables where X i is promised to be supported on the two-element set {0, i} but Pr[X i = i] is unknown: what is the sample complexity of learning X = X 1 + · · · + X N ? Even though each random variable X i is "as simple as a non-trivial random variable can be" -supported on just two values, one of which is zero -a straightforward lower bound given in [DDS12b] shows that any algorithm for learning X even to constant accuracy must use Ω(N ) samples, which is not much better than the trivial brute-force algorithm based on support size.…”
Section: Introductionmentioning
confidence: 99%
“…the union supp(X 1 ) ∪ · · · ∪ supp(X N ) of their support sets is small. Inspired by this, Daskalakis et al [DDS12b] studied the simplest non-trivial version of this learning problem, in which each X i is a Bernoulli random variable (so the union of all supports is simply {0, 1}; note, though, that the X i 's may have distinct and arbitrary biases). The main result of [DDS12b] is that this class (known as Poisson Binomial Distributions) can be learned to error ε with poly(1/ε) samples -so, perhaps unexpectedly, the complexity of learning this class is completely independent of N , the number of summands.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations