2020 IEEE 61st Annual Symposium on Foundations of Computer Science (FOCS) 2020
DOI: 10.1109/focs46700.2020.00023
|View full text |Cite
|
Sign up to set email alerts
|

Outlier-Robust Clustering of Gaussians and Other Non-Spherical Mixtures

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
51
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 15 publications
(51 citation statements)
references
References 25 publications
0
51
0
Order By: Relevance
“…Unfortunately this strategy inherently introduces polynomial factors in ๐‘‘ and it cannot give what we are after. For the special case of clusterable mixtures of Gaussians, Bakshi and Kothari [5] and Diakonikolas et al [20] proved dimension independent polynomial identifiability and their approach was based on classifying the ways in which two single Gaussians can have total variation distance close to one. When it comes to the more general problem of handling mixtures where the components can overlap non-trivially it seems difficult to follow the same route because we can no longer match components from the two mixtures to each other and almost cancel them both out.…”
Section: Key Challengesmentioning
confidence: 99%
See 2 more Smart Citations
“…Unfortunately this strategy inherently introduces polynomial factors in ๐‘‘ and it cannot give what we are after. For the special case of clusterable mixtures of Gaussians, Bakshi and Kothari [5] and Diakonikolas et al [20] proved dimension independent polynomial identifiability and their approach was based on classifying the ways in which two single Gaussians can have total variation distance close to one. When it comes to the more general problem of handling mixtures where the components can overlap non-trivially it seems difficult to follow the same route because we can no longer match components from the two mixtures to each other and almost cancel them both out.…”
Section: Key Challengesmentioning
confidence: 99%
“…Diakonikolas et al [21] gave a robust algorithm for learning mixtures of spherical Gaussians. In recent breakthroughs Bakshi and Kothari [5] and Diakonikolas et al [20] gave a robust algorithm for learning clusterable mixtures of Gaussians and building on this Kane [37] gave a robust algorithm for learning mixtures of two Gaussians. We note that these works do place some mild restrictions on the mixing weights and the variances.…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…Furthermore, our lower bound implies that, for the class of distributions we consider, if a pair of distributions is close in variational distance, then the distributions also have close parameters. This type of implication is integral to certain arguments in outlier-robust moment estimation algorithms and clustering [4,20].…”
mentioning
confidence: 99%
“…The functional form of the TV distance bound is often much more useful in practice because it can be directly evaluated based on only the means and covariances of the distribution. This has opened up the door for new applications to a variety of areas, such as analyzing ReLU networks [34], distribution learning [3,4], private distribution testing [7,8], and average-case reductions [6].…”
mentioning
confidence: 99%