2020
DOI: 10.1103/physreva.101.012326
|View full text |Cite
|
Sign up to set email alerts
|

Quantum expectation-maximization algorithm

Abstract: Clustering algorithms are a cornerstone of machine learning applications. Recently, a quantum algorithm for clustering based on the k-means algorithm has been proposed by Kerenidis, Landman, Luongo and Prakash. Based on their work, we propose a quantum expectation-maximization (EM) algorithm for Gaussian mixture models (GMMs). The robustness and quantum speedup of the algorithm is demonstrated. We also show numerically the advantage of GMM over k-means for non-trivial cluster data.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
7
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(8 citation statements)
references
References 22 publications
0
7
0
Order By: Relevance
“…Quantum computers embody a physically feasible computational model which may exceed the limit of conventional computers, and have been researched vigorously for several decades. In particular, the amplitude estimation algorithm [1][2][3][4][5][6] has attracted a lot of attention as a fundamental subroutine of a wide range of application-oriented quantum algorithms, such as the Monte Carlo integration [7][8][9][10][11][12][13] and machine learning tasks [14][15][16][17][18][19][20]. However, those quantum amplitude estimation algorithms are assumed to work on ideal quantum computers, and their performance on noisy computers should be carefully investigated.…”
Section: Introductionmentioning
confidence: 99%
“…Quantum computers embody a physically feasible computational model which may exceed the limit of conventional computers, and have been researched vigorously for several decades. In particular, the amplitude estimation algorithm [1][2][3][4][5][6] has attracted a lot of attention as a fundamental subroutine of a wide range of application-oriented quantum algorithms, such as the Monte Carlo integration [7][8][9][10][11][12][13] and machine learning tasks [14][15][16][17][18][19][20]. However, those quantum amplitude estimation algorithms are assumed to work on ideal quantum computers, and their performance on noisy computers should be carefully investigated.…”
Section: Introductionmentioning
confidence: 99%
“…Independent of this work, Miyahara, Aihara, and Lechner extended the q-means algorithm [18] for Gaussian Mixture Models [29]. The techniques used in [29] are similar to ours.…”
Section: Previous Workmentioning
confidence: 99%
“…Independent of this work, Miyahara, Aihara, and Lechner extended the q-means algorithm [18] for Gaussian Mixture Models [29]. The techniques used in [29] are similar to ours. The main difference is that in their work the update step is performed using a hard-clustering approach (as in the k-means algorithm), that is for updating the centroid and the covariance matrices of a cluster j, only the data points for which cluster j is nearest are taken into account.…”
Section: Previous Workmentioning
confidence: 99%
See 2 more Smart Citations