2022
DOI: 10.1038/s41598-022-06559-z
|View full text |Cite
|
Sign up to set email alerts
|

Distance-based clustering using QUBO formulations

Abstract: In computer science, clustering is a technique for grouping data. Ising machines can solve distance-based clustering problems described by quadratic unconstrained binary optimization (QUBO) formulations. A typical simple method using an Ising machine makes each cluster size equal and is not suitable for clustering unevenly distributed data. We propose a new clustering method that provides better performance than the simple method, especially for unevenly distributed data. The proposed method is a hybrid algori… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
18
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 17 publications
(18 citation statements)
references
References 36 publications
0
18
0
Order By: Relevance
“…The stochastic algorithm uses the first-optimization process that improves the network learning process. During the training, the network parameter θ should be updated as defined in (12):…”
Section: ∨ ( )mentioning
confidence: 99%
See 2 more Smart Citations
“…The stochastic algorithm uses the first-optimization process that improves the network learning process. During the training, the network parameter θ should be updated as defined in (12):…”
Section: ∨ ( )mentioning
confidence: 99%
“…; ; (12) In ( 12), the gradient is denoted as ∇ θ , the learning rate is η, and k is having as 1 < k < n. The training samples are selected from the entire dataset to reduce the computation complexity, such as computation cost. The training samples are examined to get unbiased data using the stochastic process, and momentum learning is applied to improve the data analysis process.…”
Section: J X Y I I K Ii Kmentioning
confidence: 99%
See 1 more Smart Citation
“…In the past decade, many optimization problems (including Karp's 21 NP-problems) have been successfully mapped onto the Ising Hamiltonian. [14][15][16][17][18][19][20] Here, mapping refers to the procedure of formulating a Hamiltonian, similar to (1), whose minimum encodes the optimal solution of the given problem; see Lucas. 14 This means that the minimum can only be achieved when an optimal spin configuration (ground state) is found, where the latter corresponds to the solution of the problem.…”
Section: Introductionmentioning
confidence: 99%
“…To solve a combinatorial optimization problem, one must map the considered problem onto the Ising Hamiltonian. In the past decade, many optimization problems (including Karp's 21 NP-problems) have been successfully mapped onto the Ising Hamiltonian 14,15,16,17,18,19,20 . Here, mapping refers to the procedure of formulating a Hamiltonian, similar to (1), whose minimum encodes the optimal solution of the given problem, see 14 .…”
Section: Introductionmentioning
confidence: 99%