Quantum Annealing for Clustering

Quantum Annealing for Clustering
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

This paper studies quantum annealing (QA) for clustering, which can be seen as an extension of simulated annealing (SA). We derive a QA algorithm for clustering and propose an annealing schedule, which is crucial in practice. Experiments show the proposed QA algorithm finds better clustering assignments than SA. Furthermore, QA is as easy as SA to implement.


💡 Research Summary

The paper investigates the use of quantum annealing (QA) as an alternative to simulated annealing (SA) for the unsupervised learning task of clustering. The authors first formulate clustering as an energy‑minimization problem by representing the assignment of each data point to one of K clusters with binary variables. This formulation is then mapped onto an Ising Hamiltonian H P that captures intra‑cluster variance (the usual K‑means objective) as the problem energy.

QA is introduced by adding a transverse‑field driver Hamiltonian H D = –∑ σ_i^x to H P and constructing a time‑dependent total Hamiltonian H(t) = A(t) H D + B(t) H P. The schedule functions A(t) and B(t) control the relative strength of quantum tunneling versus classical energy minimization. Early in the anneal, A≫B so that quantum fluctuations dominate and the system can tunnel through high‑energy barriers; later, A→0 and B→1, forcing the system to settle into a low‑energy configuration of H P.

A major technical contribution is the derivation of a QA‑specific transition probability for clustering. Using the Suzuki‑Trotter decomposition, the d‑dimensional data problem with M Trotter replicas is mapped to a two‑dimensional classical lattice. The Metropolis–Hastings acceptance rule for flipping a spin (i.e., moving a point to a different cluster) becomes

 P_accept = min {1, exp


Comments & Academic Discussion

Loading comments...

Leave a Comment