“…As in [5], we apply mean shift to segment the image into super-pixels (mean shift variants can be used to obtain directly full segmentations [2,16,24]). We compare the speed and segmentation quality obtained by using mean shift, medoid shift, and quick shift (see Fig.…”
Section: Image Segmentationmentioning
confidence: 99%
“…spaces endowed with a distance). In fact, mean shift is essentially a gradient ascent algorithm [3,5,24] and the gradient may not be defined unless the data space has additional structure (e.g. Hilbert space or smooth manifold structure).…”
Abstract. We show that the complexity of the recently introduced medoid-shift algorithm in clustering N points is O(N 2 ), with a small constant, if the underlying distance is Euclidean. This makes medoid shift considerably faster than mean shift, contrarily to what previously believed. We then exploit kernel methods to extend both mean shift and the improved medoid shift to a large family of distances, with complexity bounded by the effective rank of the resulting kernel matrix, and with explicit regularization constraints. Finally, we show that, under certain conditions, medoid shift fails to cluster data points belonging to the same mode, resulting in over-fragmentation. We propose remedies for this problem, by introducing a novel, simple and extremely efficient clustering algorithm, called quick shift, that explicitly trades off under-and overfragmentation. Like medoid shift, quick shift operates in non-Euclidean spaces in a straightforward manner. We also show that the accelerated medoid shift can be used to initialize mean shift for increased efficiency. We illustrate our algorithms to clustering data on manifolds, image segmentation, and the automatic discovery of visual categories.
“…As in [5], we apply mean shift to segment the image into super-pixels (mean shift variants can be used to obtain directly full segmentations [2,16,24]). We compare the speed and segmentation quality obtained by using mean shift, medoid shift, and quick shift (see Fig.…”
Section: Image Segmentationmentioning
confidence: 99%
“…spaces endowed with a distance). In fact, mean shift is essentially a gradient ascent algorithm [3,5,24] and the gradient may not be defined unless the data space has additional structure (e.g. Hilbert space or smooth manifold structure).…”
Abstract. We show that the complexity of the recently introduced medoid-shift algorithm in clustering N points is O(N 2 ), with a small constant, if the underlying distance is Euclidean. This makes medoid shift considerably faster than mean shift, contrarily to what previously believed. We then exploit kernel methods to extend both mean shift and the improved medoid shift to a large family of distances, with complexity bounded by the effective rank of the resulting kernel matrix, and with explicit regularization constraints. Finally, we show that, under certain conditions, medoid shift fails to cluster data points belonging to the same mode, resulting in over-fragmentation. We propose remedies for this problem, by introducing a novel, simple and extremely efficient clustering algorithm, called quick shift, that explicitly trades off under-and overfragmentation. Like medoid shift, quick shift operates in non-Euclidean spaces in a straightforward manner. We also show that the accelerated medoid shift can be used to initialize mean shift for increased efficiency. We illustrate our algorithms to clustering data on manifolds, image segmentation, and the automatic discovery of visual categories.
“…Also, different kernels give rise to different versions of the MS algorithm [8]. More numerical analysis and extensions of MS can be found in [6][7] [8][10] [15].…”
“…Recently, Yuan and Li [22] point out that convex kernel based MS is equivalent to half-quadratic (HQ) optimization for the KDE function (1.1). The HQ analysis framework of MS implies the fact that MS is a quadratic bounding (QB) optimization for KDE, which is also discovered by Fashing and Tomasi [11].…”
Section: Introductionmentioning
confidence: 99%
“…2 Quadratic Bounding Nature of MS The fact that MS is a QB optimization is originally discovered by Fashing and Tomasi in [11], motivated by the relationship between MS and Newton-Raphson method. Actually, the QB nature of MS can be more straightforwardly derived from the HQ optimization viewpoint of MS [22]: When kernel is convex and monotonically decreasing, the MS algorithm can be explained as HQ optimization forf k (x). This feature can be shown by using the theory of convex conjugated functions [17] to introduce the following augmented energy function with d + N variables:…”
Mean-Shift (MS) is a powerful non-parametric clustering method. Although good accuracy can be achieved, its computational cost is particularly expensive even on moderate data sets. In this paper, for the purpose of algorithm speedup, we develop an agglomerative MS clustering method called Agglo-MS, along with its mode-seeking ability and convergence property analysis. Our method is built upon an iterative query set compression mechanism which is motivated by the quadratic bounding optimization nature of MS. The whole framework can be efficiently implemented in linear running time complexity. Furthermore, we show that the pairwise constraint information can be naturally integrated into our framework to derive a semi-supervised non-parametric clustering method. Extensive experiments on toy and real-world data sets validate the speedup advantage and numerical accuracy of our method, as well as the superiority of its semi-supervised version.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.