The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2007 IEEE 11th International Conference on Computer Vision 2007
DOI: 10.1109/iccv.2007.4408979
|View full text |Cite
|
Sign up to set email alerts
|

Half Quadratic Analysis for Mean Shift: with Extension to A Sequential Data Mode-Seeking Method

Abstract: Theoretical understanding and extension of mean shift procedure has received much attention recently [8,18,3]

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
6
0

Year Published

2008
2008
2010
2010

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 10 publications
(7 citation statements)
references
References 19 publications
0
6
0
Order By: Relevance
“…As in [5], we apply mean shift to segment the image into super-pixels (mean shift variants can be used to obtain directly full segmentations [2,16,24]). We compare the speed and segmentation quality obtained by using mean shift, medoid shift, and quick shift (see Fig.…”
Section: Image Segmentationmentioning
confidence: 99%
See 1 more Smart Citation
“…As in [5], we apply mean shift to segment the image into super-pixels (mean shift variants can be used to obtain directly full segmentations [2,16,24]). We compare the speed and segmentation quality obtained by using mean shift, medoid shift, and quick shift (see Fig.…”
Section: Image Segmentationmentioning
confidence: 99%
“…spaces endowed with a distance). In fact, mean shift is essentially a gradient ascent algorithm [3,5,24] and the gradient may not be defined unless the data space has additional structure (e.g. Hilbert space or smooth manifold structure).…”
Section: Introductionmentioning
confidence: 99%
“…Also, different kernels give rise to different versions of the MS algorithm [8]. More numerical analysis and extensions of MS can be found in [6][7] [8][10] [15].…”
Section: Introductionmentioning
confidence: 99%
“…Recently, Yuan and Li [22] point out that convex kernel based MS is equivalent to half-quadratic (HQ) optimization for the KDE function (1.1). The HQ analysis framework of MS implies the fact that MS is a quadratic bounding (QB) optimization for KDE, which is also discovered by Fashing and Tomasi [11].…”
Section: Introductionmentioning
confidence: 99%
“…2 Quadratic Bounding Nature of MS The fact that MS is a QB optimization is originally discovered by Fashing and Tomasi in [11], motivated by the relationship between MS and Newton-Raphson method. Actually, the QB nature of MS can be more straightforwardly derived from the HQ optimization viewpoint of MS [22]: When kernel is convex and monotonically decreasing, the MS algorithm can be explained as HQ optimization forf k (x). This feature can be shown by using the theory of convex conjugated functions [17] to introduce the following augmented energy function with d + N variables:…”
Section: Introductionmentioning
confidence: 99%