2004
DOI: 10.1109/tpami.2004.1262179
|View full text |Cite
|
Sign up to set email alerts
|

Segmentation given partial grouping constraints

Abstract: We consider data clustering problems where partial grouping is known a priori. We formulate such biased grouping problems as a constrained optimization problem, where structural properties of the data define the goodness of a grouping and partial grouping cues define the feasibility of a grouping. We enforce grouping smoothness and fairness on labeled data points so that sparse partial grouping information can be effectively propagated to the unlabeled data. Considering the normalized cuts criterion in particu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
156
0
1

Year Published

2005
2005
2010
2010

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 199 publications
(158 citation statements)
references
References 26 publications
1
156
0
1
Order By: Relevance
“…Often in graph partitioning, prior knowledge about the cluster assignment is encoded as pairwise constraints, like "mustlink" or "cannot-link". One example is [18], where pairwise grouping information is included as linear equality constraints. However these constraints have to be homogeneous, which can only handle the information that a pair of examples are from the same cluster.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Often in graph partitioning, prior knowledge about the cluster assignment is encoded as pairwise constraints, like "mustlink" or "cannot-link". One example is [18], where pairwise grouping information is included as linear equality constraints. However these constraints have to be homogeneous, which can only handle the information that a pair of examples are from the same cluster.…”
Section: Related Workmentioning
confidence: 99%
“…Generally, incorporating this prior knowledge will help to improve the performance of the algorithm. A natural way to incorporate prior information is with explicit constraints [7,18], and solve the augmented optimization problem correspondingly.…”
Section: Introductionmentioning
confidence: 99%
“…The answer of the query leads to a constraint on z u and z v . However, this will lead to the "non-smooth" cluster label problem mentioned in [9]: objects that are similar can have different cluster labels. To overcome this, we adopt the solution in [9] and consider the "average" label X u , which can be viewed as the smoothed version of the cluster labels of the objects that are similar to the u-th object.…”
Section: Bayesian Feedback In Clusteringmentioning
confidence: 99%
“…An essential difference between these algorithms is the locality of the grouping process. Shi and Malik [3] and Yu and Shi [1,7] solve it from a global standpoint, whereas Felzenszwalb and Huttenlocher [4], Nock [2] and Nielsen and Nock [5] make greedy local decisions to merge the connex components of induced subgraphs. Since segmentation is a global optimization process, the former approach is a priori a good candidate to tackle the problem, even when it faces computational complexity issues [3].…”
Section: Introductionmentioning
confidence: 99%
“…1, is based on a segmentation framework previously studied by Yu and Shi [1,7]: grouping with bias. It is particularly useful for domains in which the user may interact with the segmentation, by inputting constraints to bias its result: sensor models in MRF [8], Human-computer interaction, spatial attention and others [1].…”
Section: Introductionmentioning
confidence: 99%