2009
DOI: 10.1016/j.patcog.2008.11.029
|View full text |Cite
|
Sign up to set email alerts
|

A simultaneous learning framework for clustering and classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
24
0

Year Published

2009
2009
2022
2022

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 27 publications
(24 citation statements)
references
References 25 publications
(44 reference statements)
0
24
0
Order By: Relevance
“…Thus, their performance degrades if the underlying input-output map changes over time. -There is relatively little work in incorporating both labeled and unlabeled data while building ensembles, in contrast to the substantial amount of recent interest in semisupervised learning-including semisupervised clustering, semisupervised classification, clustering with constraints, and transductive learning methods-using a single model [Chapelle et al 2006;Zhu and Goldberg 2009;Cai et al 2009;Forestier et al 2010;Chen et al 2009]. …”
Section: Related Workmentioning
confidence: 99%
“…Thus, their performance degrades if the underlying input-output map changes over time. -There is relatively little work in incorporating both labeled and unlabeled data while building ensembles, in contrast to the substantial amount of recent interest in semisupervised learning-including semisupervised clustering, semisupervised classification, clustering with constraints, and transductive learning methods-using a single model [Chapelle et al 2006;Zhu and Goldberg 2009;Cai et al 2009;Forestier et al 2010;Chen et al 2009]. …”
Section: Related Workmentioning
confidence: 99%
“…they adopt a two-step learning paradigm which fails to realize the simultaneous optimization for both criteria. This may limit the strength of both clustering and To obtain the satisfactory clustering and classification result and inspired by our previous work [21], we present a multi-objective simultaneous learning framework (named MSCC) for both clustering and classification learning. In its implement, we first employ the Bayesian theory to bridge the connection between both and make all their objectives only dependent on the same set of the cluster centers as the parameters to be optimized.…”
Section: Hidden Layermentioning
confidence: 99%
“…Such kind of algorithms sequentially optimizes the clustering criterion and the classification criterion, and thus fails to achieve the simultaneous optimality for such two criteria. Recently, we have gone a small step ahead in this research and proposed a simultaneous learning algorithm for clustering and classification (named SCC) [21]. In SCC, the classification criterion and clustering criterion are combined to a single objective function by a trade-off parameter, whose goal is to compromise the classification and the clustering performances, but its value in optimizing the objective is generally hard to be optimally chosen except for an exhaustive search in some range, which is a heavier learning burden.…”
Section: Introductionmentioning
confidence: 99%
“…These two tasks are usually treated separately, because both goals are quite different. However, when label information is available, simultaneous learning clustering and classification tasks will benefit from each other [5].…”
Section: Introductionmentioning
confidence: 99%
“…This paper focuses on simultaneous learning clustering and classification tasks, hence cluster assumption is used as data prior knowledge. It suggests that when data intrinsic structure revealed by some clustering algorithm is incorporated into the classification task, better performance may be desired [5,13,25].…”
Section: Introductionmentioning
confidence: 99%