2012
DOI: 10.1162/neco_a_00284
|View full text |Cite
|
Sign up to set email alerts
|

Improved Generative Semisupervised Learning Based on Finely Grained Component-Conditional Class Labeling

Abstract: We introduce new inductive, generative semisupervised mixtures with more finely grained class label generation mechanisms than in previous work. Our models combine advantages of semisupervised mixtures, which achieve label extrapolation over a component, and nearest-neighbor (NN)/nearest-prototype (NP) classification, which achieve accurate classification in the vicinity of labeled samples or prototypes. For our NN-based method, we propose a novel two-stage stochastic data generation, with all samples first ge… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2014
2014
2014
2014

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 36 publications
0
1
0
Order By: Relevance
“…Semi-supervised machine learning algorithms are able to exploit unlabeled data to infer more accurate predictors compared to the case where only labeled data would be used. We refer to [44] and the references therein as an example of semi-supervised machine learning algorithms. We summarize the required preprocessing steps, advantages, and disadvantages of the three proposed criteria in Table 5.…”
Section: Discussionmentioning
confidence: 99%
“…Semi-supervised machine learning algorithms are able to exploit unlabeled data to infer more accurate predictors compared to the case where only labeled data would be used. We refer to [44] and the references therein as an example of semi-supervised machine learning algorithms. We summarize the required preprocessing steps, advantages, and disadvantages of the three proposed criteria in Table 5.…”
Section: Discussionmentioning
confidence: 99%