2017
DOI: 10.5391/ijfis.2017.17.1.1
|View full text |Cite
|
Sign up to set email alerts
|

Deep Neural Network Self-training Based on Unsupervised Learning and Dropout

Abstract: In supervised learning methods, a large amount of labeled data is necessary to find reliable classification boundaries to train a classifier. However, it is hard to obtain a large amount of labeled data in practice and it is time-consuming with a lot of cost to obtain labels of data. Although unlabeled data is comparatively plentiful than labeled data, most of supervised learning methods are not designed to exploit unlabeled data. Self-training is one of the semisupervised learning methods that alternatively r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
28
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 58 publications
(28 citation statements)
references
References 13 publications
(13 reference statements)
0
28
0
Order By: Relevance
“…Every next round, we first (a) retrain the model using jointly ground truth labels and confident pseudo-labels; and then (b) update the pseudo-labels for all unlabeled data using the new model. It is crucial to reset model weights before every round of self-training not to accumulate errors in pseudo-labels during multiple rounds [29].…”
Section: Self-training Processmentioning
confidence: 99%
See 1 more Smart Citation
“…Every next round, we first (a) retrain the model using jointly ground truth labels and confident pseudo-labels; and then (b) update the pseudo-labels for all unlabeled data using the new model. It is crucial to reset model weights before every round of self-training not to accumulate errors in pseudo-labels during multiple rounds [29].…”
Section: Self-training Processmentioning
confidence: 99%
“…Average voting scheme corrects examples which could be mislabeled by one of the models, hence facilitates more reliable pseudo-labeling. Moreover, to further reduce the error amplification we retrain our models from scratch and predict labels for all unlabeled examples every round in similar spirit as [29].…”
Section: Introductionmentioning
confidence: 99%
“…Lee et al [28] propose a deep-learning approach to fault monitoring in semiconductor manufacturing. They use a Stacked de-noising Auto-encoder (SdA) approach to provide an unsupervised learning solution.…”
Section: Existing Workmentioning
confidence: 99%
“…(3). We added tiny change to the squashing function given in [2], it is shown in Eq. (4), and since v j output of a capsule j and s j might share 0 vector.…”
Section: Capsule Networkmentioning
confidence: 99%
“…Especially, deep neural networks have achieved great success in various applications [1,2], particularly in tasks involving visual information. There have introduced many state-of-the-art models in the field that perform dissimilar tasks with a high accuracy and very effectively.…”
Section: Introductionmentioning
confidence: 99%