2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2018
DOI: 10.1109/icassp.2018.8461700
|View full text |Cite
|
Sign up to set email alerts
|

Open Set Recognition by Regularising Classifier with Fake Data Generated by Generative Adversarial Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
19
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 36 publications
(19 citation statements)
references
References 4 publications
0
19
0
Order By: Relevance
“…Many techniques have been proposed for OOD detection based on traditional classification algorithms, such as OCSVM [ 41 , 43 , 44 , 45 ], KNN [ 15 ], and Kmeans [ 16 ]. Recently, deep learning-based OOD-detection techniques have been proposed [ 17 , 19 , 20 , 26 , 27 , 28 , 29 , 30 , 31 , 32 , 33 , 34 , 35 , 46 , 47 , 48 , 49 , 50 , 51 , 52 , 53 , 54 , 55 ], some of which we now highlight.…”
Section: Background and Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Many techniques have been proposed for OOD detection based on traditional classification algorithms, such as OCSVM [ 41 , 43 , 44 , 45 ], KNN [ 15 ], and Kmeans [ 16 ]. Recently, deep learning-based OOD-detection techniques have been proposed [ 17 , 19 , 20 , 26 , 27 , 28 , 29 , 30 , 31 , 32 , 33 , 34 , 35 , 46 , 47 , 48 , 49 , 50 , 51 , 52 , 53 , 54 , 55 ], some of which we now highlight.…”
Section: Background and Related Workmentioning
confidence: 99%
“…GANs are generally incorporated into an open set discriminative framework by training a GAN on the known input dataset and converting the trained discriminator into an OOD classifier [ 46 , 59 ]. Alternatively, the generator used to synthesize samples may designate them as from an OOD class, and then add them to the full training set used to train a classifier [ 47 , 48 , 49 , 50 ]. However, such GAN augmentation techniques may produce samples that do not adequately cover the entirety of in-distribution (IIN) decision boundaries, resulting in non-optimal OOD classifiers [ 60 ].…”
Section: Background and Related Workmentioning
confidence: 99%
“…These images, referred to as counterfactual-images, are a member of unknown classes but they look like known classes. Using the GAN framework, another work [25] aims at generating synthesis data which were served as fake unknown classes for the classifier to make it robust against real unknown classes. Yu et al [35] proposed the adversarial sample generation (ASG) framework that produces unseen class data.…”
Section: Deep Neural Network-based Algorithmsmentioning
confidence: 99%
“…The first type includes SVM-based methods [26] and distance-based method [1,2]. A collection of recent OSR works venture towards the generative direction [4,11,19,31]. A subset of OSR methods, named NEL, mainly employ Bayesian methods, such as infinite Gaussian mixture model (IGMM) [24] to learn the UCs.…”
Section: Related Workmentioning
confidence: 99%
“…The presence of such instances weakens the robustness of conventional machine learning algorithms, as these algorithms do not account for the instances from unknown classes, either in the train or the test environments. To overcome this challenge, a series of related research activities has become popular in recent years; examples include anomaly detection (AD) [13,15,27,34], few-shot learning (FSL) [12,25], zero-shot learning (ZSL) [21,29], open set recognition (OSR) and open-world classification (OWC) [1,2,4,5,8,11,14,17,19,20,23,26,30,31]. Collectively, each of these works belongs to one of the four different categories [6], differing on the kind of instances observed by the model during train and test.…”
Section: Introductionmentioning
confidence: 99%