2019 IEEE 4th International Conference on Signal and Image Processing (ICSIP) 2019
DOI: 10.1109/siprocess.2019.8868599
|View full text |Cite
|
Sign up to set email alerts
|

Robust Dynamic Classifier Selection for Remote Sensing Image Classification

Abstract: Dynamic classifier selection (DCS) is a classification technique that, for each new sample to be classified, selects and uses the most competent classifier among a set of available ones. We here propose a novel DCS model (R-DCS) based on the robustness of its prediction: the extent to which the classifier can be altered without changing its prediction. In order to define and compute this robustness, we adopt methods from the theory of imprecise probabilities. Additionally, two selection strategies for R-DCS mo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(7 citation statements)
references
References 16 publications
0
7
0
Order By: Relevance
“…e intuition behind measuring the cosine distance of cluster centroid and input sample is to determine more relevant samples. Cosine similarity formula can be derived from equation (12), where a → and b → are two vectors containing the information of cluster centroids and input sample image. Later, all similar image samples are categorized into a hypothetical class, such as X. e hypothetical class X contains the samples of new spectral band images.…”
Section: Training Sample Repository Modulementioning
confidence: 99%
See 2 more Smart Citations
“…e intuition behind measuring the cosine distance of cluster centroid and input sample is to determine more relevant samples. Cosine similarity formula can be derived from equation (12), where a → and b → are two vectors containing the information of cluster centroids and input sample image. Later, all similar image samples are categorized into a hypothetical class, such as X. e hypothetical class X contains the samples of new spectral band images.…”
Section: Training Sample Repository Modulementioning
confidence: 99%
“…e experiments are carried out on the Google Cloud Platform (GCP). In the GCP server (us-west1-b region), we installed the Compute Engine 2//For feature extraction task (2) Perform convolution operation with F � 64 and nonlinearity, via equations 3and 4(3) Add Gaussian noise, via equation 6(4) Apply max-pooling, via equation 7(5) Activate and deactivate neurons using dropout with p, via equation 8 Forward accuracy to the performance feedback module (5) Determine the ensemble accuracies using voting, via Algorithm 2 //Spectral band stream are classifying with their respective instances in the DEC module (6) if % accuracy for B ≥ //if sample does not misclassify (7) Repeat steps 3, 4, and 5 (8) if % accuracy for B ≤ //if sample misclassify (9) Save the B //Save misclassified sample in training repository //as potential new spectral band (10) Counter++ (11) Repeat steps 3, 4, and 5 (12) if counter � 50 //number of misclassified instances reached to 50 (13) Cluster M c using K-mean where K � 1, [35]. //Cluster all the misclassified data samples using the K-means approach with //K � 1, K � 1 the case is assigned to the class of its nearest neighbor (14) Determine optimized centroid, [35]//to optimize the similar sample instances (15) Compare cosine distance cluster sample with cluster centroid, via equation (12) //to segregate most relevant samples in cluster (16) Assign the all nearest samples a hypothetical class X � B n+1 //A new class with additional spectral band information (17) Create new instance classifier i n+1 //New Single Instance, 20 layered architecture (18) Train new instance classifier I � i n+1 with hypothetical class X � B n+1 , via Table 3 //Online training with selected hyperparameters as depicted in Table 3 ALGORITHM 3: Continued.…”
Section: Platform and Librariesmentioning
confidence: 99%
See 1 more Smart Citation
“…We combined this property with DCS and applied it to classification in our earlier work [ 58 ], but only as a proof of concept for toy cases with binary classes and two classifiers. In a follow up work [ 59 ] we presented an abstract concept of how the robustness measures can be employed to improve the classification performance of DCS in HSI classification.…”
Section: Introductionmentioning
confidence: 99%
“…Here we develop a novel robust DCS (R-DCS) model in a general setting with multiple classes and multiple classifiers, and use it to take into account the imprecision of the model that is caused by errors in the sample labels. The main novelty lies in interpreting erroneous labels as model imprecision and addressing this problem from the point of view of the robustness of PGMs to model perturbations; this also sets this work apart from our previous—more theoretical—work on robustness of PGMs [ 57 , 58 , 59 ], which did not consider the problem of erroneous label. The main issue with erroneous labels, also referred to as noisy labels [ 60 , 61 ], is that they mislead the model training and severely decrease the classification performance [ 62 , 63 , 64 ].…”
Section: Introductionmentioning
confidence: 99%