2020
DOI: 10.1109/access.2019.2960065
|View full text |Cite
|
Sign up to set email alerts
|

A New Loss Function for CNN Classifier Based on Predefined Evenly-Distributed Class Centroids

Abstract: With the development of convolutional neural networks (CNNs) in recent years, the network structure has become more and more complex and varied, and has achieved very good results in pattern recognition, image classification, object detection and tracking. For CNNs used for image classification, in addition to the network structure, more and more research is now focusing on the improvement of the loss function, so as to enlarge the inter-class feature differences, and reduce the intra-class feature variations … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
20
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3

Relationship

3
5

Authors

Journals

citations
Cited by 37 publications
(23 citation statements)
references
References 17 publications
0
20
0
Order By: Relevance
“…Where, N, in this instance, is the sums of iteration,f_i is the training loss values and y_i is the testing loss values. Consequently, MSE is calculated, as presented in (2) [36], [37].…”
Section: Mean Squared Errormentioning
confidence: 99%
“…Where, N, in this instance, is the sums of iteration,f_i is the training loss values and y_i is the testing loss values. Consequently, MSE is calculated, as presented in (2) [36], [37].…”
Section: Mean Squared Errormentioning
confidence: 99%
“…All experimental results show that our method has better classification accuracy than classic Softmax loss, AM-Softmax, and PEDCC-loss. Here we set two sets hyperparameter s = 7.5, m = 0.35, and s = 10, m = 0.5, which is following the original PEDCC-loss [9]. The hyperparameter s is used to let the network be trainable.…”
Section: Experiments and Resultsmentioning
confidence: 99%
“…Although the Softmax+cross-entropy loss function has achieved great success in many computer vision fields as a norm of neural network, some improvements have been made by researchers to let the features learned by the convolutional neural network more distinctive between classes. Loss functions such as center-loss [5], L-Softmax [6], A-Softmax [7], AM-Softmax (Additive Margin) [8], PEDCC-loss (Pre-defined Evenly-Distributed Class Centroids) [9], etc. can improve the final accuracy of models when trained for face classification and validation.…”
Section: Introductionmentioning
confidence: 99%
“…Due to the solidifying characteristics, PEDCC provides a unique research perspective for the solution of key problems in the field of pattern recognition, such as interpretable supervised/unsupervised learning, incremental learning, uncertainty analysis an so on. Now, PEDCC has been used in CNN classifiers [12], classification autoencoders [13], clustering [14], semi-supervised learning [15], etc. Although PEDCC has shown some excellent characteristics and has been well applied in some aspects, the mathematical generation method and related characteristics of PEDCC have not been well studied, which hinders its further application.…”
Section: Introductionmentioning
confidence: 99%