2021
DOI: 10.18523/2617-7080320203-10
|View full text |Cite
|
Sign up to set email alerts
|

Generalization of Cross-Entropy Loss Function for Image Classification

Abstract: Classification task is one of the most common tasks in machine learning. This supervised learning problem consists in assigning each input to one of a finite number of discrete categories. Classification task appears naturally in numerous applications, such as medical image processing, speech recognition, maintenance systems, accident detection, autonomous driving etc.In the last decade methods of deep learning have proven to be extremely efficient in multiple machine learning problems, including classificatio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 20 publications
(12 citation statements)
references
References 12 publications
(20 reference statements)
0
9
0
Order By: Relevance
“…Lebesgue measure for continuous case) by a Radon-Nykodim derivative between probability measures. Shannon's entropy can be generalized on other entropies such as Renyi [15] and Tsallis-Havrda-Charvat [16,17]. In this paper, we are interested in a particular generalization of Shannon cross-entropy: Tsallis-Havrda-Charvat cross-entropy [18].…”
Section: Introductionmentioning
confidence: 99%
“…Lebesgue measure for continuous case) by a Radon-Nykodim derivative between probability measures. Shannon's entropy can be generalized on other entropies such as Renyi [15] and Tsallis-Havrda-Charvat [16,17]. In this paper, we are interested in a particular generalization of Shannon cross-entropy: Tsallis-Havrda-Charvat cross-entropy [18].…”
Section: Introductionmentioning
confidence: 99%
“…This prominence is due to many reasons. First, CE has good theoretical grounding in information theory, which makes it useful for theoretical analysis of systems [13]. Second, CE loss has been proven to rival many loss functions in large datasets [43].…”
Section: Related Work a Cross Entropy Lossmentioning
confidence: 99%
“…Although many loss functions exist, cross-entropy remains one of the most reported and used for the case of two-class classifications [73], as is this case. Precisely, the function measures the difference between two probability distributions, calculating the entropy associated with each class or element.…”
Section: F Loss Functionmentioning
confidence: 99%