2018
DOI: 10.4310/cis.2018.v18.n1.a2
|View full text |Cite
|
Sign up to set email alerts
|

An investigation for loss functions widely used in machine learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
23
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 41 publications
(24 citation statements)
references
References 0 publications
0
23
0
1
Order By: Relevance
“…Repetition was done 20 times for the full model and 10 times for the short model with the default threshold of 0.01. Logistic loss/Log loss was calculated in the “MLmetrics” package and the best models were shown with the plots [ 36 ].…”
Section: Methodsmentioning
confidence: 99%
“…Repetition was done 20 times for the full model and 10 times for the short model with the default threshold of 0.01. Logistic loss/Log loss was calculated in the “MLmetrics” package and the best models were shown with the plots [ 36 ].…”
Section: Methodsmentioning
confidence: 99%
“…In particular, we would rather have a small but regular error rather than very large errors a little less frequently since this would completely affect the precision of the flux measurement. Common loss functions are being investigated in [17].…”
Section: Loss Function For Training the Deep Learning Neural Networkmentioning
confidence: 99%
“…One of the most commonly used loss function in regression tasks is Mean Squared Error or L2 [26]. MSE is the sum of squared distances between the real value and predicted values, it is defined as:…”
Section: Loss Functionmentioning
confidence: 99%