2009
DOI: 10.1016/j.datak.2009.07.003
|View full text |Cite
|
Sign up to set email alerts
|

On the use of classification reliability for improving performance of the one-per-class decomposition method

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
8
0

Year Published

2012
2012
2019
2019

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 18 publications
(8 citation statements)
references
References 39 publications
(67 reference statements)
0
8
0
Order By: Relevance
“…We present now two reconstruction rules for OpC: the first is a traditional implementation referred to as Hamming decoding (H d ), whereas the second is referred to as M DS rule and it has been introduced in [4]. In case of H d , the final decision is given by O(x) = ω s , with:…”
Section: Decomposition Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…We present now two reconstruction rules for OpC: the first is a traditional implementation referred to as Hamming decoding (H d ), whereas the second is referred to as M DS rule and it has been introduced in [4]. In case of H d , the final decision is given by O(x) = ω s , with:…”
Section: Decomposition Methodsmentioning
confidence: 99%
“…This algorithm, intrinsically binary, requires the use of a decomposition method to cope with multiclass problems reducing these tasks into several binary subtasks. Then, a reconstruction rule provides the final classification [2][3][4]. Furthermore, it was proven that unn classifier can effectively estimates the posterior probability of each classification act [5].…”
Section: Introductionmentioning
confidence: 99%
“…= H(T ) H(T ;Y) , H(T ; Y) = − z p t (z) log 2 p y (z)22NI based on cross-entropy NI22 = H(Y) H(Y;T ) , H(Y; T ) = − z p y (z) log 2 p t (z)23 NI based on cross-entropy NI 23 = 1 cross-entropy NI24 = H(T )+H(Y) H(T ;Y)+H(Y;T ) Numerical examples in Binary Classifications(M1-M4 and M6: C 1 = 90, C 2 = 10; M5: C 1 = 95, C 2 = 5). (R)= ranking order for the model, where R = A,B, ..., in descending order from the top.…”
mentioning
confidence: 99%
“…In activity classification, a false assignment could occur due to the unreliable nature of sensor data [17], incorrect execution of an activity [18], similar activities due to overlapping in features [19] or inability of a learning algorithm to assign the correct label [20]. In health care systems, the reliability of activity recognition models is extremely important, therefore along with the correct recognition of activities, a model should also be capable of detecting and avoiding false assignments [20]. Most of the activity recognition approaches while focusing on the segmentation and recognition may ignore false assignments [10,13,21].…”
Section: Introductionmentioning
confidence: 99%
“…Segmented activity instances are classified in activity classes using different learning models such as Hidden Markov Model (HMM) [10], Conditional Random Fields (CRF) [11], Naive Bayes (NB) [12], Support Vector Machine (SVM) [13], Artificial Neural Network (ANN) [14,15], and Decision Tree (DT) [16]. In activity classification, a false assignment could occur due to the unreliable nature of sensor data [17], incorrect execution of an activity [18], similar activities due to overlapping in features [19] or inability of a learning algorithm to assign the correct label [20]. In health care systems, the reliability of activity recognition models is extremely important, therefore along with the correct recognition of activities, a model should also be capable of detecting and avoiding false assignments [20].…”
Section: Introductionmentioning
confidence: 99%