2020
DOI: 10.1007/s10462-020-09858-x
|View full text |Cite
|
Sign up to set email alerts
|

Multi-dimensional Bayesian network classifiers: A survey

Abstract: Multi-dimensional classification is a cutting-edge problem, in which the values of multiple class variables have to be simultaneously assigned to a given example. It is an extension of the well known multi-label subproblem, in which the class variables are all binary. In this article, we review and expand the set of performance evaluation measures suitable for assessing multi-dimensional classifiers. We focus on multi-dimensional Bayesian network classifiers, which directly cope with multi-dimensional classifi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
20
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 27 publications
(24 citation statements)
references
References 98 publications
0
20
0
Order By: Relevance
“…Traditional equations for precision, recall and, therefore, F1 score can only be used for a unique binary class variable. However, Gil‐Begue et al 53 extended them for multiple, possibly nonbinary, class variables. Let B be a function that computes any of these evaluation metrics by receiving a confusion matrix, then the metric scores are obtained with macro‐ and micro‐averaging as follows: Macro‐averaging: averages the scores of each class variable: Bmacro=1dydBCy,1.0emwhere0.25emBCy={1ΩCycjB(tpcj,fpcj,tncj,fncj),1.0emif0.25emΩCy>2B(tpCy,fpCy,tnCy,fnCy)1.0emotherwise. If the class variable Cy is binary, only the confusion matrix for one of its classes (tpCy,fpCy,tnCy,fnCy) is considered. Micro‐averaging: aggregates the confusion matrices of each class variable: Bmicro=B)(y=1d...…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Traditional equations for precision, recall and, therefore, F1 score can only be used for a unique binary class variable. However, Gil‐Begue et al 53 extended them for multiple, possibly nonbinary, class variables. Let B be a function that computes any of these evaluation metrics by receiving a confusion matrix, then the metric scores are obtained with macro‐ and micro‐averaging as follows: Macro‐averaging: averages the scores of each class variable: Bmacro=1dydBCy,1.0emwhere0.25emBCy={1ΩCycjB(tpcj,fpcj,tncj,fncj),1.0emif0.25emΩCy>2B(tpCy,fpCy,tnCy,fnCy)1.0emotherwise. If the class variable Cy is binary, only the confusion matrix for one of its classes (tpCy,fpCy,tnCy,fnCy) is considered. Micro‐averaging: aggregates the confusion matrices of each class variable: Bmicro=B)(y=1d...…”
Section: Methodsmentioning
confidence: 99%
“…Traditional equations for precision, recall and, therefore, F 1 score can only be used for a unique binary class variable. However, Gil-Begue et al 53 extended them for multiple, possibly nonbinary, class variables. Let B be a function that computes any of these evaluation metrics by receiving a confusion matrix, then the metric scores are obtained with macro-and microaveraging as follows:…”
mentioning
confidence: 99%
“…Bayes Net was used as a classifier and can lead to extremely precise categories if correctly educated. A comprehensive study can be discovered in [ 51 , 52 , 53 ] on the Bayesian Network.…”
Section: Background and Related Workmentioning
confidence: 99%
“…However, the combinatorial nature still exists, which leads to that the deficiencies cannot be fully addressed. A family of MDC models called multi-dimensional Bayesian network classifier [25] aims at learning different kinds of DAG structures over class spaces to explicitly model the class dependencies. However, determining DAG structures is computationally demanding, and only nominal features can be tackled generally.…”
Section: Related Workmentioning
confidence: 99%