2014
DOI: 10.1016/j.knosys.2014.07.014
|View full text |Cite
|
Sign up to set email alerts
|

QMIQPN: An enhanced QPN based on qualitative mutual information for reducing ambiguity

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2016
2016
2016
2016

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 28 publications
0
1
0
Order By: Relevance
“…The entropy, H ( X ), of discrete random variable X with p ( x ) = Pr( X = x ) as its probability density function is defined as follows: Also, the mutual information (MI) between two random variables X and Y with a joint probability distribution Pr( x , y ) is formulated as follows [ 22 ]: Hence, Qualitative Mutual Information (QMI) is defined by multiplying a utility function U ( X i , Y j ) to mutual information formula; the formula is as follows [ 23 , 24 ]: Different informative functions can be applied as the utility function. In the proposed approach, we use the Fisher ratio [ 2 ].…”
Section: Mutual Information Qualitative Mutual Information and Cmentioning
confidence: 99%
“…The entropy, H ( X ), of discrete random variable X with p ( x ) = Pr( X = x ) as its probability density function is defined as follows: Also, the mutual information (MI) between two random variables X and Y with a joint probability distribution Pr( x , y ) is formulated as follows [ 22 ]: Hence, Qualitative Mutual Information (QMI) is defined by multiplying a utility function U ( X i , Y j ) to mutual information formula; the formula is as follows [ 23 , 24 ]: Different informative functions can be applied as the utility function. In the proposed approach, we use the Fisher ratio [ 2 ].…”
Section: Mutual Information Qualitative Mutual Information and Cmentioning
confidence: 99%