2021
DOI: 10.1504/ijiids.2021.118555
|View full text |Cite
|
Sign up to set email alerts
|

The extended Kullback-Leibler divergence measure in the unknown probability density function cases and applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…The Kullback–Leibler divergence between two probability density functions f ( x ) and g ( x ) is determined by the formula ( 5 ) 30 : We use the extended Kullback–Leibler divergence be determined by the formula ( 6 ) 31 : in which is the empirical probability of data. In this context, it is important to note that the samples are arranged in ascending order, as indicated by the notation .…”
Section: Estimate the Parameters Of The Component Probability Distrib...mentioning
confidence: 99%
“…The Kullback–Leibler divergence between two probability density functions f ( x ) and g ( x ) is determined by the formula ( 5 ) 30 : We use the extended Kullback–Leibler divergence be determined by the formula ( 6 ) 31 : in which is the empirical probability of data. In this context, it is important to note that the samples are arranged in ascending order, as indicated by the notation .…”
Section: Estimate the Parameters Of The Component Probability Distrib...mentioning
confidence: 99%
“…Step 2 Calculate the extended Kullback -Leibler divergence values 11 , for the difference of each probability distribution compared to data. Output The probability distribution is best suited to the sub-dataset corresponding to the probability distribution with the smallest extended Kullback -Leibler divergence.…”
Section: Estimate the Parameters Of The Component Probability Distrib...mentioning
confidence: 99%