2019
DOI: 10.1051/ps/2018026
|View full text |Cite
|
Sign up to set email alerts
|

Statistical estimation of conditional Shannon entropy

Abstract: The new estimates of the conditional Shannon entropy are introduced in the framework of the model describing a discrete response variable depending on a vector of d factors having a density w.r.t. the Lebesgue measure in R d . Namely, the mixed-pair model (X, Y ) is considered where X and Y take values in R d and an arbitrary finite set, respectively. Such models include, for instance, the famous logistic regression. In contrast to the well-known Kozachenko -Leonenko estimates of unconditional entropy the prop… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 9 publications
(8 citation statements)
references
References 37 publications
0
8
0
Order By: Relevance
“…2. Adapt to the time series case, those methods that do not assume any specific probability distribution model in the data, such as, [18,38,39,40].…”
Section: Proposed Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…2. Adapt to the time series case, those methods that do not assume any specific probability distribution model in the data, such as, [18,38,39,40].…”
Section: Proposed Methodsmentioning
confidence: 99%
“…The score function J in Equation 1requires the computation of the following MI terms: 1) I(T S i ; T S j ), 2) I(T S; C) and 3) I(T S i ; T S j | C). To compute them, methods that do not assume any probability distribution model in the data are considered [18,38,39,40]. All of these methods estimate the information shared between vectorvalue variables using a k-nearest neighbor strategy.…”
Section: Proposed Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Although the plug-in estimator of entropy is underbiased, using it in conjunction with Equation (21) yields estimates of mutual information that may be positively or negatively biased depending on the joint distribution [ 39 ]. Although largely unexplored (see [ 121 , 122 ] for recent work), the empirical estimation of mutual information between discrete X and analog Y is also of interest in a number of applications; for example, suppose that and Y is an intracellular voltage trace from visual cortex neurons.…”
Section: Mutual Information: Memoryless Sourcesmentioning
confidence: 99%