The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2018
DOI: 10.14209/jcis.2018.1
|View full text |Cite
|
Sign up to set email alerts
|

Estimation of Transfer Entropy between Discrete and Continuous Random Processes

Abstract: Abstract-Transfer entropy is a measure of causality that has been widely applied and one of its identities is the sum of mutual information terms. In this article we evaluate two existing methods of mutual information estimation in the specific application of detecting causality between a discrete random process and a continuous random process: binning method and nearest neighbours method. Simulated examples confirm, in the overall scenario, that the nearest neighbours method detects causality more reliably th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 24 publications
(33 reference statements)
0
1
0
Order By: Relevance
“…The allocation of data points to equally-spaced bins is less time consuming than other methods to estimate TE as the Nearest Neighbours method but has the drawback of detecting more false positives than the latter (Assis and de Assis 2018). In this paper we employ a q = 3-quantile binning, partitioning the data into three bins through the 5% and 95% empirical quantiles of the data distribution as suggested by Behrendt et al (2019); Dimpfl and Peter (2018).…”
Section: Entropy Estimationmentioning
confidence: 99%
“…The allocation of data points to equally-spaced bins is less time consuming than other methods to estimate TE as the Nearest Neighbours method but has the drawback of detecting more false positives than the latter (Assis and de Assis 2018). In this paper we employ a q = 3-quantile binning, partitioning the data into three bins through the 5% and 95% empirical quantiles of the data distribution as suggested by Behrendt et al (2019); Dimpfl and Peter (2018).…”
Section: Entropy Estimationmentioning
confidence: 99%