2014
DOI: 10.1162/neco_a_00627
|View full text |Cite
|
Sign up to set email alerts
|

Neuronal Spike Train Entropy Estimation by History Clustering

Abstract: Neurons send signals to each other by means of sequences of action potentials (spikes). Ignoring variations in spike amplitude and shape that are probably not meaningful to a receiving cell, the information content, or entropy of the signal depends on only the timing of action potentials, and because there is no external clock, only the interspike intervals, and not the absolute spike times, are significant. Estimating spike train entropy is a difficult task, particularly with small data sets, and many methods… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 32 publications
0
4
0
Order By: Relevance
“…We identified potential spikes using an action potential detector described elsewhere ( Choi et al, 2006 ). Then, we performed a principal component analysis (PCA) for each channel using the open-source cluster analysis program KlustaKwik ( http://klusta-team.github.io/klustakwik/ ) ( Watters and Reeke, 2014 ). Clusters of potential spikes were determined based on the first three components of the PCA.…”
Section: Methodsmentioning
confidence: 99%
“…We identified potential spikes using an action potential detector described elsewhere ( Choi et al, 2006 ). Then, we performed a principal component analysis (PCA) for each channel using the open-source cluster analysis program KlustaKwik ( http://klusta-team.github.io/klustakwik/ ) ( Watters and Reeke, 2014 ). Clusters of potential spikes were determined based on the first three components of the PCA.…”
Section: Methodsmentioning
confidence: 99%
“…For example multi-modal data with well separated peaks may have higher variability than uniformly distributed data where the outcomes are the least predictable. Shannon's entropy (Shannon and Weaver, 1949 ) is widely used to measure randomness (Steuer et al, 2001 ; McDonnell et al, 2011 ; Watters and Reeke, 2014 ), however it is not suitable for continuous distributions. Few other randomness measures based on entropy have been used in neural context recently.…”
Section: Introductionmentioning
confidence: 99%
“…In recent decades, the theoretical analysis of signal transduction has been broadly applied in various research fields, in parallel with significant development of information theory [1][2][3][4][5]. Informational thermodynamics for analyzing dynamic biochemical networks and systems biology have also been developed to assess the cell response to external stimuli [1][2][3][4][5][6].…”
Section: Introductionmentioning
confidence: 99%
“…In recent decades, the theoretical analysis of signal transduction has been broadly applied in various research fields, in parallel with significant development of information theory [1][2][3][4][5]. Informational thermodynamics for analyzing dynamic biochemical networks and systems biology have also been developed to assess the cell response to external stimuli [1][2][3][4][5][6]. In addition, in in vivo analysis, a significant amount of data on signal transduction has accumulated, and the quantitative analysis of a network of signaling cascades can be performed using new technology [7][8][9][10][11][12][13][14][15][16].…”
Section: Introductionmentioning
confidence: 99%