2020
DOI: 10.1101/2020.08.11.245944
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A flexible Bayesian framework for unbiased estimation of timescales

Abstract: Timescales characterize the pace of change for many dynamic processes in nature: radioactive decay, metabolization of substances, memory decay in neural systems, and epidemic spreads. Measuring timescales from experimental data can reveal underlying mechanisms and constrain theoretical models. Timescales are usually estimated by fitting the autocorrelation of sample time-series with exponential decay functions. We show that this standard procedure often fails to recover the correct timescales, exhibiting large… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 9 publications
(13 citation statements)
references
References 94 publications
(197 reference statements)
0
13
0
Order By: Relevance
“…Compared to fitting exponential decay functions in the time domain (e.g., Murray et al, 2014 )—which can be biased even without the presence of additional components ( Zeraati et al, 2020 )—the frequency domain approach is advantageous when a variable power law exponent and strong oscillatory components are present, as is often the case for neural signals (example of real data in Figure 1D ). While the oscillatory component can corrupt naive measurement of τ as time for the ACF to reach 1/e ( Figure 1D , inset), it can be more easily accounted for and removed in the frequency domain as Gaussian-like peaks.…”
Section: Resultsmentioning
confidence: 99%
“…Compared to fitting exponential decay functions in the time domain (e.g., Murray et al, 2014 )—which can be biased even without the presence of additional components ( Zeraati et al, 2020 )—the frequency domain approach is advantageous when a variable power law exponent and strong oscillatory components are present, as is often the case for neural signals (example of real data in Figure 1D ). While the oscillatory component can corrupt naive measurement of τ as time for the ACF to reach 1/e ( Figure 1D , inset), it can be more easily accounted for and removed in the frequency domain as Gaussian-like peaks.…”
Section: Resultsmentioning
confidence: 99%
“…The first problem with finding the best recipe for criticality in the brain is our inability to identify the brain's state from the observations we can make. We are slowly learning how to deal with strong subsampling (under-observation) of the brain network [17,20,56,[191][192][193]. However, even if we obtained a perfectly resolved observation of all activity in the brain, we would face the problem of constant input and spontaneous activation that renders it impossible to find natural pauses between avalanches, and hence makes avalanche-based analyses ambiguous [52].…”
Section: Discussionmentioning
confidence: 99%
“…Computing the autocorrelation time with the generalized timescale is difficult, because coefficients C ( T ) can be negative, and are too noisy for large delays T . While model fitting is in general more data efficient than the model-free estimation presented here, it can also produce biased and unreliable estimates [16]. Furthermore, when the coefficients do not decay exponentially, a more complex model has to be fitted [53], or the analysis simply cannot be applied.…”
Section: Discussionmentioning
confidence: 99%
“…Often, history dependence is characterized by how much spiking is correlated with spiking with a certain time lag [14, 15]. From the decay time of this lagged correlation, one obtains an intrinsic timescale of how long past information can still be read out [911, 16]. However, to quantify not only a timescale of statistical dependence, but also its strength, one has to quantify how much of a neuron’s spiking depends on its entire past .…”
Section: Introductionmentioning
confidence: 99%