2014
DOI: 10.1140/epjst/e2014-02286-7
|View full text |Cite
|
Sign up to set email alerts
|

A tutorial on time-evolving dynamical Bayesian inference

Abstract: Abstract. In view of the current availability and variety of measured data, there is an increasing demand for powerful signal processing tools that can cope successfully with the associated problems that often arise when data are being analysed. In practice many of the data-generating systems are not only time-variable, but also influenced by neighbouring systems and subject to random fluctuations (noise) from their environments. To encompass problems of this kind, we present a tutorial about the dynamical Bay… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
41
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
5
3
1

Relationship

3
6

Authors

Journals

citations
Cited by 42 publications
(44 citation statements)
references
References 70 publications
0
41
0
Order By: Relevance
“…Note that the chosen coupling functions q(x i , x j ) represent the encryption key. Bearing this in mind, if given a 2 × M time-series X = {x n ≡ x(t n )} (t n = nh) as input, the main task for the Bayesian dynamical inference is to reveal the unknown model parameters and the noise diffusion matrix M = {c, D}, which eventually comes down to maximization of the posterior conditional probability p X (M|X ) of observing the parameters M when given the data X [49]. The relationship of this posterior conditional probability to the prior density p prior (M) (which encompasses observation based prior knowledge of the unknown parameters), and to the likelihood function (X |M) (that is the conditional probability density to observe X given choice M), is given by Bayes' theorem:…”
Section: Dynamical Bayesian Inferencementioning
confidence: 99%
See 1 more Smart Citation
“…Note that the chosen coupling functions q(x i , x j ) represent the encryption key. Bearing this in mind, if given a 2 × M time-series X = {x n ≡ x(t n )} (t n = nh) as input, the main task for the Bayesian dynamical inference is to reveal the unknown model parameters and the noise diffusion matrix M = {c, D}, which eventually comes down to maximization of the posterior conditional probability p X (M|X ) of observing the parameters M when given the data X [49]. The relationship of this posterior conditional probability to the prior density p prior (M) (which encompasses observation based prior knowledge of the unknown parameters), and to the likelihood function (X |M) (that is the conditional probability density to observe X given choice M), is given by Bayes' theorem:…”
Section: Dynamical Bayesian Inferencementioning
confidence: 99%
“…Further details of the development, software implementation and application of the dynamic Bayesian inference approach can be found in [21], [27], [28], [49], and references therein.…”
Section: Dynamical Bayesian Inferencementioning
confidence: 99%
“…the encrypted signals. More details about the method, its implementation and coding can be found in [28], [29].…”
Section: B Dynamical Bayesian Inferencementioning
confidence: 99%
“…The odd parameters correspond to the coupling terms inferred for ϕ 1 in the direction 2 → 1, and the even parameters correspond to the coupling terms inferred for ϕ 2 in the direction 1 → 2. See [32] for further details and an in depth tutorial on dynamical Bayesian inference and its implementation. In summary, Bayesian inference is applied to ϕ * x and ϕ A *…”
Section: Dynamical Bayesian Inferencementioning
confidence: 99%