2005
DOI: 10.1109/tsp.2005.853103
|View full text |Cite
|
Sign up to set email alerts
|

Mixture-based extension of the AR model and its recursive Bayesian identification

Abstract: Abstract-An extension of the AutoRegressive (AR) model is studied, which allows transformations and distortions on the regressor to be handled. Many important signal processing problems are amenable to this Extended AR (i.e. EAR) model. It is shown that Bayesian identification and prediction of the EAR model can be performed recursively, in common with the AR model itself. The EAR model does, however, require that the transformation be known. When it is unknown, the associated transformation space is represent… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2008
2008
2019
2019

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(9 citation statements)
references
References 32 publications
0
9
0
Order By: Relevance
“…To design a sliding mode controller based on Eq. (28), the dynamics of the system and the gain matrix of inputs are defined as follows:…”
Section: Prediction Error Methods Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…To design a sliding mode controller based on Eq. (28), the dynamics of the system and the gain matrix of inputs are defined as follows:…”
Section: Prediction Error Methods Algorithmmentioning
confidence: 99%
“…A serious problem in the model-based control of robotic systems is that the computational burden imposed on the robot computer is heavy, causing the real-time exploitation of the control algorithm to take high costs [18]. To date, some solutions have been proposed to solve this problem based on identification and estimation strategies [19][20][21][22][23][24][25][26][27][28][29][30][31][32][33]. To solve this problem, we propose a model that can be supposed directly between actuators' outputs and the sensors measurements of distances from the sliding surfaces.…”
Section: Introductionmentioning
confidence: 99%
“…In order to further obtain a tractable variational treatment of the MSFA model, we consider the conditional on the, latent, factor vectors, y ij , and scale vectors, u j , expression of the MSFA model density, given by (24), and we re-express it in terms of a marginalization over a set of binary latent variables denoting the label of the component factor analyzer that each one of the observable data x j , j = 1, ..., n derive from. Let us denote as Z = {z j } n j=1 the set of label indicator vectors, z j = (z ij ), with z ij ∈ {0, 1} and such that z ij = 1 if x j is viewed as generated by the i-th mixture component analyzer, z ij = 0 otherwise.…”
Section: B the Variational Bayes Mixture Of Student's-t Factor Analymentioning
confidence: 99%
“…Variational Bayesian inference has previously been applied to GMMs (e.g. [22]), autoregressive models [23], [24], SMMs [13], [14] and conventional (Gaussian) MFA [25], thereby avoiding the singularity and overfitting problems of ML approaches.…”
Section: Introductionmentioning
confidence: 99%
“…Our novel approach is based on variational approximation methods [8], which have recently emerged as a deterministic alternative to Markov chain Monte-Carlo (MCMC) algorithms for doing Bayesian inference for probabilistic generative models [9,10], with better scalability in terms of computational cost [11]. Variational Bayesian inference has previously been applied to relevance vector machines [12], GMMs [13], autoregressive models [14,15], SMMs [16,17], mixtures of factor analyzers [18,19,20], discrete HMMs [21], Gaussian HMMs [22], as well as HMMs with…”
Section: Introductionmentioning
confidence: 99%