2016
DOI: 10.1109/tac.2015.2491678
|View full text |Cite
|
Sign up to set email alerts
|

AR Identification of Latent-Variable Graphical Models

Abstract: The paper proposes an identification procedure for autoregressive gaussian stationary stochastic processes wherein the manifest (or observed) variables are mostly related through a limited number of latent (or hidden) variables. The method exploits the sparse plus low-rank decomposition of the inverse of the manifest spectral density and the efficient convex relaxations recently proposed for such decomposition.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

1
94
0

Year Published

2016
2016
2018
2018

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 69 publications
(95 citation statements)
references
References 24 publications
1
94
0
Order By: Relevance
“…In this section, we describe an estimation method which is clearly different from the one in [10]. More precisely, we generalize the Expectation-Maximization algorithm from [5], developed there for independent and identically distributed random variables, to a VAR process.…”
Section: New Algorithmmentioning
confidence: 99%
See 4 more Smart Citations
“…In this section, we describe an estimation method which is clearly different from the one in [10]. More precisely, we generalize the Expectation-Maximization algorithm from [5], developed there for independent and identically distributed random variables, to a VAR process.…”
Section: New Algorithmmentioning
confidence: 99%
“…This step of the algorithm has a strong theoretical justification which stems from the fact that Φ −1 (ω) is the Maximum Entropy solution for a covariance extension problem (see [10] (Remark 2.1)). The number of iterations, N it , is the same as in the case of the first loop.…”
Section: Initializationmentioning
confidence: 99%
See 3 more Smart Citations