2017
DOI: 10.1016/j.automatica.2017.07.053
|View full text |Cite
|
Sign up to set email alerts
|

A new kernel-based approach to system identification with quantized output data

Abstract: In this paper we introduce a novel method for linear system identification with quantized output data. We model the impulse response as a zero-mean Gaussian process whose covariance (kernel) is given by the recently proposed stable spline kernel, which encodes information on regularity and exponential stability. This serves as a starting point to cast our system identification problem into a Bayesian framework. We employ Markov Chain Monte Carlo methods to provide an estimate of the system. In particular, we d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
20
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 47 publications
(20 citation statements)
references
References 41 publications
0
20
0
Order By: Relevance
“…In [1], [19], the EM algorithm was used to optimize the likelihood function, while in [37] a variational approximation approach was utilized. Additionally, Bayesian frameworks were applied in, e.g., [9]. The authors of [43] proposed an algorithm based on the recursive prediction error method to estimate the linear part of Wiener systems, which can be used to deal with quantized output models, but both quantizers and the range of parameters were assumed to be known.…”
Section: B Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…In [1], [19], the EM algorithm was used to optimize the likelihood function, while in [37] a variational approximation approach was utilized. Additionally, Bayesian frameworks were applied in, e.g., [9]. The authors of [43] proposed an algorithm based on the recursive prediction error method to estimate the linear part of Wiener systems, which can be used to deal with quantized output models, but both quantizers and the range of parameters were assumed to be known.…”
Section: B Related Workmentioning
confidence: 99%
“…See Appendix C Remark 9. This theorem is the key to establish the consistent estimation of the weighted adjacency matrix A * , since it shows that θ * = vec (A * c * ) can be obtained by optimizing (9). Because of its significance, one of the future works of this paper is to generalize the disturbance assumption.…”
Section: A An Objective Function and Its Concavitymentioning
confidence: 99%
“…Recently, a number of set-valued identification algorithms have been developed. They can be classified roughly from system models, e.g., for linear systems, there are FIR models [9][10][11][12][13][14]; for nonlinear systems, there are Wiener [15] and Hammerstein [16] models. Besides, they can also be categorized in terms of the quantization schemes, e.g., fixed-level quantizer and uniform quantizer.…”
Section: Introductionmentioning
confidence: 99%
“…Besides, they can also be categorized in terms of the quantization schemes, e.g., fixed-level quantizer and uniform quantizer. In particular, for FIR models with finite quantization levels, there are empirical measure approach [9,17] which has good properties of unbiasedness and strong consistency, EM algorithm [10,11] which has a exponent convergent speed under certain conditions, kernel-based method [12], and quadratic programming-based method [13].…”
Section: Introductionmentioning
confidence: 99%
“…Zhao et al [12] considered linear system identification with batched binary‐valued observations and constructed an iterative parameter estimate algorithm to achieve the maximum likelihood estimate. Bottegal et al [13] modelled the impulse response as a zero‐mean Gaussian process whose covariance (kernel) was given by the recently proposed stable spline kernel and introduced a novel method for linear system identification with quantised output data. Zhao et al [14] addressed the recursive non‐parametric identification of non‐linear systems with adaptive binary sensors.…”
Section: Introductionmentioning
confidence: 99%