2019
DOI: 10.1109/access.2019.2953126
|View full text |Cite
|
Sign up to set email alerts
|

Recursive Parsimonious Subspace Identification for Closed-Loop Hammerstein Nonlinear Systems

Abstract: In this paper, a recursive closed-loop subspace identification method for Hammerstein nonlinear systems is proposed. To reduce the number of unknown parameters to be identified, the original hybrid system is decomposed as two parsimonious subsystems, with each subsystem being related directly to either the linear dynamics or the static nonlinearity. To avoid redundant computations, a recursive least-squares (RLS) algorithm is established for identifying the common terms in the two parsimonious subsystems, whil… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
11
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 9 publications
(11 citation statements)
references
References 37 publications
0
11
0
Order By: Relevance
“…Corollary 1. When applying algorithm (23)- (25) to the time-invariant noisy H-W systems parameterized by (4) on Assumptions 1-5, if (29)- (31) and (41) are fulfilled, t   is exponentially bounded in mean square and bounded with probability one. Proof: If the H-W system is time-invariant, we have…”
Section: Convergence Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…Corollary 1. When applying algorithm (23)- (25) to the time-invariant noisy H-W systems parameterized by (4) on Assumptions 1-5, if (29)- (31) and (41) are fulfilled, t   is exponentially bounded in mean square and bounded with probability one. Proof: If the H-W system is time-invariant, we have…”
Section: Convergence Analysismentioning
confidence: 99%
“…The more complex types, namely Hammerstein-Wiener (H-W) and Wiener-Hammerstein (W-H), consist of their combinations. Many works have been published to identify these kinds of systems, such as twostage method [5]- [7], maximum likelihood estimation method [8], [9], iterative method [10]- [15], recursive method [16]- [25], etc. All of the contributions mentioned above studied the identification of time-invariant block oriented systems.…”
Section: Introductionmentioning
confidence: 99%
“…, k(y 2 , z)], ψ j (y n ) is j-th eigenvector of N × N covariance function matrix evaluated at N data points, corresponding to eigenvalue λ j , andψ j (y n ) = √ N λ j ψ j (y n ). We can interpret j-th eigenfunction in (20), as linear combination of kernel values evaluated at N data points, where the weights of linear combinations are provided bỹ ψ j (y n ). There are in general as many eigenfunctions as data points N i.e.…”
Section: ) Kernel Eigenfunction Approximationmentioning
confidence: 99%
“…Once we have generated these points, we form N × N kernel covariance matrix for each state, input and output variable. Performing eigenvalue decomposition of these n x + n u + n y kernel matrices produces N (n x + n u + n y ) eigenfunctions according to (20). We label N eigenfunctions corresponding to each state variable x i as ψ…”
Section: ) Finding Kernel Eigenfunctions For Gp-ssmmentioning
confidence: 99%
See 1 more Smart Citation