2012
DOI: 10.1016/j.jprocont.2011.09.002
|View full text |Cite
|
Sign up to set email alerts
|

Multi-loop nonlinear internal model controller design under nonlinear dynamic PLS framework using ARX-neural network model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
25
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 43 publications
(26 citation statements)
references
References 28 publications
0
25
0
Order By: Relevance
“…Kaspar and Ray proposed a proportion-integration-differentiation (PID) control scheme under this framework [12], and Chen and Cheng [14] designed multi-loop adaptive PID controllers based on a modified decoupling PLS framework. Hu et al [16,17] proposed a multi-loop internal model controller in the dynamic PLS framework and achieved better performance for disturbance rejection. LÜ and Liang [18] proposed a multi-loop constrained MPC scheme.…”
Section: Dynamic Pls Model Descriptionmentioning
confidence: 99%
See 1 more Smart Citation
“…Kaspar and Ray proposed a proportion-integration-differentiation (PID) control scheme under this framework [12], and Chen and Cheng [14] designed multi-loop adaptive PID controllers based on a modified decoupling PLS framework. Hu et al [16,17] proposed a multi-loop internal model controller in the dynamic PLS framework and achieved better performance for disturbance rejection. LÜ and Liang [18] proposed a multi-loop constrained MPC scheme.…”
Section: Dynamic Pls Model Descriptionmentioning
confidence: 99%
“…Suppose {∆t r (k|k), · · · , ∆t r (k + N p − 1|k)} is the optimal sequence of solution Equation (17). At every time t, ∆t r (k|k) is added to control rate at the previous time and acts on the system.…”
Section: Stability Analysismentioning
confidence: 99%
“…It is used to predict behavior of structures equipped with passive controller systems. It is worth mentioning that, as the system order increases, the number of candidate terms becomes very large [18][19][20] which leads to increased complexity in the analysis [21]. The ARX is described detail in [21,22].…”
Section: Nonlinear Arxmentioning
confidence: 99%
“…(19) was used and the resulting modeling performance is depicted in Fig. 6b, which shows a good modeling to the testing signal with a MSE of 2.759 × 10 −4 .…”
Section: Plantmentioning
confidence: 99%
“…In this context, to train the NN in the IMC, the most commonly applied learning methods in the literature are the gradient-based methods such as the back-propagation algorithm (BPA) [9,19]. However, these methods suffer from several drawbacks like the slow convergence speed and the tendency to get trapped at local minima in the search space.…”
Section: Introductionmentioning
confidence: 99%