Proceedings of the 2000 American Control Conference. ACC (IEEE Cat. No.00CH36334) 2000
DOI: 10.1109/acc.2000.877020
|View full text |Cite
|
Sign up to set email alerts
|

Lagrangian solution methods for nonlinear model predictive control

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2003
2003
2018
2018

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 9 publications
0
4
0
Order By: Relevance
“…(4) to obtain x(k) and then use Newton-Raphson recursive method to solve the control input u(k) , whereby U 1(k)=u (k)_[fnx(I(lI (6) …”
Section: Nonlinear Generalized Predictivementioning
confidence: 99%
“…(4) to obtain x(k) and then use Newton-Raphson recursive method to solve the control input u(k) , whereby U 1(k)=u (k)_[fnx(I(lI (6) …”
Section: Nonlinear Generalized Predictivementioning
confidence: 99%
“…It is critical to observe that this hierarchical low-rank structure is not invariant to row and column permutations. GOFMM appropriately permutes K using only entries from K, before constructing the matrices U, V, D, and S. Background and significance: Dense SPD matrices appear in Cholesky and LU factorization [4], in Schur complement matrices [5], in Hessian operators in optimization in simulation [6] and machine learning [7], in kernel methods for statistical learning [8], [9], and in N -body methods and integral equations [10], [2]. In many applications, the entries of the input matrix K are given by K ij = K(x i , x j ), where x i and x j are points in D dimensions and K is a kernel function, for example a Gaussian with bandwidth h: exp(−1/2h −2 x i −x j ).…”
Section: Introductionmentioning
confidence: 99%
“…Left: Compression time and break down into three phases: neighbor search, tree creation, and skeletonization. Right: Matrix-matrix multiplication of kernel matrix with 5M-by-512 matrix for both synchronous kernel (light brown) and asynchronous version (dark brown) (6,…”
mentioning
confidence: 99%
“…Dense SPD matrices appear in scienti c computing, statistical inference, and data analytics. They appear in Cholesky and LU factorization [16], in Schur complement matrices for saddle point problems [6], in Hessian operators in optimization [36], in kernel methods for statistical learning [17,23], and in N-body methods and integral equations [19,20]. In many applications, the entries of the input matrix K are given by K i j = K(x i , x j ) : R d × R d → R, where K is a kernel function.…”
Section: Introductionmentioning
confidence: 99%