2016
DOI: 10.1007/978-3-319-18842-3_6
|View full text |Cite
|
Sign up to set email alerts
|

Conic Linear Programming

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
463
0
20

Year Published

2016
2016
2020
2020

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 320 publications
(483 citation statements)
references
References 32 publications
0
463
0
20
Order By: Relevance
“…The minimization problem of (10) was solved using the quasi-Newton optimization technique [22] because it does not require the calculation of the Hessian, and hence fast. The initial vectors taken for the optimization were the columns of the identity matrix, that is Φ = I.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The minimization problem of (10) was solved using the quasi-Newton optimization technique [22] because it does not require the calculation of the Hessian, and hence fast. The initial vectors taken for the optimization were the columns of the identity matrix, that is Φ = I.…”
Section: Resultsmentioning
confidence: 99%
“…We have chosen the quasi-Newton method [22] to carry out the minimization in (10), the convergence of which is well known. To establish the convergence of the proposed algorithm as a whole, we define the error terms of (10) at the end of the i-th iteration as…”
Section: B Convergencementioning
confidence: 99%
“…To ensure that the gradient step respects the constraints on α (α d ≥ 0 and d α d = 1), the following strategy is used: similarly to [47,48,45], the…”
Section: : End Formentioning
confidence: 99%
“…In this case, the integrand is known explicitly throughout the integral since all the terms in (6) are known explicitly in terms of P from the joint angles q = q(t, P ). Given J(P ) and its gradient, we could easily minimize it over P using Matlab's BFGS [9] algorithm in the function "fminunc." Figure 3 shows the locally optimal solution found to this problem using the parameter optimization approach mentioned above.…”
Section: Case 1: Fully Actuated Robotmentioning
confidence: 99%
“…In addition to this example, we have used this basic approach to solve for much more complex optimal high-dive motions for a human-like diver in [16]. When we computed the above solution to the underactuated Acrobot, we did not expect numerical difficulties, since we had the exact gradient of the objective function and the optimization algorithm has well-established convergence properties for this case [9]. However, we did encounter some numerical problems and had to adjust some of the tolerances in the optimizer in order to achieve convergence, and the computation time, even in the best of cases (about 5 minutes on a PIII-800 PC), was much longer than in the previous example.…”
Section: Case 2: Underactuated Robotmentioning
confidence: 99%