2022
DOI: 10.48550/arxiv.2208.03979
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A correlative sparse Lagrange multiplier expression relaxation for polynomial optimization

Abstract: In this paper, we consider polynomial optimization with correlative sparsity. For such problems, one may apply the correlative sparse sum of squares (CS-SOS) relaxations to exploit the correlative sparsity, which produces semidefinite programs of smaller sizes, compared with the classical moment-SOS relaxations. The Lagrange multiplier expression (LME) is also a useful tool to construct tight SOS relaxations for polynomial optimization problems, which leads to convergence with a lower relaxation order. However… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 22 publications
(67 reference statements)
0
1
0
Order By: Relevance
“…One direction is to try and reduce the complexity of analysis of discrete-time dynamics by finding sparse and exploitable structures other than switching. Correlative sparsity is generally incompatible with robust counterparts, but further investigation should lead to cases in which imposing that the multipliers ζ have a CSP [98]. Another avenue is to incorporate warm starts into SDP solvers so system estimates can be updated as more data gets added to D. Chapter 7…”
Section: This Work Formulated Infinite-dimensional Robust Counterpart...mentioning
confidence: 99%
“…One direction is to try and reduce the complexity of analysis of discrete-time dynamics by finding sparse and exploitable structures other than switching. Correlative sparsity is generally incompatible with robust counterparts, but further investigation should lead to cases in which imposing that the multipliers ζ have a CSP [98]. Another avenue is to incorporate warm starts into SDP solvers so system estimates can be updated as more data gets added to D. Chapter 7…”
Section: This Work Formulated Infinite-dimensional Robust Counterpart...mentioning
confidence: 99%