2021 60th IEEE Conference on Decision and Control (CDC) 2021
DOI: 10.1109/cdc45484.2021.9683286
|View full text |Cite
|
Sign up to set email alerts
|

Neural Network Verification using Polynomial Optimisation

Abstract: The desire to provide robust guarantees on neural networks has never been more important, as their prevalence in society is increasing. One popular method that has seen a large amount of success is to use bounds on the activation functions within these networks to provide such guarantees. However, due to the large number of possible ways to bound the activation functions, there is a trade-off between conservativeness and complexity. We approach the problem from a different perspective, using polynomial optimis… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
5
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
2

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 18 publications
0
5
0
Order By: Relevance
“…These constraints can be of any form, however common choices are box, slope and sector constraints due to their ability to be expressed in a semi-definite programming framework. In our framework, we are able to choose any set of polynomial constraints to represent the NN, as shown in previous work [19], [20]. These polynomial constraints can be represented as a semi-algebraic set, which we express with notation…”
Section: A Abstracting the Neural Network As Constraintsmentioning
confidence: 99%
See 1 more Smart Citation
“…These constraints can be of any form, however common choices are box, slope and sector constraints due to their ability to be expressed in a semi-definite programming framework. In our framework, we are able to choose any set of polynomial constraints to represent the NN, as shown in previous work [19], [20]. These polynomial constraints can be represented as a semi-algebraic set, which we express with notation…”
Section: A Abstracting the Neural Network As Constraintsmentioning
confidence: 99%
“…where α is determined by the IBP bounds. Alternatively we can use two overlapping sectors which can provide better constraints on the activation function when the IBP bounds are large [19].…”
Section: A Abstracting the Neural Network As Constraintsmentioning
confidence: 99%
“…For the sigmoid and tanh activation functions it is possible to create two overlapping sector constraints using the IBP values. [15] provides more details about these constraints.…”
Section: Neural Network Constraintsmentioning
confidence: 99%
“…To alleviate the computational cost, a memory-efficient first-order algorithm was introduced in (Dathathri et al 2020). With the same aim, layerwise SDP relaxations were used in (Batten et al 2021;Newton and Papachristodoulou 2021) by exploiting the cascaded NN structures based on chordal graph decomposition (Zheng, Fantuzzi, and Papachristodoulou 2021). To the best of our knowledge, the LayerSDP method (Batten et al 2021) achieves the tightest relaxations by combining SDP relaxation with triangle relaxation (Ehlers 2017).…”
Section: Introductionmentioning
confidence: 99%