Proceedings of the 42nd ACM SIGPLAN International Conference on Programming Language Design and Implementation 2021
DOI: 10.1145/3453483.3454056
|View full text |Cite
|
Sign up to set email alerts
|

Fast and precise certification of transformers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 20 publications
0
7
0
Order By: Relevance
“…CROWN-BaF (Shi et al 2020) addresses the above issue by using the forward mode to handle non-linear functions in attention layers. Unlike CROWN-BaF, DeepT (Bonaert et al 2021), which is based on abstract interpretation and Zonotopes, improves efficiency via noise symbol reduction. Prior work shows that DeepT is more precise than CROWN-BaF but consumes more time.…”
Section: Contributions Our Main Contributions Arementioning
confidence: 99%
See 1 more Smart Citation
“…CROWN-BaF (Shi et al 2020) addresses the above issue by using the forward mode to handle non-linear functions in attention layers. Unlike CROWN-BaF, DeepT (Bonaert et al 2021), which is based on abstract interpretation and Zonotopes, improves efficiency via noise symbol reduction. Prior work shows that DeepT is more precise than CROWN-BaF but consumes more time.…”
Section: Contributions Our Main Contributions Arementioning
confidence: 99%
“…Limitation of prior works. Indeed, prior works (Shi et al 2020;Bonaert et al 2021) have made valuable contributions to the robustness certification of Transformers. However, we contend that their precision is compromised due to the loose relaxations applied to softmax functions.…”
Section: Introductionmentioning
confidence: 99%
“…SoftMax function [10][11][12] has a wide range of applications in machine learning and deep learning, mainly for multiclassification problems. Most of the time, in the multi-classification problem, we want to output the probability of getting a certain category.…”
Section: Anti-domain Adaptation Of Knowledge Mappingmentioning
confidence: 99%
“…DeepZ is a novel generic, point-wise zonotope abstract transformers for ReLU, Sigmoid, and Tanh activations which is more scalable and fast. Another work in this direction is proposed by [9] where authors proposed DeepT an abstract transformer-based network certification method. They attempted to certify larger transformers against synonym replacement based attacks.…”
Section: Robustness By Certificationmentioning
confidence: 99%