2021
DOI: 10.48550/arxiv.2106.06946
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Boosting Randomized Smoothing with Variance Reduced Classifiers

Abstract: Randomized Smoothing (RS) is a promising method for obtaining robustness certificates by evaluating a base model under noise. In this work we: (i) theoretically motivate why ensembles are a particularly suitable choice as base models for RS, and (ii) empirically confirm this choice, obtaining state of the art results in multiple settings. The key insight of our work is that the reduced variance of ensembles over the perturbations introduced in RS leads to significantly more consistent classifications for a giv… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 21 publications
0
1
0
Order By: Relevance
“…Such procedure can be implemented through data augmentation: the model is trained on transformed samples, thus we expect it will have higher robust accuracy. During the inference stage, the smoothed model has to be evaluated many times (Horváth et al 2021), thus increasing the complexity. In our approach, we can apply the certification directly to the original model trained in an augmented way.…”
Section: Types Of Model Training and Computation Cost Of Inferencementioning
confidence: 99%
“…Such procedure can be implemented through data augmentation: the model is trained on transformed samples, thus we expect it will have higher robust accuracy. During the inference stage, the smoothed model has to be evaluated many times (Horváth et al 2021), thus increasing the complexity. In our approach, we can apply the certification directly to the original model trained in an augmented way.…”
Section: Types Of Model Training and Computation Cost Of Inferencementioning
confidence: 99%