ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2020
DOI: 10.1109/icassp40776.2020.9054731
|View full text |Cite
|
Sign up to set email alerts
|

Building Firmly Nonexpansive Convolutional Neural Networks

Abstract: Building nonexpansive Convolutional Neural Networks (CNNs) is a challenging problem that has recently gained a lot of attention from the image processing community. In particular, it appears to be the key to obtain convergent Plugand-Play algorithms. This problem, which relies on an accurate control of the the Lipschitz constant of the convolutional layers, has also been investigated for Generative Adversarial Networks to improve robustness to adversarial perturbations. However, to the best of our knowledge, n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
22
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 29 publications
(22 citation statements)
references
References 16 publications
0
22
0
Order By: Relevance
“…Prior work has also shown that Lipschitz constrained residual networks yield excellent performance without sacrificing stable convergence [23], [46]. Additionally, there has recently been an explosion of techniques for training Lipschitz constrained and firmly nonexpansive deep neural nets [23], [64]- [66].…”
Section: B Convergence Analysismentioning
confidence: 99%
“…Prior work has also shown that Lipschitz constrained residual networks yield excellent performance without sacrificing stable convergence [23], [46]. Additionally, there has recently been an explosion of techniques for training Lipschitz constrained and firmly nonexpansive deep neural nets [23], [64]- [66].…”
Section: B Convergence Analysismentioning
confidence: 99%
“…PnP methods have been successfully applied in the literature with various splitting schemes: HQS (Zhang et al, 2017b;, ADMM (Romano et al, 2017;Ryu et al, 2019), Proximal Gradient Descent (PGD) (Terris et al, 2020). First used with classical non deep denoisers such as BM3D (Chan et al, 2016) and pseudo-linear denoisers (Nair et al, 2021;, more recent PnP approaches (Meinhardt et al, 2017;Ryu et al, 2019) rely on efficient off-theshelf deep denoisers such as DnCNN (Zhang et al, 2017a).…”
Section: Related Workmentioning
confidence: 99%
“…Sreehari et al (2016) first used the proximal theorem of Moreau (Moreau, 1965) to give sufficient conditions for the denoiser to be an exact proximal map, which are applied to a pseudo-linear denoiser. The convergence with pseudo-linear denoisers have then been extensively studied (Gavaskar & Chaudhury, 2020;Nair et al, 2021;Chan, 2019) (Sun et al, 2019), firmly nonexpansive (Sun et al, 2021;Terris et al, 2020) or simply a nonexpansive (Reehorst & Schniter, 2018;Liu et al, 2021). These settings are unrealistic as deep denoisers do not generally satisfy such properties.…”
Section: Related Workmentioning
confidence: 99%
“…The theoretical convergence of sequences generated by PnP algorithms has drawn a lot of attention in the last years [12,13,14,15]. However, it often relies on strong structural constraints on the denoiser (e.g.…”
Section: Introductionmentioning
confidence: 99%
“…However, it often relies on strong structural constraints on the denoiser (e.g. NNs without residual skip connection) [14,15], and/or the limit point is not clearly characterized [12,13].…”
Section: Introductionmentioning
confidence: 99%