2020
DOI: 10.1007/s11228-019-00526-z
|View full text |Cite
|
Sign up to set email alerts
|

Deep Neural Network Structures Solving Variational Inequalities

Abstract: Motivated by structures that appear in deep neural networks, we investigate nonlinear composite models alternating proximity and affine operators defined on different spaces. We first show that a wide range of activation operators used in neural networks are actually proximity operators. We then establish conditions for the averagedness of the proposed composite constructs and investigate their asymptotic properties. It is shown that the limit of the resulting process solves a variational inequality which, in … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
104
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 94 publications
(106 citation statements)
references
References 39 publications
1
104
0
Order By: Relevance
“…, K − 1}, R k (· + b k ) is firmly nonexpansive [38,Proposition 12.28]. Finally, [59,Theorem 3.8] completes the proof.…”
Section: Robustness Of Irestnet To An Input Perturbationmentioning
confidence: 81%
See 4 more Smart Citations
“…, K − 1}, R k (· + b k ) is firmly nonexpansive [38,Proposition 12.28]. Finally, [59,Theorem 3.8] completes the proof.…”
Section: Robustness Of Irestnet To An Input Perturbationmentioning
confidence: 81%
“…Before stating our main stability theorem, we recall the result from [59,Lemma 3.3] in Proposition 4 below. We then derive Proposition 5, which will appear useful when addressing the robustness of the global network.…”
Section: Preliminary Resultsmentioning
confidence: 99%
See 3 more Smart Citations