2020
DOI: 10.1007/s00041-020-09761-7
|View full text |Cite|
|
Sign up to set email alerts
|

Parseval Proximal Neural Networks

Abstract: The aim of this paper is twofold. First, we show that a certain concatenation of a proximity operator with an affine operator is again a proximity operator on a suitable Hilbert space. Second, we use our findings to establish so-called proximal neural networks (PNNs) and stable tight frame proximal neural networks. Let $$\mathcal {H}$$ H and $$\mathcal {K}$$ K be real Hilbert spaces, $$b \in \mathcal {K}$$ … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
34
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
3

Relationship

3
7

Authors

Journals

citations
Cited by 44 publications
(35 citation statements)
references
References 33 publications
(51 reference statements)
0
34
0
Order By: Relevance
“…The unfolded networks we consider here fall into the larger class of proximal neural networks studied in [17][18][19]. Many other related works are in the context of dictionary learning or sparse coding: The central problem of sparse coding is to learn weight matrices for an unfolded version of ISTA.…”
Section: Related Workmentioning
confidence: 99%
“…The unfolded networks we consider here fall into the larger class of proximal neural networks studied in [17][18][19]. Many other related works are in the context of dictionary learning or sparse coding: The central problem of sparse coding is to learn weight matrices for an unfolded version of ISTA.…”
Section: Related Workmentioning
confidence: 99%
“…Recently, neural networks on infinite dimensional spaces have received a certain attention [23]. For applications of their finite dimensional counterparts, we refer to [29,30]. To this end, let (X, • ) be a real p -space, p ∈ (1, ∞) and A k : X → X, k = 1, 2, .…”
Section: Deep Learningmentioning
confidence: 99%
“…Recently, neural networks on infinite dimensional spaces have received a certain attention [21]. For applications of their finite dimensional counterparts, we refer to [27,28]. To this end, let (X, • ) be a real ℓ p -space, p ∈ (1, ∞) and A k : X → X, k = 1, 2, ..., d a family of affine mappings which are α k -firmly nonexpansive.…”
Section: Deep Learningmentioning
confidence: 99%