2019 IEEE 8th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP) 2019
DOI: 10.1109/camsap45676.2019.9022676
|View full text |Cite
|
Sign up to set email alerts
|

An Underparametrized Deep Decoder Architecture for Graph Signals

Abstract: While deep convolutional architectures have achieved remarkable results in a gamut of supervised applications dealing with images and speech, recent works show that deep untrained non-convolutional architectures can also outperform state-of-the-art methods in several tasks such as image compression and denoising. Motivated by the fact that many contemporary datasets have an irregular structure different from a 1D/2D grid, this paper generalizes untrained and underparametrized non-convolutional architectures to… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
7
1

Relationship

3
5

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 21 publications
0
6
0
Order By: Relevance
“…The GCG architecture presented in Section IV incorporated the topology of G via the vertex-based convolutions implemented by the graph filter H with S = Ã. In this section, we introduce the graph decoder (GD) 2 architecture, a new graph-aware denoising NN that incorporates the topology of G via a (nested) collection of graph upsampling operators [48]. Specifically, we propose the linear transformation for the GD denoiser to be given by…”
Section: Graph Upsampling Decodermentioning
confidence: 99%
“…The GCG architecture presented in Section IV incorporated the topology of G via the vertex-based convolutions implemented by the graph filter H with S = Ã. In this section, we introduce the graph decoder (GD) 2 architecture, a new graph-aware denoising NN that incorporates the topology of G via a (nested) collection of graph upsampling operators [48]. Specifically, we propose the linear transformation for the GD denoiser to be given by…”
Section: Graph Upsampling Decodermentioning
confidence: 99%
“…The goal in graph-signal denoising is to recover the original graph signal x ∈ R N given the noisy graph-signal observation y = x + w, with w ∈ R N representing a noise vector. To that end, we approach the denoising problem as in [17], [30] by minimizing…”
Section: A Graph Signal Denoisingmentioning
confidence: 99%
“…where the scalars α and β control the trade-off between the different components of the loss function at the generator. As typically done, we optimize (10) in an iterative manner where we first fix θ and update the parameters of the discriminator via mini-batch training, followed by fixing ψ and updating the values of θ accordingly. More details on the implementation and hyperparameter selection are given in the next section.…”
Section: Generative Adversarial Network For Graph Signal Imputationmentioning
confidence: 99%
“…A broad range of graph signals exist, with meaningful examples including neural activity defined on the regions of a brain network, the spread of a rumor on a social network, and the delay experienced at each station of a subway network. In recent years, the processing of graph signals has attracted a lot of attention from the statistics, machine learning (ML), and signal processing (SP) communities, with relevant results including sampling and inpainting [1][2][3][4][5][6][7], denoising [8][9][10], filtering [11,12], and deep graph convolutional architectures [13][14][15] for graph supported data.…”
Section: Introductionmentioning
confidence: 99%