2015 23rd European Signal Processing Conference (EUSIPCO) 2015
DOI: 10.1109/eusipco.2015.7362634
|View full text |Cite
|
Sign up to set email alerts
|

Enhanced lasso recovery on graph

Abstract: This work aims at recovering signals that are sparse on graphs. Compressed sensing offers techniques for signal recovery from a few linear measurements and graph Fourier analysis provides a signal representation on graph. In this paper, we leverage these two frameworks to introduce a new Lasso recovery algorithm on graphs. More precisely, we present a non-convex, non-smooth algorithm that outperforms the standard convex Lasso technique. We carry out numerical experiments on three benchmark graph datasets. Spar… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 12 publications
(21 citation statements)
references
References 13 publications
(20 reference statements)
0
21
0
Order By: Relevance
“…To mitigate issues associated with diminishing gradients and loss of accuracy in such a deep architecture, the connections between layers are implemented as gated residual connections. 53 After the last attention block, the node features are pooled ( Fig. 1), resulting in a final molecular embedding consisting of 2,048 floats.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…To mitigate issues associated with diminishing gradients and loss of accuracy in such a deep architecture, the connections between layers are implemented as gated residual connections. 53 After the last attention block, the node features are pooled ( Fig. 1), resulting in a final molecular embedding consisting of 2,048 floats.…”
Section: Resultsmentioning
confidence: 99%
“…We developed the model based on the graph attention network 52 architecture. Our additions to that architecture are usage of gated residual connections 53 and feature vectors on edges as well as vertices. The model operates on molecular graphs, vertices being the atoms, and edges being the bonds.…”
Section: Model Architecture and Trainingmentioning
confidence: 99%
See 1 more Smart Citation
“…We note that the RGCN defined in ( 4)-( 5) is an anisotropic variant of GCN [33]. Similar to Residual GateGCN [34], our RGCN has residual connections on the node feature representations, and explicitly maintains edge feature at each layer. Intuitively, the edge feature representations at different layers encode the pair-wise human interaction information.…”
Section: Relational Graph Convolutional Networkmentioning
confidence: 99%
“…The most frequent one with (p, q) = (1, 2) is used [38], [24] as a surrogate to 0 [39], [40]. This ratio was used to enhance lasso recovery on graphs [41]. Its early history includes the "minimum entropy deconvolution" proposed in [42], where the "varimax norm", akin to kurtosis ( 4 / 2 ) 4 , is maximized to yield visually simpler (spikier) signals.…”
Section: B Penalties With Quasinorm and Norm Ratiosmentioning
confidence: 99%