2022
DOI: 10.48550/arxiv.2203.08058
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Graph filtering over expanding graphs

Abstract: Our capacity to learn representations from data is related to our ability to design filters that can leverage their coupling with the underlying domain. Graph filters are one such tool for network data and have been used in a myriad of applications. But graph filters work only with a fixed number of nodes despite the expanding nature of practical networks. Learning filters in this setting is challenging not only because of the increased dimensions but also because the connectivity is known only up to an attach… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 27 publications
0
1
0
Order By: Relevance
“…However, real networks are dynamic, noisy, and the respective signals are also time varying. Therefore, one of the biggest challenges is to extend graph filters to this dynamic setting in a principled manner by accounting for the variability in the number of nodes, edges, and graph signal [325]- [327]. 4) In several nonlinear tasks (e.g., classification) graph filters have been often designed via suboptimal losses to prioritize convex and mathematically tractable solutions.…”
Section: A Look Aheadmentioning
confidence: 99%
“…However, real networks are dynamic, noisy, and the respective signals are also time varying. Therefore, one of the biggest challenges is to extend graph filters to this dynamic setting in a principled manner by accounting for the variability in the number of nodes, edges, and graph signal [325]- [327]. 4) In several nonlinear tasks (e.g., classification) graph filters have been often designed via suboptimal losses to prioritize convex and mathematically tractable solutions.…”
Section: A Look Aheadmentioning
confidence: 99%