2019
DOI: 10.48550/arxiv.1909.10325
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Graph Signal Processing -- Part II: Processing and Analyzing Signals on Graphs

Ljubisa Stankovic,
Danilo Mandic,
Milos Dakovic
et al.

Abstract: Data analytics on graphs deals with information processing of data acquired on irregular but structured graph domains. The focus of Part I of this monograph has been on both the fundamental and higher-order graph properties, graph topologies, and spectral representations of graphs. Part I also establishes rigorous frameworks for vertex clustering and graph segmentation, and illustrates the power of graphs in various data association tasks. Part II embarks on these concepts to address the algorithmic and practi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
3
3

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(13 citation statements)
references
References 66 publications
0
13
0
Order By: Relevance
“…Chebyshev basis vs. other bases. Chebyshev polynomials are widely used to approximate various functions in digital signal processing and graph signal filtering [31,32]. It has been shown that for analytic functions in an ellipse containing the approximation interval, the truncated Chebyshev expansions give an approximate minimax polynomial [11].…”
Section: The Motivation Of Revisiting Chebnetmentioning
confidence: 99%
“…Chebyshev basis vs. other bases. Chebyshev polynomials are widely used to approximate various functions in digital signal processing and graph signal filtering [31,32]. It has been shown that for analytic functions in an ellipse containing the approximation interval, the truncated Chebyshev expansions give an approximate minimax polynomial [11].…”
Section: The Motivation Of Revisiting Chebnetmentioning
confidence: 99%
“…where l represents the index of each layer, c l is the number of filters (channels) of the l-th layer, Θ l i,j is a diagonal matrix which contains the set of learnt parameters of the l-th layer, and ρ(•) is the activation function of neurons. In (121), the summation ensures the aggregation of features filtered by different convolutional kernels, Θ l i,j , which is similar to a linear combination across kernels in CNNs. Although it achieves graph convolution through NNs, this work has two main limitations: i) the localisation at the vertex domain cannot be ensured by Θ l i,j , although it is crucial in convolutional neural networks to extract local stationary features; ii) computational complexity brought by the O(N 2 ) multiplications of U and U T , and the eigendecomposition of L to obtain U, at the first time, may be prohibitive for large graphs.…”
Section: Graph Fourier Transformmentioning
confidence: 99%
“…the volume normalized cut, CutV (V 1 , V 2 ) in (156), takes the form of a generalised Rayleigh quotient of L, given by [120,121] CutV…”
Section: Spectral Bisection Based Minimum Cutmentioning
confidence: 99%
“…Graph Shift Filters The weighted adjacency matrix can be used as a shift operator to filter signals on graphs. Such a graph filter represents a linear combination of vertex-shifted graph signals, which captures graph information at a local level [15]. For example, the operation g = (I + A)f produces a filtered signal, g ∈ R N , such that g n = f n + m∈Ωn a n,m f m , where Ω n denotes the 1-hop neighbours that are directly connected to the n-th node.…”
Section: Graph Signal Processingmentioning
confidence: 99%
“…For example, the operation g = (I + A)f produces a filtered signal, g ∈ R N , such that g n = f n + m∈Ωn a n,m f m , where Ω n denotes the 1-hop neighbours that are directly connected to the n-th node. For M graph signals stacked in a matrix form as F ∈ R N ×M , the resulting graph filter can be compactly written as G = (I + A)F [15].…”
Section: Graph Signal Processingmentioning
confidence: 99%