ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2020
DOI: 10.1109/icassp40776.2020.9053859
|View full text |Cite
|
Sign up to set email alerts
|

Gaussian Processes Over Graphs

Abstract: We propose Gaussian processes for signals over graphs (GPG) using the apriori knowledge that the target vectors lie over a graph. We incorporate this information using a graph-Laplacian based regularization which enforces the target vectors to have a specific profile in terms of graph Fourier transform coeffcients, for example lowpass or bandpass graph signals. We discuss how the regularization affects the mean and the variance in the prediction output. In particular, we prove that the predictive variance of t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
11
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 17 publications
(11 citation statements)
references
References 46 publications
0
11
0
Order By: Relevance
“…Using discrete kernels in GPs is a relatively unexplored area, possibly due to the difficulties in hyper-parameter optimization and inducing point selection. Discrete kernels have been used on graphs [42] and strings [43] (also for biological problems [23]), but so far only on relatively small datasets with full GPs. In parallel work, [33] study GPs on strings for Bayesian optimization, but their problems are small they do not use inducing points or any other scalability approach.…”
Section: Related Work A: Sparse Gaussian Processesmentioning
confidence: 99%
“…Using discrete kernels in GPs is a relatively unexplored area, possibly due to the difficulties in hyper-parameter optimization and inducing point selection. Discrete kernels have been used on graphs [42] and strings [43] (also for biological problems [23]), but so far only on relatively small datasets with full GPs. In parallel work, [33] study GPs on strings for Bayesian optimization, but their problems are small they do not use inducing points or any other scalability approach.…”
Section: Related Work A: Sparse Gaussian Processesmentioning
confidence: 99%
“…A second option is to leverage notions of graph convolutions for the same purpose (Opolka and Liò, 2020;Walker and Glocker, 2019). From a slightly different perspective, the studies by Venkitaraman et al (2020); Zhi et al (2020) follow the literature of multi-output GPs with a separable kernel design. Finally, a Matérn GP on graphs has been proposed by Borovitskiy et al (2021), although their model resembles kernels on graphs (Smola and Kondor, 2003).…”
Section: Related Workmentioning
confidence: 99%
“…A GP model on graphs would allow for modelling uncertainty associated with the nodes in the graph and making predictions on unlabelled nodes. A key requirement in building GPs on graphs is incorporating the graph information into the design of the GP kernel, for example using convolution-like operations (Ng et al, 2018;Walker and Glocker, 2019;Opolka and Liò, 2020; or following the separable kernel design of multi-output GPs (Venkitaraman et al, 2020;Zhi et al, 2020).…”
Section: Introductionmentioning
confidence: 99%
“…Using discrete kernels in GPs is a relatively unexplored area, possibly due to the difficulties in hyper-parameter optimization and inducing point selection. Discrete kernels have been used on graphs [35] and strings [2] (also for biological problems [31]), but so far only on relatively small data sets with full GPs. To the best of our knowledge, we are the first to address the scalability problem with discrete GPs using inducing points and discrete optimization.…”
Section: Related Workmentioning
confidence: 99%