2022
DOI: 10.48550/arxiv.2205.09968
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A General Framework for quantifying Aleatoric and Epistemic uncertainty in Graph Neural Networks

Abstract: Graph Neural Networks (GNN) provide a powerful framework that elegantly integrates Graph theory with Machine learning for modeling and analysis of networked data. We consider the problem of quantifying the uncertainty in predictions of GNN stemming from modeling errors and measurement uncertainty. We consider aleatoric uncertainty in the form of probabilistic links and noise in feature vector of nodes, while epistemic uncertainty is incorporated via a probability distribution over the model parameters. We prop… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 26 publications
0
1
0
Order By: Relevance
“…The overall candidate node prediction task is summarized in Algorithm A.1 in the appendix. It is interesting to note that if this GNN classifier can provide a confidence interval around its class predictions as shown in [28], then the succeeding DRL engine can utilize that interval while searching for optimal seed nodes. This further strengthens the overall framework and is kept as future work.…”
Section: Prediction Of Candidate Nodesmentioning
confidence: 99%
“…The overall candidate node prediction task is summarized in Algorithm A.1 in the appendix. It is interesting to note that if this GNN classifier can provide a confidence interval around its class predictions as shown in [28], then the succeeding DRL engine can utilize that interval while searching for optimal seed nodes. This further strengthens the overall framework and is kept as future work.…”
Section: Prediction Of Candidate Nodesmentioning
confidence: 99%