2020 28th European Signal Processing Conference (EUSIPCO) 2021
DOI: 10.23919/eusipco47968.2020.9287610
|View full text |Cite
|
Sign up to set email alerts
|

Towards Finite-Time Consensus with Graph Convolutional Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(7 citation statements)
references
References 22 publications
0
7
0
Order By: Relevance
“…The goal of average consensus is to let each node reach the same average value of all nodes. It can be done in many ways including flooding, distributed linear iterations [ 26 ], deep neural networks [ 27 ], graph filters [ 28 ], and graph convolutional neural networks [ 29 ]. In this paper, we leverage the GNN [ 29 ] to reach average consensus on global model parameters for the following reasons.…”
Section: Decentralized Learning Tasks Over Networkmentioning
confidence: 99%
See 3 more Smart Citations
“…The goal of average consensus is to let each node reach the same average value of all nodes. It can be done in many ways including flooding, distributed linear iterations [ 26 ], deep neural networks [ 27 ], graph filters [ 28 ], and graph convolutional neural networks [ 29 ]. In this paper, we leverage the GNN [ 29 ] to reach average consensus on global model parameters for the following reasons.…”
Section: Decentralized Learning Tasks Over Networkmentioning
confidence: 99%
“…It can be done in many ways including flooding, distributed linear iterations [ 26 ], deep neural networks [ 27 ], graph filters [ 28 ], and graph convolutional neural networks [ 29 ]. In this paper, we leverage the GNN [ 29 ] to reach average consensus on global model parameters for the following reasons. First, as for the decentralized learning setting, the network composed of devices can be modeled as a graph with the set of nodes and each link between devices can be seen as an edge.…”
Section: Decentralized Learning Tasks Over Networkmentioning
confidence: 99%
See 2 more Smart Citations
“…We consider this scenario in the paper. If one wants to use fewer iterations while sacrificing some MSE, a new finitetime AC method [19] may be considered. The new method extends the idea to nonlinear graph filters and designs filter coefficients using a learning framework of graph convolutional neural networks.…”
Section: Introductionmentioning
confidence: 99%