2021
DOI: 10.1007/s11280-021-00878-3
|View full text |Cite
|
Sign up to set email alerts
|

Binarized graph neural network

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 22 publications
(10 citation statements)
references
References 28 publications
0
8
0
Order By: Relevance
“…[34] proposes an analysis that provides insights into better extracting and fusing information from the protein-protein interaction network for drug repurposing. Furthermore, [7,14,15,20,35] designed representation learning models [29,36] that are able to preserve both positive and negative link information within the signed graphs. These methodologies are lacking the capability to fully exploit the structural and attribute information that lies in the signed bipartite graphs.…”
Section: Gnns On Bipartite Graphmentioning
confidence: 99%
“…[34] proposes an analysis that provides insights into better extracting and fusing information from the protein-protein interaction network for drug repurposing. Furthermore, [7,14,15,20,35] designed representation learning models [29,36] that are able to preserve both positive and negative link information within the signed graphs. These methodologies are lacking the capability to fully exploit the structural and attribute information that lies in the signed bipartite graphs.…”
Section: Gnns On Bipartite Graphmentioning
confidence: 99%
“…Degree-Quant [317] performs quantization-aware training on graphs, which results in INT8 models often performing as well as their FP32 counterparts. BGN [318] learns binarized parameters and enables GNNs to learn discrete embedding. Bi-GCN [320] binarizes both the network parameters and the node attributes and can significantly reduce the memory consumptions by 30x for both the network parameters and node attributes, and accelerate the inference by about 47x.…”
Section: Quantized Gnnmentioning
confidence: 99%
“…In [47], the authors propose to binarize the Graph Attention (GAT) operator [45], and evaluate their method on smallscale datasets such as Cora [37] and Pubmed [29]. In [48], the authors apply the XNOR-Net method to the GCN model [29] with success, but also on small-scale datasets.…”
Section: Model Compression In Geometric Deep Learningmentioning
confidence: 99%
“…Balance functions Recent work [42] has uncovered possible limitations in binary graph and point cloud learning models when quantizing the output of max-pooling aggregation of batch-normalized high-dimensional features. Similarly, [47] claim that a balance function is necessary to avoid large values in the outputs of the dot product operations when most pre-quantization inputs are positive. We evaluate two strategies for re-centering the input of sign after max aggregation, namely mean-centering, and median-centering (thus ensuring a perfectly balanced distribution of positive and negative values pre-quantization).…”
Section: Experimental Evaluationmentioning
confidence: 99%
See 1 more Smart Citation