2021
DOI: 10.48550/arxiv.2103.03212
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Weisfeiler and Lehman Go Topological: Message Passing Simplicial Networks

Abstract: The pairwise interaction paradigm of graph machine learning has predominantly governed the modelling of relational systems. However, graphs alone cannot capture the multi-level interactions present in many complex systems and the expressive power of such schemes was proven to be limited. To overcome these limitations, we propose Message Passing Simplicial Networks (MP-SNs), a class of models that perform message passing on simplicial complexes (SCs) -topological objects generalising graphs to higher dimensions… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
22
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(28 citation statements)
references
References 23 publications
0
22
0
Order By: Relevance
“…This SNN architecture is a particular case of the proposed SCNN with less expressive power but also with less parameters. The message passing neural network (MPNN) for simplicial complexes [13] aggregates and updates features from direct simplicial neighbours and simplices of different orders, e.g., nodes and triangles. By considering an order-one simplicial convolution as the message aggregation step, we then obtain the architectures in [14,Eq.…”
Section: Simplicial Convolutional Neural Networkmentioning
confidence: 99%
See 2 more Smart Citations
“…This SNN architecture is a particular case of the proposed SCNN with less expressive power but also with less parameters. The message passing neural network (MPNN) for simplicial complexes [13] aggregates and updates features from direct simplicial neighbours and simplices of different orders, e.g., nodes and triangles. By considering an order-one simplicial convolution as the message aggregation step, we then obtain the architectures in [14,Eq.…”
Section: Simplicial Convolutional Neural Networkmentioning
confidence: 99%
“…In [8], a basic simplicial neural network (SNN) was proposed with a convolutional layer composed of a basic simplicial filter [12] and a nonlinearity. Message passing neural networks (MPNNs) have been generalized to simplicial complexes in [13] where the aggregation and updating functions consider in addition to the edge data also data defined on adjacent simplices, i.e., nodes and triangles. The neural network architectures in [14,15] are instances of [13] by specifying the aggregation functions as simplicial filters.…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…The class of models that are based on the WL test are not in general locally permutation equivariant in that they still use a message passing model with permutation invariant update function. Despite this, many of these models inject permutation equivariant information into the feature space, which improves the expressivity of the models (Bouritsas et al, 2020;Morris et al, 2019a;Bodnar et al, 2021b;a). The information to be injected into the feature space is predetermined in these models by a choice of what structural or topological information to use, whereas our model uses representations of the permutation group, making it a very general model that still guarantees expressivity.…”
Section: Local Equivariant Graph Networkmentioning
confidence: 99%
“…Morris et al (2019a) build models based on different WL variants that consider local and global connections. Bodnar et al (2021b) introduce a WL test on simplicial complexes and incorporate this into a message passing scheme. Bodnar et al (2021a) extend work on simplicial complexes to cell complexes, which subsume simplicial complexes.…”
Section: Introductionmentioning
confidence: 99%