2020
DOI: 10.48550/arxiv.2002.02406
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Message Passing Query Embedding

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 0 publications
0
3
0
Order By: Relevance
“…Complex Query Answering. In the complex (multi-hop) query answering setup with logical operators, existing models employ different approaches, e.g., geometric [15,23,38], probabilistic [24,9], neural-symbolic [26,8,5], neural [21,4], and GNN [11,3]. Still, all the approaches are created and evaluated exclusively in the transductive mode where the set of entities does not change at inference time.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Complex Query Answering. In the complex (multi-hop) query answering setup with logical operators, existing models employ different approaches, e.g., geometric [15,23,38], probabilistic [24,9], neural-symbolic [26,8,5], neural [21,4], and GNN [11,3]. Still, all the approaches are created and evaluated exclusively in the transductive mode where the set of entities does not change at inference time.…”
Section: Related Workmentioning
confidence: 99%
“…Having a uniform featurization mechanism for both seen and unseen entities, it is now possible to apply any previously-transductive complex query answering model with learnable entity embeddings and logical operators [23,11,24,8]. Moreover, it was recently shown [5] that a combination of simple link prediction pre-training and a non-parametric logical executor allows to effectively answer complex FOL queries in the inference-only regime without training on any complex query sample.…”
Section: Nodepiece-qe: Inductive Node Representationmentioning
confidence: 99%
See 1 more Smart Citation