Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) 2018
DOI: 10.18653/v1/p18-1024
|View full text |Cite
|
Sign up to set email alerts
|

LinkNBed: Multi-Graph Representation Learning with Entity Linkage

Abstract: Knowledge graphs have emerged as an important model for studying complex multirelational data. This has given rise to the construction of numerous large scale but incomplete knowledge graphs encoding information extracted from various resources. An effective and scalable approach to jointly learn over multiple graphs and eventually construct a unified graph is a crucial next step for the success of knowledge-based inference for many downstream applications. To this end, we propose LinkNBed, a deep relational l… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
3
2

Relationship

1
9

Authors

Journals

citations
Cited by 38 publications
(24 citation statements)
references
References 28 publications
0
22
0
Order By: Relevance
“…It combines three different transformation methods to learn embedding for supporting multi-lingual learning. LinkNbed [21] learns latent representations of nodes and edges in multiple graphs where a unified graph embedding is constructed, avoiding the bias caused by the transformation between different HIN embeddings.…”
Section: Learning Across Multiple Hinsmentioning
confidence: 99%
“…It combines three different transformation methods to learn embedding for supporting multi-lingual learning. LinkNbed [21] learns latent representations of nodes and edges in multiple graphs where a unified graph embedding is constructed, avoiding the bias caused by the transformation between different HIN embeddings.…”
Section: Learning Across Multiple Hinsmentioning
confidence: 99%
“…Das et al [3] show that training Random Forest on around 1,000 labels can obtain ∼ 95% F-measure for easy data sets, and ∼ 80% F-measure for harder data sets. Deep learning allows comparing long text values by their embedding representations, and starts to show promise when matching texts and dirty data [32,48]. Finally, logic-based learning methods (e.g., probabilistic soft logic) enable linking entities of multiple types at the same time, called collective linkage [38].…”
Section: For Entity Resolutionmentioning
confidence: 99%
“…Challenged by the diverse schemata, relational structures and granularities of knowledge representations in different KGs (Nikolov et al, 2009), traditional symbolic methods usually fall short of supporting heterogeneous knowledge association (Suchanek et al, 2011;Lacoste-Julien et al, 2013;Paulheim and Bizer, 2013). Recently, increasing efforts have been put into exploring embeddingbased methods (Chen et al, 2017;Trivedi et al, 2018;Jin et al, 2019). Such methods capture the associations of entities or concepts in a vector space, which can help overcome the symbolic and schematic heterogeneity .…”
Section: Introductionmentioning
confidence: 99%