2013
DOI: 10.1007/978-3-642-40994-3_40
|View full text |Cite
|
Sign up to set email alerts
|

Tensor Factorization for Multi-relational Learning

Abstract: Abstract. Relational learning has become ubiquitous in many fields of application. Here, we review tensor factorization for relational learning on the basis of Rescal, which has shown state-of-the-art relational learning results, while scaling to knowledge bases with millions of entities and possibly billions of known facts.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
97
0
1

Year Published

2014
2014
2023
2023

Publication Types

Select...
3
3
2

Relationship

2
6

Authors

Journals

citations
Cited by 128 publications
(129 citation statements)
references
References 9 publications
0
97
0
1
Order By: Relevance
“…From [4], we get idea from the relation network learning that tensor are applying for provide a way to represent multiple relations between the nodes. We expand the adjacent matrix A that stores the links information to an graph information tensor that stores how many links between the nodes.…”
Section: Graph Information Tensormentioning
confidence: 99%
“…From [4], we get idea from the relation network learning that tensor are applying for provide a way to represent multiple relations between the nodes. We expand the adjacent matrix A that stores the links information to an graph information tensor that stores how many links between the nodes.…”
Section: Graph Information Tensormentioning
confidence: 99%
“…As in [27] we can exploit the following property of the singular value decomposition (SVD) regarding the Kronecker product of two matrices [28]…”
Section: Collective Factorization Of Type-constrained Relationsmentioning
confidence: 99%
“…The use of factorization approaches to predict ground atoms was pioneered in [17]; [1], [18], [19], and [20] applied tensor models for this task, where [1] introduced the RESCAL model. This model was the basis of many recent published works: [21] introduced non-negative constraints, [22] presented a logistic factorization and [20] explicitly models the 2nd and 3rd order interactions. [23] introduced neural tensor networks that includes a RESCAL factorization and [2] used a neural tensor decomposition for defining graph-based priors during construction of the Google Knowledge Vault.…”
Section: Introductionmentioning
confidence: 99%
“…The big advantage of the proposed method is that R ( * ) can now be derived by simply projecting X ( * ) into the latent space that is spanned by the RESCAL factor matrix A. This can be done very efficiently: Consider the ALS updates derived in [17]. Essentially what is needed is to calculate the latent matrix for the materialized view, i.e., R ( * ) as…”
Section: Querying Factorized Probabilistic Databasesmentioning
confidence: 99%
“…T explicitly, what is both slow and impractical [18] and pattern-based approaches (such as stochastic gradient descent) have not proven to be effective with a Bernoulli cost function.…”
Section: Cost Functionsmentioning
confidence: 99%