Proceedings of the 26th International Conference on World Wide Web Companion - WWW '17 Companion 2017
DOI: 10.1145/3041021.3054238
|View full text |Cite
|
Sign up to set email alerts
|

Enhancing Knowledge Graph Embedding with Probabilistic Negative Sampling

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(10 citation statements)
references
References 3 publications
0
10
0
Order By: Relevance
“…if t is a fixed constant. The rate decreases to (N log N) for the choice of t producing (17). It is also implied by (17)…”
Section: Upper Boundsmentioning
confidence: 87%
See 2 more Smart Citations
“…if t is a fixed constant. The rate decreases to (N log N) for the choice of t producing (17). It is also implied by (17)…”
Section: Upper Boundsmentioning
confidence: 87%
“…Let the number of entities N → ∞ and C, K, U, d E , d R , α, γ be absolute constants. If Assumptions 1 and 2 hold and ρ 1 + ρ 2 = o(log N), then asymptotic inequalities (16), (17), and (18) in Theorem 1 hold.…”
Section: Theoremmentioning
confidence: 99%
See 1 more Smart Citation
“…Kanojia et al [46] proposes probabilistic negative sampling to address the issue of skewed data that commonly exists in knowledge bases. For relations with less data, Uniform or Bernoulli random sampling fails to predict the missing part of golden triples among semantically possible options even after hundreds of epochs of training.…”
Section: Probabilistic Samplingmentioning
confidence: 99%
“…Kanojia et al [44]proposes probabilistic negative sampling to address the issue of skewed data that commonly exists in knowledge bases. For relations with less data, Uniform or Bernoulli random sampling fails to predict the missing part of golden triplets among semantically possible options even after hundreds of epochs of training.…”
Section: Probabilistic Samplingmentioning
confidence: 99%