2016 IEEE 16th International Conference on Data Mining (ICDM) 2016
DOI: 10.1109/icdm.2016.0111
|View full text |Cite
|
Sign up to set email alerts
|

Large-Scale Embedding Learning in Heterogeneous Event Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
66
0
2

Year Published

2017
2017
2021
2021

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 75 publications
(74 citation statements)
references
References 18 publications
3
66
0
2
Order By: Relevance
“…This observation is in line with previous studies [7], and can be explained by the heterogeneity of node types in HINs. The nodes of different types in HINs have different properties, such as degrees distribution.…”
Section: Methodssupporting
confidence: 93%
See 3 more Smart Citations
“…This observation is in line with previous studies [7], and can be explained by the heterogeneity of node types in HINs. The nodes of different types in HINs have different properties, such as degrees distribution.…”
Section: Methodssupporting
confidence: 93%
“…Chang et al propose to embed HIN with additional node features via deep architectures [2], which does not suit for typical HINs consisting of only typed nodes and edges. Gui et al devise an HIN embedding algorithm to model a special type of HINs with hyper-edges, which does not apply to general HINs [7]. More recently, an HIN embedding algorithm is proposed, which transcribes semantics in HINs by meta-paths [4].…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…The proximity of the interaction between the two words can be modeled as the conditional probability of predicting the observed target word given the context word [7], where the conditional probability is estimated via softmax function, with low-dimensional vectors of words as parameters. This model has also been generalized to network data, such as [8], [11], [23]. …”
Section: Introductionmentioning
confidence: 99%