2018
DOI: 10.48550/arxiv.1802.00910
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

GeniePath: Graph Neural Networks with Adaptive Receptive Paths

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
17
0

Year Published

2018
2018
2019
2019

Publication Types

Select...
6
1

Relationship

4
3

Authors

Journals

citations
Cited by 17 publications
(18 citation statements)
references
References 11 publications
0
17
0
Order By: Relevance
“…The system is designed to uncover fraudsters in the claim stage by classifying accounts or orders as fraudulent or not. We specifically address the problem of fraudster gang detection with the help of several powerful graph learning algorithms including unsupervised Deepwalk [16] and supervised DistRep and GeniePath [17]. The merits, knowledge, and practices we learn from applying graph data are discussed and we show how we apply them on our most popular real-world large-scale e-commerce insurance products.…”
Section: A Challenges In Insurance Fraud Detectionmentioning
confidence: 99%
See 1 more Smart Citation
“…The system is designed to uncover fraudsters in the claim stage by classifying accounts or orders as fraudulent or not. We specifically address the problem of fraudster gang detection with the help of several powerful graph learning algorithms including unsupervised Deepwalk [16] and supervised DistRep and GeniePath [17]. The merits, knowledge, and practices we learn from applying graph data are discussed and we show how we apply them on our most popular real-world large-scale e-commerce insurance products.…”
Section: A Challenges In Insurance Fraud Detectionmentioning
confidence: 99%
“…Common GNN approaches we use for the fraud detection problem are struct2vec [23] and GeniePath [17]. Struct2vec aggregates neighbors by simply summing them up while GeniePath stacks adaptive path layers for breadth and depth exploration in the graph.…”
Section: Graph Learning Algorithmsmentioning
confidence: 99%
“…To avoid the storage of all states, these methods have present the improved training strategies, such as subgraph training [27]or stochastically asynchronous training [28]. Furthermore, some complex networks architecture can utilize gating unite to control the selection of node neighborhood [29], or design two graph convolution networks with the consideration of the local and global consistency on graph [30] , or adjust the receptive field of node on graph by hyper-parameters [31].…”
Section: Graph Neural Networkmentioning
confidence: 99%
“…Earlier works [1,6] use GCNs with constant filter size for the node-based classification task and show the superiority of GCN but do not address the problem of heterogeneity of the graph. In [7], a method is proposed that determines a receptive path for each node rather than a field for performing the convolutions for representation learning. Irrespective of nearest neighbors, the aim is to perform convolutions with selective nodes in the receptive field.…”
Section: Introductionmentioning
confidence: 99%