2023
DOI: 10.3390/s23010463
|View full text |Cite
|
Sign up to set email alerts
|

Graph Learning-Based Blockchain Phishing Account Detection with a Heterogeneous Transaction Graph

Abstract: Recently, cybercrimes that exploit the anonymity of blockchain are increasing. They steal blockchain users’ assets, threaten the network’s reliability, and destabilize the blockchain network. Therefore, it is necessary to detect blockchain cybercriminal accounts to protect users’ assets and sustain the blockchain ecosystem. Many studies have been conducted to detect cybercriminal accounts in the blockchain network. They represented blockchain transaction records as homogeneous transaction graphs that have a mu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 35 publications
(38 reference statements)
0
4
0
Order By: Relevance
“…The second combination alternates GNN layers and Transformer layers in the DG encoder [100], [126], [136]. The third combination is a parallel encoding of the DG by independent transformer and GNN layers, followed by a combination of their encoded hidden states [120], merging the strengths of both layers. Additionally, some dynamic graph models [73], [98], [99], [122], [123], [125], [127] use transformers exclusively as graph encoders, exploiting the self-attention mechanism for node hidden states propagation without relying on traditional GNN architectures.…”
Section: ) Combination Of Transformer With Dgnnsmentioning
confidence: 99%
See 3 more Smart Citations
“…The second combination alternates GNN layers and Transformer layers in the DG encoder [100], [126], [136]. The third combination is a parallel encoding of the DG by independent transformer and GNN layers, followed by a combination of their encoded hidden states [120], merging the strengths of both layers. Additionally, some dynamic graph models [73], [98], [99], [122], [123], [125], [127] use transformers exclusively as graph encoders, exploiting the self-attention mechanism for node hidden states propagation without relying on traditional GNN architectures.…”
Section: ) Combination Of Transformer With Dgnnsmentioning
confidence: 99%
“…We have distinguished position encoding into 4 types. The first method is the embedding based on the time-related index of nodes or edges [98], [100], [120], [124], [127]. The second method utilizes (global) graph spectral information [99], such as the Laplacian or its eigenvectors.…”
Section: ) Positional Encoding (Pe) On Dynamic Graphsmentioning
confidence: 99%
See 2 more Smart Citations
“…These algorithms fall broadly into three distinct categories, each with its unique approach to achieving consensus within the network. The first category encompasses proof-based consensus algorithms, wherein nodes seeking to participate in the verification process must demonstrate their qualification for appending tasks [9]. The second category revolves around voting-based consensus mechanisms, requiring validators to share their validation results before reaching a final decision on new blocks 33 DOI: 10.37943/16CGOY7609 © Akylbek Tokhmetov, Vyacheslav Lee, Liliya Tanchenko or transactions.…”
Section: Introductionmentioning
confidence: 99%