2021
DOI: 10.1007/s10489-021-02498-w
|View full text |Cite|
|
Sign up to set email alerts
|

Enhancing attributed network embedding via enriched attribute representations

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 20 publications
0
3
0
Order By: Relevance
“…At the core of these algorithms is the decision tree, a hierarchical structure designed to partition a data set with numerous records into smaller subsets through a series of decision rules. Essentially, a decision tree is a tool that, through sequential decision steps, segments substantial data sets into more manageable groups of records [27].…”
Section: Decision Treementioning
confidence: 99%
See 1 more Smart Citation
“…At the core of these algorithms is the decision tree, a hierarchical structure designed to partition a data set with numerous records into smaller subsets through a series of decision rules. Essentially, a decision tree is a tool that, through sequential decision steps, segments substantial data sets into more manageable groups of records [27].…”
Section: Decision Treementioning
confidence: 99%
“…ACC is the ratio of the true negative (𝑇 𝑁 ) and true positive (𝑇 𝑃 ) fields correctly predicted by the model to the sum of the false negative (𝐹 𝑁 ), and false positive (𝐹 𝑃 ) values contained in these fields. The ACC value is given by Equation ( 7) [27].…”
Section: Performance Metricsmentioning
confidence: 99%
“…Machine learning is a revolutionary approach to improving the data analysis and learning capabilities of computers. Thanks to machine learning, solving complex problems has become more accessible and effective [35]. Information about the machine learning methods used in the study is given below.…”
Section: Machine Learningmentioning
confidence: 99%
“…Researchers have gradually focused on expressing network nodes with low-latitude, high-density spatial vectors, thereby maintaining the structure and feature information of the original network. Learned feature vectors are represented such as by graph-based classification, clustering, and link prediction [9][10][11]. As a research direction of graph neural networks, graph embedding has also attracted attention [12].…”
Section: Introductionmentioning
confidence: 99%