2021
DOI: 10.7717/peerj-cs.526
|View full text |Cite
|
Sign up to set email alerts
|

Fusion of text and graph information for machine learning problems on networks

Abstract: Today, increased attention is drawn towards network representation learning, a technique that maps nodes of a network into vectors of a low-dimensional embedding space. A network embedding constructed this way aims to preserve nodes similarity and other specific network properties. Embedding vectors can later be used for downstream machine learning problems, such as node classification, link prediction and network visualization. Naturally, some networks have text information associated with them. For instance,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
5
2
1

Relationship

3
5

Authors

Journals

citations
Cited by 25 publications
(3 citation statements)
references
References 22 publications
0
3
0
Order By: Relevance
“…Recent studies show that a combination of deep learning models with semi-supervised techniques gives state-of-the-art results in terms of scalability, speed, and quality in downstream tasks ( Makarov et al, 2021 ; Makarov, Makarov & Kiselev, 2021 ; Makarov, Korovina & Kiselev, 2021 ). However, static models are limited by the necessity to retrain the model with each significant change of graph structure.…”
Section: Related Workmentioning
confidence: 99%
“…Recent studies show that a combination of deep learning models with semi-supervised techniques gives state-of-the-art results in terms of scalability, speed, and quality in downstream tasks ( Makarov et al, 2021 ; Makarov, Makarov & Kiselev, 2021 ; Makarov, Korovina & Kiselev, 2021 ). However, static models are limited by the necessity to retrain the model with each significant change of graph structure.…”
Section: Related Workmentioning
confidence: 99%
“…There are a lot of applications of GNNs for specific domains in recommender systems like knowledge-aware [28], [29] or social recommendations [30]- [32]. However, our work is concentrated on a classic sequential formulation of recommender systems.…”
Section: Graph Neural Network For Recommender Systemsmentioning
confidence: 99%
“…Ko et al [10] take a different approach for generating hard points: they generate points by linearly interpolating between negative data points during training and picking the hardest point. Other approaches may directly learn metric space transformation via graph embeddings [14,15].…”
Section: Other Approachesmentioning
confidence: 99%