2021
DOI: 10.1016/j.eswa.2021.115146
|View full text |Cite
|
Sign up to set email alerts
|

A novel embedding approach to learn word vectors by weighting semantic relations: SemSpace

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 10 publications
(2 citation statements)
references
References 21 publications
0
2
0
Order By: Relevance
“…The idea of deriving meaning representations in terms of vectors from semantic networks has been previously investigated in the computational linguistics literature (e.g. Chakaveh et al, 2018;Grover & Leskovec, 2016;Orhan & Tulu, 2021;Perozzi et al, 2014;Pilehvar & Navigli, 2015; for review, see Grohe, 2020). For instance, Perozzi et al (2014) obtained vectors for each node in a graph by applying the random walk method.…”
Section: Spreading Activation As a Theory Of Semantic Processingmentioning
confidence: 99%
“…The idea of deriving meaning representations in terms of vectors from semantic networks has been previously investigated in the computational linguistics literature (e.g. Chakaveh et al, 2018;Grover & Leskovec, 2016;Orhan & Tulu, 2021;Perozzi et al, 2014;Pilehvar & Navigli, 2015; for review, see Grohe, 2020). For instance, Perozzi et al (2014) obtained vectors for each node in a graph by applying the random walk method.…”
Section: Spreading Activation As a Theory Of Semantic Processingmentioning
confidence: 99%
“…Tulu et al [ 41 ] proposed an automatic scoring system based on SemSpace [ 42 ], a novel embedding approach to learn word vectors by weighting semantic relations, used to feed Manhattan LSTM network (which is precisely used for similarity tasks [ 43 ]). The proposed method has also been evaluated on CU-NLP dataset.…”
Section: Related Workmentioning
confidence: 99%