2017
DOI: 10.1016/j.knosys.2017.09.008
|View full text |Cite
|
Sign up to set email alerts
|

Compositional approaches for representing relations between words: A comparative study

Abstract: Identifying the relations that exist between words (or entities) is important for various natural language processing tasks such as, relational search, nounmodifier classification and analogy detection. A popular approach to represent the relations between a pair of words is to extract the patterns in which the words co-occur with from a corpus, and assign each word-pair a vector of pattern frequencies. Despite the simplicity of this approach, it suffers from data sparseness, information scalability and lingui… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
4
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
4
1

Relationship

1
8

Authors

Journals

citations
Cited by 11 publications
(4 citation statements)
references
References 18 publications
0
4
0
Order By: Relevance
“…Vector differences have been found to be the most robust encoding method in the context of word analogies(Hakami and Bollegala, 2017).…”
mentioning
confidence: 99%
“…Vector differences have been found to be the most robust encoding method in the context of word analogies(Hakami and Bollegala, 2017).…”
mentioning
confidence: 99%
“…The averaged token representation is taken as the term representation. A query or a candidate is estimated by the sum of the representations of each term pair, which is represented as the embedding vector differences (Hakami and Bollegala, 2017;Ushio et al, 2021). The candidate with the highest cosine similarity to the query is chosen as the answer.…”
Section: Baselines For Analogical Qamentioning
confidence: 99%
“…Bollegala et al (2015a) showed that PairDiff can be used as a proxy for learning better word embeddings and Vylomova et al (2016) conducted an extensive empirical comparison of PairDiff using a dataset containing 16 different relation types. Besides PairDiff, concatenation (Hakami and Bollegala, 2017;Yin and Schütze, 2016), circular correlation and convolution (Nickel et al, 2016) have been used in prior work for representing the relations between words. Because the relation embedding is composed using word embeddings instead of learning as a separate parameter, we refer to methods that are based on this approach as compositional relational embedding methods.…”
Section: Introductionmentioning
confidence: 99%