Proceedings of the Nineteenth Conference on Computational Natural Language Learning 2015
DOI: 10.18653/v1/k15-1027
|View full text |Cite
|
Sign up to set email alerts
|

Task-Oriented Learning of Word Embeddings for Semantic Relation Classification

Abstract: We present a novel learning method for word embeddings designed for relation classification. Our word embeddings are trained by predicting words between noun pairs using lexical relation-specific features on a large unlabeled corpus. This allows us to explicitly incorporate relationspecific information into the word embeddings. The learned word embeddings are then used to construct feature vectors for a relation classification model. On a wellestablished semantic relation classification task, our method signif… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
44
0

Year Published

2015
2015
2020
2020

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 43 publications
(45 citation statements)
references
References 20 publications
(33 reference statements)
1
44
0
Order By: Relevance
“…Many researchers in the general and medical domains focused on feature-based and kernel-based methods [21,[24][25][26][27][28] CNN with softmax classification [4], factor-based compositional embedding model (FCM) [29], and word embeddingbased models [30]. Many RNN-and CNN-based variants ex-ist.…”
Section: Related Workmentioning
confidence: 99%
“…Many researchers in the general and medical domains focused on feature-based and kernel-based methods [21,[24][25][26][27][28] CNN with softmax classification [4], factor-based compositional embedding model (FCM) [29], and word embeddingbased models [30]. Many RNN-and CNN-based variants ex-ist.…”
Section: Related Workmentioning
confidence: 99%
“…The only exception is the DepNN model, which gets better result than FCM on the same embeddings. The task-specific embeddings from (Hashimoto et al, 2015) leads to the best performance (an improvement of 0.7%). This observa-10 In the task-specific setting, FCM will represent entity words and context words with separate sets of embeddings.…”
Section: Effects Of the Word Embeddingsmentioning
confidence: 99%
“…Model F1 tion suggests that the other compositional models may also benefit from the work of Hashimoto et al (2015).…”
Section: Embeddingsmentioning
confidence: 99%
See 1 more Smart Citation
“…One possible strategy is to change how the embedding is learned in the first place. For example, some approaches have been proposed to learn word embeddings that are better suited at capturing sentiment Tang et al (2016), or to learn embeddings that are optimized for relation extraction Hashimoto et al (2015). Other approaches, however, start with a pre-trained embedding, which is then modified in a particular way.…”
Section: Related Workmentioning
confidence: 99%