Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL) 2019
DOI: 10.18653/v1/k19-1085
|View full text |Cite
|
Sign up to set email alerts
|

Learning Analogy-Preserving Sentence Embeddings for Answer Selection

Abstract: Answer selection aims at identifying the correct answer for a given question from a set of potentially correct answers. Contrary to previous works, which typically focus on the semantic similarity between a question and its answer, our hypothesis is that question-answer pairs are often in analogical relation to each other. Using analogical inference as our use case, we propose a framework and a neural network architecture for learning dedicated sentence embeddings that preserve analogical properties in the sem… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2
2
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(8 citation statements)
references
References 23 publications
0
8
0
Order By: Relevance
“…Vector cosine similarity in an unsupervised word embedding enabled the prediction of applications for materials years before their publication in the materials science literature ( 49 ). Several supervised analogy learning methods based on word embeddings have been successfully applied in a variety of natural language processing tasks ( 22 , 50 , 51 ). Our algorithm uses this approach to leverage information about cancer and kinases latent in the published literature.…”
Section: Discussionmentioning
confidence: 99%
“…Vector cosine similarity in an unsupervised word embedding enabled the prediction of applications for materials years before their publication in the materials science literature ( 49 ). Several supervised analogy learning methods based on word embeddings have been successfully applied in a variety of natural language processing tasks ( 22 , 50 , 51 ). Our algorithm uses this approach to leverage information about cancer and kinases latent in the published literature.…”
Section: Discussionmentioning
confidence: 99%
“…Vector cosine similarity was taken recently to enable prediction of materials for applications years before their publication in the materials science literature by unsupervised word embedding (39). Several supervised analogy learning methods based on word embeddings have been successfully applied in a variety of natural language processing tasks (13,40,41). Our algorithm uses this approach to leverage information about cancer and kinases latent in the published literature.…”
Section: Discussionmentioning
confidence: 99%
“…In a more recent line of work, analogical embedding is used for the problem of answer selection in query-answering systems, namely finding the correct answer for a given question from a set of candidate (potentially correct) answers [9]. Analogy quadruples are defined in the form of q p : a p :: q i : a ij , where q p and a p denote a question and its correct answer (so-called "prototypes"), and q i , a ij are the ith question and its jth candidate answer.…”
Section: Related Workmentioning
confidence: 99%