2019
DOI: 10.1007/978-3-030-33246-4_41
|View full text |Cite
|
Sign up to set email alerts
|

Manhattan Siamese LSTM for Question Retrieval in Community Question Answering

Abstract: Community Question Answering (cQA) are platforms where users can post their questions, expecting for other users to provide them with answers. We focus on the task of question retrieval in cQA which aims to retrieve previous questions that are similar to new queries. The past answers related to the similar questions can be therefore used to respond to the new queries. The major challenges in this task are the shortness of the questions and the word mismatch problem as users can formulate the same query using d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 14 publications
(7 citation statements)
references
References 18 publications
0
6
0
1
Order By: Relevance
“…In order to improve the QR task, we propose an Attentive Siamese LSTM approach for question retrieval, referred to as ASLSTM to retrieve the semantically similar questions in cQA [21]. As illustrated in Figure 1, our approach is composed of three main modules namely, question preprocessing, word embedding learning and Manhattan LSTM (MaLSTM).…”
Section: Description Of the Proposed Aslstm Approachmentioning
confidence: 99%
“…In order to improve the QR task, we propose an Attentive Siamese LSTM approach for question retrieval, referred to as ASLSTM to retrieve the semantically similar questions in cQA [21]. As illustrated in Figure 1, our approach is composed of three main modules namely, question preprocessing, word embedding learning and Manhattan LSTM (MaLSTM).…”
Section: Description Of the Proposed Aslstm Approachmentioning
confidence: 99%
“…One of the most important works in this category is the MaLSTM [12] method, which measures the similarity of two sentences using Manhattan distance similarity at the output layer of the Siamese LSTM network architecture. Similarly, Othma et al [29] conducted a study based on the principle of giving the closest semantic question and the relevant answer to the question asked by a user among the community questions and answers on the web for both English and Arabic, using Siamese LSTM and Manhattan vector distance. In benchmark tests performed on Yahoo Answers Dataset, successful results are obtained.…”
Section: Related Workmentioning
confidence: 99%
“…In (Othman et al, 2019) authors use word embeddings to capture semantic and syntactic information from textual contexts and vectorize the questions. The embedding vectors feed a Siamese LSTM neural network, and the similarity between the questions is obtained as the Manhattan distance of the final LSTM hidden states.…”
Section: State Of the Artmentioning
confidence: 99%