2019
DOI: 10.1609/aaai.v33i01.33016318
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Task Learning with Multi-View Attention for Answer Selection and Knowledge Base Question Answering

Abstract: Answer selection and knowledge base question answering (KBQA) are two important tasks of question answering (QA) systems. Existing methods solve these two tasks separately, which requires large number of repetitive work and neglects the rich correlation information between tasks. In this paper, we tackle answer selection and KBQA tasks simultaneously via multi-task learning (MTL), motivated by the following motivations. First, both answer selection and KBQA can be regarded as a ranking problem, with one at tex… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
51
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
2
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 53 publications
(51 citation statements)
references
References 7 publications
(15 reference statements)
0
51
0
Order By: Relevance
“…For example, in some QA cases, the answer type may play a key role in determining the final answer. So the (8) corresponding weight e t will be larger than other weights. The final score of each answer is summed up by scores from different answer aspects in the question-towards-answer attention.…”
Section: Question-towards-answer Attentionmentioning
confidence: 98%
See 1 more Smart Citation
“…For example, in some QA cases, the answer type may play a key role in determining the final answer. So the (8) corresponding weight e t will be larger than other weights. The final score of each answer is summed up by scores from different answer aspects in the question-towards-answer attention.…”
Section: Question-towards-answer Attentionmentioning
confidence: 98%
“…Recently, some combined or fused methods are proposed, and at the same time, some domain features are investigated in the models. Deng et al [8] utilized a multitask learning framework to solve answer selection and relation detection simultaneously. Qu et al [22] proposed a similarity matrix-based CNN model to improve the detection performance.…”
Section: Related Workmentioning
confidence: 99%
“…Co-Attention We apply co-attention mechanism (Deng et al 2018) to calculate interactive attention between each QA pair, which enables QA pairs to be aware of their semantic potential relationships. Specifically, We first compute attention matrix M c :…”
Section: Attention Layermentioning
confidence: 99%
“…Another, equally widespread point of view, underlines the use of SSNSs as sources of scientific knowledge [21; 40; 41; 42]. Researchers passively consult documents uploaded by their colleagues [19] or actively seek the knowledge they need through the use of features like "question and answers" [40]. This attitude is often found in junior researchers or graduate students, but it is also common among more experienced users [43].…”
Section: Scientific Social Networking Sites Knowledge Sharing and Knmentioning
confidence: 99%