Proceedings of the 55th Annual Meeting of the Association For Computational Linguistics (Volume 1: Long Papers) 2017
DOI: 10.18653/v1/p17-1053
|View full text |Cite
|
Sign up to set email alerts
|

Improved Neural Relation Detection for Knowledge Base Question Answering

Abstract: Relation detection is a core component of many NLP applications including Knowledge Base Question Answering (KBQA). In this paper, we propose a hierarchical recurrent neural network enhanced by residual learning which detects KB relations given an input question. Our method uses deep residual bidirectional LSTMs to compare questions and relation names via different levels of abstraction. Additionally, we propose a simple KBQA system that integrates entity linking and our proposed relation detector to make the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
256
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
3
2

Relationship

1
8

Authors

Journals

citations
Cited by 273 publications
(264 citation statements)
references
References 26 publications
0
256
0
Order By: Relevance
“…Yu, M., etc. [25] proposed a hierarchical RNN network by using residual learning to improve the performance in 2017. When there is an input question, it can detect the relation inside the knowledge base.…”
Section: Problem Formulation and Related Workmentioning
confidence: 99%
“…Yu, M., etc. [25] proposed a hierarchical RNN network by using residual learning to improve the performance in 2017. When there is an input question, it can detect the relation inside the knowledge base.…”
Section: Problem Formulation and Related Workmentioning
confidence: 99%
“…LSTM [6] 70.9 GRU [11] 71.2 BuboQA [13] 74.9 BiGRU [4] 75.7 Attn. CNN [23] 76.4 HR-BiLSTM [24] 77.0 BiLSTM-CRF [16] all but one of the existing approaches. We suspect that the final score can be further improved by finding better rules for logical form selection, however that is not the goal of this study.…”
Section: Approach Accuracymentioning
confidence: 99%
“…[11] explore building question representations on both word-and character-level. [24] explore relation detection in-depth and propose a hierarchical word-level and symbol-level residual representation. Both [4] and [11] improve upon them by incorporating structural information such as entity type for entity linking.…”
Section: Related Workmentioning
confidence: 99%
“…Adapting the Relation Representation The relation network proposed in Yu et al (2017) has two parts for relation representations: one is at wordlevel and the other is at relation-level. The two parts are fed into the relation network to generate the final relation representation.…”
Section: Relation Detection With the Adaptermentioning
confidence: 99%