Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers) 2014
DOI: 10.3115/v1/p14-2105
|View full text |Cite
|
Sign up to set email alerts
|

Semantic Parsing for Single-Relation Question Answering

Abstract: We develop a semantic parsing framework based on semantic similarity for open domain question answering (QA). We focus on single-relation questions and decompose each question into an entity mention and a relation pattern. Using convolutional neural network models, we measure the similarity of entity mentions with entities in the knowledge base (KB) and the similarity of relation patterns and relations in the KB. We score relational triples in the KB using these measures and select the top scoring relational t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
123
1

Year Published

2014
2014
2022
2022

Publication Types

Select...
4
4
1

Relationship

1
8

Authors

Journals

citations
Cited by 329 publications
(125 citation statements)
references
References 9 publications
1
123
1
Order By: Relevance
“…Following this principle, a variety of neural net architectures and training approaches have been successfully applied [11], [13], [20], [22], [23], [39], [49], [58], [59], [60], [61], [65], [66]. Particularly, RNNs [22], [23], [49] are also widely used in NLP.…”
Section: Deep Learning Reviewmentioning
confidence: 99%
“…Following this principle, a variety of neural net architectures and training approaches have been successfully applied [11], [13], [20], [22], [23], [39], [49], [58], [59], [60], [61], [65], [66]. Particularly, RNNs [22], [23], [49] are also widely used in NLP.…”
Section: Deep Learning Reviewmentioning
confidence: 99%
“…Those expression problems make KBQA a further tough task. Previous works (Yih, He, and Meek 2014;Yih et al 2015) adopt letter-trigrams for the diverse expressions, which is similar to character level of Chinese. And the lattices are combinations of words and characters, so with lattices, we can utilize words information at the same time.…”
Section: Introductionmentioning
confidence: 99%
“…Recent work in [17], [18] showed that a deep neural network (DNN) can be used to directly encode KGs for semantic parsing. By leveraging our prior work, this paper extends these KG methods in several ways.…”
Section: Introductionmentioning
confidence: 99%
“…By leveraging our prior work, this paper extends these KG methods in several ways. While the approach in [17] is single-relation Question Answering, our approach is large-scale multi-concept (entity, relation, fact) open domain semantic parsing. Our approach is web-scale, learning neural embeddings for all the concepts of Wikipedia in the open source Freebase KG [7].…”
Section: Introductionmentioning
confidence: 99%