Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021) 2021
DOI: 10.18653/v1/2021.repl4nlp-1.24
|View full text |Cite
|
Sign up to set email alerts
|

Knowledge Informed Semantic Parsing for Conversational Question Answering

Abstract: Smart assistants are tasked to answer various questions regarding world knowledge. These questions range from retrieval of simple facts to retrieval of complex, multi-hops question followed by various operators (i.e., filter, argmax). Semantic parsing has emerged as the state-of-the-art for answering these kinds of questions by forming queries to extract information from knowledge bases (KBs). Specially, neural semantic parsers (NSPs) effectively translate natural questions to logical forms, which execute on K… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 19 publications
(10 reference statements)
0
2
0
Order By: Relevance
“…This work was further extended in , which includes a graph attention network to exploit correlations between entities and predicates. Another related contemporaneous work is Thirukovalluru et al (2021), where the decoder is informed with entity embeddings coming from KG random walks.…”
Section: Related Workmentioning
confidence: 99%
“…This work was further extended in , which includes a graph attention network to exploit correlations between entities and predicates. Another related contemporaneous work is Thirukovalluru et al (2021), where the decoder is informed with entity embeddings coming from KG random walks.…”
Section: Related Workmentioning
confidence: 99%
“…Much like how knowledge acquisition in cognitive development progresses from recognizing concrete objects to gradually understanding their relations to one another (Lucariello et al, 1992), we aim to extend language models' existing rough understanding of entities (Heinzerling and Inui, 2021) to the types that govern how entities are related. Instilling type knowledge in multi-purpose models can improve performance in tasks like entity linking (Onoe and Durrett, 2020), question-answering (Févry et al, 2020a), and semantic parsing (Thirukovalluru et al, 2021).…”
Section: Introductionmentioning
confidence: 99%