Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1535
|View full text |Cite
|
Sign up to set email alerts
|

A Split-and-Recombine Approach for Follow-up Query Analysis

Abstract: Context-dependent semantic parsing has proven to be an important yet challenging task. To leverage the advances in contextindependent semantic parsing, we propose to perform follow-up query analysis, aiming to restate context-dependent natural language queries with contextual information.To accomplish the task, we propose STAR, a novel approach with a well-designed two-phase process. It is parser-independent and able to handle multifarious follow-up scenarios in different domains. Experiments on the FollowUp d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
1
1

Relationship

2
3

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 30 publications
0
3
0
Order By: Relevance
“…Each cell belongs to one of three edit types: None, Substitute and Insert. Liu et al (2019a) imposed an intermediate structure span and decomposed the incomplete utterance rewriting into two sub-tasks. In dialogue generation, Pan et al (2019) presented a cascaded model which first picks words from the context via BERT, and then combines these words to generate the rewritten utterance, and Su et al (2019) distinguished the weights of context utterances and the incomplete utterance using a hyper-parameter λ.…”
Section: Related Workmentioning
confidence: 99%
“…Each cell belongs to one of three edit types: None, Substitute and Insert. Liu et al (2019a) imposed an intermediate structure span and decomposed the incomplete utterance rewriting into two sub-tasks. In dialogue generation, Pan et al (2019) presented a cascaded model which first picks words from the context via BERT, and then combines these words to generate the rewritten utterance, and Su et al (2019) distinguished the weights of context utterances and the incomplete utterance using a hyper-parameter λ.…”
Section: Related Workmentioning
confidence: 99%
“…In addition, the applications of RL on language involves topics such as natural language generation [11], conversational semantic parsing [23] and text classification [42].…”
Section: Related Workmentioning
confidence: 99%
“…Moreover, the chance of a search failing due to looking up irrelevant keywords in the knowledge base becomes lower. Query splitting techniques are therefore relevant in searching [11].…”
Section: Introductionmentioning
confidence: 99%