Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics 2019
DOI: 10.18653/v1/p19-1473
|View full text |Cite
|
Sign up to set email alerts
|

Unified Semantic Parsing with Weak Supervision

Abstract: Semantic parsing over multiple knowledge bases enables a parser to exploit structural similarities of programs across the multiple domains. However, the fundamental challenge lies in obtaining high-quality annotations of (utterance, program) pairs across various domains needed for training such models. To overcome this, we propose a novel framework to build a unified multi-domain enabled semantic parser trained only with weak supervision (denotations). Weakly supervised training is particularly arduous as the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 18 publications
0
3
0
Order By: Relevance
“…Since annotating a sufficiently large number of SQL programs is a resource heavy task, semantic parsers can be trained with execution output denotations under a weak-supervision setting. These execution denotations can be used to model a reward signal in order to train the underlying semantic parser (Zhong et al, 2017;Liang et al, 2018;Hagopian et al, 2019;Agrawal et al, 2019;Agarwal et al, 2019). They can also be used to train the semantic parser with a log Multiple log Marginal Likelihood (MML) objective by using a limited number of SQL programs as latent logical forms Min et al, 2019;Wang et al, 2021a).…”
Section: Ethical Considerationsmentioning
confidence: 99%
“…Since annotating a sufficiently large number of SQL programs is a resource heavy task, semantic parsers can be trained with execution output denotations under a weak-supervision setting. These execution denotations can be used to model a reward signal in order to train the underlying semantic parser (Zhong et al, 2017;Liang et al, 2018;Hagopian et al, 2019;Agrawal et al, 2019;Agarwal et al, 2019). They can also be used to train the semantic parser with a log Multiple log Marginal Likelihood (MML) objective by using a limited number of SQL programs as latent logical forms Min et al, 2019;Wang et al, 2021a).…”
Section: Ethical Considerationsmentioning
confidence: 99%
“…Witnessed the labeled data bottleneck problem, many techniques have been proposed to reduce the demand for labeled logical forms. Many weakly supervised learning are proposed (Artzi and Zettlemoyer, 2013;Berant et al, 2013;Reddy et al, 2014;Agrawal et al, 2019;, such as denotation-base learning (Pasupat and Liang, 2016;Goldman et al, 2018), iterative searching (Dasigi et al, 2019). Semi-supervised semantic parsing is also proposed (Chen et al, 2018a).…”
Section: Related Workmentioning
confidence: 99%
“…Supervised approaches often suffer from the lack of training data, because it is expensive to annotate an utterance with its logical form. Many weakly supervised techniques have been investigated, e.g., supervised using denotations for utterances (Liang et al 2017;Guu et al 2017;Cheng and Lapata 2018;Misra et al 2018;Goldman et al 2018;Liang et al 2018;Agrawal et al 2019). One main challenge here is the large search space of potential programs needing to be explored.…”
Section: Related Workmentioning
confidence: 99%