Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) 2020
DOI: 10.18653/v1/2020.emnlp-main.323
|View full text |Cite
|
Sign up to set email alerts
|

Fast semantic parsing with well-typedness guarantees

Abstract: AM dependency parsing is a linguistically principled method for neural semantic parsing with high accuracy across multiple graphbanks. It relies on a type system that models semantic valency but makes existing parsers slow. We describe an A* parser and a transition-based parser for AM dependency parsing which guarantee well-typedness and improve parsing speed by up to 3 orders of magnitude, while maintaining or improving accuracy.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
23
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 21 publications
(27 citation statements)
references
References 23 publications
0
23
0
Order By: Relevance
“…with fixed tree decoder (incl. post-processing bugfix for AMR as per Lindemann et al (2020)). FG'20 is Fernández-González and Gómez-Rodríguez (2020).…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…with fixed tree decoder (incl. post-processing bugfix for AMR as per Lindemann et al (2020)). FG'20 is Fernández-González and Gómez-Rodríguez (2020).…”
Section: Resultsmentioning
confidence: 99%
“…The central challenge of compositional methods lies in the fact that the compositional structures are not provided in the graphbanks. Existing AM parsers (Groschwitz et al, 2018;Lindemann et al, , 2020 use hand-built heuristics to extract AM dep-trees for supervised training from the graphs in the graphbank. These heuristics require extensive expert work, including graphbank-specific decisions for source allocations and graphbank-and phenomenon-specific patterns to extract type requests for reentrancies.…”
Section: Decomposition Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…Constrained Decoding. After neural parsers model semantic parsing as a sentence to logical form translation task (Yih et al, 2015;Krishnamurthy et al, 2017;Iyyer et al, 2017;Jie and Lu, 2018;Lindemann et al, 2020), many constrained decoding algorithms are also proposed, such as type constraint-based illegal token filtering (Krishnamurthy et al, 2017); Lisp interpreter-based method (Liang et al, 2017); type constraints for generating valid actions (Iyyer et al, 2017).…”
Section: Related Workmentioning
confidence: 99%
“…L'19 are results of with fixed tree decoder (incl. post-processing bugfix for AMR as perLindemann et al (2020)). FG'20 is…”
mentioning
confidence: 99%