2016
DOI: 10.48550/arxiv.1601.01280
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Language to Logical Form with Neural Attention

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
48
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 70 publications
(50 citation statements)
references
References 24 publications
2
48
0
Order By: Relevance
“…semantic parsing [11] and question answering [12]. Deng et al [18] present a model, WYGIWYS, which employs a CNN for text and layout recognition in tandem with an attentionbased neural machine translation system.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…semantic parsing [11] and question answering [12]. Deng et al [18] present a model, WYGIWYS, which employs a CNN for text and layout recognition in tandem with an attentionbased neural machine translation system.…”
Section: Related Workmentioning
confidence: 99%
“…Recently, with the rapid development of deep learning methods, the encoder-decoder model with attention mechanism has been adopted to deal with the mathematical expression recognition problem [4]. The encoder-decoder model has been successfully applied to a variety of tasks such as machine translation [7], image caption [8], speech recognition [9], document understanding [10], semantic parsing [11] and question answering [12]. This model is commonly employed to transform input data into high-level representations with an encoder, and then the representations are used to generate the target format data by the decoder.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Semantic parsing in turn can be considered a form of the general problem of program synthesis (see [30] for a survey), where the input to the program synthesizer is a natural language description of what the program should do. Recently there have been many works applying ML to semantic parsing [7,51,21,37,32,102] and program synthesis from input-output pairs [45,69,26,65,13,25] (and non-ML success stories [29]).…”
Section: Existing Research Using Minecraft a Number Of Machine Learni...mentioning
confidence: 99%
“…Semantic parsing is a task of transducing natural language to meaning representations, which in turn can be expressed through many different semantic formalisms including lambda calculus (Zettlemoyer and Collins, 2012), DCS (Liang et al, 2013), Discourse Representation Theory (DRT) (Kamp and Reyle, 2013), AMR (Banarescu et al, 2013) and so on. This availability of annotated data in English has translated into the development of a plethora of models, including encoder-decoders (Dong and Lapata, 2016;Jia and Liang, 2016) as well as tree or graph-structured decoders Lapata, 2016, 2018;Liu et al, 2018;Yin and Neubig, 2017).…”
Section: Introductionmentioning
confidence: 99%