Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing 2016
DOI: 10.18653/v1/d16-1197
|View full text |Cite
|
Sign up to set email alerts
|

Neural Generation of Regular Expressions from Natural Language with Minimal Domain Knowledge

Abstract: This paper explores the task of translating natural language queries into regular expressions which embody their meaning. In contrast to prior work, the proposed neural model does not utilize domain-specific crafting, learning to translate directly from a parallel corpus. To fully explore the potential of neural models, we propose a methodology for collecting a large corpus 1 of regular expression, natural language pairs. Our resulting model achieves a performance gain of 19.6% over previous state-of-the-art m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
119
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 68 publications
(120 citation statements)
references
References 4 publications
0
119
0
Order By: Relevance
“…Another approach is to treat semantic parsing as a machine translation problem, where the logical form is linearized then predicted as an unstructured sequence of tokens (Andreas et al, 2013). This approach is taken by recent neural semantic parsers (Jia and Liang, 2016;Dong and Lapata, 2016;Locascio et al, 2016;Ling et al, 2016). This approach has the advantage of predicting the logical form directly from the question without latent variables, which simplifies learning, but the disadvantage of ignoring type constraints on logical forms.…”
Section: Related Workmentioning
confidence: 99%
“…Another approach is to treat semantic parsing as a machine translation problem, where the logical form is linearized then predicted as an unstructured sequence of tokens (Andreas et al, 2013). This approach is taken by recent neural semantic parsers (Jia and Liang, 2016;Dong and Lapata, 2016;Locascio et al, 2016;Ling et al, 2016). This approach has the advantage of predicting the logical form directly from the question without latent variables, which simplifies learning, but the disadvantage of ignoring type constraints on logical forms.…”
Section: Related Workmentioning
confidence: 99%
“…There is significant existing research on mapping NL directly to executable programs in the form of logical forms (Zettlemoyer and Collins, 2005), λ-DCS (Liang et al, 2013), regular expressions (Kushman and Barzilay, 2013;Locascio et al, 2016), database queries (Iyer et al, 2017;Zhong et al, 2017) and general purpose programs (Balog et al, 2016;Allamanis et al, 2015b). Ling et al (2016) Gu et al (2016b) use neural models to map NL queries to a sequence of API calls, and Neelakantan et al (2015) augment neural models with a small set of basic arithmetic and logic operations to generate more meaningful programs.…”
Section: Related Workmentioning
confidence: 99%
“…RE (Locascio et al, 2016). By contrast, our work aims to use REs to improve the prediction ability of a NN.…”
Section: Related Workmentioning
confidence: 99%