Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) 2018
DOI: 10.18653/v1/p18-1194
|View full text |Cite
|
Sign up to set email alerts
|

Marrying Up Regular Expressions with Neural Networks: A Case Study for Spoken Language Understanding

Abstract: The success of many natural language processing (NLP) tasks is bound by the number and quality of annotated data, but there is often a shortage of such training data. In this paper, we ask the question: "Can we combine a neural network (NN) with regular expressions (RE) to improve supervised learning for NLP?". In answer, we develop novel methods to exploit the rich expressiveness of REs at different levels within a NN, showing that the combination significantly enhances the learning effectiveness when a small… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
36
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 43 publications
(36 citation statements)
references
References 27 publications
(35 reference statements)
0
36
0
Order By: Relevance
“…We also compare our method with the basic neural networks enhanced by existing methods of combining rules and neural networks. Luo et al (2018) propose three ways to utilize RE matching results in a neural model: 1) use the results as additional input features; 2) use the results to guide attention; 3) use the results to directly tune the output logits. As our basic networks do not involve attention, we enhance them using 1), 3) or both, denoted as +i, +o and +io respectively.…”
Section: Re-enhanced Basic Networkmentioning
confidence: 99%
See 2 more Smart Citations
“…We also compare our method with the basic neural networks enhanced by existing methods of combining rules and neural networks. Luo et al (2018) propose three ways to utilize RE matching results in a neural model: 1) use the results as additional input features; 2) use the results to guide attention; 3) use the results to directly tune the output logits. As our basic networks do not involve attention, we enhance them using 1), 3) or both, denoted as +i, +o and +io respectively.…”
Section: Re-enhanced Basic Networkmentioning
confidence: 99%
“…Nevertheless, symbolic rules are still an indispensable tool in various industrial NLP applications. Regular expressions (RE) are one of the most representative and useful forms of symbolic rules and are widely used for solving tasks such as pattern matching (Hosoya and Pierce, 2001;Zhang et al, 2018) and intent classification (Luo et al, 2018). RE-based systems are highly interpretable * Corresponding author.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Instead of massive labeling, human experts only need to label a small set of documents to fine-tune the neural network. Luo et al [49] incorporated knowledge of regular expressions into the training of neural networks to solve typical spoken language understanding (SLU) tasks. Experiments demonstrate that the learning performance can be significantly improved by the implicit knowledge encoded within regular expressions.…”
Section: Related Workmentioning
confidence: 99%
“…Locascio et al (2016) proposed training LSTM NN to generate REs from sample pieces of text. Luo et al (2018) incorporates knowledge of REs into training of NNs at three different levels: as the input features to NNs, as regularizations of the outputs of NN layers, or as a reward/penalty in the loss functions in NNs.…”
Section: Related Workmentioning
confidence: 99%