Proceedings of the 2018 Conference of the North American Chapter Of the Association for Computational Linguistics: Hu 2018
DOI: 10.18653/v1/n18-2013
|View full text |Cite
|
Sign up to set email alerts
|

Sentence Simplification with Memory-Augmented Neural Networks

Abstract: Sentence simplification aims to simplify the content and structure of complex sentences, and thus make them easier to interpret for human readers, and easier to process for downstream NLP applications. Recent advances in neural machine translation have paved the way for novel approaches to the task. In this paper, we adapt an architecture with augmented memory capacities called Neural Semantic Encoders (Munkhdalai and Yu, 2017) for sentence simplification. Our experiments demonstrate the effectiveness of our a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
55
0
1

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 39 publications
(57 citation statements)
references
References 20 publications
1
55
0
1
Order By: Relevance
“…As in the field of machine translation, early studies (Specia, 2010;Wubben et al, 2012;Xu et al, 2016) were mainly based on a statistical machine translation (Koehn et al, 2007;Post et al, 2013). Inspired by the success of neural machine translation (Bahdanau et al, 2015), recent studies (Nisioi et al, 2017;Zhang and Lapata, 2017;Vu et al, 2018;Guo et al, 2018;Zhao et al, 2018) use the encoder-decoder model with the attention mechanism. These studies do not consider the level of each sentence.…”
Section: Text Simplificationmentioning
confidence: 99%
“…As in the field of machine translation, early studies (Specia, 2010;Wubben et al, 2012;Xu et al, 2016) were mainly based on a statistical machine translation (Koehn et al, 2007;Post et al, 2013). Inspired by the success of neural machine translation (Bahdanau et al, 2015), recent studies (Nisioi et al, 2017;Zhang and Lapata, 2017;Vu et al, 2018;Guo et al, 2018;Zhao et al, 2018) use the encoder-decoder model with the attention mechanism. These studies do not consider the level of each sentence.…”
Section: Text Simplificationmentioning
confidence: 99%
“…2 Nisioi et al (2017) was the first major application of Seq2Seq models to text simplification, applying a standard encoder-decoder approach with attention and beam search. Vu et al (2018) extended this framework to incorporate memory augmentation, which simultaneously performs lexical and syntactic simplification, allowing them to outperform standard Seq2Seq models.…”
Section: Related Workmentioning
confidence: 99%
“…In the case of neural network methods, [43] enabled sentence simplification by deep reinforcement learning in which simplicity, relevance, and fluency are considered in the reward. [32] adapted a sequence-to-sequence model with a Neural Semantic Encoder [31] for the purpose of sentence simplification. [29] is a simple sequenceto-sequence model that can be efficiently used for in-domain and cross-domain sentence simplification.…”
Section: Related Work 21 Sentence Simplificationmentioning
confidence: 99%
“…The first approach is a method that generates a simple sentence from an input sentence by a statistical machine translation framework that interprets monolingual machine translation [22][36] [34] [35] [46]. The second approach is a method using a neural network [29] [32] [43]. In other NLP tasks (e.g., QA task, word embedding), the neural network method has had excellent results and is being actively researched.…”
Section: Introductionmentioning
confidence: 99%