Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop 2019
DOI: 10.18653/v1/p19-2036
|View full text |Cite
|
Sign up to set email alerts
|

Controllable Text Simplification with Lexical Constraint Loss

Abstract: We propose a method to control the level of a sentence in a text simplification task. Text simplification is a monolingual translation task translating a complex sentence into a simpler and easier to understand the alternative. In this study, we use the grade level of the US education system as the level of the sentence. Our text simplification method succeeds in translating an input into a specific grade level by considering levels of both sentences and words. Sentence level is considered by adding the target… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

2
61
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 48 publications
(63 citation statements)
references
References 14 publications
2
61
0
Order By: Relevance
“…The goal of text simplification is to transform text into a variant that is more broadly accessible to a wide variety of readers while preserving the content. While this has been accomplished using a range of approaches (Shardlow, 2014), most text simplification research has focused on fully-automated approaches (Xu et al, 2016;Zhang and Lapata, 2017;Nishihara et al, 2019). However, in some domains, such as healthcare, using fully-automated text simplification is not appropriate since it is critical that the important information is preserved fully during the simplification process.…”
Section: Introductionmentioning
confidence: 99%
“…The goal of text simplification is to transform text into a variant that is more broadly accessible to a wide variety of readers while preserving the content. While this has been accomplished using a range of approaches (Shardlow, 2014), most text simplification research has focused on fully-automated approaches (Xu et al, 2016;Zhang and Lapata, 2017;Nishihara et al, 2019). However, in some domains, such as healthcare, using fully-automated text simplification is not appropriate since it is critical that the important information is preserved fully during the simplification process.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, Kriz et al (2019) used two techniques to solve the problem that the model tends to copy words directly, resulting in a long and complicated output sentence. Nishihara et al (2019) proposed a method to simplify the original sentences to different level sentences. Different from most sequenceto-sequence models, Dong et al (2019) proposed a neural programmer-interpreter approach to predict explicit edit operations directly.…”
Section: Related Workmentioning
confidence: 99%
“…Kriz et al [36] modified Seq2seq model by improving training loss function and decoding method of inference time. Nishihara et al [37] proposed controllable text simplification model which improved Seq2seq model with lexical constraint loss, but Nishihara's method depended heavily on dataset because it need the target sentence with sentence level label. These models had achieved a certain degree of good results in sentence simplification task.…”
Section: Related Workmentioning
confidence: 99%