Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1491
|View full text |Cite
|
Sign up to set email alerts
|

Recursive Context-Aware Lexical Simplification

Abstract: This paper presents a novel architecture for recursive context-aware lexical simplification, REC-LS, that is capable of (1) making use of the wider context when detecting the words in need of simplification and suggesting alternatives, and (2) taking previous simplification steps into account. We show that our system outputs lexical simplifications that are grammatically correct and semantically appropriate, and outperforms the current state-of-theart systems in lexical simplification.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
12
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
4

Relationship

1
9

Authors

Journals

citations
Cited by 18 publications
(18 citation statements)
references
References 25 publications
0
12
0
Order By: Relevance
“…For instance, in example (1) a CWI system might identify engulfed as a complex word, which would allow an LS system to replace it with a simpler alternative, e.g. flooded, in the next step (Paetzold and Specia, 2016a;Gooding and Kochmar, 2019b):…”
Section: Introductionmentioning
confidence: 99%
“…For instance, in example (1) a CWI system might identify engulfed as a complex word, which would allow an LS system to replace it with a simpler alternative, e.g. flooded, in the next step (Paetzold and Specia, 2016a;Gooding and Kochmar, 2019b):…”
Section: Introductionmentioning
confidence: 99%
“…Similar as proposed in Gooding and Kochmar (2019); Hartmann and dos Santos (2018), and De Hertog and Tack (2018) we use word and character embeddings. We compare pretrained non-contextualized word embeddings, i.e., GloVe (Pennington et al, 2014), pre-trained contextualized word embeddings, i.e., ELMo (Peters et al, 2018) and BERT (Devlin et al, 2019), with pre-trained contextualized character embeddings, i.e., stacked Flair (Akbik et al, 2018(Akbik et al, , 2019a) -a combination of GloVe and Flair-and PooledFlair (Akbik et al, 2019b).…”
Section: Word and Character Embeddingsmentioning
confidence: 99%
“…These scores indicate which words are likely to cause problems for a reader. The words that are identified as problematic can be the subject of numerous types of intervention, such as direct replacement in the setting of lexical simplification (Gooding and Kochmar, 2019), or extra information being given in the context of explanation generation .…”
Section: Introductionmentioning
confidence: 99%