Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop 2020
DOI: 10.18653/v1/2020.acl-srw.13
|View full text |Cite
|
Sign up to set email alerts
|

SCAR: Sentence Compression using Autoencoders for Reconstruction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 14 publications
0
2
0
Order By: Relevance
“…Self-consistency Learning has been adopted in the field of Natural Language Processing in recent years for tasks like text encoding (Li et al, 2015), sentence compression and summarization (Baziotis et al, 2019;Malireddy et al, 2020), NER (Iovine et al, 2022), and data-to-text generation . Models that leverage self-consistency learning usually include two modules that are reversals of each other.…”
Section: Related Workmentioning
confidence: 99%
“…Self-consistency Learning has been adopted in the field of Natural Language Processing in recent years for tasks like text encoding (Li et al, 2015), sentence compression and summarization (Baziotis et al, 2019;Malireddy et al, 2020), NER (Iovine et al, 2022), and data-to-text generation . Models that leverage self-consistency learning usually include two modules that are reversals of each other.…”
Section: Related Workmentioning
confidence: 99%
“…Another commonly proposed unsupervised framework is to use autoencoders and reconstruction objectives (Miao and Blunsom, 2016;Févry and Phang, 2018;Malireddy et al, 2020). These approaches are based on the assumption that a good sentence compression is one from which the original sentence can be inferred.…”
Section: Unsupervised Sentence Compressionmentioning
confidence: 99%
“…Sentence compression is currently dominated by supervised methods (Malireddy et al, 2020;Nguyen et al, 2020;Nóbrega et al, 2020) and highly relies on syntactic dependency trees (Le et al, 2019;Xu and Durrett, 2019b;Wang and Chen, 2019;Kamigaito and Okumura, 2020). Unsupervised methods have been explored to extract sentences from documents to represent the key points (Jang and Kang, 2021).…”
Section: Related Workmentioning
confidence: 99%