Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1233
|View full text |Cite
|
Sign up to set email alerts
|

Neural Generative Rhetorical Structure Parsing

Abstract: Rhetorical structure trees have been shown to be useful for several document-level tasks including summarization and document classification. Previous approaches to RST parsing have used discriminative models; however, these are less sample efficient than generative models, and RST parsing datasets are typically small. In this paper, we present the first generative model for RST parsing. Our model is a document-level RNN grammar (RNNG) with a bottom-up traversal order. We show that, for our parser's traversal … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 18 publications
(20 citation statements)
references
References 37 publications
(46 reference statements)
0
18
0
Order By: Relevance
“…10) already achieves good performance, with Dyer et al (2016) delivering promising results by using greedy decoding. As a recent example for discourse parsing, Mabona et al (2019) successfully combine standard beam-search with shift-reduce parsing using two parallel beams for shift and reduce actions. Overall, recent work shows that beam-search approaches and their possible extensions can effectively address scalability issues in multiple parsing scenarios.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…10) already achieves good performance, with Dyer et al (2016) delivering promising results by using greedy decoding. As a recent example for discourse parsing, Mabona et al (2019) successfully combine standard beam-search with shift-reduce parsing using two parallel beams for shift and reduce actions. Overall, recent work shows that beam-search approaches and their possible extensions can effectively address scalability issues in multiple parsing scenarios.…”
Section: Related Workmentioning
confidence: 99%
“…Inspired by the recent success in applying beamsearch to enhance the scalability of multiple NLP parsing tasks (Mabona et al, 2019;Fried et al, 2017;Dyer et al, 2016;Vinyals et al, 2015), we propose a novel heuristic beam-search approach that can automatically generate discourse trees containing structure-and nuclearity-attributes for documents of arbitrary length.…”
Section: Predicting Discourse Structure and Nuclearity From Arbitrary Documentsmentioning
confidence: 99%
See 1 more Smart Citation
“…There has also been some recent work on reducing the imbalanced probability bias. Mabona et al (2019) propose an algorithmic solution for organising beam search into buckets that have the same number of expensive transitions. Crabbé et al (2019) propose a sampling based approach with the same motivation of controlling which hypotheses are being compared.…”
Section: Other Relevant Workmentioning
confidence: 99%
“…The combination of these two limitations has been one of the main reasons for the limited application of neural discourse parsing for more diverse downstream tasks. While there have been neural discourse parsers proposed (Braud et al, 2017;Yu et al, 2018;Mabona et al, 2019), they still cannot consistently outperform traditional approaches when applied to the RST-DT dataset, where the amount of training data is arguably insufficient for such data-intensive approaches.…”
Section: Introductionmentioning
confidence: 99%