2015
DOI: 10.1162/tacl_a_00142
|View full text |Cite
|
Sign up to set email alerts
|

One Vector is Not Enough: Entity-Augmented Distributed Semantics for Discourse Relations

Abstract: Discourse relations bind smaller linguistic units into coherent texts. However, automatically identifying discourse relations is difficult, because it requires understanding the semantics of the linked arguments. A more subtle challenge is that it is not enough to represent the meaning of each argument of a discourse relation, because the relation may depend on links between lower-level components, such as entity mentions. Our solution computes distributional meaning representations by composition up the synta… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
113
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 116 publications
(118 citation statements)
references
References 46 publications
1
113
0
Order By: Relevance
“…The first one is denoted as PDTB-Lin (Lin et al, 2009); it uses sections 2-21 for training, 22 as dev and section 23 as test set. The second one is labeled PDTB-Ji (Ji and Eisenstein, 2015), and uses sections 2-20 for training, 0-1 as dev and evaluates on sections 21-22. Our third setting follows the recommendations of , and performs 10-fold cross validation on the whole corpus (sections 0-23).…”
Section: Methodsmentioning
confidence: 99%
“…The first one is denoted as PDTB-Lin (Lin et al, 2009); it uses sections 2-21 for training, 22 as dev and section 23 as test set. The second one is labeled PDTB-Ji (Ji and Eisenstein, 2015), and uses sections 2-20 for training, 0-1 as dev and evaluates on sections 21-22. Our third setting follows the recommendations of , and performs 10-fold cross validation on the whole corpus (sections 0-23).…”
Section: Methodsmentioning
confidence: 99%
“…The PDTB framework allows annotations to be labelled with more than one label. In such cases we only keep the first label, in line with previous studies (among others Ji and Eisenstein, 2015;Rutherford et al, 2017).…”
Section: Methodsmentioning
confidence: 99%
“…Recently, neural network approaches have become popular. Ji and Eisenstein (2015) use two RNNs on the syntactic trees of the arguments whereas Zhang et al (2015) use a CNN to perform discourse parsing in a multi-task setting where they consider both explicit and implicit discourse relations. use a simple yet robust feedforward network and achieves the highest performance on the out-of-domain blind test in the CoNLL 2016 shared task .…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Some endto-end deep sentence modeling based approaches have advanced the performance of discourse relation classification. Some researchers (Ji and Eisenstein 2015;Wang and Lan 2016;Qin et al 2017) propose to learn the semantic representation of each argument with neural networks, such as the CNN, the RNN, and the Bi-LSTM for classification.These methods effectively capture inner semantic connections of each argument via distributed representation learning. However, they didn't consider semantic interactions between the two arguments.…”
mentioning
confidence: 99%