Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Confere 2015
DOI: 10.3115/v1/p15-1112
|View full text |Cite
|
Sign up to set email alerts
|

A Re-ranking Model for Dependency Parser with Recursive Convolutional Neural Network

Abstract: In this work, we address the problem to model all the nodes (words or phrases) in a dependency tree with the dense representations. We propose a recursive convolutional neural network (RCNN) architecture to capture syntactic and compositional-semantic representations of phrases and words in a dependency tree. Different with the original recursive neural network, we introduce the convolution and pooling layers, which can model a variety of compositions by the feature maps and choose the most informative composi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
50
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 33 publications
(50 citation statements)
references
References 19 publications
0
50
0
Order By: Relevance
“…2 While finalizing the current paper we discovered a paper by Zhu et al (2015) proposing a similar model which is evaluated on syntactic parsing. Our work goes substantially beyond theirs, however, as it takes a parse forest rather than a single tree as input.…”
Section: Recursive Convolutional Neuralmentioning
confidence: 99%
See 2 more Smart Citations
“…2 While finalizing the current paper we discovered a paper by Zhu et al (2015) proposing a similar model which is evaluated on syntactic parsing. Our work goes substantially beyond theirs, however, as it takes a parse forest rather than a single tree as input.…”
Section: Recursive Convolutional Neuralmentioning
confidence: 99%
“…Ma et al (2015) extend the work of Kim (2014) by taking into acount dependency relations so that long range dependencies could be captured. The model proposed by Zhu et al (2015), which is very similar to our Recursive convolutional neural network model, is to use a convolutional network for the composition purpose. Our work, although also employing a convolutional network and syntactic information, goes beyond them: we address the issue of how to deal with uncertainty about the correct parse inside the neural architecture.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Zhu et al (2015) proposed a recursive convolutional neural network (RCNN) architecture to capture syntactic and compositional-semantic representations of phrases and words in a dependency tree. Different with the original recursive neural network, they introduced the convolution and pooling layers, which can model a variety of compositions by the feature maps and choose the most informative compositions by the pooling layers.…”
Section: Related Workmentioning
confidence: 99%
“…Recently, distributed representations have been widely used in a variety of natural language processing (NLP) tasks (Collobert et al, 2011;Devlin et al, 2014;Socher et al, 2013;Turian et al, 2010;Mikolov et al, 2013b;Bengio et al, 2003). Specific to the transition-based parsing, the neural network based methods have also been increasingly focused on due to their ability to minimize the efforts in feature engineering and the boosted performance (Le and Zuidema, 2014;Stenetorp, 2013;Bansal et al, 2014;Chen and Manning, 2014;Zhu et al, 2015).…”
Section: Introductionmentioning
confidence: 99%