2014
DOI: 10.1109/taslp.2014.2327295
|View full text |Cite
|
Sign up to set email alerts
|

Dependency Parse Reranking with Rich Subtree Features

Abstract: In pursuing machine understanding of human language, highly accurate syntactic analysis is a crucial step. In this work, we focus on dependency grammar, which models syntax by encoding transparent predicate-argument structures. Recent advances in dependency parsing have shown that employing higherorder subtree structures in graph-based parsers can substantially improve the parsing accuracy. However, the inefficiency of this approach increases with the order of the subtrees. This work explores a new reranking a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(9 citation statements)
references
References 17 publications
0
9
0
Order By: Relevance
“…We choose two-layer Bi-LSTM with 256 hidden dimensions. The adopted CNN has three layers with 100 filters per layer of size [3,4,5], respectively. For FastText [72], we use the bi-gram setting and only one layer for optimization.…”
Section: ) Sentiment Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…We choose two-layer Bi-LSTM with 256 hidden dimensions. The adopted CNN has three layers with 100 filters per layer of size [3,4,5], respectively. For FastText [72], we use the bi-gram setting and only one layer for optimization.…”
Section: ) Sentiment Analysismentioning
confidence: 99%
“…W ORD embedding is a real-valued vector representation of words by embedding both semantic and syntactic meanings obtained from unlabeled large corpus. It is a powerful tool widely used in modern natural language processing (NLP) tasks, including semantic analysis [1], information retrieval [2], dependency parsing [3], [4], [5], question answering [6], [7] and machine translation [6], [8], [9]. Learning a high quality representation is extremely important for these tasks, yet the question "what is a good word embedding model" remains an open problem.…”
Section: Introductionmentioning
confidence: 99%
“…Wang et al structured a regional CNN-LSTM model based on a subtree to analyze sentiment predictions [35]. A reranking approach for the dependency tree was provided utilizing complex subtree representations [36]. A bidirectional dependency tree representation was provided to extract dependency features from the input sentences [37].…”
Section: Dependency Treementioning
confidence: 99%
“…Mo Shen et al [2] proposed a dependency grammar, in which predicate-argument structures are encoded to build modelled syntax. New reranking approach was proposed for dependency parsing that can utilize complex sub tree representations.…”
Section: Literature Surveymentioning
confidence: 99%