2019
DOI: 10.1609/aaai.v33i01.33016658
|View full text |Cite
|
Sign up to set email alerts
|

Dependency Grammar Induction with a Neural Variational Transition-Based Parser

Abstract: Dependency grammar induction is the task of learning dependency syntax without annotated training data. Traditional graph-based models with global inference achieve state-ofthe-art results on this task but they require O(n 3 ) run time. Transition-based models enable faster inference with O(n) time complexity, but their performance still lags behind. In this work, we propose a neural transition-based parser for dependency grammar induction, whose inference procedure utilizes rich neural features with O(n) time… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
32
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 23 publications
(34 citation statements)
references
References 22 publications
(34 reference statements)
0
32
0
Order By: Relevance
“…Williams et al (2018) observe a similar phenomenon in the context of learning latent trees for classification tasks. However Li et al (2019) find that it is possible use a transition-based parser as the inference network for dependency grammar induction, if the inference network is constrained via posterior regularization (Ganchev et al, 2010) based on universal syntactic rules (Naseem et al, 2010).…”
Section: Amortized Variational Inferencementioning
confidence: 99%
“…Williams et al (2018) observe a similar phenomenon in the context of learning latent trees for classification tasks. However Li et al (2019) find that it is possible use a transition-based parser as the inference network for dependency grammar induction, if the inference network is constrained via posterior regularization (Ganchev et al, 2010) based on universal syntactic rules (Naseem et al, 2010).…”
Section: Amortized Variational Inferencementioning
confidence: 99%
“…Furthermore, we compare our model with current state-of-the-art discriminative models, the neural variational transition-based parser (NVTP) (Li et al, 2019) (Noji et al, 2016). DV: deterministic variant of D-NDMV.…”
Section: Results On Universal Dependency Treebankmentioning
confidence: 99%
“…VV: variational variant of D-NDMV. NVTP: neural variational transition-based parser (Li et al, 2019). CM: Convex-MST.…”
Section: Results On Universal Dependency Treebankmentioning
confidence: 99%
See 2 more Smart Citations