Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Langua 2016
DOI: 10.18653/v1/n16-1024
|View full text |Cite
|
Sign up to set email alerts
|

Recurrent Neural Network Grammars

Abstract: This is modified version of a paper originally published at NAACL 2016 that contains a corrigendum at the end, with improved results after fixing an implementation bug in the RNNG composition function. AbstractWe introduce recurrent neural network grammars, probabilistic models of sentences with explicit phrase structure. We explain efficient inference procedures that allow application to both parsing and language modeling. Experiments show that they provide better parsing in English than any single previously… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

3
575
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 465 publications
(638 citation statements)
references
References 41 publications
3
575
0
Order By: Relevance
“…As reported in [9], reranking with the generative RNN models deliver gains over end-to-end discriminative RNN models, our framework thereby has the potential to advance the reranking models [9], [11] by integrating these generative RNN models into decoding † , which remains a future work. On the German dataset, particularly, our parser outperforms other state-of-the-art parsers both when using auto-labelled POS tags and gold tags.…”
Section: Comparison With State-of-the-art Constituent Parsersmentioning
confidence: 83%
See 3 more Smart Citations
“…As reported in [9], reranking with the generative RNN models deliver gains over end-to-end discriminative RNN models, our framework thereby has the potential to advance the reranking models [9], [11] by integrating these generative RNN models into decoding † , which remains a future work. On the German dataset, particularly, our parser outperforms other state-of-the-art parsers both when using auto-labelled POS tags and gold tags.…”
Section: Comparison With State-of-the-art Constituent Parsersmentioning
confidence: 83%
“…For example, [7] proposed an end-to-end RNN parser whose performance is comparable to the stateof-the-art feature-rich parsers, and [9] proposed a variant RNN model for parsing. Unlike their pure neural network models, ours is a hybrid model consisting of neural networks and feature-rich models, which can combine their merits.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…The first assumption is violated when e.g. non-local features (Huang, 2008) are used to define probabilities or when probabilities are defined by recurrent neural nets that use hidden states derived from whole subtrees (Socher et al, 2013;Dyer et al, 2016). The second assumption is violated when e.g.…”
mentioning
confidence: 99%