Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Confere 2015
DOI: 10.3115/v1/p15-1168
|View full text |Cite
|
Sign up to set email alerts
|

Gated Recursive Neural Network for Chinese Word Segmentation

Abstract: Recently, neural network models for natural language processing tasks have been increasingly focused on for their ability of alleviating the burden of manual feature engineering. However, the previous neural models cannot extract the complicated feature compositions as the traditional methods with discrete features. In this paper, we propose a gated recursive neural network (GRNN) for Chinese word segmentation, which contains reset and update gates to incorporate the complicated combinations of the context cha… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
94
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
4
3
2
1

Relationship

1
9

Authors

Journals

citations
Cited by 97 publications
(95 citation statements)
references
References 11 publications
1
94
0
Order By: Relevance
“…As shown in Table 4, pre-training with conventional skip-gram embeddings gives only small improvements, which is consistent as findings of previous work (Chen et al, 2015a;Ma and 2015; Cai and Zhao, 2016). Segmentation with self-training even shows accuracy drops on PKU and MSR.…”
Section: In-domain Resultssupporting
confidence: 90%
“…As shown in Table 4, pre-training with conventional skip-gram embeddings gives only small improvements, which is consistent as findings of previous work (Chen et al, 2015a;Ma and 2015; Cai and Zhao, 2016). Segmentation with self-training even shows accuracy drops on PKU and MSR.…”
Section: In-domain Resultssupporting
confidence: 90%
“…Pei et al (2014) improved upon (Zheng et al, 2013) by explicitly modeling the interactions between local context and previous tag. Chen et al (2015a) proposed a gated recursive neural network to model the feature combinations of context characters. Chen et al (2015b) used an LSTM architecture to capture potential long-distance dependencies, which alleviates the limitation of the size of context window but introduced another window for hidden states.…”
Section: Introductionmentioning
confidence: 99%
“…Pei et al [4] proposed a Max-Margin Tensor Neural Network (MMTNN) for CWS tasks, which can model complicated interactions between tags and context characters and speed up the model and avoid over-fitting. Chen et al [19] proposed a Gated Recursive Neural Network (GRNN) segmentation model, incorporating the complicated combinations of context characters by reset and update gates. In order to gain long-distance information, various long shortterm memory (LSTM) neural networks were proposed to get local and long-distance dependency information of current observed tokens, and the experimental results showed that the LSTM neural networks outperform other DNNs [20]- [22].…”
Section: Neural Cws Tasksmentioning
confidence: 99%