2016
DOI: 10.1007/978-3-319-46681-1_42
|View full text |Cite
|
Sign up to set email alerts
|

Bi-directional LSTM Recurrent Neural Network for Chinese Word Segmentation

Abstract: Recurrent neural network(RNN) has been broadly applied to natural language processing(NLP) problems. This kind of neural network is designed for modeling sequential data and has been testified to be quite efficient in sequential tagging tasks. In this paper, we propose to use bi-directional RNN with long short-term memory(LSTM) units for Chinese word segmentation, which is a crucial preprocess task for modeling Chinese sentences and articles. Classical methods focus on designing and combining hand-craft featur… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
53
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 106 publications
(53 citation statements)
references
References 12 publications
0
53
0
Order By: Relevance
“…BiLSTM has two parallel layers propagating in both directions . The final output of such a network would be the concatenation of the output of these two layers.…”
Section: Deep Learning Models For Sentiment Classificationmentioning
confidence: 99%
See 1 more Smart Citation
“…BiLSTM has two parallel layers propagating in both directions . The final output of such a network would be the concatenation of the output of these two layers.…”
Section: Deep Learning Models For Sentiment Classificationmentioning
confidence: 99%
“…BiLSTM has two parallel layers propagating in both directions. 66 The final output of such a network would be the concatenation of the output of these two layers. With h F (t) as the hidden state of the forward propagating layer and h B (t) as the hidden state of the backward propagating layer, h (t) the hidden state of the BiLSTM can be calculated as follows:…”
Section: Bi-directional Long Short Time Memory (Bilstm)mentioning
confidence: 99%
“…For segmentation, Yao and Huang (2016) successfully used a bi-directional LSTM model for segmenting Chinese text. In this paper, we build on their work and extend it in two ways, namely combining bi-LSTM with CRF and applying on Arabic, which is an alphabetic language.…”
Section: Related Workmentioning
confidence: 99%
“…In the past few years, neural networks have been widely used to solve NLP problems. Specially, several RNN-based neural networks have been proposed to handle sequence labeling tasks including Chinese word segmentation (Yao and Huang, 2016), POS tagging (Huang et al, 2015), NER (Chiu and Nichols, 2015) (Lample et al, 2016), which achieved outstanding performance against traditional methods.…”
Section: Introductionmentioning
confidence: 99%