2021
DOI: 10.1016/j.csl.2020.101134
|View full text |Cite
|
Sign up to set email alerts
|

A Korean named entity recognition method using Bi-LSTM-CRF and masked self-attention

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 20 publications
(7 citation statements)
references
References 5 publications
0
7
0
Order By: Relevance
“…LSTM is essentially derived from the recurrent neural network (RNN) [19]. RNN is an extraordinary network for processing serial data.…”
Section: Long Short-term Memory (Lstm) Networkmentioning
confidence: 99%
“…LSTM is essentially derived from the recurrent neural network (RNN) [19]. RNN is an extraordinary network for processing serial data.…”
Section: Long Short-term Memory (Lstm) Networkmentioning
confidence: 99%
“…Zhang et al [42] merged pre-trained Glove and word2vec vectors. Although most of the research studies in NER used the pre-trained word embeddings as features, fixing the values of the vectors during model training, there exist some studies that fine-tuned these pre-trained embeddings during the training process [43]- [44]. To reduce the impact of Out of Vocabulary (OOV) words, word level representations are concatenated with character level representations.…”
Section: Related Workmentioning
confidence: 99%
“…The first is the softmax function direct tag prediction which is straightforward to compute. The second is CRF, a well-known statistical graphical model which has demonstrated state-of-the-art accuracy on many sequence labeling tasks Jin and Yu, 2021). The third is a variant of RNN (Goller and Kuchler, 1996) (e.g.…”
Section: Sequence Labeling Modelsmentioning
confidence: 99%