2020
DOI: 10.3390/e22020252
|View full text |Cite
|
Sign up to set email alerts
|

Semi-Supervised Bidirectional Long Short-Term Memory and Conditional Random Fields Model for Named-Entity Recognition Using Embeddings from Language Models Representations

Abstract: Increasingly, popular online museums have significantly changed the way people acquire cultural knowledge. These online museums have been generating abundant amounts of cultural relics data. In recent years, researchers have used deep learning models that can automatically extract complex features and have rich representation capabilities to implement named-entity recognition (NER). However, the lack of labeled data in the field of cultural relics makes it difficult for deep learning models that rely on labele… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
16
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
2
2

Relationship

1
9

Authors

Journals

citations
Cited by 25 publications
(16 citation statements)
references
References 44 publications
0
16
0
Order By: Relevance
“…A BiGRU [ 38 ] is similar to a BiLSTM [ 39 ] model in which backpropagation learning is introduced on the basis of GRU, as shown in Fig. 6 (b).…”
Section: Methodsmentioning
confidence: 99%
“…A BiGRU [ 38 ] is similar to a BiLSTM [ 39 ] model in which backpropagation learning is introduced on the basis of GRU, as shown in Fig. 6 (b).…”
Section: Methodsmentioning
confidence: 99%
“…One way is to extract the entities from the structured data of the Shaanxi History Museum ( ) and the List of National Cultural Relics Collection (LNCRC, ), including three types of cultural relics: pottery, porcelain and bronzeware. Another way is to automatically extract entities from semi-structured and unstructured data, provided by Wikipedia and online museums ( ), using the entity extraction method proposed by Zhang et al [ 33 ]. The relation types and entity types are determined by the guidance of cultural relic experts.…”
Section: Methodsmentioning
confidence: 99%
“…Recurrent LSTM networks can address the limitations of traditional time series forecasting techniques by adapting nonlinearities of given COVID-19 dataset and can result in a state of the art results on temporal data (Chimmula & Zhang, 2020). The Long Short-Term Memory model (LSTM) (M. Zhang, Geng, & Chen, 2020; Q. Zhang, Gao, Liu, & Zheng, 2020) is an advancement from the recurrent neural network.…”
Section: Methods and Modelsmentioning
confidence: 99%