2018
DOI: 10.3390/fi10100095
|View full text |Cite
|
Sign up to set email alerts
|

Chinese Event Extraction Based on Attention and Semantic Features: A Bidirectional Circular Neural Network

Abstract: Chinese event extraction uses word embedding to capture similarity, but suffers when handling previously unseen or rare words. From the test, we know that characters may provide some information that we cannot obtain in words, so we propose a novel architecture for combining word representations: character–word embedding based on attention and semantic features. By using an attention mechanism, our method is able to dynamically decide how much information to use from word or character level embedding. With the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 9 publications
0
6
0
Order By: Relevance
“…Zeng et al 14 uses a bidirectional recurrent neural network to extract sentence features and uses the CNN network to extract lexical features, which reduces the impact of Chinese word segmentation errors and improves the performance of Chinese event extraction. Wu et al 15 proposes a neural network model based on semantic features and attention mechanism, which uses word vector information and attention mechanism to generate word vectors. They combine external semantic features to improve the quality of word vectors and achieves good results in event extraction tasks.…”
Section: Related Workmentioning
confidence: 99%
“…Zeng et al 14 uses a bidirectional recurrent neural network to extract sentence features and uses the CNN network to extract lexical features, which reduces the impact of Chinese word segmentation errors and improves the performance of Chinese event extraction. Wu et al 15 proposes a neural network model based on semantic features and attention mechanism, which uses word vector information and attention mechanism to generate word vectors. They combine external semantic features to improve the quality of word vectors and achieves good results in event extraction tasks.…”
Section: Related Workmentioning
confidence: 99%
“…Many word-level attention mechanisms have been proposed to learn the importance of each word in a sentence, like distinguishing different argument words, word types, word relations [197]- [202]. They mainly differ in what elements should be given more attentions and how to train attention vectors.…”
Section: E Attention Mechanismmentioning
confidence: 99%
“…The basic idea is that the syntactic dependency can provide the connection in between two possibly nonconsecutive yet distant words; While the dependency type can help to distinguish the syntactic importance in between words. For Chinese event extraction, as there has no explicit word segmentation like English, Wu et al [202] also proposed a character-level attention mechanism to distinguish each character importance in a Chinese word.…”
Section: E Attention Mechanismmentioning
confidence: 99%
“…They combined the distributed word vector, the part-of-speech feature POS of the medical text and the named entity feature NE as the input of the model. Wu and Zhang [23] used binary representation model to construct word vectors based on part of speech, dependencies and distance from core words. The BiLSTM-CRF model was used to implement Chinese event extraction and achieved good results.…”
Section: B Word Representationmentioning
confidence: 99%