Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1047
|View full text |Cite
|
Sign up to set email alerts
|

Enhancing Local Feature Extraction with Global Representation for Neural Text Classification

Abstract: For text classification, traditional local feature driven models learn long dependency by deeply stacking or hybrid modeling. This paper proposes a novel Encoder1-Encoder2 architecture, where global information is incorporated into the procedure of local feature extraction from scratch. In particular, En-coder1 serves as a global information provider, while Encoder2 performs as a local feature extractor and is directly fed into the classifier. Meanwhile, two modes are also designed for their interactions. Than… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(3 citation statements)
references
References 25 publications
0
3
0
Order By: Relevance
“…Qin et al [ 4 ] proposed a feature projection layer to eliminate the redundant information of features and improve the quality of features. G Niu et al [ 14 ] proposed a new Encoder1โ€“Encoder2 structure, where Encoder1 is a global information extractor and Encoder2 is a local information extractor. The global information vectors are merged with the local information vectors for a higher performance.…”
Section: Related Workmentioning
confidence: 99%
“…Qin et al [ 4 ] proposed a feature projection layer to eliminate the redundant information of features and improve the quality of features. G Niu et al [ 14 ] proposed a new Encoder1โ€“Encoder2 structure, where Encoder1 is a global information extractor and Encoder2 is a local information extractor. The global information vectors are merged with the local information vectors for a higher performance.…”
Section: Related Workmentioning
confidence: 99%
“…The syntactic dependency tree is highly structured, which inspires us to capture more structural information from parser model. Convolutional Neural Network (CNN) (LeCun et al, 1998) has been proven that retains the capacity of capturing local structure (Niu et al, 2019). Therefore, we adopt a parallel CNN structure query-key CNN (QKCNN) as the structural information extractor (Yang et al, 2018), where QKCNN is composed of query CNN and key CNN.…”
Section: Structural Extractormentioning
confidence: 99%
“…Global Context Encoder. Considering a complaint text contains several fine-grained clauses, we introduce a global context encoder [17] to capture the global textual context information ๐‘ to help identify the element of clauses in the complaint text ๐‘ฅ. We concatenate all clauses in the complaint text ๐‘ฅ ๐‘ = ๐‘๐‘œ๐‘›๐‘๐‘Ž๐‘ก ( [๐‘ฅ 1 , ..., ๐‘ฅ ๐‘› ]) and take the embeddings of them as the input of the global encoder, and use a bi-directional GRU [9] to encode the whole text sequence.…”
Section: Our Modelmentioning
confidence: 99%