2020
DOI: 10.1109/access.2019.2954985
|View full text |Cite
|
Sign up to set email alerts
|

A Fusion Model-Based Label Embedding and Self-Interaction Attention for Text Classification

Abstract: Text classification is a pivotal task in NLP (Natural Language Processing), which has received widespread attention recently. Most of the existing methods leverage the power of deep learning to improve the performance of models. However, these models ignore the interaction information between all the sentences in a text when generating the current text representation, which results in a partial semantics loss. Labels play a central role in text classification. And the attention learned from text-label in the j… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
18
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 36 publications
(20 citation statements)
references
References 30 publications
(29 reference statements)
2
18
0
Order By: Relevance
“…Recently, self-attention mechanism has attracted peopleʼs attention and achieved state-of-the-art results. Dong et al [25] combined a self-interaction attention mechanism with label representation and used the Bert model to solve the problem of text feature extraction. In this method, joint word representation and label representation are proposed to improve the efficiency of the model.…”
Section: Complex Neural Network Methodmentioning
confidence: 99%
“…Recently, self-attention mechanism has attracted peopleʼs attention and achieved state-of-the-art results. Dong et al [25] combined a self-interaction attention mechanism with label representation and used the Bert model to solve the problem of text feature extraction. In this method, joint word representation and label representation are proposed to improve the efficiency of the model.…”
Section: Complex Neural Network Methodmentioning
confidence: 99%
“…Word2vec s high-dimensional vector model solves the multi-dimensional semantic problem, because it can quickly and effectively express words in high-dimensional vector form through the optimized training model according to a given corpus, thereby providing a new tool for the application research in the field of natural language processing [63]. Academic research [64,65] demonstrates that Word2vec has achieved excellent performance in the fields of text similarity calculation and text classification. In light of the above analysis, this study opted to construct Word2vec vectors for the pre-processed and semantically expanded comment text.…”
Section: Text Representation (1) Text Representation Of Comments Based On Word2vecmentioning
confidence: 99%
“…The context vector c is calculated with these attention weights, and inputted to the decoder, which makes the decoder can access the whole input sequence also focus on relevant positions in the input sequence. Given good performance, there are numerous applications of AM in Natural Language Processing [25], [26], Speech Recognition [27], [28], Text Processing [29] and Computer Vision [30], et al…”
Section: Attention Mechanismmentioning
confidence: 99%