Proceedings of the 12th International Conference on Agents and Artificial Intelligence 2020
DOI: 10.5220/0008940304940505
|View full text |Cite
|
Sign up to set email alerts
|

MAGNET: Multi-Label Text Classification using Attention-based Graph Neural Network

Abstract: In Multi-Label Text Classification (MLTC), one sample can belong to more than one class. It is observed that most MLTC tasks, there are dependencies or correlations among labels. Existing methods tend to ignore the relationship among labels. In this paper, a graph attention network-based model is proposed to capture the attentive dependency structure among the labels. The graph attention network uses a feature matrix and a correlation matrix to capture and explore the crucial dependencies between the labels an… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
29
0
1

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 68 publications
(35 citation statements)
references
References 41 publications
1
29
0
1
Order By: Relevance
“…Many researchers have noticed that fully supervised information for multilabel learning is difficult to acquire. There are some works that focused directly on solving the problem of multilabel learning with missing labels [4,[6][7][8][9][10][11][12][13][14][15][16][17][18], which has also been called learning with missing label assignments or incomplete labels.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Many researchers have noticed that fully supervised information for multilabel learning is difficult to acquire. There are some works that focused directly on solving the problem of multilabel learning with missing labels [4,[6][7][8][9][10][11][12][13][14][15][16][17][18], which has also been called learning with missing label assignments or incomplete labels.…”
Section: Related Workmentioning
confidence: 99%
“…Ibrahim et al [9] proposed a weighted loss function to account for the confidence in each label/sample pair that can easily be incorporated to adjust a pre-trained model on missing labels or incomplete labels in multilabel text dataset problems. Pal et al [11] presented an attention-based graph neural network (AGNET) model for capturing the attentive correlation structure between labels. A feature matrix and a correlation matrix are used by the graph attention network to capture and examine the fundamental dependencies between the labels and to build classifiers for the assignment.…”
Section: Related Workmentioning
confidence: 99%
“…Recent studies [17], [20] have exploited the advantages of combining BERT and graph networks. In VGCN-BERT [17], a GCN is used to capture the correlation between words at the vocabulary level (i.e., global information).…”
Section: B Multi-label Text Classificationmentioning
confidence: 99%
“…In a similar work, Ankit Pal et al [20] leverage the combination between BERT embeddings and GAT to learn feature representation for text in a multi-label classification task. Their proposed approach (dubbed MAGNET) employs two components: First, a BiLSTM network with BERT embedding is used to capture text representation into an embedding vector.…”
Section: B Multi-label Text Classificationmentioning
confidence: 99%
See 1 more Smart Citation