2019 International Joint Conference on Neural Networks (IJCNN) 2019
DOI: 10.1109/ijcnn.2019.8851686
|View full text |Cite
|
Sign up to set email alerts
|

Gated Neural Network with Regularized Loss for Multi-label Text Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 15 publications
0
2
0
Order By: Relevance
“…SGM (Yang et al, 2018b) applies the seq2seq model with attention mechanism, which takes the global contextual information. REGGNN (Xu et al, 2019) uses a combination of CNN and LSTM with a dynamic gate that controls the information from these two parts. NLP-CAP (Zhao et al, 2019) is a capsule-based approach for MLC, which reformulates the routing algorithm.…”
Section: Evaluation Metricsmentioning
confidence: 99%
See 1 more Smart Citation
“…SGM (Yang et al, 2018b) applies the seq2seq model with attention mechanism, which takes the global contextual information. REGGNN (Xu et al, 2019) uses a combination of CNN and LSTM with a dynamic gate that controls the information from these two parts. NLP-CAP (Zhao et al, 2019) is a capsule-based approach for MLC, which reformulates the routing algorithm.…”
Section: Evaluation Metricsmentioning
confidence: 99%
“…The traditional MLC method SLEEC (Bhatia et al, 2015) makes use of label correlations by embedding the label co-occurrence graph. The seq2seq model SGM (Yang et al, 2018b) uses the attention mechanism to consider the label correlations, while REGGNN (Xu et al, 2019) applies a regularized loss specified for label co-occurrence. REGGNN additionally chooses to dynamically combine the local and global contextual information to construct document representations.…”
Section: Multi-label Classificationmentioning
confidence: 99%