2023
DOI: 10.26599/tst.2021.9010079
|View full text |Cite
|
Sign up to set email alerts
|

Supervised Contrastive Learning with Term Weighting for Improving Chinese Text Classification

Abstract: With the rapid growth of information retrieval technology, Chinese text classification, which is the basis of information content security, has become a widely discussed topic. In view of the huge difference compared with English, Chinese text task is more complex in semantic information representations. However, most existing Chinese text classification approaches typically regard feature representation and feature selection as the key points, but fail to take into account the learning strategy that adapts to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 35 publications
0
0
0
Order By: Relevance
“…However, in the case of supervision, the scarcity of data does not aid in enhancing the semantic representation of sentences. Guo et al [27] proposed a supervised word-weighted contrastive learning method that enhances source text data via the calculation of word weight and utilizes an adversarial approach in combination with supervised contrastive learning. However, this method requires a large amount of labeled data, and the methodology used in constructing positive and negative samples cannot guarantee semantic consistency.…”
Section: Contrastive Learningmentioning
confidence: 99%
“…However, in the case of supervision, the scarcity of data does not aid in enhancing the semantic representation of sentences. Guo et al [27] proposed a supervised word-weighted contrastive learning method that enhances source text data via the calculation of word weight and utilizes an adversarial approach in combination with supervised contrastive learning. However, this method requires a large amount of labeled data, and the methodology used in constructing positive and negative samples cannot guarantee semantic consistency.…”
Section: Contrastive Learningmentioning
confidence: 99%