2022
DOI: 10.2478/ijanmc-2022-0030
|View full text |Cite
|
Sign up to set email alerts
|

Research on Topic Fusion Graph Convolution Network News Text Classification Algorithm

Abstract: In the face of a large amount of news text information, how to make a reasonable classification of news text is a hot issue of modern scholars. To solve the problem that only word co-occurrence was considered in the Text Graph Convolutional Network (Text-GCN) method to build a graph model, a news text classification algorithm which fuses themes and is based on Graph Convolution Network, is presented. Firstly, the LDA topic model is used to process the corpus to obtain the distribution of themes of the corpus. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 20 publications
(17 reference statements)
0
2
0
Order By: Relevance
“…To do this, the label predictions are sorted in descending order based on their probabilities. For simplicity, the sorted probabilities are denoted as shown in the equation: (21) where n represents the number of labels, and p i represents the predicted score for the i-th label.…”
Section: Label Correction Interval Selection Based On Binary Entropymentioning
confidence: 99%
See 1 more Smart Citation
“…To do this, the label predictions are sorted in descending order based on their probabilities. For simplicity, the sorted probabilities are denoted as shown in the equation: (21) where n represents the number of labels, and p i represents the predicted score for the i-th label.…”
Section: Label Correction Interval Selection Based On Binary Entropymentioning
confidence: 99%
“…Depending on the different model embeddings, character level [17], word level [18] and sentence level [19] have been proposed. As research progresses, researchers have started exploring how to parse semantics using graph structures, including TextING [20] and [21]. With the development of pre-training, pre-trained models such as ERNIE [22,23] have gradually gained attention.…”
Section: Introductionmentioning
confidence: 99%