2016 IEEE 28th International Conference on Tools With Artificial Intelligence (ICTAI) 2016
DOI: 10.1109/ictai.2016.0132
|View full text |Cite
|
Sign up to set email alerts
|

Robust Word-Network Topic Model for Short Texts

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(11 citation statements)
references
References 13 publications
0
11
0
Order By: Relevance
“…The time complexity of the WNTM algorithm is O KNIc(c − 1 , where K is the number of pre-defined hidden topics, N denotes number of documents in the dataset, c is size of sliding window, I is the average length of every document in D . Wang et al (2016) developed the Robust WNTM (R-WNTM) as an extension for Short Texts. As the irrelevant data in the word-word space building procedure of WNTM is high, the R-WNTM that filters the unrelated data during the sampling process is presented.…”
Section: Global Word Co-occurrences Based Asttm Modelsmentioning
confidence: 99%
See 3 more Smart Citations
“…The time complexity of the WNTM algorithm is O KNIc(c − 1 , where K is the number of pre-defined hidden topics, N denotes number of documents in the dataset, c is size of sliding window, I is the average length of every document in D . Wang et al (2016) developed the Robust WNTM (R-WNTM) as an extension for Short Texts. As the irrelevant data in the word-word space building procedure of WNTM is high, the R-WNTM that filters the unrelated data during the sampling process is presented.…”
Section: Global Word Co-occurrences Based Asttm Modelsmentioning
confidence: 99%
“…They chose nine topic-related groups and sampled 182,671 tweets in the total under those categories. This Tweets dataset has been utilized in a few studies (Jiang et al 2018;Wang et al 2016;Zuo, Wu, et al 2016a;Lin et al 2020a;Indra and Pulungan 2019). Table 11 describes the details of this dataset.…”
Section: Tweets Datasetmentioning
confidence: 99%
See 2 more Smart Citations
“…In addition, WNTM contains a huge data that is not relevant in word-word space. An extension to WNTM, Wang et al [57] introduced a novel model called Robust WNTM (R-WNTM), which filters the unrelated data during the sampling process is presented as the irrelevant data in the word-word space building procedure of WNTM is high, Jiang et al [58] suggested WNTM with Word2Vector (WNTM-W2V) to discover deep meaning among words to increase the accuracy of relationship among words as well as to improve topic coherence. Wu et al [59] introduced a clustering method for short texts based on the (BG & SLF-Kmeans) method.…”
Section: Short Text Topic Modellingmentioning
confidence: 99%