2020
DOI: 10.1016/j.neucom.2019.11.077
|View full text |Cite
|
Sign up to set email alerts
|

Generation of topic evolution graphs from short text streams

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
3

Relationship

2
7

Authors

Journals

citations
Cited by 32 publications
(9 citation statements)
references
References 11 publications
0
9
0
Order By: Relevance
“…For modeling sequential time-series data with right granularity, Wang et al ( 2008 ) developed the continuous time dynamic topic model (cDTM). Topic content evolutionary graphs from short text streams are generated by an Online version of Conditional random field regularized correlated topic model (CCTM) to capture the evolution path of main topics and related subtopics (Gao et al, 2020 ). In fact, the evolutionary path can be explained by drifting among topics.…”
Section: Related Workmentioning
confidence: 99%
“…For modeling sequential time-series data with right granularity, Wang et al ( 2008 ) developed the continuous time dynamic topic model (cDTM). Topic content evolutionary graphs from short text streams are generated by an Online version of Conditional random field regularized correlated topic model (CCTM) to capture the evolution path of main topics and related subtopics (Gao et al, 2020 ). In fact, the evolutionary path can be explained by drifting among topics.…”
Section: Related Workmentioning
confidence: 99%
“…In recent years, knowledge representation learning has been a research hotspot due to its excellent performance in various tasks such as knowledge acquisition, integration, reasoning and topic evolution analysis [16,10,7,9]. Bordes et al proposed an unstructured model, which embed head and tail entities (h, t) into a vector space.…”
Section: Knowledge Representation Learningmentioning
confidence: 99%
“…This is due to the short length and high noise of tweets, and high levels of text overlap between the two categories. Probabilistic topic models may provide additional topic information for semantic differences, and the latest research on neural networks has shown that topic integration can improve the performance of NLP tasks such as summarization and question answering [ 8 , 11 , 38 ]. However, there is no standard method to integrate topic information with pre-trained language models such as BERT.…”
Section: Introductionmentioning
confidence: 99%