Proceedings of the 2018 International Conference on Mathematics, Modelling, Simulation and Algorithms (MMSA 2018) 2018
DOI: 10.2991/mmsa-18.2018.97
|View full text |Cite
|
Sign up to set email alerts
|

Vector Representation of Words for Detecting Topic Trends over Short Texts

Abstract: Abstract-It is a critical task to infer discriminative and coherent topics from short texts. Furthermore, people not only want to know what kinds of topics can be extract from these short texts, but also desire to obtain the temporal dynamic evolution of these topics. In this paper, we present a novel model for short texts, referred as topic trend detection (TTD) model. Based on an optimized topic model we proposed, TTD model derives more typical terms and itemsets to represent topics of short texts and improv… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 21 publications
0
2
0
Order By: Relevance
“…Once the number of topics has been determined, we proceed to find vectorembedding representations of tweets, as they have been previously shown to yield superior results in topic modelling with respect to LDA [9]. Here, we use the word2vec-based [17,7,22] FastText model [4], which essentially uses sub-word information to enrich embeddings generated by a neural network that predicts neighbouring words.…”
Section: Methodsmentioning
confidence: 99%
“…Once the number of topics has been determined, we proceed to find vectorembedding representations of tweets, as they have been previously shown to yield superior results in topic modelling with respect to LDA [9]. Here, we use the word2vec-based [17,7,22] FastText model [4], which essentially uses sub-word information to enrich embeddings generated by a neural network that predicts neighbouring words.…”
Section: Methodsmentioning
confidence: 99%
“…Bengio et al (2003) proposed a neural network-based language model (NNLM) and introduced the concept of "word embedding." Afterward, word vectors were proposed by He et al (2018) to train the surrounding context of a target word and carry contextual information about the word (Jiang & He, 2020;Lan et al, 2021). Barnickel et al (2009) proposed the SENNA model for the purpose of generating word vector representations.…”
Section: Figure 2 Relationship Between MI and Information Entropy Dis...mentioning
confidence: 99%