2019
DOI: 10.1109/access.2019.2919494
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Label Classification of Microblogging Texts Using Convolution Neural Network

Abstract: Microblogging sites contain a huge amount of textual data and their classification is an imperative task in many applications, such as information filtering, user profiling, topical analysis, and content tagging. Traditional machine learning approaches mainly use a bag of words or n-gram techniques to generate feature vectors as text representation to train classifiers and perform considerably well for many text information processing tasks. Since short texts, such as tweets, contain a very limited number of w… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 44 publications
(17 citation statements)
references
References 16 publications
0
13
0
Order By: Relevance
“…They used Naïve Bayes for text classification, while we implemented a neural network. Neural networks have been shown to perform better than traditional machine learning mod-els 26,27 for text classification, in addition to other tasks like image classification. Rather than relying on the topics provided by Press Ganey, Doing-Harris et al 15 performed automatic topic modeling.…”
Section: Discussionmentioning
confidence: 99%
“…They used Naïve Bayes for text classification, while we implemented a neural network. Neural networks have been shown to perform better than traditional machine learning mod-els 26,27 for text classification, in addition to other tasks like image classification. Rather than relying on the topics provided by Press Ganey, Doing-Harris et al 15 performed automatic topic modeling.…”
Section: Discussionmentioning
confidence: 99%
“…Softmax function is the suitable choice for multiclass classification (28) . Softmax considered useful because it converts the scores to a normalized probability distribution.…”
Section: Artificial Neural Networkmentioning
confidence: 99%
“…In such cases Continuous Bag of Words and Skip‐gram models are applied. However, vectors obtained through such models do not consider words arrangements, context of the words and other words syntaxes such as POS tags 16 . For effective feature extraction and representation for sentence level sentiment classification, it is crucial to include other linguistic aspects such as POS tags, word semantic orientations and arrangement.…”
Section: Introductionmentioning
confidence: 99%