2020
DOI: 10.1016/j.procs.2020.03.416
|View full text |Cite
|
Sign up to set email alerts
|

Sentimental Short Sentences Classification by Using CNN Deep Learning Model with Fine Tuned Word2Vec

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
45
0
3

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 76 publications
(48 citation statements)
references
References 7 publications
0
45
0
3
Order By: Relevance
“…Recently, vector representations of documents or sentences are used as input features into supervised machine learning classifiers. Techniques used to build the vector representations include, TF‐IDF, 11,14 N‐grams 11,19 and Word2Vec models 15,20 among others. Chug, Gupta and Ahuja 11 analyzed the impact of two feature extraction techniques; TF‐IDF word level and N‐Gram on SS‐Tweet dataset of sentiment analysis.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Recently, vector representations of documents or sentences are used as input features into supervised machine learning classifiers. Techniques used to build the vector representations include, TF‐IDF, 11,14 N‐grams 11,19 and Word2Vec models 15,20 among others. Chug, Gupta and Ahuja 11 analyzed the impact of two feature extraction techniques; TF‐IDF word level and N‐Gram on SS‐Tweet dataset of sentiment analysis.…”
Section: Related Workmentioning
confidence: 99%
“…In their work, 20 word vectors were generated from pre‐trained Word2Vec model and CNN layer was used to extract better features for short sentences categorization. Multiple channels pattern is applied for text processing in which one channel could be the sequence of words, another channel the sequence of corresponding POS tags, and the third one the shape of the word.…”
Section: Related Workmentioning
confidence: 99%
“…Now slide to size k of the kernel to input all the words by multiplying the input vector by each kernel value and also by the number of overlapping values. Between the set of input insertion vectors in a given window and the weight vector u, there is a point product, where the non-linear activation function g often follows [31].…”
Section: Cnn Modelmentioning
confidence: 99%
“…This method generates a vocabulary from the input words and then learns the word vectors via backpropagation and stochastic gradient descent. 14 To visualize the ethnic relationships recorded in the Hetu Dangse catalog, we input the first 300 words of the word vector into the trained word2vec model and performed dimensionality reduction to realize a planar graph. To understand the structure of the data intuitively, we used the t-SNE algorithm to reduce the dimensions of the word vector.…”
Section: Word Vector Conversion Of Text Catalog Datamentioning
confidence: 99%