2016
DOI: 10.1007/978-3-319-47602-5_40
|View full text |Cite
|
Sign up to set email alerts
|

Polarity Classification for Target Phrases in Tweets: A Word2Vec Approach

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 26 publications
(5 citation statements)
references
References 10 publications
0
5
0
Order By: Relevance
“…More specifically, word2vec employs neural networks to model the relation between a given word and its context of neighboring words in the given col- This type of model is also referred to as neural language models (Bengio et al, 2003). Recently, word2vec has been shown to be successful in many natural language processing tasks ranging from sentiment analysis (Liang et al, 2015;Ren et al, 2016;Rexha et al, 2016), topic modeling (Bicalho et al, 2017), through document classification (Lilleberg et al, 2015;Yoshikawa et al, 2014) and name entity recognition (Seok et al, 2015;Tang et al, 2014) to machine translation (Freitas et al, 2016;Zou et al, 2013).…”
Section: Continuous-space Modelsmentioning
confidence: 99%
“…More specifically, word2vec employs neural networks to model the relation between a given word and its context of neighboring words in the given col- This type of model is also referred to as neural language models (Bengio et al, 2003). Recently, word2vec has been shown to be successful in many natural language processing tasks ranging from sentiment analysis (Liang et al, 2015;Ren et al, 2016;Rexha et al, 2016), topic modeling (Bicalho et al, 2017), through document classification (Lilleberg et al, 2015;Yoshikawa et al, 2014) and name entity recognition (Seok et al, 2015;Tang et al, 2014) to machine translation (Freitas et al, 2016;Zou et al, 2013).…”
Section: Continuous-space Modelsmentioning
confidence: 99%
“…These vectors present a fixed size and concentrate information regarding semantic relationships which allows to associate the context to its meaning. As text documents contain different numbers of words, we follow the strategy described by [24], [25] to obtain a single vector representing each text document. For that purpose, the "document vector" is obtained by averaging the word vectors and weighting them according to the frequency of each word.…”
Section: B Feature Extractionmentioning
confidence: 99%
“…Once trained, such a model can detect synonymous words or suggest different words for a partial sentence. As the name implies, word2vec represents each specific word with a particular list of numbers called a vector [11].…”
Section: Figure 21mentioning
confidence: 99%