Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing 2015
DOI: 10.18653/v1/d15-1168
|View full text |Cite
|
Sign up to set email alerts
|

Fine-grained Opinion Mining with Recurrent Neural Networks and Word Embeddings

Abstract: The tasks in fine-grained opinion mining can be regarded as either a token-level sequence labeling problem or as a semantic compositional task. We propose a general class of discriminative models based on recurrent neural networks (RNNs) and word embeddings that can be successfully applied to such tasks without any taskspecific feature engineering effort. Our experimental results on the task of opinion target identification show that RNNs, without using any hand-crafted features, outperform feature-rich CRF-ba… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
252
0
1

Year Published

2015
2015
2023
2023

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 361 publications
(254 citation statements)
references
References 30 publications
1
252
0
1
Order By: Relevance
“…An LSTM (Long Short-Term Memory) is an extension of an RNN with more stable gradients (Hochreiter and Schmidhuber, 1997). Bi-LSTM have recently successfully been used for a variety of tasks (Collobert et al, 2011;Huang et al, 2015;Kiperwasser and Goldberg, 2016;Liu et al, 2015;. For further details, cf.…”
Section: Modelmentioning
confidence: 99%
“…An LSTM (Long Short-Term Memory) is an extension of an RNN with more stable gradients (Hochreiter and Schmidhuber, 1997). Bi-LSTM have recently successfully been used for a variety of tasks (Collobert et al, 2011;Huang et al, 2015;Kiperwasser and Goldberg, 2016;Liu et al, 2015;. For further details, cf.…”
Section: Modelmentioning
confidence: 99%
“…Bi-LSTMs have already been used for finegrained sentiment analysis (Liu et al, 2015), syntactic chunking (Huang et al, 2015), and semantic role labeling (Zhou and Xu, 2015). These and other recent applications of bi-LSTMs were constructed for solving a single task in isolation, however.…”
mentioning
confidence: 99%
“…For sentiment detection, saccade channel seems to be handing text having semantic incongruity (due 5 a standard range (Liu et al, 2015;Melamud et al, 2016) to the presence of irony / sarcasm) better. Fixation channel does not help much, may be because of higher variance in fixation duration.…”
Section: Effect Of Fixation / Saccade Channelsmentioning
confidence: 99%