Proceedings of the 9th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis 2018
DOI: 10.18653/v1/w18-6220
|View full text |Cite
|
Sign up to set email alerts
|

Dual Memory Network Model for Biased Product Review Classification

Abstract: In sentiment analysis (SA) of product reviews, both user and product information are proven to be useful. Current tasks handle user profile and product information in a unified model which may not be able to learn salient features of users and products effectively. In this work, we propose a dual user and product memory network (DUPMN) model to learn user profiles and product reviews using separate memory networks. Then, the two representations are used jointly for sentiment prediction. The use of separate mod… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
17
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2
2

Relationship

2
6

Authors

Journals

citations
Cited by 19 publications
(17 citation statements)
references
References 27 publications
0
17
0
Order By: Relevance
“…CMA (Ma et al, 2017) A hierarchical LSTM encoding the document; injects user and product information hierarchically. DUPMN (Long et al, 2018) encodes the document using a hierarchical LSTM; adopts two memory networks, one for user information and another for product information. HCSC (Amplayo et al, 2018) A combination of CNN and Bi-LSTM as the document encoder; injects user/product information with bias-attention.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…CMA (Ma et al, 2017) A hierarchical LSTM encoding the document; injects user and product information hierarchically. DUPMN (Long et al, 2018) encodes the document using a hierarchical LSTM; adopts two memory networks, one for user information and another for product information. HCSC (Amplayo et al, 2018) A combination of CNN and Bi-LSTM as the document encoder; injects user/product information with bias-attention.…”
Section: Resultsmentioning
confidence: 99%
“…1 Introduction Document-level sentiment analysis aims to predict sentiment polarity of text that often takes the form of product or service reviews. Tang et al (2015) demonstrated that modelling the individual who has written the review, as well as the product being reviewed, is worthwhile for polarity prediction, and this has led to exploratory work on how best to combine review text with user/product information in a neural architecture (Chen et al, 2016;Ma et al, 2017;Dou, 2017;Long et al, 2018;Amplayo, 2019;Amplayo et al, 2018). A feature common amongst past studies is that user and product IDs are modelled as embedding vectors whose parameters are learned during training.…”
mentioning
confidence: 99%
“…In addition to the competing models above, we also report results from previous state-of-the-art sentiment classification models that use user and product information: (a) UPNN (Tang et al, 2015) uses a CNN encoder and customizes on bias vectors and word embeddings; (b) UPDMN (Dou, 2017) uses an LSTM encoder and customizes on memory vectors; (c) NSC (Chen et al, 2016) uses a hierarchical LSTM encoder and customizes on attention mechanism; (d) HCSC (Amplayo et al, 2018a) uses a BiLSTM and a CNN as encoders and customizes on a cold-start aware attention mechanism (CSAA); (e) PMA (Zhu and Yang, 2017) uses a hierarchical LSTM encoder and customizes on PMA, an attention mechanism guided by external features; (f) DUPMN (Long et al, 2018) uses a hierarchical LSTM encoder and customizes on memory vectors; and (g) CMA (Ma et al, 2017) uses a hierarchical attention-based encoder and customizes on user-and productspecific attention mechanism (CMA). The com-Models Acc RMSE UPNN (Tang et al, 2015) CNN + word-cust + bias-cust 59.6 0.784 UPDMN (Dou, 2017) LSTM + memory-cust 63.9 0.662 NSC (Chen et al, 2016) LSTM + attention-cust 65.0 0.692 HCSC (Amplayo et al, 2018a) BiLSTM + CNN + attention-cust (CSAA) 65.7 0.660 PMA (Zhu and Yang, 2017) HierLSTM + attention-cust (PMA) 65.8 0.668 DUPMN (Long et al, 2018) HierLSTM + memory-cust 66.2 0.667 CMA (Ma et al, 2017) HierAttention + attention-cust (CMA) 66.4 0.677…”
Section: Review Sentiment Classificationmentioning
confidence: 99%
“…The majority of this paper uses a base model that accepts a review x = x 1 , ..., x n as input and returns a sentiment y as output, which we extend to also accept the corresponding user u and product p attributes as additional inputs. Different from previous work where models use complex architectures such as hierarchical LSTMs (Chen et al, 2016;Zhu and Yang, 2017) and external memory networks (Dou, 2017;Long et al, 2018), we aim to achieve improvements by only modifying how we represent and inject attributes. Thus, we use a simple classifier as our base model, which consists of four parts explained briefly as follows.…”
Section: The Base Modelmentioning
confidence: 99%
“…4. DUPMN (Long et al, 2018) also uses a hierarchical LSTM as base model and incorporates attributes as two separate deep memory network, one for each attribute.…”
Section: Comparisons With Models In the Literaturementioning
confidence: 99%