2018
DOI: 10.1007/978-3-319-93034-3_54
|View full text |Cite
|
Sign up to set email alerts
|

Cross-Domain Sentiment Classification via a Bifurcated-LSTM

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
6
1
1

Relationship

2
6

Authors

Journals

citations
Cited by 15 publications
(10 citation statements)
references
References 14 publications
0
10
0
Order By: Relevance
“…Deep learning models constitute another important group of techniques for cross-domain sentiment classification. Various neural network architectures are exploited to compute the representation vectors for reviews, such as, stacked auto-encoders (SAE) [33], [34], fully connected neural networks [21], convolutional neural networks (CNN) [25], and also recurrent neural networks (RNN) typically with long short-term memory (LSTM) units [23]. To enhance the expressive power of neural networks, there are various works investigating the use of recently developed attention and memory mechanisms [22], [24].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Deep learning models constitute another important group of techniques for cross-domain sentiment classification. Various neural network architectures are exploited to compute the representation vectors for reviews, such as, stacked auto-encoders (SAE) [33], [34], fully connected neural networks [21], convolutional neural networks (CNN) [25], and also recurrent neural networks (RNN) typically with long short-term memory (LSTM) units [23]. To enhance the expressive power of neural networks, there are various works investigating the use of recently developed attention and memory mechanisms [22], [24].…”
Section: Related Workmentioning
confidence: 99%
“…Comparing the domain adaption strategies used by various state-of-the-art techniques, we can see that, in addition to the main task of sentiment classification, they usually enhance their learning through preparing extra tasks like detecting whether a pivot co-occurs with a domain-specific word, whether the different versions of the same pivot word in different domains possess similar enough representation vectors, whether a review contains a pivot, or whether the reviews from the source and target domains can be distinguished in the representation space, and so on; we refer to these as the auxiliary learning tasks. The learning algorithms are mostly built on spectral approaches which explore and preserve inherent data structure through matrix decompositions [10], [11], [17], or neural networks which directly learn the review representations through structured processing of the content words based on different network architectures and exhaustive training [21], [23]- [25].…”
Section: Introductionmentioning
confidence: 99%
“…Deep learning techniques, such as Artificial Neural Network (ANN), Recurrent Neural Network (RNN), LSTM, have been widely employed in various applications due to their strong capability in data pattern and correlation mining [34], [35]. In TrafficChain, it is obvious that the road passing time cost for and the number of vehicles on each road segment are highly related to the history data on the same road segment and the data on other road segments, which essentially indicates the spatiotemporal correlation among the traffic status data.…”
Section: ) Lstm Based Secure Report Aggregationmentioning
confidence: 99%
“…proposed a model based on Word2vec word vector synthesis, which can be used for document synthesis simply based on word vectors. Peng H et al [10] and Ji J et al [11] proposed to use BiLSTM and CNN models for text representation, and to build more complex network models for document modeling, so as to obtain key information from documents.…”
Section: Introductionmentioning
confidence: 99%