The World Wide Web Conference 2019
DOI: 10.1145/3308558.3313750
|View full text |Cite
|
Sign up to set email alerts
|

Aspect-level Sentiment Analysis using AS-Capsules

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
33
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 60 publications
(33 citation statements)
references
References 24 publications
0
33
0
Order By: Relevance
“…In document level. Bollegala et al [40] proposed a method to perform cross-domain SA using a sentiment-sensitive thesaurus (SST). Blitzer et al [41] extended the structural correspondence learning (SCL).…”
Section: Sentiment Analysismentioning
confidence: 99%
“…In document level. Bollegala et al [40] proposed a method to perform cross-domain SA using a sentiment-sensitive thesaurus (SST). Blitzer et al [41] extended the structural correspondence learning (SCL).…”
Section: Sentiment Analysismentioning
confidence: 99%
“…In this study, we research only the second issue, which also consists of two main technical lines, namely, rule-based [59] and machine learningbased. The variants of recurrent neural networks (RNNs), such as long short-term memory (LSTM) [92], gated recurrent unit (GRU) [93], and the capsule network [94,95] have been widely used for aspect-level sentiment classification [96][97][98][99][100]. Tang et al [96] employed a forward LSTM and a backward LSTM to model the left and right contexts of the aspect separately and then concatenated the context representations for prediction of the sentiment polarity.…”
Section: Recent Researches On Aspect-level Sentiment Classificationmentioning
confidence: 99%
“…Yang et al [98] proposed a new approach with the guidance of contextual, lexical, and syntactic cues, in which a new target representation sub-network was used to capture the semantic and contextual information of targets and a new dependence attention mechanism was utilized to model the syntactic dependency cues between targets and other words. Wang et al [99] proposed the aspect-level sentiment capsules model (AS-Capsules), which utilized the correlation between aspect and sentiment to perform aspect detection and sentiment classification simultaneously, in a joint manner. Chen and Qian [100] proposed a Transfer Capsule Network (TransCap) model, which utilized an aspect routing approach and the dynamic routing approach to transfer document-level knowledge to aspect-level sentiment classification.…”
Section: Recent Researches On Aspect-level Sentiment Classificationmentioning
confidence: 99%
“…Multi-task learning is widely used in natural language processing, such as jointly learning Chinese word segmentation and named entity recognition (Peng and Dredze 2017), jointly performing aspect detection and sentiment classification (Wang et al 2019), jointly extracting named entities and relation (Zheng et al 2017). Semantic role labeling model LISA (Strubell et al 2018) simultaneously learned dependency parsing and used a syntactically-informed selfattention mechanism to attend to each token's syntactic parse parent.…”
Section: Multi-task Learningmentioning
confidence: 99%