2021
DOI: 10.1109/taslp.2021.3058540
|View full text |Cite
|
Sign up to set email alerts
|

Deep Selective Memory Network With Selective Attention and Inter-Aspect Modeling for Aspect Level Sentiment Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(4 citation statements)
references
References 29 publications
0
4
0
Order By: Relevance
“…Finally a CRF is used to model dependencies among output labels. Lin et al proposed a DSMN [41] to guide multi-hop attention mechanism by computing distance between an aspect and its context to capture aspect aware context information.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Finally a CRF is used to model dependencies among output labels. Lin et al proposed a DSMN [41] to guide multi-hop attention mechanism by computing distance between an aspect and its context to capture aspect aware context information.…”
Section: Related Workmentioning
confidence: 99%
“…9. DSMN [41]: The multi-hop attention is guided by dynamically selected context memory which then integrates the aspect information with the memory networks. 10.…”
Section: Interactive Attention Network (Ian): Ian Model [53] Utilizes...mentioning
confidence: 99%
“…Peiqin Lin et.al [19] introduced the Deep Selective Memory Network (DSMN), an original system for a perspective level feeling arrangement that progressively chooses setting memory for better directing the multi-jump consideration instrument and incorporates between angle data with profound memory organization, was proposed. DSMN centers around unmistakable districts of the setting memory in various memory network layers to catch extensive viewpoint context-setting data by building a particular consideration system dependent on the distance data between a perspective and its specific circumstance.…”
Section: Related Workmentioning
confidence: 99%
“…Memory networks have been applied to aspect-level sentiment classification. It can be used to extract rich aspect-aware context information [18]. Combined with attention mechanism, the performances of memory networks [19,20] can be improved.…”
Section: Literature Reviewmentioning
confidence: 99%