Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Langua 2021
DOI: 10.18653/v1/2021.naacl-main.145
|View full text |Cite
|
Sign up to set email alerts
|

Target-specified Sequence Labeling with Multi-head Self-attention for Target-oriented Opinion Words Extraction

Abstract: Opinion target extraction and opinion term extraction are two fundamental tasks in Aspect Based Sentiment Analysis (ABSA). Many recent works on ABSA focus on Targetoriented Opinion Words (or Terms) Extraction (TOWE), which aims at extracting the corresponding opinion words for a given opinion target. TOWE can be further applied to Aspect-Opinion Pair Extraction (AOPE) which aims at extracting aspects (i.e., opinion targets) and opinion terms in pairs. In this paper, we propose Target-Specified sequence labelin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
17
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(17 citation statements)
references
References 26 publications
0
17
0
Order By: Relevance
“…It aims to extract from a sentence the corresponding opinion span describing an aspect. Most work in this area treats OE as a sequence tagging task, for which complex methods are developed to capture the interaction between the aspect and the context (Fan et al, 2019;Wu et al, 2020;Feng et al, 2021). More recent models such as TSMSA-BERT (Feng et al, 2021) and ARGCN-BERT (Jiang et al, 2021), adopt PLMs.…”
Section: Aspect-based Opinion Extraction Oementioning
confidence: 99%
See 4 more Smart Citations
“…It aims to extract from a sentence the corresponding opinion span describing an aspect. Most work in this area treats OE as a sequence tagging task, for which complex methods are developed to capture the interaction between the aspect and the context (Fan et al, 2019;Wu et al, 2020;Feng et al, 2021). More recent models such as TSMSA-BERT (Feng et al, 2021) and ARGCN-BERT (Jiang et al, 2021), adopt PLMs.…”
Section: Aspect-based Opinion Extraction Oementioning
confidence: 99%
“…Most work in this area treats OE as a sequence tagging task, for which complex methods are developed to capture the interaction between the aspect and the context (Fan et al, 2019;Wu et al, 2020;Feng et al, 2021). More recent models such as TSMSA-BERT (Feng et al, 2021) and ARGCN-BERT (Jiang et al, 2021), adopt PLMs. In TSMSA-BERT, the multi-head self-attention is utilized to enhance the BERT PLM.…”
Section: Aspect-based Opinion Extraction Oementioning
confidence: 99%
See 3 more Smart Citations