Proceedings of the 5th Workshop on Noisy User-Generated Text (W-Nut 2019) 2019
DOI: 10.18653/v1/d19-5505
|View full text |Cite
|
Sign up to set email alerts
|

Exploiting BERT for End-to-End Aspect-based Sentiment Analysis

Abstract: In this paper, we investigate the modeling power of contextualized embeddings from pretrained language models, e.g. BERT, on the E2E-ABSA task. Specifically, we build a series of simple yet insightful neural baselines to deal with E2E-ABSA. The experimental results show that even with a simple linear classification layer, our BERT-based architecture can outperform state-of-the-art works. Besides, we also standardize the comparative study by consistently utilizing a hold-out development dataset for model select… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
97
0
1

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 236 publications
(111 citation statements)
references
References 53 publications
1
97
0
1
Order By: Relevance
“…Biomedical Named Entity Recognition is a challenging task in the field of biomedical information processing. Li et al [31] proposed the use of BERT embeddings and Conditional Random Fields (CRF) layer for finding the best tag sequence for a given sentence. Yuan [51] proposed a BERT-based question answering model, where they used a BERT-CRF model in order to detect mentions of the entities in sentences.…”
Section: Contextualized Embeddings Based Modelsmentioning
confidence: 99%
See 2 more Smart Citations
“…Biomedical Named Entity Recognition is a challenging task in the field of biomedical information processing. Li et al [31] proposed the use of BERT embeddings and Conditional Random Fields (CRF) layer for finding the best tag sequence for a given sentence. Yuan [51] proposed a BERT-based question answering model, where they used a BERT-CRF model in order to detect mentions of the entities in sentences.…”
Section: Contextualized Embeddings Based Modelsmentioning
confidence: 99%
“…On the other hand, many recent state-of-the-art architectures used CRF layer after a contextual Language Model [31], [45], [49], [51]. Both Li [31] and Yuan et al [51] used BERT-CRF model for sequence labelling tasks. In [45], Souza et al employed the BERT model with CRF layer using feature-based and fine-tuning based strategies for Portuguese NER task.…”
Section: F Conditional Random Fieldmentioning
confidence: 99%
See 1 more Smart Citation
“…The advantage of such an approach is that less parameters need to be learned from scratch. Several works have shown that BERT can be transferred for tasks, such as text summarization and sentiment analysis (Li, Bing, Zhang, & Lam, 2019; Liu & Lapata, 2019). Biomedical NLP researchers have also demonstrated the importance of transfer learning from pre‐trained BERT, where the state‐of‐the‐art performances are obtained by fine‐tuning BERT with large task‐/domain‐specific data in NER, question answering (QA) and relation extraction (Lee et al, 2020).…”
Section: Word Representation Modelsmentioning
confidence: 99%
“…More recently, some scholars even propose complicated models with the help of more advanced neural structures, e.g., transformation networks [12], parameterized convolutional neural networks (PCNN) [13], gated convolutional networks (GCN) [14], memory networks [15], graph networks [16], [17] or semantic cognition networks (SCN) [18]. However, just as Li et al pointed out in [19], the improvement of these models measured by the accuracy or F1 score has reached a bottleneck, because the commonly used embeddings are pre-trained via word2vec or GloVe, which can only provide context-independent word-level features and are insufficient for capturing the complex semantic dependencies in the sentence.…”
Section: Introductionmentioning
confidence: 99%