2021
DOI: 10.1016/j.ijin.2021.06.005
|View full text |Cite
|
Sign up to set email alerts
|

Improving the performance of aspect based sentiment analysis using fine-tuned Bert Base Uncased model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
15
0
2

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 55 publications
(22 citation statements)
references
References 8 publications
1
15
0
2
Order By: Relevance
“…We applied BERT Tokenizer based on WordPiece (Muller et al, 2019 ) for the title, and abstracts from the LitCovid corpus. We used a pre-trained model bert-base-uncased (Geetha & Renuka, 2021 ) and the pre-training was performed on a large corpus of English data (BookCorpus and English Wikipedia) in a self-supervised fashion (Geetha & Renuka, 2021 ; Turc et al, 2019 ).…”
Section: Methodsmentioning
confidence: 99%
“…We applied BERT Tokenizer based on WordPiece (Muller et al, 2019 ) for the title, and abstracts from the LitCovid corpus. We used a pre-trained model bert-base-uncased (Geetha & Renuka, 2021 ) and the pre-training was performed on a large corpus of English data (BookCorpus and English Wikipedia) in a self-supervised fashion (Geetha & Renuka, 2021 ; Turc et al, 2019 ).…”
Section: Methodsmentioning
confidence: 99%
“…Sedangkan bayes berasal dari prinsip teorema bayes [20]. Keuntungan menggunakan naïve bayes untuk mengklasifikasikan salah satunya karena untuk mengklasifikasikan menggunakan jumlah data pelatihan yang kecil [21].…”
Section: Pendahuluanunclassified
“…They used clinical BERT that had already been trained for embedding using LSTM, and the results were subpar. Additionally, a potent Deep Learning Model called BERT Base Uncased is offered in the study effort [20] to clarify the problem of sentiment analysis. In the experimental assessment, the BERT model outperformed the other machine learning techniques with a good prediction and high accuracy.…”
Section: Related Workmentioning
confidence: 99%