2023
DOI: 10.28991/hij-2023-04-02-015
|View full text |Cite
|
Sign up to set email alerts
|

BERT: A Review of Applications in Sentiment Analysis

Md Shohel Sayeed,
Varsha Mohan,
Kalaiarasi Sonai Muthu

Abstract: E-commerce reviews are becoming more valued by both customers and companies. The high demand for sentiment analysis is driven by businesses relying on it as a crucial tool to improve product quality and make informed decisions in a fiercely competitive business environment. The purpose of this review paper is to explore and evaluate the applications of the BERT model, a Natural Language Processing (NLP) technique, in sentiment analysis across various fields. The model has been utilized in certain studies for v… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 47 publications
0
1
0
Order By: Relevance
“…GPT-4, in particular, has garnered attention for its versatility in various applications, including healthcare-related tasks [45]. BERT (Bidirectional Encoder Representations from Transformers) is another influential LLM that has left an indelible mark on NLP [46,47]. Renowned for its bidirectional training approach, BERT excels in grasping contextual nuances, making it particularly adept at understanding the intricacies of medical language and information.…”
Section: Evolution Of Large Language Modelsmentioning
confidence: 99%
“…GPT-4, in particular, has garnered attention for its versatility in various applications, including healthcare-related tasks [45]. BERT (Bidirectional Encoder Representations from Transformers) is another influential LLM that has left an indelible mark on NLP [46,47]. Renowned for its bidirectional training approach, BERT excels in grasping contextual nuances, making it particularly adept at understanding the intricacies of medical language and information.…”
Section: Evolution Of Large Language Modelsmentioning
confidence: 99%
“…RNNs, particularly LSTM networks, have become increasingly popular due to their effectiveness in processing sequential data [29]. In terms of transformer models, the transfer learning ability improved the accuracy of many NLP applications [30,31]. The study by Gopalakrishnan et al [32] investigated the performance of LSTM and Gated Recurrent Unit (GRU) models on a biomedical dataset.…”
Section: Literature Reviewmentioning
confidence: 99%