2023
DOI: 10.3390/s23115232
|View full text |Cite
|
Sign up to set email alerts
|

Transfer Learning for Sentiment Classification Using Bidirectional Encoder Representations from Transformers (BERT) Model

Abstract: Sentiment is currently one of the most emerging areas of research due to the large amount of web content coming from social networking websites. Sentiment analysis is a crucial process for recommending systems for most people. Generally, the purpose of sentiment analysis is to determine an author’s attitude toward a subject or the overall tone of a document. There is a huge collection of studies that make an effort to predict how useful online reviews will be and have produced conflicting results on the effica… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(1 citation statement)
references
References 24 publications
0
1
0
Order By: Relevance
“…The attention weights are represented by feature similarity, which can determine the importance of different source domains. Zheng et al [ 31 , 32 ] demonstrated the powerful performance of Bidirectional Encoder Representations from Transformers (BERT), a pre-trained model with an attention mechanism at its core, in language representation. The BERT model is employed on a Chinese dataset to generate a text vector representation of intrinsic semantic information in our work.…”
Section: Related Workmentioning
confidence: 99%
“…The attention weights are represented by feature similarity, which can determine the importance of different source domains. Zheng et al [ 31 , 32 ] demonstrated the powerful performance of Bidirectional Encoder Representations from Transformers (BERT), a pre-trained model with an attention mechanism at its core, in language representation. The BERT model is employed on a Chinese dataset to generate a text vector representation of intrinsic semantic information in our work.…”
Section: Related Workmentioning
confidence: 99%