2022
DOI: 10.1371/journal.pone.0273936
|View full text |Cite
|
Sign up to set email alerts
|

AFR-BERT: Attention-based mechanism feature relevance fusion multimodal sentiment analysis model

Abstract: Multimodal sentiment analysis is an essential task in natural language processing which refers to the fact that machines can analyze and recognize emotions through logical reasoning and mathematical operations after learning multimodal emotional features. For the problem of how to consider the effective fusion of multimodal data and the relevance of multimodal data in multimodal sentiment analysis, we propose an attention-based mechanism feature relevance fusion multimodal sentiment analysis model (AFR-BERT). … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 42 publications
0
3
0
1
Order By: Relevance
“…The PAJO model might benefit from the incorporation of more varied features, based on the findings of the ablation experiments. For instance, the performance of the PAJO model might be enhanced using pre-trained language models (BERT) in the BERT-CAM [ 31 ] and AFR-BERT by Ji et al [ 32 ] models. Additionally, the Bidirectional Long Short-Term Memory (BiLSTM) used by the AFR-BERT model for pre-processing data might be considered for incorporation into the PAJO model.…”
Section: Discussionmentioning
confidence: 99%
“…The PAJO model might benefit from the incorporation of more varied features, based on the findings of the ablation experiments. For instance, the performance of the PAJO model might be enhanced using pre-trained language models (BERT) in the BERT-CAM [ 31 ] and AFR-BERT by Ji et al [ 32 ] models. Additionally, the Bidirectional Long Short-Term Memory (BiLSTM) used by the AFR-BERT model for pre-processing data might be considered for incorporation into the PAJO model.…”
Section: Discussionmentioning
confidence: 99%
“…Its proficiency in capturing contextual information allows it to comprehend the intricacies of language exceptionally well, leading to superior performance across various NLP tasks. There are several variants of the BERT model, each tailored for specific purposes or devised to address particular limitations of the original model (43)(44)(45)(46)(47). The selection of the appropriate BERT-based model depends on various factors, including the nature of the task, dataset size, computational resources, and language requirements.…”
Section: Case Example In Tweetsunclassified
“…Consequently, we map the scores ∈ [−3, −1] to the negative class, and the scores ∈ [1,3] to the positive class. Although related works in the literature based on this corpus cluster the sentences corresponding to the neutral sentiment into the negative class [19], we exclude these sentences to minimise biasing our models towards the negative class. Table 2 summarises the number of positive and negative sentences belonging to the resulting train, development, and test partitions.…”
Section: A Cmu-mosei Datasetmentioning
confidence: 99%