Findings of the Association for Computational Linguistics: EMNLP 2020 2020
DOI: 10.18653/v1/2020.findings-emnlp.124
|View full text |Cite
|
Sign up to set email alerts
|

Modeling Intra and Inter-modality Incongruity for Multi-Modal Sarcasm Detection

Abstract: Sarcasm is a pervasive phenomenon in today's social media platforms such as Twitter and Reddit. These platforms allow users to create multi-modal messages, including texts, images, and videos. Existing multi-modal sarcasm detection methods either simply concatenate the features from multi modalities or fuse the multi modalities information in a designed manner. However, they ignore the incongruity character in sarcastic utterance, which is often manifested between modalities or within modalities. Inspired by t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
27
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 55 publications
(38 citation statements)
references
References 17 publications
1
27
0
Order By: Relevance
“…The multimodal work that closely matches with our work is [10]. In that, the author proposes a BERT-based architecture for modeling intra-and intermodality incongruity.…”
Section: Related Worksupporting
confidence: 74%
See 4 more Smart Citations
“…The multimodal work that closely matches with our work is [10]. In that, the author proposes a BERT-based architecture for modeling intra-and intermodality incongruity.…”
Section: Related Worksupporting
confidence: 74%
“…Specifically, our model gives an improvement of 6.14% on F1-score and 5.15% on accuracy over the current SOTA from Bridge-RoBERTa model, thus verifying the effectiveness of our model. 0.8060 0.7797 0.8342 0.8402 Res-Bert [10] 0.8157 0.7887 0.8446 0.8480 IIMI-MMSD [10] 0.8292 0.8087 0.8508 0.8605 Bridge-RoBERTa [19] 0.8605 0.8295 0.8939 0.8851 Our Method 0.9219 0.9056 0.9387 0.9366…”
Section: Resultsmentioning
confidence: 98%
See 3 more Smart Citations