2020
DOI: 10.1016/j.asoc.2020.106198
|View full text |Cite
|
Sign up to set email alerts
|

Sarcasm detection in mash-up language using soft-attention based bi-directional LSTM and feature-rich CNN

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
41
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 108 publications
(41 citation statements)
references
References 20 publications
0
41
0
Order By: Relevance
“…Then, maximum-pooling layer, drop-out layer, and finally, fully connected layer has been employed to classify text documents as either sarcastic or non-sarcastic. In another study, Kumar et al [16] presented a sarcasm identification framework based on bidirectional long short-term memory with convolution neural network.…”
Section: B Related Work On Deep Learning Based Approachesmentioning
confidence: 99%
See 2 more Smart Citations
“…Then, maximum-pooling layer, drop-out layer, and finally, fully connected layer has been employed to classify text documents as either sarcastic or non-sarcastic. In another study, Kumar et al [16] presented a sarcasm identification framework based on bidirectional long short-term memory with convolution neural network.…”
Section: B Related Work On Deep Learning Based Approachesmentioning
confidence: 99%
“…Ren et al [28] presented a multi-level memory network for capturing the features of sarcasm expressions using sentiment semantics. Recently, Jain et al [16] presented a deep learning based framework for sarcasm identification in English and Hindi language. To extract semantic feature vectors, pre-trained GloVe model has been utilized.…”
Section: Volume 4 2016mentioning
confidence: 99%
See 1 more Smart Citation
“…Consequently, attention mechanism has been widely used in natural language processing [65], statistical learning [66], and computational vision [67]. There are many variations of attention mechanism, for example soft attention [68], multi-level attention [69] and multi-dimensional attention [70], etc. More details of the category of attention mechanism can be found in [71].…”
Section: ) Attention Mechanism In Neural Networkmentioning
confidence: 99%
“…A lot of researches have been done lately in the domain of natural language processing. In [9] Jain, Kumar, & Garg proposed an RNN approach with Feature-rich CNN to improve the accuracy of sarcasm detection using a dataset of 3.000 sarcastic tweets and 3.000 non-sarcastic tweets in Hindi and English. This model consists of a preprocessing stage where the data is normalized by removing punctuation marks, special characters and regular expressions, then the data is stemmed to return the word to its basic form after the data is tokenized using GloVe for English and Hindi-SentiWordNet for Hindi then the result is an English vector.…”
Section: Related Workmentioning
confidence: 99%