Proceedings of the 28th International Conference on Computational Linguistics 2020
DOI: 10.18653/v1/2020.coling-main.270
|View full text |Cite
|
Sign up to set email alerts
|

Modality Enriched Neural Network for Metaphor Detection

Abstract: Metaphor as a cognitive mechanism in human's conceptual system manifests itself an effective way for language communication. Although being intuitively sensible for human, metaphor detection is still a challenging task due to the subtle ontological differences between metaphorical and non-metaphorical expressions. This work proposes a modality enriched deep learning model for tackling this unsolved issue. It provides a new perspective for understanding metaphor as a modality shift, as in 'sweet voice'. It also… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2
1

Relationship

2
4

Authors

Journals

citations
Cited by 7 publications
(9 citation statements)
references
References 35 publications
0
5
0
Order By: Relevance
“…We compare our model with existing approaches which do not use external knowledge. We do not compare with the works that divided the dataset into the train set and test set by themselves, such as Wan and Xing (2020). Since Gao et al (2018) and Mao et al (2019) used a different subset of VUA, we use the results reported by Neidlein et al (2020) on VUA ALL POS and VUA Verbs for comparison.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…We compare our model with existing approaches which do not use external knowledge. We do not compare with the works that divided the dataset into the train set and test set by themselves, such as Wan and Xing (2020). Since Gao et al (2018) and Mao et al (2019) used a different subset of VUA, we use the results reported by Neidlein et al (2020) on VUA ALL POS and VUA Verbs for comparison.…”
Section: Resultsmentioning
confidence: 99%
“…Recent metaphor researches (Gao et al, 2018;Mao et al, 2019), and ACL 2020Metaphor Shared Task(Leong et al, 2020 regard it as a sequence labeling task. Although many previous works have explored ways to enhance the contextualized representation within a sentence (Gao et al, 2018;Mao et al, 2019), or to introduce some external knowledge (Rohanian et al, 2020;Wan and Xing, 2020), most of them do not make full use of the information in the dataset, from which the metaphor identification process may benefit.…”
Section: Introductionmentioning
confidence: 99%
“…One approach attempts to predict these judgments about concreteness or salient sensorimotor features from LM representations, with varying degrees of success (Thompson and Lupyan, 2018;Turton et al, 2020;Chersoni et al, 2020;Utsumi, 2020). Another approach uses sensorimotor features to augment the ability of an LM on an applied task, such as the GLUE benchmark (Kennington, 2021) or metaphor detection (Wan et al, 2020b). Ass mentioned in Section 1, these experiments are limited in that the sensorimotor features themselves were obtained for words in isolation.…”
Section: Grounding Lms With Psycholinguistic Resourcesmentioning
confidence: 99%
“…The modality norm describes every word in six main senses (auditory, gustatory, haptic, visual, olfactory, and interoceptive). Given the hypothesis that a metaphor showed a shrift in modality from source to target concepts, Wan et al (2020) concatenated the modality norm 22 with word embeddings (GloVe (Pennington et al, 2014)). The results outperformed several BERT-based baselines, showing that the modality norm is a helpful feature in identifying metaphors.…”
Section: Sentence-level Metaphor Identificationmentioning
confidence: 99%