2022
DOI: 10.1109/taffc.2020.2964549
|View full text |Cite
|
Sign up to set email alerts
|

Recognition of Advertisement Emotions With Application to Computational Advertising

Abstract: Advertisements (ads) often contain strong affective content to capture viewer attention and convey an effective message to the audience. However, most computational affect recognition (AR) approaches examine ads via the text modality, and only limited work has been devoted to decoding ad emotions from audiovisual or user cues. This work (1) compiles an affective ad dataset capable of evoking coherent emotions across users; (2) explores the efficacy of content-centric convolutional neural network (CNN) features… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
2

Relationship

3
5

Authors

Journals

citations
Cited by 26 publications
(23 citation statements)
references
References 48 publications
0
19
0
Order By: Relevance
“…Deep neural networks are the state-of-the-art in text, speech, image, video and EEG-based recognition [48], [65], [49], [47], [25], [21], and have outperformed traditional machine learning methods obviating the need for handcrafted features [62]. We explored 1D, 2D and 3D-CNNs to learn EEG representations.…”
Section: E Convolutional Neural Network Pipelinementioning
confidence: 99%
See 2 more Smart Citations
“…Deep neural networks are the state-of-the-art in text, speech, image, video and EEG-based recognition [48], [65], [49], [47], [25], [21], and have outperformed traditional machine learning methods obviating the need for handcrafted features [62]. We explored 1D, 2D and 3D-CNNs to learn EEG representations.…”
Section: E Convolutional Neural Network Pipelinementioning
confidence: 99%
“…Implicit physiological or biosignals reflect characteristic activity of the central nervous system, and cannot be intentionally suppressed. Recent studies have extensively employed biosignals [19], [20], [21] for emotion perception in healthy subjects. EEG, functional Magnetic Resonance Imaging (fMRI), Magnetoencephalogram (MEG) and Positron Emission Tomography (PET) provide reliable information on emotional states compared to other modalities [22].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Audiovisual emotion recognition has also seen a significant amount of recent research efforts. Automatic affect recognition has a variety of applications in various fields; from detecting depression [37], to more emotionally relevant advertising [38], [39]. A lot of contemporary affect analysis approaches are based on deep neural networks that study both the visual and audio modalities [40], [41].…”
Section: Audiovisual Speech and Emotion Recognitionmentioning
confidence: 99%
“…Table 1 summarizes statistics of the FakeET dataset, in terms of the eye-gaze and EEG recordings available for research. While a substantial amount of EEG data was visually found to be noisy, and ignored for our analyses, we nevertheless note that sophisticated machine learning algorithms designed for noisy data (e.g., multiple instance learning, Siamese neural networks) can be explored to improve EEG-based FD as in [18]. In terms of eye-gaze recordings, each video corresponds to a minimum of 2, and a maximum of 16 eye-track files (some data was lost due to subjects closing their eyes, tracker errors, etc.).…”
Section: Gaze Pattern Analysismentioning
confidence: 99%