2014 IEEE International Conference on Multimedia and Expo (ICME) 2014
DOI: 10.1109/icme.2014.6890161
|View full text |Cite
|
Sign up to set email alerts
|

Emotion recognition from users' EEG signals with the help of stimulus VIDEOS

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
6
2
2

Relationship

1
9

Authors

Journals

citations
Cited by 22 publications
(13 citation statements)
references
References 9 publications
0
13
0
Order By: Relevance
“…• It's the first time to obtain video features as privileged information for emotion recognition from EEG [43], and use EEG features as privileged information for video emotion tagging. We regard the proposed method as implicit fusion, which use multiple modalities only during training, not testing.…”
Section: Video Emotional Taggingmentioning
confidence: 99%
“…• It's the first time to obtain video features as privileged information for emotion recognition from EEG [43], and use EEG features as privileged information for video emotion tagging. We regard the proposed method as implicit fusion, which use multiple modalities only during training, not testing.…”
Section: Video Emotional Taggingmentioning
confidence: 99%
“…Certainly, less number of participants could lead to unbalanced results (see Table 5). Based on survey, EEG based emotion databases, such as SEED [192] and MAHNOB HCI [95], have been constructed using 15 and 30 subjects respectively. However, the widely used database DEAP [37] has been created using 32 subjects, showcasing the reliability of the data.…”
Section: ) Subject Informationmentioning
confidence: 99%
“…By using a fast Fourier transformation, the frequency features (60 power features, 16 power difference features) were prepared. In each channel, the power features were computed on four frequency bands, i.e., theta (4-8 Hz), alpha (8-12 Hz), beta (12)(13)(14)(15)(16)(17)(18)(19)(20)(21)(22)(23)(24)(25)(26)(27)(28)(29)(30) and gamma (30)(31)(32)(33)(34)(35)(36)(37)(38)(39)(40)(41)(42)(43)(44)(45). Power difference features were employed to detect the variation in cerebral activity between the left and right cortical areas.…”
Section: Feature Extraction and The Target Emotion Classesmentioning
confidence: 99%