2013
DOI: 10.1109/t-affc.2013.6
|View full text |Cite
|
Sign up to set email alerts
|

EEG-Based Classification of Music Appraisal Responses Using Time-Frequency Analysis and Familiarity Ratings

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
48
1

Year Published

2015
2015
2021
2021

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 91 publications
(53 citation statements)
references
References 30 publications
1
48
1
Order By: Relevance
“…Soleymani et al [3] proposed a user-independent emotion recognition method using EEG, pupillary response and gaze distance, which achieved the best classification accuracies of 68.5% for three labels of valence and 76.4% for three labels of arousal using a modality fusion across 24 participants. Hadjidimitriou et al [43] employed three time-frequency distributions (spectrogram, Hilbert-Huang spectrum, and ZhaoAtlas-Marks transform) as features to classify ratings of liking and familiarity. They also investigated the time course of music-induced affect responses and the role of familiarity.…”
Section: Related Workmentioning
confidence: 99%
“…Soleymani et al [3] proposed a user-independent emotion recognition method using EEG, pupillary response and gaze distance, which achieved the best classification accuracies of 68.5% for three labels of valence and 76.4% for three labels of arousal using a modality fusion across 24 participants. Hadjidimitriou et al [43] employed three time-frequency distributions (spectrogram, Hilbert-Huang spectrum, and ZhaoAtlas-Marks transform) as features to classify ratings of liking and familiarity. They also investigated the time course of music-induced affect responses and the role of familiarity.…”
Section: Related Workmentioning
confidence: 99%
“…In these studies, brain signals were recorded using an EEG headset while the subject listens to music [44,53,58,100,110,112,115,116,151,154,190,205,216,220,222,235,276,279]. Moreover, the subjects' emotions were recognized as displayed by EEG signals.…”
Section: Domain Description Referencesmentioning
confidence: 99%
“…To increase frequency resolution, we divided the β rhythm into β low (13)(14)(15)(16)(17)(18)(19)(20), and β high (20)(21)(22)(23)(24)(25)(26)(27)(28)(29)(30) Hz) sub-bands and derived descriptors separately. Similarly, the γ rhythm was divided into γ low (30-49Hz) and γ high (51-90Hz).…”
Section: Discussionmentioning
confidence: 99%