2016
DOI: 10.1088/1741-2560/13/4/046022
|View full text |Cite
|
Sign up to set email alerts
|

Affective brain–computer music interfacing

Abstract: Abstract. ObjectiveWe aim to develop and evaluate an affective Brain-computer music interface (aBCMI) modulating the affective states of its users. ApproachAn aBCMI is constructed to detect a user's current affective state and attempt to modulate it in order to achieve specific objectives (for example, making the user calmer or happier) by playing music which is generated according to a specific affective target by an algorithm music composition system and a case-based reasoning system. The system is trained a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
52
0
1

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
3
1

Relationship

2
7

Authors

Journals

citations
Cited by 49 publications
(54 citation statements)
references
References 48 publications
1
52
0
1
Order By: Relevance
“…The number of electrodes that are used during experimentation in emotion detection situations imposes time constraints on the algorithms. For example, in [44], the authors built a system that detects current user affective states and obtained a classification accuracy of 65%. In [45,46], they tested their method online and offline.…”
Section: Design Innovation (Experimental) Papermentioning
confidence: 99%
See 1 more Smart Citation
“…The number of electrodes that are used during experimentation in emotion detection situations imposes time constraints on the algorithms. For example, in [44], the authors built a system that detects current user affective states and obtained a classification accuracy of 65%. In [45,46], they tested their method online and offline.…”
Section: Design Innovation (Experimental) Papermentioning
confidence: 99%
“…In these studies, brain signals were recorded using an EEG headset while the subject listens to music [44,53,58,100,110,112,115,116,151,154,190,205,216,220,222,235,276,279]. Moreover, the subjects' emotions were recognized as displayed by EEG signals.…”
Section: Domain Description Referencesmentioning
confidence: 99%
“…The calibration session was designed to identify neural and physiological correlates of emotional responses to music, while subsequent runs were used to identify trajectories for moving between affective states. The details of the complete study are described elsewhere [36], along with a comprehensive discussion of details such as referencing schemes. In this present study we analyse the data obtained during the calibration session to determine whether personalised classification methods for each participant provide better results than generic classifiers for all participants.…”
Section: Methodsmentioning
confidence: 99%
“…Two facets may drive its intensive interest. On one hand, it enables a wide spectrum of intriguing emotion-oriented applications such as, machine intelligence (Chen et al, 2017), receptionist robots (Pinheiro et al, 2017), content recommendation devices (Lee and Shin, 2013), tutoring systems (Muñoz et al, 2010), and music therapy (Ian et al, 2016). On the other hand, recent explosive innovations in wearable sensing technology considerably bring laboratory-demonstrated emotion-aware research closer to our daily life, necessitating a robust and accurate emotion-aware analytical framework.…”
Section: Introductionmentioning
confidence: 99%