This article reports a study in which listeners were asked to provide continuous ratings of perceived emotional content of clinical music therapy improvisations. Participants were presented with 20 short excerpts of music therapy improvisations, and had to rate perceived activity, pleasantness and strength using a computer-based slider interface. A total of nine musical features relating to various aspects of the music (timing, register, dynamics, tonality, pulse clarity and sensory dissonance) were extracted from the excerpts, and relationships between these features and participants' emotion ratings were investigated. The data were analysed in three stages. First, inter-dimension correlations revealed that ratings of activity and pleasantness were moderately negatively correlated, activity and strength were strongly positively correlated, and strength and pleasantness were moderately negatively correlated. Second, a series of cross-correlation analyses revealed that the temporal lag between musical features and listeners' dimension ratings differed across both variables and dimensions. Finally, a series of linear regression analyses produced significant feature prediction models for each of the three dimensions, accounting for 80 percent ( activity), 57 percent ( pleasantness ) and 84 percent ( strength) of the variance in participants' ratings. Activity was best predicted by high note density and high pulse clarity, pleasantness by low note density and high tonal clarity, and strength by high mean velocity and low note density. The results are discussed in terms of their fit with other work reported in the music psychology literature, and their relevance to clinical music therapy research and practice.
The present study sought to identify relationships between musical features of music therapy improvisations and clients' level of mental retardation, using a computationally-based method of analysis. 216 improvisations, contributed by 50 clients, were collected in MIDI format. Clients were divided into four groups according to their level of diagnosed mental retardation: 1 = none, 2 = mild, 3 = moderate, 4 = severe or profound. 43 client-related musical features were automatically extracted from their improvisations in the MATLAB computer environment and entered into a series of linear regression analyses as predictors of clients'level of mental retardation. The final model, which contained nine significant musical variables, accounted for 67% of the variation in clients' level of mental retardation. Specifically, level of mental retardation was best predicted by temporal elements of the music relating to note duration, note density, articulation, and amount of silence.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.