Research has shown inconsistent results concerning the ability of young children to identify musical emotion. This study explores the influence of the type of musical performance (vocal vs. instrumental) on children's affect identification. Using an independent-group design, novel child-directed music was presented in three conditions: instrumental, vocal-only, and song (instrumental plus vocals) to 3-to 6-year-olds previously screened for language development (n = 76). A forced-choice task was used in which children chose a face expressing the emotion matching each musical track. All performance conditions comprised 'happy' (major mode/fast tempo) and 'sad' (minor mode/slow tempo) tracks. Nonsense syllables rather than words were used in the vocals in order to avoid the influence of lyrics on children's decisions. The results showed that even the younger children were able to identify correctly the intended emotion in music, although 'happy' music was more readily recognized and recognition appeared facilitated in the instrumental condition. Performance condition interacted with gender.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.