In this article, the rapidly growing body of research that has been published recently on the topic of crossmodal correspondences that involve auditory and gustatory/flavor stimuli is critically reviewed. The evidence demonstrates that people reliably match different tastes/flavors to auditory stimuli varying in both their psychoacoustic (e.g., pitch) and musical (e.g., timbre) properties. In order to stimulate further progress in this relatively young research field, the present article aims at consolidating prior findings concerning specific auditory-gustatory mappings, whereby special attention is given to highlighting (1) any conflicts in the existing experimental evidence and (2) any potential caveats with regard to the most appropriate interpretation of prior studies. Next, potential mechanisms underlying auditory-gustatory crossmodal correspondences are discussed. Finally, a number of potentially fruitful avenues for future research are outlined.
Though practitioners have relied on tempo as a criterion to design in-store music, scant attention has been devoted to the mode of musical selections, and no consideration has been given to the potential for the interactive effects of low-level structural elements of music on actual retail sales. The current research reports a field experiment wherein the positive main effect of slow tempo on actual sales reported by Milliman (J Marketing 46 (3):86-91, 1982, J Cons Res 13 (2):286-289, 1986) is qualified by musical mode. A significant interaction between tempo and mode was evidenced, such that music in a major mode did not vary in effectiveness by tempo while music in a minor mode was significantly more effective when accompanied by a slow tempo. That is, the Milliman effect was eliminated for music in a major mode. Implications of our findings and directions for further research are discussed. Keywords Retail atmospherics. Musical tempo. Musical mode. Linear mixed models 1 Introduction Recognizing the potential for music to influence individual affect, cognition, and behavior, which in turn impacts consumer behavior and decision making, marketers invest substantial resources in an effort to effectively incorporate music into the design of retail environments (e.g., Morrison and Beverland 2003). Considerable research in the field of atmospherics has examined the effects of high level, global properties of music including music versus no-music comparisons (Park and Young 1986), background versus foreground conditions (
Searching for a particular product in a supermarket can be a challenging business. The question therefore arises as to whether cues from the shopper’s other senses can be used to facilitate, guide, or bias visual search toward a particular product or product type. Prior research suggests that characteristic sounds can facilitate visual object localization (Iordanescu et al., 2008, 2010). Extending these findings to an applied setting, we investigated whether product-related sounds would facilitate visual search for products from different categories (e.g., champagne, potato crisps, deodorant) when arranged on a virtual shelf. On each trial, participants were visually presented with the name of a target product and then located the target within a virtual shelf display containing pictures of four different products (randomly selected from a set of nine). The visual display was randomly accompanied by a target-congruent, a target-incongruent, an unrelated, or no sound. Congruent sounds were semantically related to the target (e.g., uncorking a champagne bottle), incongruent sounds were related to the product shown in the corner opposite to the target, and unrelated sounds did not correspond to any of the products shown in the display. Participants found the target product significantly faster when the sound was congruent rather than incongruent with the target. All other pairwise comparisons were non-significant. These results extend the facilitatory crossmodal effect of characteristic sounds on visual search performance described earlier to the more realistic context of a virtual shelf display, showing that characteristic sounds can crossmodally enhance the visual processing of actual products.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.