Special issue: Enaction and MusicInternational audienceConsidering the process of artistic creation to be deeply linked with technology, we propose a conceptual framework which gives a sense to the concept of artistic creation tools in the context of the computer. Starting from an initial, technology-free situation, we introduce the notion of musical instrument as the first appearance of technology in music. Then, introducing step by step the important technological mutations we characterize the creative process that is supported by each stage and the transaction or 'trade-off' which accompanies each change. Under the light of this analysis we discuss interactive multisensory simulation of physical objects, including gestural real-time interactions, as we have been developing for years in our laboratory. An important issue within this article is an attempt to finish with some harmful confusion which comes from too systematically classifying certain situations as instrumental under the pretext that they appeal to gesture and real-time sound production or processing. We present through the concepts of 'supra-instrumental gesture and interaction', several situations that, in non-real time (but also possibly in real-time), and even in the total absence of actual gesture (but also achieved with actual gestures), are more gestural and more instrumental than certain gesticulations effected with sophisticated input devices and real-time sound digital processing
HAL is a multi-disciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L'archive ouverte pluridisciplinaire HAL, est destinée au dépôt et à la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d'enseignement et de recherche français ou étrangers, des laboratoires publics ou privés.
This chapter presents recent work concerning physically modelled virtual musical instruments and force feedback. Firstly, we discuss fundamental differences in the gesture-sound relationship between acoustic instruments and digital musical instruments, the former being linked by dynamic physical coupling, the latter by transmission and processing of information and control signals. We then present an approach that allows experiencing physical coupling with virtual instruments, using the CORDIS-ANIMA physical modelling formalism, synchronous computation and force-feedback devices. To this end, we introduce a framework for the creation and manipulation of multisensory virtual instruments, called the MSCI platform. In particular, we elaborate on the cohabitation, within a single physical model, of sections simulated at different rates. Finally, we discuss the relevance of creating virtual musical instruments in this manner, and we consider their use in live performance.
A study on force-feedback interaction with a model of a neural oscillator provides insight into enhanced humanrobot interactions for controlling musical sound. We provide differential equations and discrete-time computable equations for the core oscillator model developed by Edward Large for simulating rhythm perception. Using a mechanical analog parameterization, we derive a force-feedback model structure that enables a human to share control of a virtual percussion instrument with a "robotic" neural oscillator. A formal human subject test indicated that strong coupling (STRNG) between the force-feedback device and the neural oscillator provided subjects with the best control. Overall, the human subjects predominantly found the interaction to be "enjoyable" and "fun" or "entertaining." However, there were indications that some subjects preferred a medium-strength coupling (MED), presumably because they were unaccustomed to such strong force-feedback interaction with an external agent. With related models, test subjects performed better when they could synchronize their input in phase with a dominant sensory feedback modality. In contrast, subjects tended to perform worse when an optimal strategy was to move the force-feedback device with a 90°phase lag. Our results suggest an extension of dynamic pattern theory to force-feedback tasks. In closing, we provide an overview of how a similar force-feedback scenario could be used in a more complex musical robotics setting.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.