Figure 1: Interaction opportunities of our design space: (A) Reflections allow users to interact with artifacts inside a museum cabinet. (B) Reflections can reveal internal details of real objects. (C) A Digital Musical Instrument augmented with projection mapping and a volumetric display. (D) This augmentation is visible from any point of view and from both sides of the mirror.
As Virtual Reality headsets become accessible, more and more artistic applications are developed, including immersive musical instruments. 3D interaction techniques designed in the 3D User Interfaces research community, such as navigation, selection and manipulation techniques, open numerous opportunities for musical control. For example, navigation techniques such as teleportation, free walking/flying and path-planning enable different ways of accessing musical scores, scenes of spatialized sounds sources or even parameter spaces. Manipulation techniques provide novel gestures and metaphors, e.g. for drawing or sculpting sound entities. Finally, 3D selection techniques facilitate the interaction with complex visual structures which can represent hierarchical temporal structures, audio graphs, scores or parameter spaces. However, existing devices and techniques were developed mainly with a focus on efficiency, i.e. minimising error rate and task completion times. They were therefore not designed with the specifics of musical interaction in mind. In this paper, we review existing 3D interaction techniques and examine how they can be used for musical control, including the possibilities they open for instrument designers. We then propose a number of research directions to adapt and extend 3DUIs for musical expression KEYWORDS 3D user interfaces, Immersive virtual musical instruments, Virtual Reality, New interface for musical expression
International audienceImmersive Virtual Musical Instruments (IVMIs) can be considered as the meeting between Music Technology and Virtual Reality. Being both musical instruments and elements of Virtual Environments , IVMIs require a transversal approach from their designers, in particular when the final aim is to play them in front of an audience , as part of a scenography. In this paper, we combine the main constraints of musical performances and Virtual Reality applications into a set of dimensions, meant to extensively describe IVMIs stage setups. A number of existing stage setups are then classified using these dimensions, explaining how they were used to showcase live virtual performances and discussing their scenographic level
We define a generic model for finite audio or symbolic musical patterns that structurally encodes a rich and abstract synchronization mechanism. This is achieved by distinguishing for each pattern a realization window, describing what the pattern is, from a synchronization window, describing how the pattern can be used. The sequential composition of patterns is defined and studied. An algebra of musical patterns is introduced in a mathematically well-founded approach. We propose several high level operators that can be used either in audio processing or in musical analysis and composition. Practical uses and experiments conducted in both fields are described.
This paper aims to investigate the semantic perceptual space of synthetic tactile textures rendered via an ultrasonic based haptic tablet and the parameters influencing this space. Through a closed card sorting task, 30 participants had to explore 32 tactile-only textures and describe each texture using adjectives. A factorial analysis of mixed data was conducted. Results suggest a 2 dimensional space with tactile textures belonging to a continuum of rough to smooth adjectives. Influence of waveform and amplitude is shown to play an important role in perceiving a texture as smooth or rough, and spatial period is a possible modulator of different degrees of roughness or smoothness. Finally, we discuss how these findings can be used by designers on tactile feedback devices.
3D graphical interaction offers a large amount of possibilities for musical applications. However it also carries several limitations that prevent it from being used as an efficient musical instrument. For example, input devices for 3D interaction or new gaming devices are usually based on 3 or 6 degrees-of-freedom tracking combined with push-buttons or joysticks. While buttons and joysticks do not provide good resolution for musical gestures, graphical interaction using tracking may bring enough expressivity but is weakened by accuracy and haptic feedback problems. Moreover, interaction based solely on tracking limit the possibilities brought by graphical interfaces in terms of musical gestures. We propose a new approach that separates the input modalities according to traditional musical gestures. This allows to combine the possibilities of graphical interaction as selection and modulation gestures with the accuracy and the expressivity of musical interaction as excitation gestures. We implement this approach with a new input device, Piivert, which combines 6DOF tracking and pressure detection. We describe associated interaction techniques and show how this new device can be valuable for immersive musical applications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.