This paper presents initial steps towards the design of an embedded system for body-centric sonic performance. The proposed prototyping system allows performers to manipulate sounds through gestural interactions captured by textile wearable sensors. The e-textile sensor data control, in real-time, audio synthesis algorithms working with content from Audio Commons, a novel web-based ecosystem for repurposing crowd-sourced audio. The system enables creative embodied music interactions by combining seamless physical e-textiles with web-based digital audio technologies.
There has been little research on how interactions with tabletop and Tangible User Interfaces (TUIs) by groups of users change over time. In this article, we investigate the challenges and opportunities of a tabletop tangible interface based on constructive building blocks. We describe a long-term lab study of groups of expert musicians improvising with the Reactable, a commercial tabletop TUI for music performance. We examine interaction, focusing on
interface
,
tangible
,
musical,
and
social
phenomena. Our findings reveal a practice-based learning between peers in situated contexts, and new forms of participation, all of which is facilitated by the Reactable's tangible interface, if compared to traditional musical ensembles. We summarise our findings as a set of design considerations and conclude that construction processes on interactive tabletops support learning by doing and peer learning, which can inform constructivist approaches to learning with technology.
Music information retrieval (MIR) has a great potential in musical live coding because it can help the musician–programmer to make musical decisions based on audio content analysis and explore new sonorities by means of MIR techniques. The use of real-time MIR techniques can be computationally demanding and thus they have been rarely used in live coding; when they have been used, it has been with a focus on low-level feature extraction. This article surveys and discusses the potential of MIR applied to live coding at a higher musical level. We propose a conceptual framework of three categories: (1) audio repurposing, (2) audio rewiring, and (3) audio remixing. We explored the three categories in live performance through an application programming interface library written in SuperCollider, MIRLC. We found that it is still a technical challenge to use high-level features in real time, yet using rhythmic and tonal properties (midlevel features) in combination with text-based information (e.g., tags) helps to achieve a closer perceptual level centered on pitch and rhythm when using MIR in live coding. We discuss challenges and future directions of utilizing MIR approaches in the computer music field.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.