This paper reflects on the dynamics and practices of building a maker community around a new hardware platform. We examine the factors promoting the successful uptake of a maker platform from two perspectives: first, we investigate the technical and user experience considerations that users identify as the most important. Second, we explore the specific activities that help attract a community and encourage sustained participation. We present an inductive approach based on the case study of Bela, an embedded platform for creating interactive audio systems. The technical design and community building processes are detailed, culminating in a successful crowdfunding campaign. To further understand the community dynamics, the paper also presents an intensive three-day workshop with eight digital musical instrument designers. From observations and interviews, we reflect on the relationship between the platform and the community and offer suggestions for HCI researchers and practitioners interested in establishing their own maker communities.
The Hammond organ is one of earliest electronic instruments and is still used widely in contemporary popular music. One of its main sonic features is the "key-click," a transient that occurs upon note onset, caused by the mechanical bouncing of the nine electric contacts actuated during each key press. A study of the dynamic mechanical behaviour of the contact bounces is presented, showing that the velocity, the type of touch and, more in general, the temporal evolution of the key position, all affect different characteristics of the contact bounces. A second study focuses on the listener's perception of the generated sound and finds that listeners can classify sounds produced on the Hammond organ according to the type of touch and velocity used. It is concluded that the Hammond organ is a touch-responsive instrument and that the gesture used to produce a note affects the generated sound across multiple dimensions. The control available at the fingertips of the musician is therefore such that it cannot be easily reduced to a single scalar velocity parameter, as is common practice in modern digital emulations of the instrument.
On several keyboard instruments the produced sound is not always dependent exclusively on a discrete key-velocity parameter, and minute gestural details can affect the final sonic result. By contrast, variations in articulation beyond velocity have normally no effect on the produced sound when the keyboard controller uses the MIDI standard, used in the vast majority of digital keyboards. In this article, we introduce a novel keyboard-based digital musical instrument that uses continuous readings of key position to control a nonlinear waveguide flute synthesizer with a richer set of interaction gestures than would be possible with a velocity-based keyboard. We then report on the experience of six players interacting with our instrument and reflect on their experience, highlighting the opportunities and challenges that come with continuous key sensing.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.