Musical Instruments in the 21st Century 2016
DOI: 10.1007/978-981-10-2951-6_10
|View full text |Cite
|
Sign up to set email alerts
|

Machine Learning as Meta-Instrument: Human-Machine Partnerships Shaping Expressive Instrumental Creation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
4
1

Relationship

1
8

Authors

Journals

citations
Cited by 12 publications
(8 citation statements)
references
References 10 publications
0
8
0
Order By: Relevance
“…Such efforts towards "reframing machine learning workflows based on situated human working practices, and exploring the co-adaptation of humans and intelligent systems" [19] p.7.1, show that systems can operate in response to limited training samples presented in real-world contexts. Fiebrink, for example, has shown considerable success in building digital music instruments that can be trained and respond to real-time embodied interaction by users [18]. Indeed, we find possible starting points in object recognizers which allow vision impaired users to build their own training sets [41], [49].…”
Section: Salience In the Momentmentioning
confidence: 96%
“…Such efforts towards "reframing machine learning workflows based on situated human working practices, and exploring the co-adaptation of humans and intelligent systems" [19] p.7.1, show that systems can operate in response to limited training samples presented in real-world contexts. Fiebrink, for example, has shown considerable success in building digital music instruments that can be trained and respond to real-time embodied interaction by users [18]. Indeed, we find possible starting points in object recognizers which allow vision impaired users to build their own training sets [41], [49].…”
Section: Salience In the Momentmentioning
confidence: 96%
“…Interactive music systems are often divided into three stages: sensing, processing, and response, as shown in Figure 3 [67]. While this framework is simple, it provides a helpful division of concerns and has previously been used to frame DMI designs [22] including those using ML [27]. This framework highlights that electronic music systems, unlike most acoustic instruments, are modular.…”
Section: Prediction In Musical Interactionmentioning
confidence: 99%
“…The advent of electronic musical instruments including powerful computers has allowed experiments with instruments that are able to make intelligent use of the musical context in which they are used. This has been discussed since at least the early 1990s (Pressing, 1990 ), but has been extended in recent years with the development and popularity of accessible machine learning frameworks for understanding physical gestures in performance (Fiebrink, 2017 ). Artificial intelligence techniques can imbue a musical interface with a kind of self-awareness (Lewis et al, 2016 ; Nymoen et al, 2016 ), allowing them to act predictively, rather than in reaction to a performer.…”
Section: Introductionmentioning
confidence: 99%