CHI '13 Extended Abstracts on Human Factors in Computing Systems 2013
DOI: 10.1145/2468356.2479596
|View full text |Cite
|
Sign up to set email alerts
|

The throat III

Abstract: Practitioner-led artistic research, combined with interactive technologies, opens up new and unexplored design spaces. Here we focus on the creation of a tool for opera-singers to dynamically disform, change and accompany their voices. In an opera composed by one of the authors, the title-role singer needed to be able to alter his voice to express hawking, coughing, snuffling and other disturbing vocal qualities associated with the lead role -Joseph Merrick, aka "The Elephant Man". In our designerly exploratio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2014
2014
2019
2019

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 2 publications
0
3
0
Order By: Relevance
“…In performing arts, interactive technologies and robots have already entered the stage alongside human performers or audience members (e.g. [4,20,26,38,40,51,57,58]). Drones are no exception, co-performing with dancers as swarms [31], partners [25], or pixel-formations in large-scale performances [19].…”
Section: Hci the Arts And Movement-based Designmentioning
confidence: 99%
“…In performing arts, interactive technologies and robots have already entered the stage alongside human performers or audience members (e.g. [4,20,26,38,40,51,57,58]). Drones are no exception, co-performing with dancers as swarms [31], partners [25], or pixel-formations in large-scale performances [19].…”
Section: Hci the Arts And Movement-based Designmentioning
confidence: 99%
“…However, there appears to be a gap in the design of interfaces for controlling and processing vocal sound. Although there is notable ongoing work and research in voice and the application of biofeedback and biosensing technologies, a significant majority is geared towards discovering how the expressivity of a singer's gestures (typically their hands) can be harnessed to process their sound according to sonification mappings based on spatial orientation and acceleration of limbs [17] [18], or the recognition of a gestural vocabulary [19] [14] [8]. Recent research and applications of respiration as a control parameter within NIMEs have seen the development of novel interfaces and instruments, and investigated couplings between music control, biofeedback and breathing [20], or a dancer's respiratory influence upon music [21].…”
Section: Introductionmentioning
confidence: 99%
“…With regards to singing and vocal production within the context of NIME and HCI research, there has been notable work [17] [37] [38] [39] [18]. Two who have been quite influential to our research, are Pamela Z and Imogen Heap.…”
Section: Introductionmentioning
confidence: 99%