2023
DOI: 10.1609/aaai.v37i13.26907
|View full text |Cite
|
Sign up to set email alerts
|

MoMusic: A Motion-Driven Human-AI Collaborative Music Composition and Performing System

Abstract: The significant development of artificial neural network architectures has facilitated the increasing adoption of automated music composition models over the past few years. However, most existing systems feature algorithmic generative structures based on hard code and predefined rules, generally excluding interactive or improvised behaviors. We propose a motion based music system, MoMusic, as a AI real time music generation system. MoMusic features a partially randomized harmonic sequencing model based on a p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 13 publications
0
1
0
Order By: Relevance
“…Computer vision-based motion capture is currently less accurate, but has low environmental requirements, a large movable range, imaginative use of scenarios, and consumer-level applications. Bian et al introduce a motiondriven human-ai collaborative music composition and performance system, with a focus on improving the naturalness and creativity of music composition and performance [58]. Wang et al propose a method for motiondriven tracking via end-to-end coarse-to-fine verification, with a focus on improving the accuracy and stability of tracking [59].…”
Section: ) Motion-drivenmentioning
confidence: 99%
“…Computer vision-based motion capture is currently less accurate, but has low environmental requirements, a large movable range, imaginative use of scenarios, and consumer-level applications. Bian et al introduce a motiondriven human-ai collaborative music composition and performance system, with a focus on improving the naturalness and creativity of music composition and performance [58]. Wang et al propose a method for motiondriven tracking via end-to-end coarse-to-fine verification, with a focus on improving the accuracy and stability of tracking [59].…”
Section: ) Motion-drivenmentioning
confidence: 99%
“…Both Louie et al (2020) and Arriagada (2020) state the power of AI as a co-creation tool, to help artists and enhance human creativity, rather than an threat to substitute artists. For instance, a couple of recently presented collaborative AI-human tool for music composition are MoMusic (Bian et al, 2023) and Humming2Music (Qiu et al, 2023). Furthermore, Hitsuwari et al ( 2023) presents a case study on hiku poetry were AI-human collaboration enhances beauty rating, compared with only human or only AI works.…”
Section: Creativitymentioning
confidence: 99%