2011
DOI: 10.1145/2037676.2037683
|View full text |Cite
|
Sign up to set email alerts
|

Exploiting online music tags for music emotion classification

Abstract: The online repository of music tags provides a rich source of semantic descriptions useful for training emotion-based music classifier. However, the imbalance of the online tags affects the performance of emotion classification. In this paper, we present a novel data-sampling method that eliminates the imbalance but still takes the prior probability of each emotion class into account. In addition, a two-layer emotion classification structure is proposed to harness the genre information available in the online … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
15
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 32 publications
(17 citation statements)
references
References 39 publications
(41 reference statements)
2
15
0
Order By: Relevance
“…There is an average of 9.8 tags for each song. Such uneven distribution of tags across tracks has previously been reported in [31] and [15]. Most frequent 30 tags are presented in Table 2.…”
Section: Data Processing and Statisticssupporting
confidence: 62%
“…There is an average of 9.8 tags for each song. Such uneven distribution of tags across tracks has previously been reported in [31] and [15]. Most frequent 30 tags are presented in Table 2.…”
Section: Data Processing and Statisticssupporting
confidence: 62%
“…There was an average of 9.8 tags for each song. Such uneven distribution of tags across tracks has previously been reported in [55] and [19]. Top 30 tags are shown in Table 2 together with their appearance frequency.…”
Section: Data Processing and Statisticssupporting
confidence: 61%
“…An experimentally defined linear combination of the results then outperformed classifiers using individual feature domains. In a more recent study, Lin et al [41] demonstrated that genre-based grouping complements the use of tags in a two-stage multi-label emotion classification system reporting an improvement of 55% when genre information is used. Finally, Schuller [70] et al combined audio features with metadata and Web-mined lyrics.…”
Section: Multi-modal Approaches and Fusion Policiesmentioning
confidence: 99%