2019
DOI: 10.1007/s10044-019-00807-1
|View full text |Cite
|
Sign up to set email alerts
|

Hybrid hidden Markov models and artificial neural networks for handwritten music recognition in mensural notation

Abstract: In this paper we present a hybrid approach using Hidden Markov Models (HMM) and Artificial Neural Networks to deal with the task of Handwritten Music Recognition in Mensural notation. Previous works have shown that the task can be addressed with Gaussian density HMMs that can be trained and used in an end-to-end manner; that is, without prior segmentation of the symbols. However, the results achieved using that approach are not sufficiently accurate to be useful in practice. In this work we hybridize HMMs with… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 12 publications
(8 citation statements)
references
References 24 publications
0
8
0
Order By: Relevance
“…In addition, the accuracy of the considered approach is compared with previous work over the same corpus. Specifically: HMMs both trained with the classic Maximum Likelihood (ML) estimation (Calvo-Zaragoza et al, 2016) and with Discriminative Training (DT) (Calvo-Zaragoza et al, 2017), as well as HMMs hybridized with Multi-Layer Perceptron (MLP) models (Calvo-Zaragoza et al, 2019). In these cases, we directly consider the results with the best N-gram estimation found in the aforementioned references.…”
Section: Recognition Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…In addition, the accuracy of the considered approach is compared with previous work over the same corpus. Specifically: HMMs both trained with the classic Maximum Likelihood (ML) estimation (Calvo-Zaragoza et al, 2016) and with Discriminative Training (DT) (Calvo-Zaragoza et al, 2017), as well as HMMs hybridized with Multi-Layer Perceptron (MLP) models (Calvo-Zaragoza et al, 2019). In these cases, we directly consider the results with the best N-gram estimation found in the aforementioned references.…”
Section: Recognition Resultsmentioning
confidence: 99%
“…Concerning this formulation, Pugin (2006) already proposed a holistic approach for printed Mensural notation using Hidden Markov Models (HMM). This approach was recently extended to handwritten sources by using a more appropriate set of features (Calvo-Zaragoza et al, 2016), and further improved by considering discriminative training techniques (Calvo-Zaragoza et al, 2017), as well as hybridization with neural networks (Calvo-Zaragoza et al, 2019).…”
Section: Related Workmentioning
confidence: 99%
“…fixed; Ulrych et al 2001); Pr(S) is straightforwardly estimated from the relative frequencies of lithofacies in the training data set; and Pr(S|Y) is the output of ANN, as calculated by the softmax equation. Note that any explicitly or implicitly computed Pr(S|Y) from statistically interpretable classifier can be easily adapted to model Pr(Y|S) (Calvo-Zaragoza et al 2019).…”
Section: Ann-hmmmentioning
confidence: 99%
“…For supervised learning, the training samples include inputs and outputs (i.e., features and class labels), which results in a better result than unsupervised learning in most cases [ 12 ]. The supervised algorithm commonly used includes decision tree (DT) [ 13 ], naïve Bayes (NB) [ 14 ], k-nearest neighbor (kNN) [ 15 17 ], neural networks (NNs) [ 18 , 19 ], and support vector machine (SVM) [ 20 22 ]. Among them, SVM was first formally proposed by Cortes and Vapnik in 1995.…”
Section: Introductionmentioning
confidence: 99%