In this paper, a joint feature selection and parameter estimation algorithm is presented for hidden Markov models (HMMs) and hidden semi-Markov models (HSMMs). New parameters, feature saliencies, are introduced to the model and used to select features that distinguish between states. The feature saliencies represent the probability that a feature is relevant by distinguishing between state-dependent and state-independent distributions. An expectation maximization algorithm is used to calculate maximum a posteriori estimates for model parameters. An exponential prior on the feature saliencies is compared with a beta prior. These priors can be used to include cost in the model estimation and feature selection process. This algorithm is tested against maximum likelihood estimates and a variational Bayesian method. For the HMM, four formulations are compared on a synthetic data set generated by models with known parameters, a tool wear data set, and data collected during a painting process. For the HSMM, two formulations, maximum likelihood and maximum a posteriori, are tested on the latter two data sets, demonstrating that the feature saliency method of feature selection can be extended to semi-Markov processes. The literature on feature selection specifically for HMMs is sparse, and non-existent for HSMMs. This paper fills a gap in the literature concerning simultaneous feature selection and parameter estimation for HMMs using the EM algorithm, and introduces the notion of selecting features with respect to cost for HMMs.
INDEX TERMSFeature selection, hidden Markov models, hidden semi-Markov models, maximum a posteriori estimation.