ICASSP '87. IEEE International Conference on Acoustics, Speech, and Signal Processing
DOI: 10.1109/icassp.1987.1169614
|View full text |Cite
|
Sign up to set email alerts
|

Explicit time correlation in hidden Markov models for speech recognition

Abstract: m e Hidden Mwkov models are generalized b y &$ning a n e w e m i s s i o n p r o b a b i l i t y w h i c h takes the correlation between sltccessive feature vectors into account. Estimation f m u l a s fm the iterative Learning both along %teTbi a n d Mazimurn likelihood criteria are p r e s e n t e d .

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
57
0
1

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 101 publications
(58 citation statements)
references
References 9 publications
(10 reference statements)
0
57
0
1
Order By: Relevance
“…In this paper we propose using the autoregressive HMM [4]- [7] for speech synthesis. The autoregressive HMM relaxes the traditional HMM conditional independence assumption, allowing state output distributions which depend on past output as well as the current state.…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…In this paper we propose using the autoregressive HMM [4]- [7] for speech synthesis. The autoregressive HMM relaxes the traditional HMM conditional independence assumption, allowing state output distributions which depend on past output as well as the current state.…”
mentioning
confidence: 99%
“…Autoregressive HMMs have been used before for speech recognition [4]- [6], [8], but have not been extensively investigated for speech synthesis. 1 A basic formulation of the autoregressive HMM for statistical parametric speech synthesis showing how to do expectation maximization-based parameter estimation and parameter generation considering global variance was given in [11].…”
mentioning
confidence: 99%
“…This DBN describes HMMs with explicit temporal correlation modelling [181], vector predictors [184], and buried Markov models [16]. Although an interesting direction for refining an HMM, this approach has not yet been adopted in mainstream state-of-the-art systems.…”
Section: Dynamic Bayesian Networkmentioning
confidence: 99%
“…The trace of this moving point is called the trajectory of the symbol. Several techniques were applied to model these trajectories [22,23,24,25]. …”
Section: 1mentioning
confidence: 99%