Proceedings of the Main Conference on Human Language Technology Conference of the North American Chapter of the Association of 2006
DOI: 10.3115/1220835.1220871
|View full text |Cite
|
Sign up to set email alerts
|

Backoff model training using partially observed data

Abstract: Dialog act (DA) tags are useful for many applications in natural language processing and automatic speech recognition. In this work, we introduce hidden backoff models (HBMs) where a large generalized backoff model is trained, using an embedded expectation-maximization (EM) procedure, on data that is partially observed. We use HBMs as word models conditioned on both DAs and (hidden) DAsegments. Experimental results on the ICSI meeting recorder dialog act corpus show that our procedure can strictly increase lik… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
18
0

Year Published

2006
2006
2018
2018

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 17 publications
(18 citation statements)
references
References 24 publications
0
18
0
Order By: Relevance
“…The convergence (5) then follows immediately from (14). The convergence (5) then follows immediately from (14).…”
Section: Known Weightsmentioning
confidence: 84%
See 2 more Smart Citations
“…The convergence (5) then follows immediately from (14). The convergence (5) then follows immediately from (14).…”
Section: Known Weightsmentioning
confidence: 84%
“…, S K g as in (12) and (13) and the empirical measureŝ P l n (u j , p j ) as in (14). We now summarize the above by giving a formal definition of the adjusted VT with the weight correction.…”
Section: Unknown Weightsmentioning
confidence: 99%
See 1 more Smart Citation
“…Over the last years, dynamic Bayesian networks have been investigated for many sequential data modeling tasks (automatic speech recognition, POS and dialog-act tagging [9], DNA sequence analy- sis...). DBN have shown a great flexibility for complex stochastic system representation and good performance are generally observed when compared to other standard techniques.…”
Section: Dbn-based Understanding Modelsmentioning
confidence: 99%
“…Over the last years, dynamic Bayesian networks have been investigated for many sequential data modeling tasks (automatic speech recognition, POS and Dialog-Act tagging [10], DNA sequence analysis...). DBN have shown a great flexibility for complex stochastic system representation and good performance are generally observed when compared to other standard techniques.…”
Section: Dbn-based Slu Modelsmentioning
confidence: 99%