2013
DOI: 10.1007/978-3-642-40991-2_34
|View full text |Cite
|
Sign up to set email alerts
|

Variational Hidden Conditional Random Fields with Coupled Dirichlet Process Mixtures

Abstract: Abstract. Hidden Conditional Random Fields (HCRFs) are discriminative latent variable models which have been shown to successfully learn the hidden structure of a given classification problem. An infinite HCRF is an HCRF with a countably infinite number of hidden states, which rids us not only of the necessity to specify a priori a fixed number of hidden states available but also of the problem of overfitting. Markov chain Monte Carlo (MCMC) sampling algorithms are often employed for inference in such models. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
5
0

Year Published

2014
2014
2018
2018

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 13 publications
(22 reference statements)
0
5
0
Order By: Relevance
“…An alternative approach to expanding the number of hidden states of the HCRF is the infinite HCRF (iHCRF), which employs a Dirichlet process to determine the number of hidden states. Inference in the iHCRF can be performed via collapsed Gibbs sampling [2] or variational inference [3]. Whilst theoretically facilitating infinitely many states, the modeling power of the iHCRF is, however, limited to the number of "represented" hidden states.…”
Section: Related Workmentioning
confidence: 99%
“…An alternative approach to expanding the number of hidden states of the HCRF is the infinite HCRF (iHCRF), which employs a Dirichlet process to determine the number of hidden states. Inference in the iHCRF can be performed via collapsed Gibbs sampling [2] or variational inference [3]. Whilst theoretically facilitating infinitely many states, the modeling power of the iHCRF is, however, limited to the number of "represented" hidden states.…”
Section: Related Workmentioning
confidence: 99%
“…We do not enforce any such constraints on the features or parameters. Further, unlike the point estimates produced for HCRF parameters using gradient descent in [20], the work here estimates posterior distribution for the parameters.…”
Section: Related Workmentioning
confidence: 99%
“…The works in [19,20] extend HCRF to be non-parametric with a DP prior. The MCMC based approach in [19] is not applicable for continuous observation features and excludes the HCRF normalization term.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…CRF can choose contextual features to solve the problem of the tag bias, and can get the global optimal value through globally normalizing all the characteristics. Because of these good characteristics, CRF has been widely used in the vision community [7][8].…”
Section: Introductionmentioning
confidence: 99%