2010
DOI: 10.1007/978-3-642-14980-1_45
|View full text |Cite
|
Sign up to set email alerts
|

Information Theoretical Kernels for Generative Embeddings Based on Hidden Markov Models

Abstract: Abstract. Many approaches to learning classifiers for structured objects (e.g., shapes) use generative models in a Bayesian framework. However, state-of-the-art classifiers for vectorial data (e.g., support vector machines) are learned discriminatively. A generative embedding is a mapping from the object space into a fixed dimensional feature space, induced by a generative model which is usually learned from data. The fixed dimensionality of these feature spaces permits the use of state of the art discriminati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2011
2011
2024
2024

Publication Types

Select...
3
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 20 publications
0
5
0
Order By: Relevance
“…This problem of feature selection poses a key research theme in machine learning and requires appropriate techniques for dimensionality reduction. One attractive option is to embed the data in a feature space that is constructed using a generative model ( Lasserre et al, 2006; Martins et al, 2010; Minka, 2005; Perina et al, 2010 ). This feature space, referred to as a generative score space , embodies a model-guided dimensionality reduction of the observed data.…”
Section: Methodsmentioning
confidence: 99%
“…This problem of feature selection poses a key research theme in machine learning and requires appropriate techniques for dimensionality reduction. One attractive option is to embed the data in a feature space that is constructed using a generative model ( Lasserre et al, 2006; Martins et al, 2010; Minka, 2005; Perina et al, 2010 ). This feature space, referred to as a generative score space , embodies a model-guided dimensionality reduction of the observed data.…”
Section: Methodsmentioning
confidence: 99%
“…Generative kernels have been fruitfully exploited in a range of applications [54]–[66] and define an active area of research [67]–[70]. In the special case of generative embedding, a generative kernel is used to construct a generative score space .…”
Section: Introductionmentioning
confidence: 99%
“…Generative embedding represents a special case of using generative kernels for classification, such as the P -kernel [52] or the Fisher kernel [53] . Generative kernels have been fruitfully exploited in a range of applications [54] – [66] and define an active area of research [67] – [70] . In the special case of generative embedding, a generative kernel is used to construct a generative score space .…”
Section: Introductionmentioning
confidence: 99%
“…Instead of relying on standard kernels, we investigate the use of the recently introduced information theoretic (IT) kernels [15] as a similarity measure between objects in the generative embedding space. The main idea is that, with such kernels, we can exploit the probabilistic nature of the generative embeddings, improving even more the classification results of the hybrid approaches -this has been already shown in other classification contexts [3,16].…”
Section: Discriminative Classificationmentioning
confidence: 91%