2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541)
DOI: 10.1109/ijcnn.2004.1380905
|View full text |Cite
|
Sign up to set email alerts
|

Fisher kernel for tree structured data

Abstract: Abstract-We introduce a kernel for structured data, which is an extension of the Fisher Kernel used for sequences [11]. In our approach, we extract the Fisher score vectors from a Bayesian Network, specifically a Hidden Tree Markov Model [6], which can be constructed starting from the training data. Experiments on a QSPR (quantitative structure-property relationship) analysis, where instances are naturally represented as trees, allow a first test of the approach.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(5 citation statements)
references
References 17 publications
0
5
0
Order By: Relevance
“…Deriving a probabilistic model incorporating information about data, however, is in general computationally very expensive. These models include [58][59][60][61][62][63]. A very important kernel in this class is the Fisher kernel [61], which actually is a general way on how to generate a kernel using a probabilistic model describing the data, such as a Hidden Markov Model (used, for example, in Bioinformatics to model protein families) or Stochastic Context-Free Grammars (which have been used to model RNA sequences).…”
Section: Kernel Methodsmentioning
confidence: 99%
“…Deriving a probabilistic model incorporating information about data, however, is in general computationally very expensive. These models include [58][59][60][61][62][63]. A very important kernel in this class is the Fisher kernel [61], which actually is a general way on how to generate a kernel using a probabilistic model describing the data, such as a Hidden Markov Model (used, for example, in Bioinformatics to model protein families) or Stochastic Context-Free Grammars (which have been used to model RNA sequences).…”
Section: Kernel Methodsmentioning
confidence: 99%
“…In this thesis the Fisher kernel [15] is explained since it has been shown to perform well in many applications and can process data that are not of the vector type [16]. Generative models such as Gaussian Mixture Models (GMM) and Hidden Markov Models (HMM) [17] have been vastly used to model the observed data.…”
Section: Fisher Kernelmentioning
confidence: 99%
“…-text (via NLP parse-tree kernels/LSA kernels [9,10]) -graphs and sequences (via string kernels, random walk kernels [11,12]) -shapes (via edit distance kernels [13]) -real vectors (via dot product kernels, polynomial kernels etc [14]) -sets of pixels/voxels (efficient match kernels, pyramid match kernels [15]) -stochastic data (Fisher kernels [8])…”
Section: Mercer Kernels Are Now For All Data Typesmentioning
confidence: 99%