2004
DOI: 10.1023/b:nepl.0000016845.36307.d7
|View full text |Cite
|
Sign up to set email alerts
|

Minimizing the Cross Validation Error to Mix Kernel Matrices of Heterogeneous Biological Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
5
0

Year Published

2006
2006
2022
2022

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(6 citation statements)
references
References 26 publications
1
5
0
Order By: Relevance
“…On comparison with (2), we have and on using (5) and the distributive property of the Hadamard product. This confirms that is of the form in (15), with . It is also easy to see 7 that…”
Section: B Kernel Learning Examplessupporting
confidence: 82%
See 1 more Smart Citation
“…On comparison with (2), we have and on using (5) and the distributive property of the Hadamard product. This confirms that is of the form in (15), with . It is also easy to see 7 that…”
Section: B Kernel Learning Examplessupporting
confidence: 82%
“…can often be written in the form of (13) in many kernel methods, i.e., (15) where and . We also decompose in the same form as for , i.e.,…”
Section: Section Iv-bmentioning
confidence: 99%
“…In fact, the kernel matrix implicitly represents the inner product between all Paris of instances in an embedded feature space yielded by an explicit feature mapping. Since the resulting feature space may be high-dimensional or even infinite-dimensional, kernel matrix helps to have tractable and efficient computation in the original space without explicit mapping [38]. They are integrated through Log-Euclidean Mean (LogE), Arithmetic Mean (AM) and weighted-version of LogE (W-LogE).…”
Section: Kernel-based Data Fusion Methods For Gene Prioritizationmentioning
confidence: 99%
“…The employment of MKL for KFDA has received much interest in the literature. Tsuda et al considered both the linear mixture and the nonlinear mixture of kernel matrices and optimized the kernel combination weights to minimize the cross-validation error for KFDA [ 14 ]. Scores of other research work have also introduced multiple kernel FDA (MK-FDA) by constructing an optimized linear combination of several base kernels under a specific constraint for mixing weights [ 15 , 16 , 17 , 18 , 19 , 20 , 21 , 22 ].…”
Section: Introductionmentioning
confidence: 99%