2013
DOI: 10.1109/msp.2013.2249294
|View full text |Cite
|
Sign up to set email alerts
|

Analyzing Local Structure in Kernel-Based Learning: Explanation, Complexity, and Reliability Assessment

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
24
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 39 publications
(24 citation statements)
references
References 28 publications
0
24
0
Order By: Relevance
“…Indeed, one only needs to replace the inner product operator of a linear technique with an appropriate kernel function k (i.e., a positive semi-definite symmetric function), which arises as a similarity measure that can be thought as an inner product between pairs of data in the feature space. Here, the original nonlinear problem can be transformed into a linear formulation in a higher dimensional space ℱ with an appropriate kernel k [65]:…”
Section: Transfer Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…Indeed, one only needs to replace the inner product operator of a linear technique with an appropriate kernel function k (i.e., a positive semi-definite symmetric function), which arises as a similarity measure that can be thought as an inner product between pairs of data in the feature space. Here, the original nonlinear problem can be transformed into a linear formulation in a higher dimensional space ℱ with an appropriate kernel k [65]:…”
Section: Transfer Learningmentioning
confidence: 99%
“…Moreover, for a specific problem, the selection of the best kernel function is still an open issue, although ample experimental evidence in the literature supports that the popular kernel functions such as Gaussian kernels and polynomial kernels perform well in most cases. At the root of the success of kernel-based learning, the combination of high expressive power with the possibility to perform the numerous analyses has been developed in many challenging applications [65], e.g., online classification [66], convexly constrained parameter/function estimation [67], beamforming problems [68], and adaptive multiregression [69]. One of the most popular surveys about introducing kernel-based learning algorithms is [70], in which an introduction of the exciting field of kernel-based learning methods and applications was given.…”
Section: Transfer Learningmentioning
confidence: 99%
“…The accuracy of predictions depends strongly on how crystals are represented. We found that Coulomb matrices, while being successful for predicting properties in molecules [25,26], are not suitable to describe crystal structures well enough. Instead, we have proposed a representation inspired by partial radial distribution functions which is invariant with respect to translation, rotation and the choice of the unit cell.…”
Section: Arxiv:13071266v3 [Cond-matmtrl-sci] 22 May 2014mentioning
confidence: 92%
“…It could be seen that the scatters was discontinuous, that was because the step of FVC was set as 0.05 when the training dataset was generated using PROSAIL model. To analyze models' robustness to local consistency and reliability of the estimated error [62,63], Figure 6 exhibits the evolution of the RMSE as a function of the number of predictions. The shapes of the two curves in Figure 6 are similar for two methods, which shows a consistent better performance of MARS over BPNNs.…”
Section: Accuracy Assessment Over the Simulated Datasetmentioning
confidence: 99%