2008
DOI: 10.1016/j.neucom.2007.10.030
|View full text |Cite
|
Sign up to set email alerts
|

Learning radial basis neural networks in a lazy way: A comparative study

Abstract: a b s t r a c tLazy learning methods have been used to deal with problems in which the learning examples are not evenly distributed in the input space. They are based on the selection of a subset of training patterns when a new query is received. Usually, that selection is based on the k closest neighbors and it is a static selection, because the number of patterns selected does not depend on the input space region in which the new query is placed. In this paper, a lazy strategy is applied to train radial basi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2009
2009
2020
2020

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 20 publications
0
3
0
Order By: Relevance
“…k-NN learning is one of the lazy learning (i.e., IBL) techniques that defer the decision of the parameters of a model until a query point is provided newly. When a query instance is given, a set of similar patterns is determined among the already obtained training patterns set and is used to estimate the output of the new instance [17]. There are two well-known advantages of lazy learning: effective usage of local information available in the input space and short training time [1].…”
Section: Locally Weighted Regression With K-nn Approachmentioning
confidence: 99%
“…k-NN learning is one of the lazy learning (i.e., IBL) techniques that defer the decision of the parameters of a model until a query point is provided newly. When a query instance is given, a set of similar patterns is determined among the already obtained training patterns set and is used to estimate the output of the new instance [17]. There are two well-known advantages of lazy learning: effective usage of local information available in the input space and short training time [1].…”
Section: Locally Weighted Regression With K-nn Approachmentioning
confidence: 99%
“…Such a model should, to perform well, cover the entire input space focusing to achieve greater accuracy in regions of higher probability. Typical examples of lazy learning and eager learning are, respectively, the kNN [6] and RBFN [9] algorithms.…”
Section: Introductionmentioning
confidence: 99%
“…In [29], we can find a time series analysis using nonlinear dynamic systems theory and models based on multilayer feed-forward networks. These models are applied to data measures of the tide level in the Venice lagoon over the years 1980-1994. In recent works [8,26], learning methods are used to automatically select the most appropriate patterns for training, depending on the example to predict. This training method uses a lazy learning strategy that builds local approximations centered in the new patterns.…”
Section: Introductionmentioning
confidence: 99%