2012
DOI: 10.1109/tnnls.2012.2200500
|View full text |Cite
|
Sign up to set email alerts
|

Kernel Recursive Least-Squares Tracker for Time-Varying Regression

Abstract: In this paper, we introduce a kernel recursive least-squares (KRLS) algorithm that is able to track nonlinear, time-varying relationships in data. To this purpose, we first derive the standard KRLS equations from a Bayesian perspective (including a sensible approach to pruning) and then take advantage of this framework to incorporate forgetting in a consistent way, thus enabling the algorithm to perform tracking in nonstationary scenarios. The resulting method is the first kernel adaptive filtering algorithm t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
113
0
6

Year Published

2015
2015
2022
2022

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 154 publications
(119 citation statements)
references
References 18 publications
0
113
0
6
Order By: Relevance
“…These experiments demonstrate that the algorithms produce high accuracies relative to standard kernels (e.g., isotropic and ARD squared exponential kernels) using sliding-windows as well as state-of-the-art online methods including sparse online Gaussian process (SOGP) [4], kernel recursive least squares (KRLS) [8], [9], locally weighted projection regression (LWPR) [10] and kernel least mean squares (KLMS) [11].…”
Section: Introductionmentioning
confidence: 88%
See 1 more Smart Citation
“…These experiments demonstrate that the algorithms produce high accuracies relative to standard kernels (e.g., isotropic and ARD squared exponential kernels) using sliding-windows as well as state-of-the-art online methods including sparse online Gaussian process (SOGP) [4], kernel recursive least squares (KRLS) [8], [9], locally weighted projection regression (LWPR) [10] and kernel least mean squares (KLMS) [11].…”
Section: Introductionmentioning
confidence: 88%
“…In this paper, we expand upon this body of work and contribute a novel online training method for ESNs through Bayesian online learning, specifically the SOGP [4]. In fact, the SOGP is intimately connected with the KRLS and its more recent variants [8], [9]. KRLS is a kernelised RLS filter with a sparse dictionary.…”
Section: B Online Training For Esnsmentioning
confidence: 99%
“…These issues come from the infinite memory of the system through the hyperparameters (5) and can be avoided using a "back-to-the-prior" strategy [33], [18] on the hyperparameter σ…”
Section: B Llrmentioning
confidence: 99%
“…Many techniques that try to bound this set have been presented in the literature [18], [19], [20], [21]. For example, [22] propose online kernel algorithms for classification, regression and novelty detection using a stochastic gradient descent algorithm in the Hilbert space defined by the kernel.…”
Section: A Related Research 1) Sfa In Computer Visionmentioning
confidence: 99%
“…Although alternatives exist [20], [21], [23], [24], we adopt the reduced set expansion of [10] to ensure constant running time. While we describe the main steps in the following, we refer to [10] for details.…”
Section: ) Introducing Constant Running Timementioning
confidence: 99%