2015
DOI: 10.1109/tnnls.2014.2316291
|View full text |Cite
|
Sign up to set email alerts
|

Spatio-Temporal Learning With the Online Finite and Infinite Echo-State Gaussian Processes

Abstract: Successful biological systems adapt to change. In this work, we are principally concerned with adaptive systems that operate in environments where data arrives sequentially and is multi-variate in nature, e.g., sensory streams in robotic systems. We contribute two reservoir inspired methods: (1) the online echo-state Gaussian process (OESGP) and (2) its infinite variant, the online infinite echo-state Gaussian process (OIESGP). Both algorithms are iterative fixed-budget methods that learn from noisy time-se… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0
1

Year Published

2015
2015
2021
2021

Publication Types

Select...
4
3
2

Relationship

3
6

Authors

Journals

citations
Cited by 53 publications
(22 citation statements)
references
References 38 publications
0
21
0
1
Order By: Relevance
“…As such, we develop models that capture both the dynamic property of trust and its variation across tasks. We leverage upon recurrent neural networks that have been applied to a variety of sequential learning tasks (e.g., [34]) and online Gaussian processes that have been previously used in robotics [33,31,32].…”
Section: Background and Related Workmentioning
confidence: 99%
“…As such, we develop models that capture both the dynamic property of trust and its variation across tasks. We leverage upon recurrent neural networks that have been applied to a variety of sequential learning tasks (e.g., [34]) and online Gaussian processes that have been previously used in robotics [33,31,32].…”
Section: Background and Related Workmentioning
confidence: 99%
“…Existing work. The idea of using neural network intensities for general point processes is not new; several authors [Kim et al, 2011;Du et al, 2015;Choi et al, 2016;Mei and Eisner, 2016;Du et al, 2016;Xiao et al, 2017; have recently proposed the use of recurrent neural networks in conjunction with point processes with random intensities. These more complex models require Bayesian inference and likelihood methods, which prevents scalability.…”
Section: Temporal Point Processesmentioning
confidence: 99%
“…Among the set of online learning methods, we consider four algorithms that have been shown effective in a number of diverse applications: the Echo State Networks (ESN) [72], which are a class of recurrent neural networks; the Online Echo State Gaussian Processes (OESGPs) [73], which combine ESN with sparse Gaussian Processes; the Locally Weighted Projection Regression (LWPR) [74], which exploits piecewise linear models to realise an incremental learning algorithm; and recursive ARX models (RARX) identified using the recursive least square method [75,76] (see Fig. 2).…”
Section: The Proposed Online Heterogeneous Ensemblementioning
confidence: 99%
“…In the ESN model, only the output weights (w out ) of the recurrent neural network are updated; the prediction is then obtained by tanh(w out x(t)), [72]. In the OESGP model, the prediction is made through the Gaussian predictive distribution N (µ, σ 2 ), where the mean µ and the variance σ 2 are estimated incrementally during training, [73]. The RARX model updates the parameter estimatesθ at each iteration, while the prediction is calculated as ψ T (t)θ(t) where ψ represents the gradient of the predicted model output [76].…”
Section: The Proposed Online Heterogeneous Ensemblementioning
confidence: 99%