2019
DOI: 10.48550/arxiv.1910.09323
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Recurrent Attentive Neural Process for Sequential Data

Abstract: Neural processes (NPs) learn stochastic processes and predict the distribution of target output adaptively conditioned on a context set of observed input-output pairs. Furthermore, Attentive Neural Process (ANP) improved the prediction accuracy of NPs by incorporating attention mechanism among contexts and targets. In a number of real-world applications such as robotics, finance, speech, and biology, it is critical to learn the temporal order and recurrent structure from sequential data. However, the capabilit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
7

Relationship

2
5

Authors

Journals

citations
Cited by 7 publications
(8 citation statements)
references
References 15 publications
0
8
0
Order By: Relevance
“…Future work will explore improving the calibration of the downscaling models using mixture distributions and normalising flows (Rezende and Mohamed, 2015) to improve the calibration of the model. A further possibility for extending the convCNP model would be to explicitly incorporate time by building recurrence into the model (Qin et al, 2019;Singh et al, 2019).…”
Section: Discussionmentioning
confidence: 99%
“…Future work will explore improving the calibration of the downscaling models using mixture distributions and normalising flows (Rezende and Mohamed, 2015) to improve the calibration of the model. A further possibility for extending the convCNP model would be to explicitly incorporate time by building recurrence into the model (Qin et al, 2019;Singh et al, 2019).…”
Section: Discussionmentioning
confidence: 99%
“…To define stochastic processes, we indeed need to ensure invariance to input permutations, i.e., exchangeability condition [19]. However, it is reported to be practically beneficial to relax such assumption when the observations contain time sequences [20]. Specifically, recurrent attentive neural processes (RANP) [20] incorporates a recurrent neural network structure to process the observations, and show improved performance on vehicle trajectory predicition over ANP.…”
Section: B Functional Distribution With Efficient Inferencementioning
confidence: 99%
“…However, it is reported to be practically beneficial to relax such assumption when the observations contain time sequences [20]. Specifically, recurrent attentive neural processes (RANP) [20] incorporates a recurrent neural network structure to process the observations, and show improved performance on vehicle trajectory predicition over ANP. We point out that in RANP, the exchangeability is only relaxed on observations, while the temporal structure of test-time input and output is not considered.…”
Section: B Functional Distribution With Efficient Inferencementioning
confidence: 99%
See 1 more Smart Citation
“…There have been proposed many methods, such as transfer learning, domain adaptation, multitask learning [52,33,20,18,25], and meta-learning [48,5,43,2,54,49,4,9,42,21,55,11], to improve performance using data from related but different domains for regression, classification, and time-series prediction [16,44,29,41,51,1,39,40,17]. However, these methods are inapplicable for Koopman spectral analysis.…”
Section: Related Workmentioning
confidence: 99%