2017
DOI: 10.48550/arxiv.1703.04706
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Tree Memory Networks for Modelling Long-term Temporal Dependencies

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
5
1

Relationship

5
1

Authors

Journals

citations
Cited by 7 publications
(13 citation statements)
references
References 0 publications
0
13
0
Order By: Relevance
“…The main disadvantage of these models is the need to hand-craft rules and features, limiting their ability to efficiently learn beyond abstract level and the domain experts. Modern socially-aware trajectory prediction work usually use recurrent neural networks [1,14,6,5,4,11]. Hug et al [10] present an experiment-based study the effectiveness of some RNN models in the context socially aware trajectory prediction.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…The main disadvantage of these models is the need to hand-craft rules and features, limiting their ability to efficiently learn beyond abstract level and the domain experts. Modern socially-aware trajectory prediction work usually use recurrent neural networks [1,14,6,5,4,11]. Hug et al [10] present an experiment-based study the effectiveness of some RNN models in the context socially aware trajectory prediction.…”
Section: Related Workmentioning
confidence: 99%
“…Parallel to path prediction using scene context information, several approaches have recently proposed to model interactions between all agents in the scene in order to predict the future trajectory for each targeted agent [5,6]. Although these methods have shown promising progress in addressing this challenging problem, they still ignore the scene contexts as crucial information.…”
Section: Introductionmentioning
confidence: 99%
“…As we are dealing with longer sequences up to a duration of 5 minutes, the modelling ability of the LSTM is limited as LSTMs struggle to capture long-term dependencies when sequences are very long [3,10]. To address this limitation, we make use of memory networks [3,10] together with LSTM models, to improve the ability to capture long-term relationships.…”
Section: A1mentioning
confidence: 99%
“…However, none of the existing works in either action anticipation or future action sequence prediction have investigated how best to capture long-term dependencies. We speculate that LSTMs fail to capture such dependencies as they model only the relationships within a given sequence [3]. To address this, we demonstrate a method to effectively capture these relationships using neural memory networks.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation