Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence 2020
DOI: 10.24963/ijcai.2020/302
|View full text |Cite
|
Sign up to set email alerts
|

Location Prediction over Sparse User Mobility Traces Using RNNs: Flashback in Hidden States!

Abstract: Location prediction is a key problem in human mobility modeling, which predicts a user's next location based on historical user mobility traces. As a sequential prediction problem by nature, it has been recently studied using Recurrent Neural Networks (RNNs). Due to the sparsity of user mobility traces, existing techniques strive to improve RNNs by considering spatiotemporal contexts. The most adopted scheme is to incorporate spatiotemporal factors into the recurrent hidden state passing process of RNN… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
55
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 98 publications
(56 citation statements)
references
References 16 publications
(1 reference statement)
1
55
0
Order By: Relevance
“…ST-RNN proposed by Liu et al [21] and DeepMove [8] are two popular methods that apply attention mechanisms upon RNN to forecast human mobility. Following this trend, various methods [3,9,22,33] incorporating different context information have been proposed to predict human mobility. Since the introduction of the self-attention-based Transformer architecture [27], it has been applied and achieved great success in many fields such as computer vision [1,6], audio processing [31], and natural language processing [5,15].…”
Section: Related Workmentioning
confidence: 99%
“…ST-RNN proposed by Liu et al [21] and DeepMove [8] are two popular methods that apply attention mechanisms upon RNN to forecast human mobility. Following this trend, various methods [3,9,22,33] incorporating different context information have been proposed to predict human mobility. Since the introduction of the self-attention-based Transformer architecture [27], it has been applied and achieved great success in many fields such as computer vision [1,6], audio processing [31], and natural language processing [5,15].…”
Section: Related Workmentioning
confidence: 99%
“…Tips are shorter than reviews and tend to convey quick suggestions. photo id, business id, caption, label Contains photo data including the caption and classification In another recent work, Yang et al [38] proposed a model called Flashback in which they use Basic RNN. The model uses sparse user mobility data by focusing on rich spatio-temporal contexts and doing flashbacks on hidden states in RNNs.…”
Section: Business Id Datementioning
confidence: 99%
“…In the last few years, we have seen an unprecedented rise in the number of works leveraging deep learning in POI recommendation in all major venues (e.g., AAAI, IJCAI, SIGIR, CIKM, WWW, etc.). The use of different deep learning paradigms such as CNN [35,36], RNN [37,38,39,40], Long Short Term Memory (LSTM) [41,42,43,44], Gated Recurrent Unit (GRU) [45,46,47], and self-attention [48,49] have greatly boosted the performance of POI recommendation models. On top of that, state-of-the-art techniques from Natural Language Processing (NLP) have also been employed for complex modeling of human mobility in POI recommendation.…”
Section: Introductionmentioning
confidence: 99%
“…Modelling the joint interactions between user mobility and social relationships can greatly enhance the performance of these downstream tasks [7]. Traditional approaches tailor hand-crafted features extracted from either user mobility data (e.g.…”
Section: Introductionmentioning
confidence: 99%