2020
DOI: 10.1109/lra.2020.3004785
|View full text |Cite
|
Sign up to set email alerts
|

Long-Short Term Spatiotemporal Tensor Prediction for Passenger Flow Profile

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 26 publications
(7 citation statements)
references
References 20 publications
0
7
0
Order By: Relevance
“…Smart transport has been an essential chapter, yet with many works focusing on demand prediction [20,22], trajectory [23,26,50], or etc. Congestion root analysis instead should gain more attention since it is safety-related.…”
Section: Congestion Causes Analysismentioning
confidence: 99%
“…Smart transport has been an essential chapter, yet with many works focusing on demand prediction [20,22], trajectory [23,26,50], or etc. Congestion root analysis instead should gain more attention since it is safety-related.…”
Section: Congestion Causes Analysismentioning
confidence: 99%
“…Thus, water use data for the study sites followed non-stationary stochastic patterns with non-uniform variance. Accordingly, for ARMA model development [21,27] for water consumption, non-stationarity was removed via second order differencing [28].…”
Section: Study Area and Water Use Data Compilation And Preprocessingmentioning
confidence: 99%
“…The auto-regressive (AR) polynomial constitutes the autoregressive model at a predefined order p describing the dependence of the variable (e.g., water consumption over a specified time period) on its values in a previous time. The moving average (MA) polynomial describes the linear dependence of the forecast errors resulting from the autoregressive model on the second predefined order q (Section S1, Supplementary Materials) [21,27]. The AR and MA polynomials were combined considering both the variable linear relationship and linear dependence of the forecast errors [29].…”
Section: Study Area and Water Use Data Compilation And Preprocessingmentioning
confidence: 99%
“…Learning an efficient representation of this large-scale data for further downstream tasks is necessary but challenging. In recent years, inspired by the great success of self-supervised representation learning in computer vision 1 Luxuan Wang is with Interdisciplinary Programs Office, The Hong Kong University of Science and Technology, Hong Kong SAR (Email: lwangda@connect.ust.hk) 2 Lei Bai is with The Shanghai AI Laboratory, Shanghai, China (Email: baisanshi@gmail.com) 3 Ziyue Li is with Information System, University of Cologne, 50923 Cologne, NRW, Germany (Email: zlibn@wiso.uni-koeln.de) 4 Rui Zhao is with SenseTime Reasearch and Qing Yuan Research Institute of Shanghai Jiao Tong University, Shanghai, China (Email: zhaorui@sensetime.com) 5 Fugee Tsung is with The Hong Kong University of Science and Technology and The Hong Kong University of Science and Technology (Guangzhou), Guangzhou, China (Email:season@ust.hk) and natural language process, many studies [13]- [15] have proposed various methods to learn latent representations of time series, and contrastive methods are currently the state-of-art approaches among discriminative self-supervised learning methods. However, the current approaches still have several significant shortcomings: Firstly, most recent studies [13], [14] only learn instancelevel representations, which is unsuitable for point-wise tasks, e.g., forecasting and anomaly detection.…”
Section: Introductionmentioning
confidence: 99%