2020
DOI: 10.1007/978-3-030-63823-8_5
|View full text |Cite
|
Sign up to set email alerts
|

An Attention-Based Interaction-Aware Spatio-Temporal Graph Neural Network for Trajectory Prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 17 publications
0
3
0
Order By: Relevance
“…3.5.2 User Interest Representation Discrimination. Our developed contrastive paradigm not only captures the commonality among multiple behaviors but also enhances the diversity between individual nodes to mitigate oversmoothing [23,54]. During backpropagation, the gradient of the loss function L 𝑐𝑙 with respect to the parameters of RCL is computed and used to update the parameters, causing negative samples to move away from the anchor node e 𝑢 in the parameter space.…”
Section: In-depth Discussion On Rclmentioning
confidence: 99%
“…3.5.2 User Interest Representation Discrimination. Our developed contrastive paradigm not only captures the commonality among multiple behaviors but also enhances the diversity between individual nodes to mitigate oversmoothing [23,54]. During backpropagation, the gradient of the loss function L 𝑐𝑙 with respect to the parameters of RCL is computed and used to update the parameters, causing negative samples to move away from the anchor node e 𝑢 in the parameter space.…”
Section: In-depth Discussion On Rclmentioning
confidence: 99%
“… Chen, Tian & Wu (2020) mixed the information gain with a long short-term memory network, and the improved model prediction accuracy reached 0.967. Today, LSTM is also used widely for air quality predictions, and these studies can be found in literature surveys ( Liu, Zhang & Qi, 2022 ; Bai & Shen, 2019 ; Zhou et al, 2020 ), the accuracy is up to 0.95. Yu (2020) uses one-dimensional convolutional kernels to extract features, combines LSTM model and genetic algorithm to construct the model, and the MAE is as low as 0.961.…”
Section: Introductionmentioning
confidence: 99%
“…Overall, our benchmarking framework and experiments provide a rigorous setup for GNN research. Aspects of the benchmark have led to facilitating several interesting studies for GNNs such as on (i) the aggregation functions and filters [106,120,121], (ii) improving expressive power of GNNs [122][123][124], (iii) pooling mechanisms [125], (iv) graph-specific normalization and regularization [126][127][128], and (v) GNNs' robustness and efficiency [129,130] among other ideas contributed in the literature.…”
Section: Discussionmentioning
confidence: 99%