2023
DOI: 10.3390/app13053296
|View full text |Cite
|
Sign up to set email alerts
|

Non-Autoregressive Sparse Transformer Networks for Pedestrian Trajectory Prediction

Abstract: Pedestrian trajectory prediction is an important task in practical applications such as automatic driving and surveillance systems. It is challenging to effectively model social interactions among pedestrians and capture temporal dependencies. Previous methods typically emphasized social interactions among pedestrians but ignored the temporal consistency of predictions and suffered from superfluous interactions by dense undirected graphs, resulting in a considerable deviance from reality. In addition, autoregr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 38 publications
0
5
0
Order By: Relevance
“…Apart from methods that use the original Transformer architecture, this chapter will also introduce methods that use an attention mechanism to predict trajectories. The Transformer architecture can be used without any changes to predict trajectories based on observed positions [17,58,59,[74][75][76].…”
Section: Transformer-and Attention-based Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Apart from methods that use the original Transformer architecture, this chapter will also introduce methods that use an attention mechanism to predict trajectories. The Transformer architecture can be used without any changes to predict trajectories based on observed positions [17,58,59,[74][75][76].…”
Section: Transformer-and Attention-based Methodsmentioning
confidence: 99%
“…Liu et al [59] used the embeddings of the last observed position as a query for the Transformer decoder part of the method, and Chen et al [74] supported the training of their Transformer structure with a multitask learning scheme that trained the model on intention recognition and trajectory prediction.…”
Section: Transformer-and Attention-based Methodsmentioning
confidence: 99%
“…A notable exemplar is EfficientDet, introduced in 2019, reflecting a trend towards optimizing the efficiency of object detection models [17]. Sparse attention mechanisms have emerged as a solution to handle largescale images effectively, enabling models to selectively attend to pertinent image regions, thereby reducing computation and improving overall efficiency [18]. The exploration of hybrid models, combining diverse architectures from twostage and one-stage detectors, along with the integration of ensemble methods, has shown promise in enhancing the overall performance of object detection systems [19].…”
Section: Literature Reviewmentioning
confidence: 99%
“…Many other works have also adopted this method [14,17,18,20,33], including the improved circular occupancy map [16] and angular grid [33]. Another non-grid method [24,26] embeds the relative position of other pedestrians with respect to the target pedestrian, similar to transformer position encoding [34,35], and then concatenates the relative position embeddings with the LSTM hidden states. The Multi-Layer Perceptron (MLP) and pooling layers then process the resulting vector to obtain the social vector.…”
Section: Social Interactionsmentioning
confidence: 99%