2020 International Conference on Computer Engineering and Intelligent Control (ICCEIC) 2020
DOI: 10.1109/icceic51584.2020.00029
|View full text |Cite
|
Sign up to set email alerts
|

Object Tracking in Video Sequence based on Kalman filter

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 18 publications
0
2
0
Order By: Relevance
“…In the literature, outside the field of transport, there are studies focusing on people detection and counting that employ similar approaches based on overhead video cameras. Some authors have used Kalman filtering for tracking purposes [ 45 ] and to improve a YOLO-based object detection and tracking algorithm [ 46 ]. Others have used only object detection algorithms for counting [ 47 , 48 ].…”
Section: Introductionmentioning
confidence: 99%
“…In the literature, outside the field of transport, there are studies focusing on people detection and counting that employ similar approaches based on overhead video cameras. Some authors have used Kalman filtering for tracking purposes [ 45 ] and to improve a YOLO-based object detection and tracking algorithm [ 46 ]. Others have used only object detection algorithms for counting [ 47 , 48 ].…”
Section: Introductionmentioning
confidence: 99%
“…Object tracking has always been a focus of research in the field of computer vision. With the in-depth study of visual tracking algorithms by researchers, their scientific and theoretical basis has continued to improve, which has greatly promoted the development of surveillance systems, perceptual user interface, intelligent robotics, vehicle navigation, and intelligent transportation systems [1][2][3]. Two important factors to consider when for the performance of embedded hardware, algorithm selection, and algorithm-based improvement.…”
Section: Introductionmentioning
confidence: 99%