The 26th Chinese Control and Decision Conference (2014 CCDC) 2014
DOI: 10.1109/ccdc.2014.6853120
|View full text |Cite
|
Sign up to set email alerts
|

Object localization and tracking based on multiple sensor fusion in intelligent home

Abstract: A novel scheme for object localization and tracking under family environment is presented based on fusion of multiple sensors, which include two laser sensors and camera sensors. The two laser sensors and two cameras are used to locate the object separately, and multiple sensors probability data association fusion algorithm is used to track the objects. Firstly, object detection is realized by laser sensors and vision sensors separately. Secondly, the laser data is fused by Extended Kalman Filter. To obtain th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 9 publications
0
3
0
Order By: Relevance
“…Several techniques, such as range-based, range-free, and AI-based, can be used for object identification and localization to perform tasks in a certain environment efficiently. If multiple objects exist in a particular environment, such sensors are essential for identifying and localizing multiple objects simultaneously [1], [2]. These sensors include radiofrequency identification (RFID), laser range finder (LRF), and ultrawideband (UWB) sensors.…”
Section: Introductionmentioning
confidence: 99%
“…Several techniques, such as range-based, range-free, and AI-based, can be used for object identification and localization to perform tasks in a certain environment efficiently. If multiple objects exist in a particular environment, such sensors are essential for identifying and localizing multiple objects simultaneously [1], [2]. These sensors include radiofrequency identification (RFID), laser range finder (LRF), and ultrawideband (UWB) sensors.…”
Section: Introductionmentioning
confidence: 99%
“…22 There are some reported works on motion estimation with binocular divergent systems, 23 trinocular divergence for visual odometry for robots, 24 and divergent visual simultaneous localization and mapping (SLAM) 25 in mobile robots. In contrast to active sensing modalities for localization, 26,27 object detection, 28 and SLAM 29 with parallel multiple views, 30 this work estimates the posture of a 4WD WMR by exploiting sensor fusion feedback using a radial trinocular sensor. Numerous visual odometry algorithms have been reported, 31 using stereo cameras, 32 matching multi-frame features, 33 local structural invariants, 34 three-dimensional (3D) point clouds, 35 and outdoor approaches such as urban 36 and train 37 environments.…”
Section: Introductionmentioning
confidence: 99%
“…With the fusion of multiple sensors i.e. two laser sensors and camera sensors a novel scheme for object localization and tracking was presented by researchers [2]. A hybrid RFID and computer vision system for fine-grained localization and tracking of tagged objects called as TagVision along with a fusion algorithm was suggested to organically combine the position information given by the CV subsystem, and phase data output by the RFID subsystem [3].The problem of tracking dynamic objects with UHF RFID tags using a Bayesian framework with a mobile robot was addressed.…”
Section: Imentioning
confidence: 99%