2020
DOI: 10.3390/s20030688
|View full text |Cite
|
Sign up to set email alerts
|

Laser Ranging-Assisted Binocular Visual Sensor Tracking System

Abstract: Aimed at improving the low measurement accuracy of the binocular vision sensor along the optical axis in the process of target tracking, we proposed a method for auxiliary correction using a laser-ranging sensor in this paper. In the process of system measurement, limited to the mechanical performance of the two-dimensional turntable, the measurement value of a laser-ranging sensor is lagged. In this paper, the lag information is updated directly to solve the time delay. Moreover, in order to give full play to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(7 citation statements)
references
References 21 publications
0
6
0
Order By: Relevance
“…In the process of selecting color features, a discriminant color descriptor (DD) is used to obtain stronger model tracking performance. After fusing hog and DD features, a discriminant color feature hog correlation filter (dhcf) is proposed [16]. DD can freely select the required dimensions.…”
Section: Kernel Correlation Filter Tracking Algorithmmentioning
confidence: 99%
“…In the process of selecting color features, a discriminant color descriptor (DD) is used to obtain stronger model tracking performance. After fusing hog and DD features, a discriminant color feature hog correlation filter (dhcf) is proposed [16]. DD can freely select the required dimensions.…”
Section: Kernel Correlation Filter Tracking Algorithmmentioning
confidence: 99%
“…As shown in the principle of using the four-point calibration method. The fixed laser ranging sensor [13] 1 is read as L . The TCP coordinate value of the point is calibrated as ( )…”
Section: D Online Measurement Algorithm For Workpiecementioning
confidence: 99%
“…In engineering applications, sensor information fusion refers to the integration and processing of sensor information from different sources and modes according to certain algorithms and strategies in order to describe the sensed objects accurately and reasonably [72]. Information fusion can enhance the availability of data and reduce fuzziness; additionally, it can increase the coverage of time and space [73].Therefore, based on a binocular camera, CMRRs can be equipped with auxiliary sensors such as lidar sensors, infrared sensors or inertial measurement units [74], [75], [76], [77], [78]. Then, the multidimensional information can be integrated, which can effectively improve the robustness, adaptability, and fault tolerance of the robot's perception system, more accurately obtain the key information of the accident site, and provide a good foundation for the path planning and autonomous decision-making of the robot.…”
Section: ) Multisensor Information Fusionmentioning
confidence: 99%