2017 IEEE International Conference on Robotics and Automation (ICRA) 2017
DOI: 10.1109/icra.2017.7989460
|View full text |Cite
|
Sign up to set email alerts
|

Tracking objects with point clouds from vision and touch

Abstract: Abstract-We present an object-tracking framework that fuses point cloud information from an RGB-D camera with tactile information from a GelSight contact sensor. GelSight can be treated as a source of dense local geometric information, which we incorporate directly into a conventional point-cloud-based articulated object tracker based on signed-distance functions. Our implementation runs at 12 Hz using an online depth reconstruction algorithm for GelSight and a modified secondorder update for the tracking algo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
53
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
3
2

Relationship

3
7

Authors

Journals

citations
Cited by 81 publications
(53 citation statements)
references
References 18 publications
0
53
0
Order By: Relevance
“…Li et al [3] used GelSight's localization capabilities to insert a USB connector, where the sensor used the texture of the characteristic USB logo to guide the insertion. Izatt et al [17] explored the use of the 3D point cloud measured by a GelSight sensor in a state estimation filter to find the pose of a grasped object in a peg-in-hole task. Dong et al [4] used the GelSight sensor to detect slip from variations in the 2D texture of the contact surface in a robot picking task.…”
Section: B Gelsight Sensorsmentioning
confidence: 99%
“…Li et al [3] used GelSight's localization capabilities to insert a USB connector, where the sensor used the texture of the characteristic USB logo to guide the insertion. Izatt et al [17] explored the use of the 3D point cloud measured by a GelSight sensor in a state estimation filter to find the pose of a grasped object in a peg-in-hole task. Dong et al [4] used the GelSight sensor to detect slip from variations in the 2D texture of the contact surface in a robot picking task.…”
Section: B Gelsight Sensorsmentioning
confidence: 99%
“…In [22], tactile contacts are localized in a visual map by matching the tactile features with visual features. Vision and touch data are combined to reconstruct a point cloud representation and there is no learning of the key features of the two modalities in [23]. Deep neural networks have also been used to extract adjectives/features from both vision and tactile data [24], [25].…”
Section: B Visual and Tactile Sensingmentioning
confidence: 99%
“…As reviewed in [18], there is a wealth of literature concerning the use of tactile sensors in robotic manipulation. Tactile sensors have already proved effective at detecting contact slip between the gripper and grasped objects [19,20,21], estimating contact forces [22], and localizing objects [23,24]. In [25], a grasp quality predictor is constructed using self-supervised learning to predict the probability of success given tactile information.…”
Section: Related Workmentioning
confidence: 99%