2009 IEEE/RSJ International Conference on Intelligent Robots and Systems 2009
DOI: 10.1109/iros.2009.5354648
|View full text |Cite
|
Sign up to set email alerts
|

Object identification with tactile sensors using bag-of-features

Abstract: Abstract-In this paper, we present a novel approach for identifying objects using touch sensors installed in the finger tips of a manipulation robot. Our approach operates on low-resolution intensity images that are obtained when the robot grasps an object. We apply a bag-of-words approach for object identification. By means of unsupervised clustering on training data, our approach learns a vocabulary from tactile observations which is used to generate a histogram codebook. The histogram codebook models distri… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
158
0
2

Year Published

2011
2011
2021
2021

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 187 publications
(163 citation statements)
references
References 19 publications
(19 reference statements)
0
158
0
2
Order By: Relevance
“…The recent work by Schneider et al [37] is most closely related to ours, since it also applies bag-of-features to data from tactile force sensors. The work presented in this paper goes farther than Schneider et al in several important ways.…”
Section: Related Workmentioning
confidence: 86%
“…The recent work by Schneider et al [37] is most closely related to ours, since it also applies bag-of-features to data from tactile force sensors. The work presented in this paper goes farther than Schneider et al in several important ways.…”
Section: Related Workmentioning
confidence: 86%
“…It could be a possible solution to implement planned grasps on actual robotic platforms. 9 For holding two objects, the closest point of contact used to compute the velocity of a desired point is valid only if its normal is opposite to the direction from the desired point to this point: n i · (p i c − p j ) < 0 as a condition to Eq (18). Video of the experiments:…”
Section: Discussionmentioning
confidence: 99%
“…Another area of research focuses on using robotic hands and fingers or grippers, and tactile or force sensors to model the object's shape. One approach is to grasp objects sequentially, at different locations, and reconstruct a "bag-of-features" representation [9]; this allows to keep a sense of continuity and represent the object globally from local features without the need for precise localization during the exploration. A systematic approach is also used for reconstructing 3-D point cloud models of objects with a 3-fingered hand and tactile sensors [3]; however, this is a very slow method because of the systematic probing (the fingers are opened and closed a hundred times for an object smaller than 9cm), and is restricted to small objects that can fit between the robot's fingertips.…”
Section: Tactile Explorationmentioning
confidence: 99%
“…This transformation can enable distinct robots and humans to share haptic data and datadriven object-centric models. Schneider et al (2009) have presented methods for object identification with bag-of-features models using haptic data in the form of readings from tactile sensor arrays on the robot's parallel jaw gripper, and the width and height at which the robot grasps the object. Although Schneider et al (2009) presented results from data collected by a single robot, these models are data-driven and object-centric, and different robots with similar sensing capabilities may be able to share these haptic data.…”
Section: Data-driven and Robot-centric Modelsmentioning
confidence: 99%