2015 IEEE International Conference on Computational Intelligence and Virtual Environments for Measurement Systems and Applicati 2015
DOI: 10.1109/civemsa.2015.7158615
|View full text |Cite
|
Sign up to set email alerts
|

Data-driven analysis of kinaesthetic and tactile information for shape classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 8 publications
0
5
0
Order By: Relevance
“…These interactions can be static, usually generating tactile images [ 3 , 4 ], or dynamic, generating signals that vary over time [5] , [6] , [7] , [8] . Our dataset focuses on dynamic tactile data for texture characterization but utilizes a set of commonly used tactile textures instead of synthetic gratings [9] or macroscopic profiles [10] . This dataset is also valuable because it aims to avoid a halt in the research of tactile perception of textures, considering that previous research, such as [ 11 , 12 ], have not maintained the availability of their data.…”
Section: Value Of the Datamentioning
confidence: 99%
See 1 more Smart Citation
“…These interactions can be static, usually generating tactile images [ 3 , 4 ], or dynamic, generating signals that vary over time [5] , [6] , [7] , [8] . Our dataset focuses on dynamic tactile data for texture characterization but utilizes a set of commonly used tactile textures instead of synthetic gratings [9] or macroscopic profiles [10] . This dataset is also valuable because it aims to avoid a halt in the research of tactile perception of textures, considering that previous research, such as [ 11 , 12 ], have not maintained the availability of their data.…”
Section: Value Of the Datamentioning
confidence: 99%
“…These interactions can be static, usually generating tactile images [ 3 , 4 ], or dynamic, generating signals that vary over time [5] , [6] , [7] , [8] . Our dataset focuses on dynamic tactile data for texture characterization but utilizes a set of commonly used tactile textures instead of synthetic gratings [9] or macroscopic profiles [10] .…”
Section: Value Of the Datamentioning
confidence: 99%
“…CCD camera solutions involve generally a high cost and weight, while the loss of light due to microbending or chirping causes distortion is the measured data [23]. To alleviate the drawbacks associated to any one single technology, a possible solution is to capitalize on the use of multiple sensor technologies for recuperating tactile information [8,24,25,26]. A biomimetic fingertip containing three accelerometers and seven force sensors in two layers of polyurethane [27], is employed to discriminate six fabrics and an aluminum plate by comparing the differences in their surface texture in [28].…”
Section: Literature Reviewmentioning
confidence: 99%
“…The robot is able to correctly identify 10 different objects on 99 out of 100 presentations. A data-driven analysis to the problem of sensor selection in the contour following for shape discrimination task is presented in [26]. Data collected from the motors, an inertial measurement unit, and a magnetometer attached to a 4-DOF robotic finger, during the exploration of seven synthetic shapes are analyzed using principal component analysis and a multilayer perceptron neural network is trained to classify the shapes.…”
Section: Literature Reviewmentioning
confidence: 99%
“…The authors of Lederman and Klatzky (1987) pointed out six different exploratory movements to identify the properties of an object. Authors of de Oliveira et al (2015b) proposed a data-driven analysis for shape discrimination tasks using a robotic finger that performs the sliding movement. Lima et al (2020) developed an experiment with a sliding tactile-enabled robotic fingertip to explore textures dynamically.…”
Section: Introductionmentioning
confidence: 99%