“…Providing robot with the capability of manipulating and recognising fabric is still a challenging task. Indeed a huge effort has been done in literature in order to classify various kinds of materials (polycotton, nylon, silicone, brass, wood plastic, foam, and PVC to name, but few) using tactile sensors based on different transduction principles [1], [2], [3] but with respect to the these examples, where materials have clear different geometric and mechanical characteristics, to classify fabrics is more challenging due to the high variability of existing types that in many cases could have really similar characteristics (i.e., consider for example a jumper that can be made from wool or acrylics). A multisensorial approach can be used for improving the fabric classification as in [4], where data coming from RGB-D, tactile, and photometric stereo sensors are used, but when only one sensor modality is available, the challenge is to find a fabric exploration technique that allow to detect all its discriminative characteristics and also to determine which are the sensor data features that are most effective for discriminating the different type of fabrics.…”