When touching the surface of an object, its spatial structure translates into a vibration on the skin. The perceptual system evolved to translate this pattern into a representation that allows to distinguish between different materials. Here we show that perceptual haptic representation of materials emerges from efficient encoding of vibratory patterns elicited by the interaction with materials. We trained a deep neural network with unsupervised learning (Autoencoder) to reconstruct vibratory patterns elicited by human haptic exploration of different materials. The learned compressed representation (i.e. latent space) allows for classification of material categories (i.e. plastic, stone, wood, fabric, leather/wool, paper, and metal). More importantly, distances between these categories in the latent space resemble perceptual distances, suggesting a similar coding. We could further show, that the temporal tuning of the emergent latent dimensions is similar to properties of human tactile receptors.