In many aural (acoustic) signal processing tasks, humans are known to perform better than automated classification systems. For such applications, it may be beneficial to identify how humans relate different sounds to one another and incorporate that information into an automatic classification system. This paper presents a method for using psychoacoustic information from a human listening experiment to generate a novel kernel function that can be used to improve automated signal classification. We have conducted a similarity-based listening experiment on a series of impulsive-source sonar echoes. In this experiment, humans were asked to rate perceived similarity between pairings of target and clutter echoes. These ratings combine to form a similarity matrix that reflects the underlying distance measure humans use when judging these echoes. This similarity matrix is a perceptual equivalent to the similarity matrix used in modern kernel methods used in automatic classification systems (e.g. Support Vector Machine). By fitting an appropriate distance metric to the results of the perceptual experiment we can identify novel, perceptually-inspired, kernel functions. This paper presents a series of new approaches for the identification of a perceptual kernel function. We then compare the classification performance between these perceptual kernels and more standard kernel functions.