Motivated by evidence that image source statistics predict the response properties of several visual perception aspects, we provide an empirical analysis of the relation between chroma statistics and human judgments of tonality. To accomplish this, a statistical analysis method based on chroma feature covariance is proposed. It makes use of a large collection of western music to build a tonal profile. The obtained profile is compared to alternative tonal profiles proposed in the literature, either cognitively, perceptually, or theoretically inspired. The high degree of correlation we find between the covariance-based tonal profile proposed here and several ones proposed in the literature (reaching values higher than 0.9) is interpreted as evidence that human-derived profiles faithfully reflect the statistics of the musical input listeners have been exposed to. Furthermore, we show that very short time scales allow us to correctly predict these profiles, which brings us to discuss the role that local-scale implicit learning plays in building mental representations of tonality.