Recent advances in fluorescence microscopy have yielded an abundance of high-dimensional spectrally rich datasets that cannot always be adequately explored through conventional three-color visualization methods. While computational image processing techniques allow researchers to derive spectral characteristics of their datasets that cannot be visualized directly, there are still limitations in how to best visually display these resulting rich spectral data. Data sonification has the potential to provide a novel way for researchers to intuitively perceive these characteristics auditorily through direct interaction with the raw multi-channel data. The human ear is well tuned to detect subtle differences in sound that could represent discrete changes in fluorescence spectra. We present a proof of concept implementation of a functional data sonification workflow for analysis of fluorescence microscopy data as an FIJI ImageJ plugin and evaluate its utility with various hyperspectral microscopy datasets. Additionally, we provide a framework for prototyping and testing new sonification methods and a mathematical model to point out scenarios where vision-based spectral analysis fails and sonification-based approaches would not. With this first reported practical application of sonification to biological fluorescence microscopy and supporting computational tools for further exploration, we discuss the current advantages and disadvantages of sonification over conventional spectral visualization approaches. We also discuss where further efforts in spectral sonification need to go to maximize its practical biological applications.