Information functionals allow one to quantify the degree of randomness of a given probability distribution, either absolutely (through min/max entropy principles) or relative to a prescribed reference one. Our primary aim is to analyze the "minimum information" assumption, which is a classic concept (R. Balian, 1968) in the random matrix theory. We put special emphasis on generic level (eigenvalue) spacing distributions and the degree of their randomness, or alternatively -information/organization deficit.
MotivationThe statistical theory of random-matrix spectra (RMT) [1,2] provides an ideal playground to test workings of the Shannon and Kullback-Leibler (K-L) entropies in diverse contexts. That pertains to a direct analysis of spectral data for complex quantum systems (semiclasically chaotic case included), but as well to the statistics of Gaussian matrix ensembles and random-matrix diffusion processes. Dyson's interacting Brownian motion model can be interpreted as a non--equilibrium dynamical process, whose asymptotic distribution is related to the thermodynamical equilibrium state of a Coulomb gas (RMT as equilibrium statistical mechanics). Ultimately one may pass to probability densities inferred from the ground state(s) of singular Calogero-type quantum systems: Shannon and K-L entropies prove to be proper tools in the quantum case as well.Before embarking on these issues, let us indicate that there are ambiguities involved in the very concept of information and (un)certainty. To stay on a solid ground [3-6], we must accept a specific lore of semantic games, where baffling synonyms quite often appear and their specific meaning is under scrutiny. Examples are: information vs. entropy notions, (un)certainty and randomness vs. information deficit, entropic measures of surprise vs. information functionals,