An ensemble of random unistochastic (orthostochastic) matrices is defined by taking squared moduli of elements of random unitary (orthogonal) matrices distributed according to the Haar measure on U (N ) (or O(N ), respectively). An ensemble of symmetric unistochastic matrices is obtained with use of unitary symmetric matrices pertaining to the circular orthogonal ensemble. We study the distribution of complex eigenvalues of bistochastic, unistochastic and orthostochastic matrices in the complex plane. We compute averages (entropy, traces) over the ensembles of unistochastic matrices and present inequalities concerning the entropies of products of bistochastic matrices.
Topological and geometrical properties of the set of mixed quantum states in the N −dimensional Hilbert space are analysed. Assuming that the corresponding classical dynamics takes place on the sphere we use the vector SU(2) coherent states and the generalised Husimi distributions to define the Monge distance between two arbitrary density matrices. The Monge metric has a simple semiclassical interpretation and induces a non-trivial geometry. Among all pure states the distance from the maximally mixed state ρ * , proportional to the identity matrix, admits the largest value for the coherent states, while the delocalized 'chaotic' states are close to ρ * . This contrasts the geometry induced by the standard (trace, Hilbert-Schmidt or Bures) metrics, for which the distance from ρ * is the same for all pure states. We discuss possible physical consequences including unitary time evolution and the process of decoherence. We introduce also a simplified Monge metric, defined in the space of pure quantum states and more suitable for numerical computation.
A new definition of the entropy of a given dynamical system and of an instrument describing the measurement process is proposed within the operational approach to quantum mechanics. It generalizes other definitions of entropy, in both the classical and quantum cases. The Kolmogorov-Sinai (KS) entropy is obtained for a classical system and the sharp measurement instrument. For a quantum system and a coherent states instrument, a new quantity, coherent states entropy, is defined. It may be used to measure chaos in quantum mechanics. The following correspondence principle is proved: the upper limit of the coherent states entropy of a quantum map as h-+0 is less than or equal to the KS-entropy of the corresponding classical map."Chaos umpire sits, And by decision more imbroils the fray By which he reigns: next him high arbiter Chance governs all. " John Milton, Paradise Lost, Book II 5674
We discuss the dependence of the Shannon entropy of normalized finite rank-1 POVMs on the choice of the input state, looking for the states that minimize this quantity. To distinguish the class of measurements where the problem can be solved analytically, we introduce the notion of highly symmetric POVMs and classify them in dimension 2 (for qubits). In this case, we prove that the entropy is minimal, and hence, the relative entropy (informational power) is maximal, if and only if the input state is orthogonal to one of the states constituting a POVM. The method used in the proof, employing the Michel theory of critical points for group action, the Hermite interpolation, and the structure of invariant polynomials for unitary-antiunitary groups, can also be applied in higher dimensions and for other entropy-like functions. The links between entropy minimization and entropic uncertainty relations, the Wehrl entropy, and the quantum dynamical entropy are described.
We show that the mean dynamical entropy of a quantum map on the sphere tends logarithmically to infinity in the semiclassical limit. Consequences of this fact for classical dynamical systems are discussed.
Dynamics of deterministic systems perturbed by random additive noise is characterized quantitatively. Since for such systems the Kolmogorov-Sinai (KS) entropy diverges if the diameter of the partition tends to zero, we analyze the difference between the total entropy of a noisy system and the entropy of the noise itself. We show that this quantity is finite and non-negative and we call it the dynamical entropy of the noisy system. In the weak noise limit this quantity is conjectured to tend to the KS entropy of the deterministic system. In particular, we consider one-dimensional systems with noise described by a finite-dimensional kernel for which the Frobenius-Perron operator can be represented by a finite matrix.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.