The capacity of a classical-quantum channel (or in other words the classical capacity of a quantum channel) is considered in the most general setting, where no structural assumptions such as the stationary memoryless property are made on a channel. A capacity formula as well as a characterization of the strong converse property is given just in parallel with the corresponding classical results of Verdú-Han which are based on the so-called information-spectrum method. The general results are applied to the stationary memory-less case with or without cost constraint on inputs, whereby a deep relation between the channel coding theory and the hypothesis testing for two quantum states is elucidated.
The hypothesis testing problem of two quantum states is treated. We show a new inequality between the error of the first kind and the second kind, which complements the result of Hiai and Petz to establish the quantum version of Stein's lemma. The inequality is also used to show a bound on the first kind error when the power exponent for the second kind error exceeds the quantum relative entropy, and the bound yields the strong converse in the quantum hypothesis testing. Finally, we discuss the relation between the bound and the power exponent derived by Han and Kobayashi in the classical hypothesis testing.
KeywordsQuantum hypothesis testing, Stein's lemma, strong converse, quantum relative entropy.
IntroductionLet H be a Hilbert space which represents a physical system in interest. We suppose dim H < ∞ for mathematical simplicity. Let B(H) be the set of linear operators on H and putwhich is the set of density operators on H.We treat the problem of hypothesis testing a null hypothesis ρ ∈ S(H) versus an alternative hypothesis σ ∈ S(H). Here, we assume Im ρ ⊂ Im σ. To consider an asymptotic situation, suppose that either ρ ⊗n ∈ S(H ⊗n ) or σ ⊗n ∈ S(H ⊗n ) is given. The problem is to decide which hypothesis is true, and the decision is given by a two-valued quantum measurement {A n , 1 − A n } (A n ∈ B(H ⊗n ), 0 ≤ A n ≤ 1), where A n corresponds to the acceptance of ρ ⊗n and 1 − A n corresponds to the acceptance of σ ⊗n . We call A n ∈ B(H ⊗n ) (0 ≤ A n ≤ 1) a test in the sequel. * The authors are with the Graduate
A lower bound on the probability of decoding error of quantum communication channel is presented. The strong converse to the quantum channel coding theorem is shown immediately from the lower bound. It is the same as Arimoto's method except for the difficulty due to noncommutativity.
A Boltzmann machine is a network of stochastic neurons. The set of all the Boltzmann machines with a fixed topology forms a geometric manifold of high dimension, where modifiable synaptic weights of connections play the role of a coordinate system to specify networks. A learning trajectory, for example, is a curve in this manifold. It is important to study the geometry of the neural manifold, rather than the behavior of a single network, in order to know the capabilities and limitations of neural networks of a fixed topology. Using the new theory of information geometry, a natural invariant Riemannian metric and a dual pair of affine connections on the Boltzmann neural network manifold are established. The meaning of geometrical structures is elucidated from the stochastic and the statistical point of view. This leads to a natural modification of the Boltzmann machine learning rule.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.