Communication over a noisy quantum channel introduces errors in the transmission that must be corrected. A fundamental bound on quantum error correction is the quantum capacity, which quantifies the amount of quantum data that can be protected. We show theoretically that two quantum channels, each with a transmission capacity of zero, can have a nonzero capacity when used together.This unveils a rich structure in the theory of quantum communications, implying that the quantum capacity does not uniquely specify a channel's ability for transmitting quantum information. 1 arXiv:0807.4935v2 [quant-ph]
How correlated are two quantum systems from the perspective of a third? We answer this by providing an optimal ''quantum state redistribution'' protocol for multipartite product sources. Specifically, given an arbitrary quantum state of three systems, where Alice holds two and Bob holds one, we identify the cost, in terms of quantum communication and entanglement, for Alice to give one of her parts to Bob. The communication cost gives the first known operational interpretation to quantum conditional mutual information. The optimal procedure is self-dual under time reversal and is perfectly composable. This generalizes known protocols such as the state merging and fully quantum Slepian-Wolf protocols, from which almost every known protocol in quantum Shannon theory can be derived. The statistical approach to information and the asymptotic analysis of the protocols which process it was pioneered by Claude Shannon [1]. He showed that the information content of a random variable X with distribution px could be intuitively quantified by the Shannon entropy HX ÿ P x pxlog 2 px. More importantly, he operationally justified this by proving that HX is the minimum average number of bits required to faithfully represent independent instances of X by any data compression protocol, also proving that such protocols indeed exist. Shannon further defined the conditional entropy as HXjY HXY ÿ HY, which is also equal to the average entropy of X given Y. Conditional entropy measures the information someone knowing only Y would have to learn in order to know X as well. Its operational relevance was shown by Slepian and Wolf [2] to be the minimum number of bits needed to describe X to someone who knows Y. Shannon also introduced mutual information IX; Y HY ÿ HYjX and conditional mutual information IX; YjZ HYjZ ÿ HYjXZ, each of which is interpreted as the information shared by X and Y; the latter is measured from the perspective of someone knowing Z. Mutual information plays a fundamental role in characterizing the capacity for a noisy channel to transmit information [1]. Its conditional counterpart arises in the answers to many problems, such as in rate distortion with side information at the decoder [3] and communication with side information at the encoder [4]. It also appears in the analysis of degraded broadcast channels [5]. All four of these quantities can easily be shown to be non-negative.In recent years, a quantum mechanical generalization [6] of Shannon's theory has been developing where a random variable is replaced with a quantum system C with density matrix C . The quantum analog of Shannon entropy is von Neumann entropy HC ÿTr C log 2 C , which is the Shannon entropy of the eigenvalues of C . While von Neumann's entropy preceded Shannon's by almost 20 years, its operational interpretation was only found relatively recently by Schumacher [7], who showed that a large number n of quantum systems, identically prepared in the state C , could be compressed into a space of roughly nHC qubits, or two-level quantum systems. Her...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.