2018
DOI: 10.1109/tit.2018.2842180
|View full text |Cite
|
Sign up to set email alerts
|

Bounds on Information Combining With Quantum Side Information

Abstract: Bounds on information combining" are entropic inequalities that determine how the information (entropy) of a set of random variables can change when these are combined in certain prescribed ways. Such bounds play an important role in classical information theory, particularly in coding and Shannon theory; entropy power inequalities are special instances of them. The arguably most elementary kind of information combining is the addition of two binary random variables (a CNOT gate), and the resulting quantities … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
5
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
3

Relationship

3
4

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 82 publications
1
5
0
Order By: Relevance
“…A similar result has been proven for the sum of binary random variables [15]. Entropic inequalities with classical conditioning easily follow from the corresponding unconditioned inequalities and from the definition (3) of conditional entropy via Jensen's inequality (see e.g.…”
Section: Our Contributionsupporting
confidence: 59%
“…A similar result has been proven for the sum of binary random variables [15]. Entropic inequalities with classical conditioning easily follow from the corresponding unconditioned inequalities and from the definition (3) of conditional entropy via Jensen's inequality (see e.g.…”
Section: Our Contributionsupporting
confidence: 59%
“…Furthermore, considering the variety of applications of the classical IB function it would be interesting to see which of them translate to the quantum setting. Finally, the classical IB function is closely related to entropic bounds on information combining, and our results might help to better understand their quantum generalization [23].…”
Section: Discussionmentioning
confidence: 79%
“…( 3), h(x) is the binary entropy and ⋆ denotes the binary convolution. This is an important example as it plays a crucial role in the theory of classical and quantum information combining [22], [23]. Now, using the reasoning after Eq.…”
Section: Numerics and Examplesmentioning
confidence: 99%
“…As quantum conditioning is significantly more complex, the same argument does not hold for quantum reference systems [5,39].…”
Section: Channel Preorders In Quantum Information Theory and Related ...mentioning
confidence: 99%