2009
DOI: 10.1140/epjd/e2009-00049-1
|View full text |Cite
|
Sign up to set email alerts
|

Complementarity in atomic (finite-level quantum) systems: an information-theoretic approach

Abstract: We develop an information theoretic interpretation of the numberphase complementarity in atomic systems, where phase is treated as a continuous positive operator valued measure (POVM). The relevant uncertainty principle is obtained as an upper bound on a sum of knowledge of these two observables for the case of two-level systems. A tighter bound characterizing the uncertainty relation is obtained numerically in terms of a weighted knowledge sum involving these variables. We point out that complementarity in th… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
20
0

Year Published

2010
2010
2019
2019

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 7 publications
(20 citation statements)
references
References 51 publications
0
20
0
Order By: Relevance
“…where ρ is a pure state [18]. Additionally, the problems of finding the informational power of group- [27]. Indeed, the greaterH is, the more we know about the measurement outcomes.…”
Section: Introductionmentioning
confidence: 99%
“…where ρ is a pure state [18]. Additionally, the problems of finding the informational power of group- [27]. Indeed, the greaterH is, the more we know about the measurement outcomes.…”
Section: Introductionmentioning
confidence: 99%
“…In Ref. [13], we numerically found this to be the case for the continued-valued phase obervable in a two-level system. However, we know of no (published) proof that this is true in general, which was the motivation behind adopting the concept of entropic knowledge in the uncertainty relation.…”
mentioning
confidence: 66%
“…[13], where the problem that the Shannon entropy of a continuous random variable may be negative is circumvented by instead using relative entropy (also called Kullbäck-Leibler divergence, which is always positive) [14,15] with respect to a uniform distribution. This quantity is a measure of knowledge [13]. Note that recourse to entropic knowledge may not be always necessary, and other ways might exist to circumvent the problem.…”
mentioning
confidence: 99%
See 2 more Smart Citations