We investigate the decay of entanglement of generalized N-particle Greenberger-Horne-Zeilinger (GHZ) states interacting with independent reservoirs. Scaling laws for the decay of entanglement and for its finite-time extinction (sudden death) are derived for different types of reservoirs. The latter is found to increase with N. However, entanglement becomes arbitrarily small, and therefore useless as a resource, much before it completely disappears, around a time which is inversely proportional to the number of particles. We also show that the decay of multiparticle GHZ states can generate bound entangled states.
Parameter estimation is of fundamental importance in areas from atomic spectroscopy and atomic clocks to gravitational wave-detection. Entangled probes provide a significant precision gain over classical strategies in the absence of noise. However, recent results seem to indicate that any small amount of realistic noise restricts the advantage of quantum strategies to an improvement by at most a multiplicative constant. Here we identify a relevant scenario in which one can overcome this restriction and attain super-classical precision scaling even in the presence of uncorrelated noise. We show that precision can be significantly enhanced when the noise is concentrated along some spatial direction, while the Hamiltonian governing the evolution which depends on the parameter to be estimated can be engineered to point along a different direction. In the case of perpendicular orientation, we find super-classical scaling and identify a state which achieves the optimum.Estimation of an unknown parameter is essential across disciplines from atomic spectroscopy and clocks [1][2][3] to gravitational wave-detection [4]. It is typically achieved by letting a probe, e.g. light, interact with the system under investigation, picking up information about the desired parameter. As seen in Fig. 1, a metrology protocol can be understood in four main steps [5,6]: i) preparation of the probe, ii) interaction with the system, iii) readout of the probe, and iv) construction of an estimate of the unknown parameter from the results. Steps (i)-(iii) may be repeated many times before the final construction of the estimate. FIG. 1.General metrology protocol where a known probe state evolves according to a physical evolution depending on an unknown parameter ω. After sufficient amount of data is collected an estimate for the parameter is constructed.The estimate uncertainty will depend on the available resources, here the probe size N and the total time T available for the experiment (other choices are possible [7]). By the central limit theorem, for N uncorrelated particles, the best uncertainty scales as 1/ √ νN , where ν = T /t is the number of evolve-and-measure rounds. This bound is known as the shot-noise or standard quantum limit (SQL). By making use of quantum phenomena, a metrology protocol may surpass the SQL, reaching instead the limits imposed by the quantum uncertainty relations. For probes of non-interacting particles, the best possible scaling compatible with these relations is 1/( √ νN ), known as the Heisenberg limit. Without noise, the Heisenberg limit can be attained using entangled input states, e.g. Greenberger-HorneZeilinger (GHZ) states for atomic spectroscopy [8]. In the presence of noise however, the picture is much less clear, as the optimal strategy depends strongly on the model of decoherence considered. Nevertheless, the SQL has been significantly surpassed in experiments of optical magnetometry [9,10], which proved that some sources of noise can be effectively counterbalanced [11,12]. However, unless one can k...
In recent years, the use of information principles to understand quantum correlations has been very successful. Unfortunately, all principles considered so far have a bipartite formulation, but intrinsically multipartite principles, yet to be discovered, are necessary for reproducing quantum correlations. Here we introduce local orthogonality, an intrinsically multipartite principle stating that events involving different outcomes of the same local measurement must be exclusive or orthogonal. We prove that it is equivalent to no-signalling in the bipartite scenario but more restrictive for more than two parties. By exploiting this nonequivalence, it is then demonstrated that some bipartite supra-quantum correlations do violate the local orthogonality when distributed among several parties. Finally, we show how its multipartite character allows revealing the non-quantumness of correlations for which any bipartite principle fails. We believe that local orthogonality is a crucial ingredient for understanding no-signalling and quantum correlations.
It is a relatively new insight of classical statistics that empirical data can contain information about causation rather than mere correlation. First algorithms have been proposed that are capable of testing whether a presumed causal relationship is compatible with an observed distribution. However, no systematic method is known for treating such problems in a way that generalizes to quantum systems. Here, we describe a general algorithm for computing information-theoretic constraints on the correlations that can arise from a given causal structure, where we allow for quantum systems as well as classical random variables. The general technique is applied to two relevant cases: first, we show that the principle of information causality appears naturally in our framework and go on to generalize and strengthen it. Second, we derive bounds on the correlations that can occur in a networked architecture, where a set of few-body quantum systems is distributed among some parties.
For any Bell locality scenario (or Kochen-Specker noncontextuality scenario), the joint Shannon entropies of local (or noncontextual) models define a convex cone for which the non-trivial facets are tight entropic Bell (or contextuality) inequalities. In this paper we explore this entropic approach and derive tight entropic inequalities for various scenarios. One advantage of entropic inequalities is that they easily adapt to situations like bilocality scenarios, which have additional independence requirements that are non-linear on the level of probabilities, but linear on the level of entropies. Another advantage is that, despite the nonlinearity, taking detection inefficiencies into account turns out to be very simple. When joint measurements are conducted by a single detector only, the detector efficiency for witnessing quantum contextuality can be arbitrarily low.
Bell's Theorem shows that quantum mechanical correlations can violate the constraints that the causal structure of certain experiments impose on any classical explanation. It is thus natural to ask to which degree the causal assumptions -e.g. "locality" or "measurement independence" -have to be relaxed in order to allow for a classical description of such experiments. Here, we develop a conceptual and computational framework for treating this problem. We employ the language of Bayesian networks to systematically construct alternative causal structures and bound the degree of relaxation using quantitative measures that originate from the mathematical theory of causality. The main technical insight is that the resulting problems can often be expressed as computationally tractable linear programs. We demonstrate the versatility of the framework by applying it to a variety of scenarios, ranging from relaxations of the measurement independence, locality and bilocality assumptions, to a novel causal interpretation of CHSH inequality violations.The paradigmatic Bell experiment [1] involves two distant observers, each with the capability to perform one of two possible experiments on their shares of a joint system. Bell observed that even absent of any detailed information about the physical processes involved, the causal structure of the setup alone implies strong constraints on the correlations that can arise from any classical description [2]. The physically wellmotivated causal assumptions are: (i) measurement independence: experimenters can choose which property of a system to measure, independently of how the system has been prepared; (ii) locality: the results obtained by one observer cannot be influenced by any action of the other (ideally space-like separated) experimenter. The resulting constraints are Bell's inequalities [1]. Quantum mechanical processes subject to the same causal structure can violate these constraints -a prediction that has been abundantly verified experimentally [3][4][5][6][7]. This effect is commonly referred to as quantum nonlocality.It is now natural to ask how stable the effect of quantum non-locality is with respect to relaxations of the causal assumptions. Which "degree of measurement dependence", e.g., is required to reconcile empirically observed correlations with a classical and local model? Such questions are not only, we feel, of great relevance to foundational questions -they are also of interest to practical applications of non-locality, e.g. in cryptographic protocols. Indeed, eavesdroppers can (and do [8]) exploit the failure of a given cryptographic device to be constrained by the presumed causal structure to compromise its security. At the same time, it will often be difficult to ascertain that causal assumptions hold exactly -which makes it important to develop a systematic quantitative theory.Several variants of this question have recently attracted considerable attention [9][10][11][12][13][14][15][16][17][18][19][20]. For example, measurement dependence has been found ...
A marginal problem asks whether a given family of marginal distributions for some set of random variables arises from some joint distribution of these variables. Here we point out that the existence of such a joint distribution imposes non-trivial conditions already on the level of Shannon entropies of the given marginals. These entropic inequalities are necessary (but not sufficient) criteria for the existence of a joint distribution. For every marginal problem, a list of such Shannon-type entropic inequalities can be calculated by Fourier-Motzkin elimination, and we offer a software interface to a Fourier-Motzkin solver for doing so. For the case that the hypergraph of given marginals is a cycle graph, we provide a complete analytic solution to the problem of classifying all relevant entropic inequalities, and use this result to bound the decay of correlations in stochastic processes. Furthermore, we show that Shannon-type inequalities for differential entropies are not relevant for continuous-variable marginal problems; non-Shannon-type inequalities are, both in the discrete and in the continuous case. In contrast to other approaches, our general framework easily adapts to situations where one has additional (conditional) independence requirements on the joint distribution, as in the case of graphical models. We end with a list of open problems. A complementary article discusses applications to quantum nonlocality and contextuality.Comment: 26 pages, 3 figure
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.