Heat spontaneously flows from hot to cold in standard thermodynamics. However, the latter theory presupposes the absence of initial correlations between interacting systems. We here experimentally demonstrate the reversal of heat flow for two quantum correlated spins-1/2, initially prepared in local thermal states at different effective temperatures, employing a Nuclear Magnetic Resonance setup. We observe a spontaneous energy flow from the cold to the hot system. This process is enabled by a trade off between correlations and entropy that we quantify with information-theoretical quantities. These results highlight the subtle interplay of quantum mechanics, thermodynamics and information theory. They further provide a mechanism to control heat on the microscale.
We derive detailed and integral quantum fluctuation theorems for heat exchange in a quantum correlated bipartite thermal system using the framework of dynamic Bayesian networks. Contrary to the usual two-projective-measurement scheme that is known to destroy quantum features, these fluctuation relations fully capture quantum correlations and quantum coherence at arbitrary times.Fluctuation theorems are fundamental generalizations of the second law of thermodynamics for small systems. While the entropy production Σ is a nonnegative deterministic quantity for macroscopic systems, it becomes random at the microscopic scale owing to the presence of nonnegligible thermal [1,2] or quantum [3,4] fluctuations. Detailed fluctuation theorems quantify the probability of occurrence of negative entropy production events via the general relation P (Σ)/P (−Σ) = exp(Σ) [5]. Integral fluctuation theorems take on the form exp(−Σ) = 1 after integration over Σ. The concavity of the exponential function then implies that the entropy production is only positive on average, Σ ≥ 0. The generic validity of fluctuation theorems arbitrarily far from equilibrium makes them particularly useful in nonequilibrium physics. They have been extensively investigated for this reason, both theoretically and experimentally, for classical systems [6,7]. These studies have provided unique insight into the thermodynamics of microscopic systems, from colloidal particles to enzymes and molecular motors [1,2].The situation is more involved in the quantum regime. Quantum fluctuation theorems are commonly studied within the two-point-measurement (TPM) scheme [3,4]. In this approach, the energy change, and in turn the entropy production, of a quantum system are determined for individual realizations by projectively measuring the energy at the beginning and at the end of a nonequilibrium protocol [8]. Equivalent formulations based on Ramsey-like interferometry [10,11] and generalized measurements [12] have also been proposed. These methods were used to perform experimental tests of quantum fluctuations theorems, both for mechanically driven [13][14][15] and thermally driven [16] systems, using NMR, trappedion and cold-atom setups. The TPM procedure successfully captures the discrete quantum energy spectrum of the system, as well as its nonequilibrium quantum dynamics between the two measurements [9]. However, due to its projective nature, it completely fails to account for quantum correlations and quantum coherence, two central features of quantum theory, that may be present in initial and final states of the system. In that sense, the TPM scheme may thus be viewed as not fully quantum.In this paper, we present detailed and integral quantum fluctuation theorems for heat exchange between quantum correlated bipartite thermal systems using a dynamic Bayesian network approach [17,18]. Global and local descriptions of a composite system usually differ because of quantum correlations. The dynamic Bayesian network offers a powerful framework to specify the local dynam...
Maxwell's demon explores the role of information in physical processes. Employing information about microscopic degrees of freedom, this "intelligent observer" is capable of compensating entropy production (or extracting work), apparently challenging the second law of thermodynamics. In a modern standpoint, it is regarded as a feedback control mechanism and the limits of thermodynamics are recast incorporating information-to-energy conversion. We derive a trade-off relation between information-theoretic quantities empowering the design of an efficient Maxwell's demon in a quantum system. The demon is experimentally implemented as a spin-1/2 quantum memory that acquires information, and employs it to control the dynamics of another spin-1/2 system, through a natural interaction. Noise and imperfections in this protocol are investigated by the assessment of its effectiveness. This realization provides experimental evidence that the irreversibility on a nonequilibrium dynamics can be mitigated by assessing microscopic information and applying a feed-forward strategy at the quantum scale.Connections between thermodynamics and information theory have been producing important insights and useful applications in the past few years, which has turned out to be a dynamic field [1][2][3][4]. Its genesis traces back to the famous Maxwell's demon gedanken experiment [5][6][7][8][9]. In 1867, Maxwell conceived a "neat fingered being", which has the ability to gather information about the microscopic state of a gas and use this information to transfer fast particles to a hot medium and slow particles to a cold one, engendering an apparent conflict with the second law of thermodynamics. Several approaches and developments concerning this conundrum had been put forward [5][6][7][8][9], but only after more than a century, in 1982, Bennett [10] realized that the apparent contradiction with the second law could be puzzled out by considering the Landauer's erasure principle [11][12][13][14].Theoretical endeavours to incorporate information into thermodynamics acquire a pragmatic applicability within the recent technological progress, where information just started to be manipulated at the micro-and nanoscale. A modern framework for these endeavors has been provided by explicitly taking into account the change, introduced in the statistical description of the system, due to the assessment of its microscopic information [15]. This outlines an illuminating paradigm for the Maxwell's demon, where the information-to-energy conversion is governed by fluctuation theorems, which hold for small systems arbitrarily far from equilibrium [16][17][18][19][20]. Generalizations of the second law in the presence of feedback control can be obtained from this framework, establishing bounds for information-based work extraction [21]. Notwithstanding its fundamental relevance, these relations do not provide a clear recipe for building a demon in a laboratory setting. Owing to the challenges associated with a high precision microscopic control, there ar...
It is well known that a quantum correlated probe can yield better precision in estimating an unknown parameter than classically possible. However, how such a quantum probe should be measured remains somewhat elusive. We examine the role of measurements in quantum metrology by considering two types of readout strategies: coherent, where all probes are measured simultaneously in an entangled basis; and adaptive, where probes are measured sequentially, with each measurement one way conditioned on the prior outcomes. Here we firstly show that for classically correlated probes the two readout strategies yield the same precision. Secondly, we construct an example of a noisy multipartite quantum system where coherent readout yields considerably better precision than adaptive readout. This highlights a fundamental difference between classical and quantum parameter estimation. From the practical point of view, our findings are relevant for the optimal design of precisionmeasurement quantum devices.
Connections between information theory and thermodynamics have proven to be very useful to establish bounding limits for physical processes. Ideas such as Landauer's erasure principle and information assisted work extraction have greatly contributed not only to enlarge our understanding about the fundamental limits imposed by nature, but also to enlighten the path for practical implementations of information processing devices. The intricate information-thermodynamics relation also entails a fundamental limit on parameter estimation, establishing a thermodynamic cost for information acquisition. We show that the amount of information that can be encoded in a physical system by means of a unitary process is limited by the dissipated work during the implementation of the process. This includes a thermodynamic trade-off for information acquisition. Likewise, the information acquisition process is ultimately limited by the second law of thermodynamics. This trade-off for information acquisition may find applications in several areas of knowledge.PACS numbers: 03.65.Ta Information theory first met thermodynamics when Maxwell introduced his famous Demon [1]. This relation became clear with Brillouin's treatment of the information entropy (due to Shannon) and the thermodynamic entropy (due to Boltzmann) on the same footing [2]. Many advances linking these two apparently distinct areas have been achieved since then, with one of the most remarkable being ascribed to Landauer's erasure principle [3]. This principle, introduced as an effectively way to exorcize Maxwell's Demon, states that erasure of information is a logically irreversible process that must dissipate energy. More recently, developments in this directions include theoretical and experimental investigations of Landauer's principle and its consequences [4, 5], work extraction by feedback control of microscopic systems [6][7][8][9][10], and links between the second law of thermodynamics and two fundamental quantum mechanical principles, i.e., the wave-function collapse [11] and the uncertainty relation [12]. Here, we introduce a thermodynamic trade-off for information acquisition, which relates the uncertainty of the information acquired in a parameter estimation process with the dissipated work by the encoding process. This trade-off relation is obtained by a formal connection between an elusive quantity from estimation theory, named Fisher information [2, 5,13,14], and the Jarzynski equality [17]. * kaonan.bueno@ufabc.edu.br † serra@ufabc.edu.br ‡ lucas@chibebe.org I. RESULTSNatural sciences are based on experimental and phenomenological facts. Parameter estimation protocols have a central role to the observation of new phenomena or to validate some theoretical prediction. Suppose, we want to determine the value of some parameter, let us say ϕ. This task can be accomplished, generally, by employing a probe, ρ T . We will assume that the probe state is initially in thermal equilibrium at absolute temperature T , so ρ T is the canonical equilibrium (Gibbs)...
Despite their theoretical importance, dynamic Bayesian networks associated with quantum processes are currently not accessible experimentally. We here describe a general scheme to determine the multi-time path probability of a Bayesian network based on local measurements on independent copies of a composite quantum system combined with postselection. We further show that this protocol corresponds to a non-projective measurement. It thus allows the investigation of the multi-time properties of a given local observable while fully preserving all its quantum features.Dynamic Bayesian networks offer a powerful framework to analyze conditional dependencies in a set of timedependent random quantities. In this approach, relationships between dynamical variables are specified through conditional probabilities evaluated via Bayes' rule [1][2][3][4]. They have found widespread application in statistics, engineering and computer science to model time series in probabilistic models. Hidden Markov models and Kalman filters are special cases of such networks [1-4]. In the past decade, Bayesian networks have been successfully employed to investigate the nonequilibrium thermodynamics of small, composite systems, both in the classical [5][6][7][8][9][10][11][12] and quantum [13][14][15][16] regimes. They have, in particular, been used to obtain fluctuation theorems, fundamental generalizations of the second law that characterize fluctuations of the entropy production arbitrarily far from equilibrium [17], for multipartite systems [5][6][7][8][9][10][11][12][13][14][15][16].An interesting property of dynamic Bayesian networks is that they allow to specify the local dynamics of a composite quantum system conditioned on its global state. The Bayesian network formalism thus preserves all the quantum features of the system, especially quantum correlations [18] and quantum coherence [19]. As a result, it permits to go beyond the standard two-projectivemeasurement scheme [20][21][22], which, owing to its projective nature, destroys off-diagonal density matrix elements. This characteristic has recently been exploited to derive fully quantum fluctuation theorems that not only account for the quantum nonequilibrium dynamics of a driven system [23], as in the two-projective-measurement approach, but also fully capture both quantum correlations and quantum coherence at arbitrary times [13][14][15]. However, while a number of methods to implement the two-projective-measurement approach (and its variants) have been both theoretically developed [24][25][26][27] and experimentally demonstrated [28][29][30][31][32][33][34], to date, no such protocol exists for dynamic Bayesian networks.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.