Extrinsic interference is routinely faced in systems engineering, and a common solution is to rely on a broad class of filtering techniques to a ord stability to intrinsically unstable systems or isolate particular signals from a noisy background. Experimentalists leading the development of a new generation of quantum-enabled technologies similarly encounter time-varying noise in realistic laboratory settings. They face substantial challenges in either suppressing such noise for high-fidelity quantum operations 1 or controllably exploiting it in quantum-enhanced sensing [2][3][4] or system identification tasks 5,6 , due to a lack of e cient, validated approaches to understanding and predicting quantum dynamics in the presence of realistic time-varying noise. In this work we use the theory of quantum control engineering . We demonstrate the utility of these constructs for directly predicting the evolution of a quantum state in a realistic noisy environment as well as for developing novel robust control and sensing protocols. These experiments provide a significant advance in our understanding of the physics underlying controlled quantum dynamics, and unlock new capabilities for the emerging field of quantum systems engineering.Time-varying noise coupled to quantum systems-typically qubits-generically results in decoherence, or a loss of 'quantumness' of the system. Broadly, one may think of the state of the quantum system becoming randomized through uncontrolled (and often uncontrollable) interactions with the environment during both idle periods and active control operations (Fig. 1a). Despite the ubiquity of this phenomenon, it is a challenging problem to predict the average evolution of a qubit state undergoing a specific, but arbitrary operation in the presence of realistic time-dependent noise-how much randomization does one expect and how well can one perform the target operation? Making such predictions accurately is precisely the capability that experimentalists require in realistic laboratory settings. Moreover, this capability is fundamental to the development of novel control techniques designed to modify or suppress decoherence as researchers attempt to build quantum-enabled technologies for applications such as quantum information and quantum sensing.These considerations motivate the development of novel engineering-inspired analytic tools enabling a user to accurately predict the behaviour of a controlled quantum system in realistic laboratory environments. Recent work has demonstrated that the average dynamics of a controlled qubit state evolution may be captured using filter-transfer functions (FFs) characterizing the control. The fidelity of an arbitrary operation over duration τ ,, is degraded owing to frequency-domain spectral overlap between noise in the environment given by a power spectrum S(ω), and the filter-transfer functions denoted F(ω) (Methods) [11][12][13][14] . The FF description of ensemble-average quantum dynamics tremendously simplifies the task of analysing the expected performa...
Among the most popular and well studied quantum characterization, verification and validation techniques is randomized benchmarking (RB), an important statistical tool used to characterize the performance of physical logic operations useful in quantum information processing. In this work we provide a detailed mathematical treatment of the effect of temporal noise correlations on the outcomes of RB protocols. We provide a fully analytic framework capturing the accumulation of error in RB expressed in terms of a three-dimensional random walk in "Pauli space." Using this framework we derive the probability density function describing RB outcomes (averaged over noise) for both Markovian and correlated errors, which we show is generally described by a gamma distribution with shape and scale parameters depending on the correlation structure. Long temporal correlations impart large nonvanishing variance and skew in the distribution towards high-fidelity outcomesconsistent with existing experimental data -highlighting potential finite-sampling pitfalls and the divergence of the mean RB outcome from worst-case errors in the presence of noise correlations. We use the Filtertransfer function formalism to reveal the underlying reason for these differences in terms of effective coherent averaging of correlated errors in certain random sequences. We conclude by commenting on the impact of these calculations on the utility of single-metric approaches to quantum characterization, verification, and validation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.