A boson sampling device is a specialized quantum computer that solves a problem that is strongly believed to be computationally hard for classical computers. Recently, a number of small-scale implementations have been reported, all based on multiphoton interference in multimode interferometers. Akin to several quantum simulation and computation tasks, an open problem in the hard-to-simulate regime is to what extent the correctness of the boson sampling outcomes can be certified. Here, we report new boson sampling experiments on larger photonic chips and analyse the data using a recently proposed scalable statistical test. We show that the test successfully validates small experimental data samples against the hypothesis that they are uniformly distributed. In addition, we show how to discriminate data arising from either indistinguishable or distinguishable photons. Our results pave the way towards larger boson sampling experiments whose functioning, despite being non-trivial to simulate, can be certified against alternative hypotheses
Phase estimation protocols provide a fundamental benchmark for the field of quantum metrology. The latter represents one of the most relevant applications of quantum theory, potentially enabling the capability of measuring unknown physical parameters with improved precision over classical strategies. Within this context, most theoretical and experimental studies have focused on determining the fundamental bounds and how to achieve them in the asymptotic regime where a large number of resources is employed. However, in most applications it is necessary to achieve optimal precisions by performing only a limited number of measurements. To this end, machine learning techniques can be applied as a powerful optimization tool. Here, we implement experimentally single-photon adaptive phase estimation protocols enhanced by machine learning, showing the capability of reaching optimal precision after a small number of trials. In particular, we introduce a new approach for Bayesian estimation that exhibit best performances for very low number of photons N . Furthermore, we study the resilience to noise of the tested methods, showing that the optimized Bayesian approach is very robust in the presence of imperfections. Application of this methodology can be envisaged in the more general multiparameter case, that represents a paradigmatic scenario for several tasks including imaging or Hamiltonian learning.Introduction. -Quantum metrology is one of the most promising applications of quantum theory [1][2][3][4][5], where the aim is to obtain enhanced performances in the estimation of unknown physical parameters by employing quantum resources. A notable benchmark for quantum metrology is provided by phase estimation, a task where the parameter to be measured is an optical phase embedded within an interferometric setup. In this scenario, an input probe field is prepared in a suitable state and sent through the system. The value of the phase is retrieved by measuring the field after the evolution in the interferometer, and by repeating the procedure N times to perform statistical analysis. While the ultimate precision achievable with classical resources is known to be bounded by the standard quantum limit (SQL), stating that the achievable error on the unknown phase φ scales as N −1/2 (being N the number of photons), the adoption of quantum inputs can in principle improve the performances up to the Heisenberg limit (HL) [1,2], scaling as N −1 . Several theoretical and experimental studies [6][7][8][9][10][11][12][13][14] focused on devising experimental schemes able to reach quantum enhanced performances. Furthermore, recent advances in integrated photonics has opened new possibilities for the implementation and the development of phase estimation protocols [15][16][17][18][19][20][21][22]. In parallel, a thorough investigation has been dedicated to identifying the effect of experimental noise and losses [23][24][25][26]. In the scenario where the parameter to be estimated is a single phase, it is always possible to identify the op...
The launch of a satellite capable of distributing entanglement through long distances and the first loophole-free violation of Bell inequalities are milestones indicating a clear path for the establishment of quantum networks. However, nonlocality in networks with independent entanglement sources has only been experimentally verified in simple tripartite networks, via the violation of bilocality inequalities. Here, by using a scalable photonic platform, we implement star-shaped quantum networks consisting of up to five distant nodes and four independent entanglement sources. We exploit this platform to violate the chained n-locality inequality and thus witness, in a device-independent way, the emergence of nonlocal correlations among the nodes of the implemented networks. These results open new perspectives for quantum information processing applications in the relevant regime where the observed correlations are compatible with standard local hidden variable models but are nonclassical if the independence of the sources is taken into account.
We report the experimental realization of a recently discovered quantum information protocol by Asher Peres implying an apparent non-local quantum mechanical retrodiction effect. The demonstration is carried out by apply- Typeset using REVT E X 1
Wave-particle duality has long been considered a fundamental signature of the non-classical behavior of quantum phenomena, specially in a delayed choice experiment (DCE), where the experimental setup revealing either the particle or wave nature of the system is decided after the system has entered the apparatus. However, as counter-intuitive as it might seem, usual DCEs do have a simple causal explanation. Here, we take a different route and under a natural assumption about the dimensionality of the system under test, we present an experimental proof of the non-classicality of a DCE based on the violation of a dimension witness inequality. Our conclusion is reached in a device-independent and loophole-free manner, that is, based solely on the observed data and without the need on any assumptions about the measurement apparatus.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.