2021
DOI: 10.48550/arxiv.2103.01240
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Scalable Hamiltonian learning for large-scale out-of-equilibrium quantum dynamics

Agnes Valenti,
Guliuxin Jin,
Julian Léonard
et al.

Abstract: Large-scale quantum devices provide insights beyond the reach of classical simulations. However, for a reliable and verifiable quantum simulation, the building blocks of the quantum device require exquisite benchmarking. This benchmarking of large scale dynamical quantum systems represents a major challenge due to lack of efficient tools for their simulation. Here, we present a scalable algorithm based on neural networks for Hamiltonian tomography in out-of-equilibrium quantum systems. We illustrate our approa… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
5
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 42 publications
0
5
0
Order By: Relevance
“…Here we show how the discriminator network of the generative model allows to directly extract the physical parameters of a certain dynamical correlator. The estimation of physical parameters from data is commonly referred to as Hamiltonian learning and has been explored with a variety of machine learning techniques [81][82][83][84][85][86]. While these methodologies are usually specifically developed for this purpose, conditional generative algorithms provide this functionality as a direct consequence of their training.…”
Section: A Hamiltonian Learning With the Generative Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…Here we show how the discriminator network of the generative model allows to directly extract the physical parameters of a certain dynamical correlator. The estimation of physical parameters from data is commonly referred to as Hamiltonian learning and has been explored with a variety of machine learning techniques [81][82][83][84][85][86]. While these methodologies are usually specifically developed for this purpose, conditional generative algorithms provide this functionality as a direct consequence of their training.…”
Section: A Hamiltonian Learning With the Generative Modelmentioning
confidence: 99%
“…For the three considered, models the accuracy of the estimation shows small variations across the different parameter realization and noise level, yet overall giving a good estimation of the vicinity of the exact conditional parameters for each of the studied systems. While we performed here parameter extraction solely with the dynamical correlators, it is worth noting that an analogous procedure can be extended by training a generative model with combined time-dependent [86,88] or ground state observables [89,90]. Finally, it is worth noting that while here we focused on simulated dynamical correlators, this procedure can be readily applied with experimentally measured spin excitations [64,65], providing a procedure for experimental Hamiltonian extraction with conditional generative adversarial networks.…”
Section: A Hamiltonian Learning With the Generative Modelmentioning
confidence: 99%
“…its Hamiltonian Ĥ0 , consisting of a sum of independent terms, each of which correspond to a unique physical interaction contributing to Q's dynamics. A growing set of quantum parameter estimation algorithms-such as quantum Hamiltonian learning (QHL) [7][8][9][10][11] among others [12][13][14][15][16][17][18][19][20][21]-characterise quantum systems whose model is known in advance, by inferring an optimal parameterisation.…”
Section: Introductionmentioning
confidence: 99%
“…its Hamiltonian Ĥ0 , consisting of a sum of independent terms, each of which correspond to a unique physical interaction contributing to Q's dynamics. A growing set of quantum parameter estimation algorithms -such as quantum Hamiltonian learning (QHL) [7][8][9][10][11] among others [12][13][14][15][16][17][18][19][20][21] -characterise quantum systems whose model is known in advance, by inferring an optimal parameterisation. Leveraging parameter-learning as a subroutine, we introduce the quantum model learning agent (QMLA), which aims to compose an approximate model Ĥ , by testing and comparing a series of candidate models against data drawn from Q. QMLA differs from quantum parameter estimation algorithms by removing the assumption of the precise form of Ĥ0 ; instead we use an optimisation routine to determine which terms ought to be included in Ĥ , thereby determining which interactions Q is subject to.…”
Section: Introductionmentioning
confidence: 99%