2022
DOI: 10.48550/arxiv.2202.08253
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Protocols for Trainable and Differentiable Quantum Generative Modelling

Abstract: We propose an approach for learning probability distributions as differentiable quantum circuits (DQC) that enable efficient quantum generative modelling (QGM) and synthetic data generation. Contrary to existing QGM approaches, we perform training of a DQCbased model, where data is encoded in a latent space with a phase feature map, followed by a variational quantum circuit. We then map the trained model to the bit basis using a fixed unitary transformation, coinciding with a quantum Fourier transform circuit … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(10 citation statements)
references
References 72 publications
0
10
0
Order By: Relevance
“…We can obtain such α p and U p by solving the Fokker-Planck equation, which describes the time evolution of the probability density function, using VQS [22,41]. Alternatively, they can also be obtained by quantum generative models [33][34][35][36][37] since the probability density function of the underlying asset price at any t ∈ [0, T ] can be obtained analytically under the BS model (see Eq. (53) in Sec.…”
Section: Proposed Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…We can obtain such α p and U p by solving the Fokker-Planck equation, which describes the time evolution of the probability density function, using VQS [22,41]. Alternatively, they can also be obtained by quantum generative models [33][34][35][36][37] since the probability density function of the underlying asset price at any t ∈ [0, T ] can be obtained analytically under the BS model (see Eq. (53) in Sec.…”
Section: Proposed Methodsmentioning
confidence: 99%
“…To find such α V and U V , we can use the quantum generative models [33][34][35][36][37]. Fourth, we solve the BSPDE from τ = 0 to τ ter using VQS and obtain an unnormalized state…”
Section: Proposed Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…In the context of QGM, QCBM is an excellent example of implicit training where typically a MMD like loss-function is used. On the other hand, recent work showcases how explicit quantum models such as DQGM [38] and quantum quantile mechanics [11] benefit from a functional access to the model probability distributions, allowing input-differentiable quantum models [39,40] To enable the classical training, we propose a strategy described in a schematic shown in Fig. 2.…”
Section: A Preliminaries: Qcbm and Dqgm As Implicit Vs Explicit Gener...mentioning
confidence: 99%
“…We explore different families of quantum circuits in detail and perform numerical studies to study their sampling complexity and expressivity. For expressivity studies in particular, we look at training a differentiable quantum generative model (DQGM) [38][39][40] architecture which allows training in the latent (or "frequency") space, and sampling in the bit-basis. This presents a good testing ground for applying the proposed method to explicit quantum generative models.…”
Section: Introductionmentioning
confidence: 99%