2021
DOI: 10.22331/q-2021-10-05-558
|View full text |Cite
|
Sign up to set email alerts
|

Effect of barren plateaus on gradient-free optimization

Abstract: Barren plateau landscapes correspond to gradients that vanish exponentially in the number of qubits. Such landscapes have been demonstrated for variational quantum algorithms and quantum neural networks with either deep circuits or global cost functions. For obvious reasons, it is expected that gradient-based optimizers will be significantly affected by barren plateaus. However, whether or not gradient-free optimizers are impacted is a topic of debate, with some arguing that gradient-free approaches are unaffe… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
97
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
3
2

Relationship

1
8

Authors

Journals

citations
Cited by 141 publications
(99 citation statements)
references
References 44 publications
1
97
0
Order By: Relevance
“…[51], it was shown that Newton-type algorithms make no difference because exponentially many measurements are still required to evaluate the Hessian matrix obtained by applying the parameter shift rule twice. In the presence of BPs, the landscape value will exhibit an exponential concentration about the mean [49,52], i.e., the variation Var θ [J(θ)] exponentially decays with the qubit number n. Other gradient-free optimizers (e.g., Nelder-Mead, Powell, and COBYLA) cannot improve the efficiency of optimization, because decisions made in these algorithms are based on the comparison of the objective function values between different points [52].…”
Section: The Effects Of Barren Plateausmentioning
confidence: 99%
“…[51], it was shown that Newton-type algorithms make no difference because exponentially many measurements are still required to evaluate the Hessian matrix obtained by applying the parameter shift rule twice. In the presence of BPs, the landscape value will exhibit an exponential concentration about the mean [49,52], i.e., the variation Var θ [J(θ)] exponentially decays with the qubit number n. Other gradient-free optimizers (e.g., Nelder-Mead, Powell, and COBYLA) cannot improve the efficiency of optimization, because decisions made in these algorithms are based on the comparison of the objective function values between different points [52].…”
Section: The Effects Of Barren Plateausmentioning
confidence: 99%
“…This means that when there establishes BP, the probability of finding parameters whose cost function is lower than the average value at a constant c is also exponentially suppressed. This exponential suppression of the cost difference makes the parameter optimization also difficult with gradient-free methods [44], where the parameter update is based on the sampled cost difference. It has been shown that BP is equivalent to the cost concentration.…”
Section: Barren Plateaus and Cost Concentrationmentioning
confidence: 99%
“…In a barren plateau, ∇ θ F → 0 [22,24] and the value of the cost function is exponentially flattened in the number of qubits. In our case, we reformulate the problem in such a way that the space of solutions is no longer a set of possible configurations, but rather a space of (Gaussian) distributions.…”
Section: Fig 2 Expressivity Of the Model As A Function Of The Chosen ...mentioning
confidence: 99%
“…This phenomenon, which exponentially increases the resources required to train large scale quantum neural networks, has been demonstrated in a number of proposed architectures and classes of cost functions [22,23]. Even gradient-free optimizers do not solve the barren plateau problem, as cost function differences, which are the basis for making decisions in a gradient-free optimization, are exponentially suppressed in a barren plateau [24].…”
Section: Introductionmentioning
confidence: 99%