2008
DOI: 10.3788/aos20082802.0205
|View full text |Cite
|
Sign up to set email alerts
|

Experimental Demonstration of Stochastic Parallel Gradient Descent Control Algorithm for Adaptive Optics System

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2010
2010
2020
2020

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 0 publications
0
3
0
Order By: Relevance
“…To evaluate the performance of the SPGD algorithm, we presented the evolution curves of the performance metric. When the perturbations amplitude is fixed, the convergence rate is determined by the gain coefficient to a great extent [9] [13] . To improve SPGD convergence, in the simulation we adopt an adaptive adjusting gain regime in which the control parameter is adjusted to the varied value of the metric, γ=J/C, where C is a negative constant.…”
Section: Simulation Results and Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…To evaluate the performance of the SPGD algorithm, we presented the evolution curves of the performance metric. When the perturbations amplitude is fixed, the convergence rate is determined by the gain coefficient to a great extent [9] [13] . To improve SPGD convergence, in the simulation we adopt an adaptive adjusting gain regime in which the control parameter is adjusted to the varied value of the metric, γ=J/C, where C is a negative constant.…”
Section: Simulation Results and Analysismentioning
confidence: 99%
“…This adaptive optics correction system without wavefront sensor involves simple components and costs lowly. As so far, a large number of research results show that the stochastic parallel gradient descent algorithm is the most efficient stochastic optimization algorithm [6][7][8][9] .…”
Section: Introductionmentioning
confidence: 99%
“…Entropy evaluation function and statistics function are extremely sensitive to additive constants such as the background levels, their applications are therefore limited. Simulated Annealing (SA) [2], Genetic Algorithm(GA) [3], Simplex Algorithm [4] and Stochastic Parallel Gradient Descent(SPGD) [5] [6] are the common stochastic parallel search algorithms. In Comparison with SPGD, the convergence accuracy of SA is relatively low, GA needs to adjust large numbers of control parameters, and the simplex algorithm is inclined to falling into local extremes.…”
Section: Introductionmentioning
confidence: 99%