2021
DOI: 10.1109/access.2021.3108418
|View full text |Cite
|
Sign up to set email alerts
|

Stochastic Bigger Subspace Algorithms for Nonconvex Stochastic Optimization

Abstract: It is well known that the stochastic optimization problem can be regarded as one of the most hard problems since, in most of the cases, the values of f and its gradient are often not easily to be solved, or the F (•, ξ) is normally not given clearly and (or) the distribution function P is equivocal. Then an effective optimization algorithm is successfully designed and used to solve this problem that is an interesting work. This paper designs stochastic bigger subspace algorithms for solving nonconvex stochasti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1

Relationship

2
3

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 61 publications
0
7
0
Order By: Relevance
“…According to the importance of each indicator to the value gain,  ,  ,  , and  are set to 5,  ,  ,  , and  are set to 10. The proposed algorithm is compared with no value gain subcarrier packet aggregation optimization algorithm (NVG-SPA) [11] and the random subcarrier packet aggregation optimization algorithm (R-SPA) [12]. NVG-SPA does not consider the gain value and network topology changes when optimizing subcarrier packet aggregation.…”
Section: Simulation Resultsmentioning
confidence: 99%
“…According to the importance of each indicator to the value gain,  ,  ,  , and  are set to 5,  ,  ,  , and  are set to 10. The proposed algorithm is compared with no value gain subcarrier packet aggregation optimization algorithm (NVG-SPA) [11] and the random subcarrier packet aggregation optimization algorithm (R-SPA) [12]. NVG-SPA does not consider the gain value and network topology changes when optimizing subcarrier packet aggregation.…”
Section: Simulation Resultsmentioning
confidence: 99%
“…In the defined model, the following constraints should be satisfied. ) , (10) where P load n,t is the load power. The direct current power flow algorithm, discussed in Reference 35, describes the system's power balance.…”
Section: Constraintsmentioning
confidence: 99%
“…It allows constraints containing uncertain variables of the wind turbine to hold at a certain confidence level during the optimization process, thus transforming uncertain optimization into deterministic optimization with a better economy 9 . However, it is hard to get the complete probability distribution information of uncertain parameters in practice 10 . Reference 11 constructed SO models based on advanced scene selection algorithms.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In order to overcome the shortcomings of the full gradient descent algorithm, many improved methods have appeared [12,13,31,32]. At the same time, in order to increase the convergence rate of SGD algorithms, Le Roux [35] proposed an SGD method with variance technology, and based on this work, more gradient methods with variance technology such as CGVR [8], SCGN [9], SCGA [1], and their common feature is that they all use the stochastic conjugate gradient method. Among them, CGVR and SCGA are both hybrid conjugate gradient methods, and both achieve linear convergence rates under strong convex conditions.…”
Section: Introductionmentioning
confidence: 99%