We consider stochastic convex optimization problems with affine constraints and develop several methods using the either primal or dual approach to solve it. In the primal case, we use special penalization technique to make the initial problem more convenient for using optimization methods. We propose algorithms to solve it based on Similar Triangles Method Gasnikov and Nesterov (2018); Nesterov (2018) with Inexact Proximal Step for the convex smooth and strongly convex smooth objective functions and methods based on Gradient Sliding algorithm Lan (2012) to solve the same problems in the non-smooth case. We prove the convergence guarantees in the smooth convex case with deterministic first-order oracle.We propose and analyze three novel methods to handle stochastic convex optimization problems with affine constraints: SPDSTM, SSTM sc and R-RRMA-AC-SA 2 . We develop convergence analysis for these methods for the unbiased (for R-RRMA-AC-SA 2 ) and biased (for SPDSTM and SSTM sc) stochastic oracles.Finally, we apply all aforementioned results and approaches to solve the decentralized distributed optimization problem and discuss the optimality of the obtained results in terms of communication rounds and the number of oracle calls per node.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.