2019 IEEE International Symposium on Information Theory (ISIT) 2019
DOI: 10.1109/isit.2019.8849396
|View full text |Cite
|
Sign up to set email alerts
|

A Random Walk Approach to First-Order Stochastic Convex Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 10 publications
(12 citation statements)
references
References 12 publications
0
12
0
Order By: Relevance
“…where the inequality holds by the assumption that ǫ i are R−sub-Gaussian. As a result of Chernoff-Hoeffding inequality for sub-Gaussian distributions (Vakili and Zhao, 2019),…”
Section: Discussion and Future Workmentioning
confidence: 99%
“…where the inequality holds by the assumption that ǫ i are R−sub-Gaussian. As a result of Chernoff-Hoeffding inequality for sub-Gaussian distributions (Vakili and Zhao, 2019),…”
Section: Discussion and Future Workmentioning
confidence: 99%
“…Roughly speaking, this upper bound holds since the fixed size internal tests and the leaf tests have a greater probability of moving towards the anomaly than away from it. Summing the upper bound on the last passage times yields the first term in (11).…”
Section: B Performance Analysismentioning
confidence: 99%
“…The recent studies [10]- [12] considered hierarchical search under unknown observation models. The key difference is that the search strategies in [10], [11] are based on a sample mean statistic, which fails to detect a general anomalous distribution with a mean close to the mean of the normal distribution. The work in [12] does not assume a structure on the abnormal distribution, and uses the Kolmogorov-Smirnov statistic, which fails to utilize the parametric information considered in our setting.…”
Section: Introductionmentioning
confidence: 99%
“…Referred to as Random Walk on a Tree (RWT), this policy was proposed by two of the authors of this paper in a prior work [19] that analyzed its regret performance for convex functions under sub-Gaussian noise distributions. In this paper, we demonstrate the adaptivity of RWT to different function characteristics and robustness to heavy-tailed noise with infinite variance.…”
Section: Rwt: An Adaptive and Robust Approachmentioning
confidence: 99%
“…where the second step follows from k 2/b log log k β being increasing in k. For the RHS, we can use the bound obtained in (19). Let λ(m, θ) denote the value of bound in (19)…”
Section: Appendix Cmentioning
confidence: 99%