We prove almost sure convergence rates of Zeroth-order Gradient Descent (SZGD) algorithms for Łojasiewicz functions. The SZGD algorithm iterates aswhere f is the objective function that satisfies the Łojasiewicz inequality with Łojasiewicz exponent θ, ηt is the step size (learning rate), and ∇f (xt) is the approximate gradient estimated using zeroth-order information. We show that, for smooth Łojasiewicz functions, the sequence {xt} t∈N governed by SZGD converges to a bounded point x∞ almost surely, and x∞ is a critical point of f . If θ ∈ (0, 1 2 ], f (xt) − f (x∞), ∞ s=t xs − x∞ 2 and xt − x∞ ( • is the Euclidean norm) converge to zero linearly almost surely. If θ ∈ ( 1 2 , 1), then f (xt) − f (x∞) (and ∞ s=t xs+1 − xs 2 ) converges to zero at rate o t 1 1−2θ log t almost surely; xt − x∞ converges to zero at rate o t 1−θ 1−2θ log t almost surely. To the best of our knowledge, this paper provides the first almost sure convergence rate guarantee for stochastic zeroth order algorithms for Łojasiewicz functions.