Abstract-We develop deterministic necessary and sufficient conditions on individual noise sequences of a stochastic approximation algorithm for the error of the iterates to converge at a given rate.Specifically, suppose f n g f n g f n g is a given positive sequence converging monotonically to zero. Consider a stochastic approximation algorithm x n+1 = x n 0 a n (A n x n 0 b n ) + a n e n x n+1 = x n 0 a n (A n x n 0 b n ) + a n e n x n+1 = x n 0 a n (A n x n 0 b n ) + a n e n , where fx n g fx n g fx n g is the iterate sequence, fang fang fang is the step size sequence, feng feng feng is the noise sequence, and x 3 x 3x 3 is the desired zero of the function f (x) = Ax 0 b f (x) = Ax 0 b f (x) = Ax 0 b. Then, under appropriate assumptions, we show that x n 0 x 3 = o( n )x n 0 x 3 = o( n ) x n 0 x 3 = o( n ) if and only if the sequence feng feng feng satisfies one of five equivalent conditions. These conditions are based on well-known formulas for noise sequences: Kushner and Clark's condition, Chen's condition, Kulkarni and Horn's condition, a decomposition condition, and a weighted averaging condition. Our necessary and sufficient condition on feng feng feng to achieve a convergence rate of f n g f n g f n g is basically that the sequence fe n = n g fe n = n g fe n = n g satisfies any one of the above five well-known conditions. We provide examples to illustrate our result. In particular, we easily recover the familiar result that if a n = a=n a n = a=n a n = a=n and fe n g fe n g fe n g is a martingale difference process with bounded variance, then xn 0 x 3 = o(n xn 0 x 3 = o(n xn 0 x 3 = o(n 01=2 (log (n)) ) (log (n)) ) (log (n)) )for any > > >1=2.Index Terms-Convergence rate, Kiefer-Wolfowitz, necessary and sufficient conditions, noise sequences, Robbins-Monro, stochastic approximation.