The gradient descent method minimizes an unconstrained nonlinear optimization problem with O(1/ √ K ), where K is the number of iterations performed by the gradient method. Traditionally, this analysis is obtained for smooth objective functions having Lipschitz continuous gradients. This paper aims to consider a more general class of nonlinear programming problems in which functions have Hölder continuous gradients. More precisely, for any function f in this class, denoted by C 1,ν L , there is a ν ∈ (0, 1] and L > 0 such that for all x, y ∈ R n the relation ∇ f (x) − ∇ f (y) ≤ L x − y ν holds. We prove that the gradient descent method converges globally to a stationary point and exhibits a convergence rate of O(1/K ν ν+1 ) when the step-size is chosen properly, i.e., less than [ ν+1 L ]1 ν ∇ f (x k ) 1 ν −1 . Moreover, the algorithm employs O(1/ 1 ν +1 ) number of calls to an oracle to findx such that ∇ f (x) < .