In this paper, we present an analytical local solution for optimal control problems applicable to a wide class of general nonlinear systems. We first introduce the optimal control problem for a general nonlinear system and formulate the associated Hamilton-Jacobi-Bellman equation. Starting from the necessary conditions for optimality represented by the Hamiltonian system of the optimal control problem, we solve the Hamilton-Jacobi equation for the associated generating functions. The generating functions are obtained in the form of series expansions by using canonical transformation theory and the semitensor product of matrices. The coefficients of the generating functions satisfy simply the algebraic Riccati equation and linear algebraic equations for the second-degree homogeneous component and higher degree components, respectively, from which an algorithm and data structure can be derived. This enables us to obtain analytical optimal feedback controllers and the corresponding Lyapunov function. Through theoretical analysis, it is established that the proposed methodology guarantees stability of the closedloop system. This approach is not affected by the complexity of the performance index and state equations, and most importantly, it does not require any initial admissible control law. A simulation example is presented to illustrate the effectiveness and applicability of the proposed approach.