In this paper we propose a decomposition algorithm for convex differentiable minimization. This algorithm at each iteration solves a variational inequality problem obtained by approximating the gradient of the cost function by a strongly monotone function. A line search is then performed in the direction of the solution to this variational inequality (with respect to the original cost). If the constraint set is a Cartesian product of m sets, the variational inequality decomposes into m coupled variational inequalities which can be solved in either a Jacobi manner or a Gauss-Seidel manner. This algorithm also applies to the minimization of strongly convex (possibly nondifferentiable) costs subject to linear constraints. As special cases, we obtain the GP-SOR algorithm of Mangasarian and De Leone, a diagonalization algorithm of Feijoo and Meyer, the coordinate descent method, and the dual gradient method. This algorithm is also closely related to a splitting algorithm of Gabay and a gradient projection algorithm of Goldstein and Levitin-Poljak, and has interesting applications to separable convex programming and to solving traffic assignment problems.