We study existence and approximation of optimal controls for systems governed by McKean-Vlasov stochastic differential equations. It is well known in simple examples that in the absence of convexity conditions, the strict control problem has no optimal solution. The compactification of the set of such strict admissible controls leads to measure valued controls called relaxed controls. The space of relaxed controls enjoys nice topological properties. We prove that under pathwise uniqueness of solutions of the state equation, the relaxed state process is continuous with respect to the control variable. This means that the relaxed and strict control problems have the same value function. Moreover, we show, under merely continuity of the coefficients, that an optimal control exists in the space of relaxed controls. Under additional convexity hypothesis, we show that the optimal relaxed control is a strict control. These two results extend known results to general nonlinear MVSDEs, under minimal assumptions on the coefficients.