In this paper we consider the functional Itô calculus framework to find a pathdependent version of the Hamilton-Jacobi-Bellman equation for stochastic control problems that feature dynamics and running cost that depend on the path of the control. We also prove a Dynamic Programming Principle for such problems. We apply our results to path-dependence of the delay type. We further study Stochastic Differential Games in this context.