In this paper, we deal with the randomized generalized diffusion equation with delay: u t (t, x) = a 2 u xx (t, x) + b 2 u xx (t − , x), t > , 0 ≤ x ≤ l; u(t, 0) = u(t, l) = 0, t ≥ 0; u(t, x) = (t, x), 0 ≤ t ≤ , 0 ≤ x ≤ l. Here, > 0 and l > 0 are constant. The coefficients a 2 and b 2 are nonnegative random variables, and the initial condition (t, x) and the solution u(t, x) are random fields. The separation of variables method develops a formal series solution. We prove that the series satisfies the delay diffusion problem in the random Lebesgue sense rigorously. By truncating the series, the expectation and the variance of the random-field solution can be approximated.