Abstract. This paper is the second of a series in which a general theory of a priori error estimates for scalar conservation laws is constructed. In this paper, we focus on how the lack of consistency introduced by the nonuniformity of the grids influences the convergence of flux-splitting monotone schemes to the entropy solution. We obtain the optimal rate of convergence of (∆x) 1/2 in L ∞ (L 1 ) for consistent schemes in arbitrary grids without the use of any regularity property of the approximate solution. We then extend this result to less consistent schemes, called p−consistent schemes, and prove that they converge to the entropy solution with the rate of (∆x) min{1/2,p} in L ∞ (L 1 ); again, no regularity property of the approximate solution is used. Finally, we propose a new explanation of the fact that even inconsistent schemes converge with the rate of (∆x) 1/2 in L ∞ (L 1 ). We show that this well-known supraconvergence phenomenon takes place because the consistency of the numerical flux and the fact that the scheme is written in conservation form allows the regularity properties of its approximate solution (total variation boundedness) to compensate for its lack of consistency; the nonlinear nature of the problem does not play any role in this mechanism. All the above results hold in the multidimensional case, provided the grids are Cartesian products of one-dimensional nonuniform grids.