Concentration inequalities are fundamental tools in probabilistic combinatorics and theoretical computer science for proving that random functions are near their means. Of particular importance is the case where f (X) is a function of independent random variables X = (X1, . . . , Xn). Here the well known bounded differences inequality (also called McDiarmid's or Hoeffding-Azuma inequality) establishes sharp concentration if the function f does not depend too much on any of the variables. One attractive feature is that it relies on a very simple Lipschitz condition (L): it suffices to show that |fWhile this is easy to check, the main disadvantage is that it considers worst-case changes c k , which often makes the resulting bounds too weak to be useful.In this paper we prove a variant of the bounded differences inequality which can be used to establish concentration of functions f (X) where (i) the typical changes are small although (ii) the worst case changes might be very large. One key aspect of this inequality is that it relies on a simple condition that (a) is easy to check and (b) coincides with heuristic considerations why concentration should hold. Indeed, given an event Γ that holds with very high probability, we essentially relax the Lipschitz condition (L) to situations where Γ occurs. The point is that the resulting typical changes c k are often much smaller than the worst case ones.To illustrate its application we consider the reverse H-free process, where H is 2-balanced. We prove that the final number of edges in this process is concentrated, and also determine its likely value up to constant factors. This answers a question of Bollobás and Erdős.