Resistivity saturation is observed in many metallic systems with large resistivities, i.e., when the resistivity has reached a critical value, its further increase with temperature is substantially reduced. This typically happens when the apparent mean free path is comparable to the interatomic separations -the Ioffe-Regel condition. Recently, several exceptions to this rule have been found. Here, we review experimental results and early theories of resistivity saturation. We then describe more recent theoretical work, addressing cases both where the Ioffe-Regel condition is satisfied and where it is violated. In particular we show how the (semiclassical) Ioffe-Regel condition can be derived quantum-mechanically under certain assumptions about the system and why these assumptions are violated for high-Tc cuprates and alkali-doped fullerides.