The paper outlines the optimization of wet cylinder liner designs for high thermal loads, with special reference to medium speed diesel engine applications in the bore range 180–500 mm (∼ 7–20 in). It is shown that a flake graphite nickel–copper–molybdenum alloyed cast iron with a well developed vanadium matrix combines the best properties for liner manufacture. For quality control reasons, centrifugal castings are often preferred, although the resultant graphite form is unfavourable for thermal stress resistance. Consequently, recourse to plain steels is sometimes required, leading naturally to the adoption of chromed finishes for wear and corrosion resistance. For a given thermal rating, the thermal stresses can be reduced by a judicious choice of air–fuel ratio, but a limiting value exists beyond which the thermal stresses remain unaffected. Reductions in stress level are also possible by optimizing the wall thickness and coolant flow. This leads inevitably to thin sections susceptible to distortion under firing and impact loads, thus impairing both the strength of the component and its wear properties. A planned degree of restraint can be adopted to control both distortion and stress level; such measures are not always conducive to economic design and serviceability, and may actually degrade the reliability when applied to steel liners. The surrounding structure can have a first order effect on the stress levels in the liner; these manifest themselves as applied loads to the liner supports or flange. The design details are thus important, and the influence of parameters such as bolting load, location of sealing spigot and block deflection are discussed in relation to the evolution of successful designs. Any structurally sound design must be capable of operating for periods of 20 000 h or more without excessive wear. It is shown that the wear process is essentially exponential; the factors affecting its absolute value are discussed, and attempts made to quantify their contribution.