We investigate the effects of adaptive time-stepping and other algorithmic strategies on the computational stability of ODE codes. We show that carefully designed adaptive algorithms have a most significant impact on computational stability and reliability. A series of computational experiments with the standard implementation of Dassl and a modified version, including stepsize control based on digital filters, is used to demonstrate that relatively small algorithmic changes are able to extract a vastly better computational stability at no extra expense. The inherent performance and stability of Dassl are therefore much greater than the standard implementation seems to suggest.The objective of this paper is to investigate the effect of adaptive time-stepping and other algorithmic strategies on what we refer to as the computational stability of an ODE solver. All stability notions are concerned with a continuous data dependence. As for computational stability, we ask that the complete algorithm, as well as its implementation (including all internal decision making of the code), is a "continuous" map from data to computed results, provided that the given ODE being solved is smooth enough. In other words, a small change of parameters should only have a small effect on the computed output. Alternatively, this could be expressed by saying that the computational process is well-conditioned.Ideally, this should hold for any parameter, regardless of whether it is a parameter of the ODE itself, a parameter in the algorithmic specification of the code, or a parameter in the calling sequence of the software. As it concerns software, the notion of computational stability goes beyond the classical well-conditioning requirement of an algorithm.