The evaluation of bit error rate (BER) of a digital communication system is usually done via simulation using Monte Carlo (MC) method. For low BER, this method requires very large sample size to produce the rare error events. To overcome this limitation, various importance sampling techniques have been used to reduce simulation runtime. This reduction is achieved by modifying the noise distribution and subsequently scaling the number of errors to make the estimator unbiased. In this paper, we consider the applicability of the conventional importance sampling (CIS) technique for the simulation of orthogonal space time block coded (OSTBC) systems over frequency-nonselective Rayleigh fading channels. For additive white Gaussian noise (AWGN) channels, it is shown that, using CIS, the required sample size is dramatically reduced especially for low BERs by increasing the noise variance. However, our results show that high efficiency cannot be reached by biasing the receiver noise processes in OSTBC systems. We show numerically that it is more efficient to bias the Rayleigh fading process by decreasing the Rayleigh variance. Furthermore, it is shown that the variance improvement is limited by the dimensionality of the system.