The Kullback Information Criterion¨© [1] and the bias corrected version,¨© [2] are two methods for statistical model selection of regression variables and autoregressive models. Both criteria may be viewed as estimators of the Kullback symmetric divergence between the true model and the fitted approximating model. The bias of¨© and¨© is studied in the underfitting case, where none of the candidate models includes the true model. Here, only normal linear regression models are considered, where exact expression of the bias is obtained for¨© and¨© . The bias of¨© is often smaller, in most case drastically smaller than¨© . A simulation study in which the true model is of inf inite order polynomial expansion shows that in small and moderate sample size¨© provides a better model selection than¨© . Furthermore¨© outperforms the two well-known criteria © and .