2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP '03).
DOI: 10.1109/icassp.2003.1201639
|View full text |Cite
|
Sign up to set email alerts
|

A small sample model selection criterion based on Kullback's symmetric divergence

Abstract: The Kullback information criterion KIC is a recently developed tool for statistical model selection [I]. KIC serves as an asymptotically unbiased estimator of a variant of the Kullback symmetric divergence, known also as J-divergence. In this paper a bias correction of the Kullback symmetric information criterion is derived for linear models. The conection is of particular use when the sample size is small or when the number of fitted parameters is of moderate to large fraction of the sample sire. For linear r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
30
0
1

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 26 publications
(32 citation statements)
references
References 25 publications
1
30
0
1
Order By: Relevance
“…We find that¨© significantly outperforms the other methods in terms of the quality of selected models as measured by the mean square error of approximation. These results strengthen the case for using¨© instead of¨© as originally recommended in [2].…”
Section: Introductionsupporting
confidence: 82%
See 1 more Smart Citation
“…We find that¨© significantly outperforms the other methods in terms of the quality of selected models as measured by the mean square error of approximation. These results strengthen the case for using¨© instead of¨© as originally recommended in [2].…”
Section: Introductionsupporting
confidence: 82%
“…In addition, one may hope that by improving the bias property, one will also improve the quality of the selected models. This indeed was the motivation behind the development of the corrected¨© criterion proposed in [2] . © is an exactly unbiased estimator of the Kullback symmetric divergence and not only produces drastic bias reduction, but also greatly improves the model selection in small samples.…”
Section: Introductionmentioning
confidence: 95%
“…Here the approximate Kullback information criterion (AKIC) was implemented which attempts to balance the complexity of the model (number of poles) against how well the model fits the original data. The AKIC has the least bias and best resolution of the available model-selection criteria, see [39] for more details. The number of poles was limited to 25% of the total length of the data sequence in order to avoid overfitting in case of a short data sequence.…”
Section: Data Analysis Of Eeg Signalsmentioning
confidence: 99%
“…The model selection criterion used in this paper is Kullback Information Criterion (KIC) which has shown to perform well for sEMG sensor data fusion [7][8][9][10]. The sum of two directed divergences, which is the measure of the models dissimilarity, is known as Kullback's symmetric or J-divergence [17], as given by Equation (5).…”
Section: Data Fusionmentioning
confidence: 99%