2006
DOI: 10.1016/j.physa.2006.03.024
|View full text |Cite
|
Sign up to set email alerts
|

Entropic criterion for model selection

Abstract: Model or variable selection is usually achieved through ranking models according to the increasing order of preference. One of methods is applying Kullback-Leibler distance or relative entropy as a selection criterion. Yet that will raise two questions, why uses this criterion and are there any other criteria. Besides, conventional approaches require a reference prior, which is usually difficult to get. Following the logic of inductive inference proposed by Caticha [1], we show relative entropy to be a unique … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2007
2007
2024
2024

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 16 publications
(17 citation statements)
references
References 27 publications
(50 reference statements)
0
17
0
Order By: Relevance
“…The basic principle is based on the degree of variation among the characteristic values of the evaluation indexes; the greater the degree of variation and the smaller the information entropy, the greater is the weight of the corresponding index, and vice versa. This method avoids human subjectivity and thus yields accurate results [36,37]. The specific steps of the method are as follows.…”
Section: Reciprocalsmentioning
confidence: 99%
“…The basic principle is based on the degree of variation among the characteristic values of the evaluation indexes; the greater the degree of variation and the smaller the information entropy, the greater is the weight of the corresponding index, and vice versa. This method avoids human subjectivity and thus yields accurate results [36,37]. The specific steps of the method are as follows.…”
Section: Reciprocalsmentioning
confidence: 99%
“…First, we needed to determine the probability distributions of amplitudes of IMFs from real signal and noise. Second, we evaluated the relative entropy or Kullback-Leibler distance of these two distributions to determine their difference (Tseng, 2006;Chen et al, 2007). For the probability distributions from real signals, we simply utilized normalized histograms of amplitudes p real i (t).…”
Section: Entropic Analysismentioning
confidence: 99%
“…By repeating this same procedure, one can obtain the joint probability distribution of an L-mer nucleotide sequence; (3) apply the entropic criterion [44,45] to determine the preferred sequence length L (the reader is referred to [16] for further details).…”
Section: Theorymentioning
confidence: 99%