2013
DOI: 10.11648/j.ajtas.20130202.12
|View full text |Cite
|
Sign up to set email alerts
|

Information Theoretic Models for Dependence Analysis And missing Data Estimation

Abstract: In the present communication information theoretic dependence measure has been defined using maximum entropy principle, which measures amount of dependence among the attributes in a contingency table. A relation between information theoretic measure of dependence and Chi-square statistic has been discussed. A generalization of this information theoretic dependence measure has been also studied. In the end Yate's method and maximum entropy estimation of missing data in design of experiment have been described a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2019
2019
2019
2019

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 4 publications
0
1
0
Order By: Relevance
“…A decision about H 0 is made by comparing the calculated value of Chi square from (1.1) with the tabulated value of Chi square for (m − 1)(n − 1) degrees of freedom with α % level of significance. Hooda and Kumar [1] studied an information theoretic model to measure the dependence among the attributes in a contingency table by applying maximum entropy principle.…”
Section: Introductionmentioning
confidence: 99%
“…A decision about H 0 is made by comparing the calculated value of Chi square from (1.1) with the tabulated value of Chi square for (m − 1)(n − 1) degrees of freedom with α % level of significance. Hooda and Kumar [1] studied an information theoretic model to measure the dependence among the attributes in a contingency table by applying maximum entropy principle.…”
Section: Introductionmentioning
confidence: 99%