The problem of transmitting information in a specified direction over a communication channel with three terminals is considered. Examples are given of the various ways of sending information. Basic inequalities for average mutual information rates are obtained. A coding theorem and weak converse are proved and a necessary and sufficient condition for a positive capacity is derived. Upper and lower bounds on the capacity are obtained, which coincide for channels with symmetric structure.
The problem of transmitting information in a specified direction over a communication channel with three terminals is considered. Examples are given of the various ways of sending information. Basic inequalities for average mutual information rates are obtained. A coding theorem and weak converse are proved and a necessary and sufficient condition for a positive capacity is derived. Upper and lower bounds on the capacity are obtained, which coincide for channels with symmetric structure.
The problem of the nonparametric estimation of a probability distribution is considered from three viewpoints: the consistency in total variation, the consistency in information divergence, and consistency in reversed order information divergence. These types of consistencies are relatively strong criteria of convergence, and a probability distribution cannot he consistently estimated in either type of convergence without any restrictions on the class of probability distributions allowed. Histogram-based estimators of distribution are presented which, under certain conditions, converge in total variation, in information divergence, and in reversed order information divergence to the unknown probability distribution. Some a priori information about the true probability distribution is assumed in each case. As the concept of consistency in information divergence is stronger than that of convergence in total variation, additional assumptions are imposed in the cases of informational divergences.
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.. American Statistical Association is collaborating with JSTOR to digitize, preserve and extend access to Journal of the American Statistical Association.The power properties of an entropy-based test are investigated when used for testing uniformity on [0, 11. Percentage points and power against seven alternatives are reported. Compared with other tests of uniformity, the entropy-based test possesses good power properties for many alternatives. Some asymptotic null and alternative distributions are derived. For sample sizes up to 100 the table of percentage points provides a practical guide for using this test. A theory of entropy-based tests of distributional hypotheses other than uniformity is outlined.KEY WORDS: Entropy-based statistical inference; Tests of fit; Goodness of fit.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.