The concept of weighted entropy takes into account values of different outcomes, i.e., makes entropy context-dependent, through the weight function. In this paper, we establish a number of simple inequalities for the weighted entropies (general as well as specific), mirroring similar bounds on standard (Shannon) entropies and related quantities. The required assumptions are written in terms of various expectations of the weight functions. Examples are weighted Ky Fan and weighted Hadamard inequalities involving determinants of positive-definite matrices, and weighted Cramér-Rao inequalities involving the weighted Fisher information matrix.
The weighted Gibbs inequality and its consequencesThe definition and initial results on weighted entropy were introduced in [1,11]. The purpose was to introduce disparity between outcomes of the same probability: in the case of a standard entropy such outcomes contribute the same amount of information/uncertainty, which is appropriate in contextfree situations. However, imagine two equally rare medical conditions, occurring with probability p << 1, one of which carries a major health risk while the other is just a peculiarity. Formally, they provide the same amount of information − log p but the value of this information can be very different. The weight, or a weight function, was supposed to fulfill this task, at least to a certain extent. The initial results have been further extended and deepened in [24,7,14,23,25,31,15], and, more recently, in [6,26,2,22,27]. Certain applications emerged, see [8,13], along with a number of theoretical suggestions.The purpose of this note is to extend a number of inequalities, established previously for a standard (Shannon) entropy, to the case of the weighted entropy. We particularly mention Ky Fanand Hadamard-type inequalities from [3,9,20] which are related to (standard) Gaussian entropies. Extended inequalities for weighted entropies already found applications and further developments in [28,29,30]. Another kind of bounds, weighted Cramér-Rao inequalities, may be useful in statistics.An additional motivation for studying weighted entropy (WE) can be provided in the following questions. (I) What is the rate at which the WE is produced by a sample of a random process (and what could be an analog of the Shannon-McMillan-Breiman theorem)? (II) What would be an analog of Shannon's Second Coding theorem when an incorrect channel output causes a penalty but does not make the transmission session invalid? Properties of the WE established in the current paper could be helpful in this line of research.One of naturally emerging questions is about the form/structure of the weight function (WF). In this paper we focus on some simple inequalities (as suggested by the title). Our results hold for fairly 2010 Mathematics Subject Classification: 60A10, 60B05, 60C05