Motivated by the method of interpolating inequalities that makes use of the improved Jensen-type inequalities, in this paper we integrate this approach with the well known Zipf–Mandelbrot law applied to various types of f-divergences and distances, such are Kullback–Leibler divergence, Hellinger distance, Bhattacharyya distance (via coefficient), -divergence, total variation distance and triangular discrimination. Addressing these applications, we firstly deduce general results of the type for the Csiszár divergence functional from which the listed divergences originate. When presenting the analyzed inequalities for the Zipf–Mandelbrot law, we accentuate its special form, the Zipf law with its specific role in linguistics. We introduce this aspect through the Zipfian word distribution associated to the English and Russian languages, using the obtained bounds for the Kullback–Leibler divergence.
In the presented paper, Levinson's inequality for the 3-convex function is generalized by using two Green functions.Čebyšev-, Grüss-and Ostrowski-type new bounds are found for the functionals involving data points of two types. Moreover, the main results are applied to information theory via the f -divergence, the Rényi divergence, the Rényi entropy, the Shannon entropy and the Zipf-Mandelbrot law.
In this paper, Levinson type inequalities are studied for the class of higher order convex functions by using Abel-Gontscharoff interpolation. Cebyšev, Grüss, and Ostrowski-type new bounds are also found for the functionals involving data points of two types.
In this paper, we consider the definition of “useful” Csiszár divergence and “useful” Zipf-Mandelbrot law associated with the real utility distribution to give the results for majorizatioQn inequalities by using monotonic sequences. We obtain the equivalent statements between continuous convex functions and Green functions via majorization inequalities, “useful” Csiszár functional and “useful” Zipf-Mandelbrot law. By considering “useful” Csiszár divergence in the integral case, we give the results for integral majorization inequality. Towards the end, some applications are given.
Jensen’s inequality is important for obtaining inequalities for divergence between probability distribution. By applying a refinement of Jensen’s inequality (Horváth et al. in Math. Inequal. Appl. 14:777–791, 2011) and introducing a new functional based on an f-divergence functional, we obtain some estimates for the new functionals, the f-divergence, and Rényi divergence. Some inequalities for Rényi and Shannon estimates are constructed. The Zipf–Mandelbrot law is used to illustrate the result. In addition, we generalize the refinement of Jensen’s inequality and new inequalities of Rényi Shannon entropies for an m-convex function using the Montgomery identity. It is also given that the maximization of Shannon entropy is a transition from the Zipf–Mandelbrot law to a hybrid Zipf–Mandelbrot law.
In this paper we show how the Shannon entropy is connected to the theory of majorization. They are both linked to the measure of disorder in a system. However, the theory of majorization usually gives stronger criteria than the entropic inequalities. We give some generalized results for majorization inequality using Csiszár f-divergence. This divergence, applied to some special convex functions, reduces the results for majorization inequality in the form of Shannon entropy and the Kullback-Leibler divergence. We give several applications by using the Zipf-Mandelbrot law.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.