Abstract:Abstract. We give the tight bounds of Tsallis relative operator entropy by using Hermite-Hadamard's inequality. Some reverse inequalities related to Young's inequality are also given. In addition, operator inequalities for normalized positive linear map with Tsallis relative operator entropy are given.Mathematics subject classification (2010): 47A63, 46L05, 47A60.
We give bounds on the difference between the weighted arithmetic mean and the weighted geometric mean. These imply refined Young inequalities and the reverses of the Young inequality. We also study some properties on the difference between the weighted arithmetic mean and the weighted geometric mean. Applying the newly obtained inequalities, we show some results on the Tsallis divergence, the Rényi divergence, the Jeffreys-Tsallis divergence and the Jensen-Shannon-Tsallis divergence.
We give bounds on the difference between the weighted arithmetic mean and the weighted geometric mean. These imply refined Young inequalities and the reverses of the Young inequality. We also study some properties on the difference between the weighted arithmetic mean and the weighted geometric mean. Applying the newly obtained inequalities, we show some results on the Tsallis divergence, the Rényi divergence, the Jeffreys-Tsallis divergence and the Jensen-Shannon-Tsallis divergence.
“…We compare Theorem 2.6 and Theorem 2.2 in [16]. The inequalities k p (t) ≤ c p (t) ≤ l p (t) + (t−1) 2 4 given in (5) are equivalent to the following inequalities…”
Section: Remark 27mentioning
confidence: 99%
“…In [16], we obtained the estimates on Tsallis relative operator entropy by the use of Hermite-Hadamard inequality:…”
Section: Alternative Estimate Of Tsallis Relative Operator Entropymentioning
confidence: 99%
“…Theorem 2.2 ( [16]) For any invertible positive operator A and B such that A ≤ B, and −1 ≤ p ≤ 1 with p = 0 we have…”
Section: Alternative Estimate Of Tsallis Relative Operator Entropymentioning
The main purpose of this article is to study estimates for the Tsallis relative operator entropy, by the use of Hermite-Hadamard inequality. Thus, we obtain alternative bounds for the Tsallis relative operator entropy. In the process to derive these bounds, we established the significant relation between the Tsallis relative operator entropy and the generalized relative operator entropy. In addition, we study the properties on monotonicity for the weight of operator means, and for the parameter of relative operator entropies.Keywords : Operator inequality, positive operator, Hermite-Hadamard inequality, operator mean, generalized relative operator entropy and Tsallis relative operator entropy.
Mathematics Subject Classification : 47A63, 47A64 and 94A17where we respectively denote p-weighted harmonic operator mean, p-weighted geometric operator mean and p-weighted arithmetic operator mean by A! p B ≡ (1 − p)A −1 + pB −1 −1 , A# p B ≡ A 1/2 A −1/2 BA −1/2 p A 1/2 and A∇ p B ≡ (1 − p)A + pB for A, B > 0 and p ∈ [0, 1].On the other hand, Tsallis defined the one-parameter extended entropy for the analysis of a physical model in statistical physics in [19]. The properties of the Tsallis relative entropy was studied in [6,7], by Furuichi, Yanagi and Kuriyama. The relative operator entropy S (A|B) := A 1/2 log A −1/
“…The Jeffreys divergence (see [ 22 , 23 ]) is defined by and the Jensen–Shannon divergence [ 15 , 16 ] is defined by In [ 24 ], the Jeffreys and the Jensen–Shannon divergence are extended to biparametric forms. In [ 23 ], Furuichi and Mitroi generalizes these divergences to the Jeffreys–Tsallis divergence, which is given by and to the Jensen–Shannon–Tsallis divergence, which is defined as Several properties of divergences can be extended in the operator theory [ 25 ].…”
Section: Applications To Some Divergencesmentioning
We give bounds on the difference between the weighted arithmetic mean and the weighted geometric mean. These imply refined Young inequalities and the reverses of the Young inequality. We also studied some properties on the difference between the weighted arithmetic mean and the weighted geometric mean. Applying the newly obtained inequalities, we show some results on the Tsallis divergence, the Rényi divergence, the Jeffreys–Tsallis divergence and the Jensen–Shannon–Tsallis divergence.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.