2019
DOI: 10.1186/s13662-019-2309-8
|View full text |Cite
|
Sign up to set email alerts
|

The logarithmic concavity of modified Bessel functions of the first kind and its related functions

Abstract: This research demonstrates the log-convexity and log-concavity of the modified Bessel function of the first kind and the related functions. The method of coefficient is used to verify such properties. One of our results contradicts the conjecture proposed by Neumann in 2007 which states that modified Bessel function of the first kind I ν is log-concave in (0, ∞) given ν > 0. The log-concavity holds true in some bounded domain. The application of the other results in Kibble's bivariate gamma distribution is als… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 15 publications
0
2
0
Order By: Relevance
“…Now substituting the NCX distribution ( A2 ) and its corresponding Gaussian distribution ( A3 ) into ( A1 ), the KL divergence can be expressed as where denotes the differential entropy of the NCX distribution ( A2 ) given parameter x , and denotes the expectation over the NCX distribution ( A2 ). The first term can be expressed as while the second term can be written as Since function where x is a given non-negative constant and function are concave functions [ 39 ], Jensen’s inequality is applied to obtain an upperbound on the KL divergence as Next, we find the limit of the upper bound of KL divergence using the limits of mean and variance already calculated in ( A4 ) and ( A5 ), that is At last, using the non-negativity of KL divergence, we have , i.e., . Therefore, we can conclude that the KL divergence between ( A2 ) and ( A3 ) goes to zero when x is sufficiently large and this concludes the proof.…”
Section: Proof Of Propositionmentioning
confidence: 99%
See 1 more Smart Citation
“…Now substituting the NCX distribution ( A2 ) and its corresponding Gaussian distribution ( A3 ) into ( A1 ), the KL divergence can be expressed as where denotes the differential entropy of the NCX distribution ( A2 ) given parameter x , and denotes the expectation over the NCX distribution ( A2 ). The first term can be expressed as while the second term can be written as Since function where x is a given non-negative constant and function are concave functions [ 39 ], Jensen’s inequality is applied to obtain an upperbound on the KL divergence as Next, we find the limit of the upper bound of KL divergence using the limits of mean and variance already calculated in ( A4 ) and ( A5 ), that is At last, using the non-negativity of KL divergence, we have , i.e., . Therefore, we can conclude that the KL divergence between ( A2 ) and ( A3 ) goes to zero when x is sufficiently large and this concludes the proof.…”
Section: Proof Of Propositionmentioning
confidence: 99%
“…Since function where x is a given non-negative constant and function are concave functions [ 39 ], Jensen’s inequality is applied to obtain an upperbound on the KL divergence as …”
Section: Proof Of Propositionmentioning
confidence: 99%