Reversing Jensen’s Inequality for Information-Theoretic Analyses
Abstract
:1. Introduction
- Improvement of the tightness of the lower bound on the expectation of a concave function.
- Relaxing some of the assumptions on the concave function.
- Proposing a more convenient (and perhaps more natural) passage from lower bounds to expectations of concave functions to upper bounds on expectations of convex functions.
- Extension to bivariate (and multivariate) functions that are concave (or convex) in each variable.
- Providing examples of usefulness in information theory (other than the mutual information estimation of [9]).
2. The Basic Reverse Inequality
3. Alternative Upper Bounds to
- The Chernoff approach. The first approach is to upper bound the indicator function, , by the exponential function (), exactly like in the Chernoff bound. This would yield
- 2.
- The Chebychev–Cantelli approach. According to this approach, the function is upper bounded by a quadratic function, in the spirit of the Chebychev–Cantelli inequality, i.e.,
4. Examples
4.1. Example 1—Capacity of the Gaussian Channel with Random SNR
4.2. Example 2—Moments of the Number of Guesses in Randomized Guessing
4.3. Example 3—Moments of the Error in Parameter Estimation
4.4. Logarithms of Sums of Independent Random Variables
4.4.1. Example 4—Universal Source Coding
4.4.2. Example 5—Ergodic Capacity of the Rayleigh SIMO Channel
4.4.3. Example 6—Differential Entropy of the Generalized Multivariate Cauchy Distribution
5. Discussion
- The maximization over a is not necessary. Our first comment is quite trivial but, nevertheless, it is important to mention at least as a reminder. The explicit maximization over the parameter a may not be trivial to carry out in most examples, but for certain purposes, it may not be necessary. One can select an arbitrary value of a and obtain a legitimate lower bound. In some cases, however, it is not too difficult to guess what could be a good choice of this value, as we saw in some of the examples of Section 4.
- 2.
- Softening the assumption . One may partially relax the assumption for all , and replace it with the softer assumption that there exists , such that for all (or more precisely, within the support of the PDF of X). By applying Lemma 1 and Theorem 1 to and compensating for the term , we can easily use exactly the same technique and obtain the modified lower bound,
- 3.
- Convex functions. So far we have dealt with RJIs for concave functions. RJIs associated with expectations of convex functions (upper bounds) can be obtained in exactly the same manner, except that the signs are flipped. Specifically, by replacing f with , we have similar statements for convex functions: Let be a convex function with for every . Then,
- 4.
- Functions that are neither convex nor concave. Using the same line of thought as in item 2 above, one can obtain upper and lower bounds to expectations of general functions, that are not necessarily convex or concave. Indeed, let be a real, continuous function that satisfies the following condition: implies for all . Assume also that f is bounded from below by a constant, . Then, is monotonically non-increasing and positive. Thus, for every ,
6. Extension to Bivariate (and Multivariate) Concave Functions
- Example 7—minimum between two sums of independent random variables. Let and , where both and are all non-negative, independent random variables. Obviously, and . Further, let , which is concave in x for fixed y and vice versa. Then,
- Example 8—channel capacity revisited. Here, we combine Example 1 (capacity of the Gaussian channel with random SNR) and Example 5 (ergodic capacity of the SIMO channel). Consider the expression
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Cover, T.M.; Thomas, J.A. Elements of Information Theory, 2nd ed.; John Wiley & Sons: Hoboken, NJ, USA, 2006. [Google Scholar]
- Jebara, T.; Pentland, A. On reversing Jensen’s inequality. In Proceedings of the 13th International Conference on Neural Information Processing Systems (NIPS 2000), Denver, CO, USA, 1 January 2000; pp. 213–219. [Google Scholar]
- Budimir, I.; Dragomir, S.S.; Pečarić, J. Further reverse results for Jensen’s discrete inequality and applications in information theory. J. Inequal. Pure Appl. Math. 2001, 2, 5. [Google Scholar]
- Simić, S. On a new converse of Jensen’s inequality. Math. Publ. L’Institut Math. 2009, 85, 107–110. [Google Scholar] [CrossRef]
- Dragomir, S.S. Some reverses of the Jensen inequality for functions of selfadjoint operators in Hilbert spaces. J. Inequal. Appl. 2010, 2010, 496821. [Google Scholar] [CrossRef] [Green Version]
- Dragomir, S.S. Some reverses of the Jensen inequality with applications. Bull. Aust. Math. Soc. 2013, 87, 177–194. [Google Scholar] [CrossRef] [Green Version]
- Khan, S.; Khan, M.A.; Chu, Y.-M. New converses of Jensen inequality via Green functions with applications. Rev. Real Acad. Cienc. Exactas Fis. Nat. Ser. A Mat. 2020, 114, 114. [Google Scholar] [CrossRef]
- Khan, S.; Khan, M.A.; Chu, Y.-M. Converses of Jensen inequality derived from the Green functions with applications in information theory. Math. Methods Appl. Sci. 2020, 43, 2577–2587. [Google Scholar] [CrossRef]
- Wunder, G.; Groβ, B.; Fritschek, R.; Schaefer, R.F. A reverse Jensen inequality result with application to mutual information estimation. In Proceedings of the 2021 IEEE Information Theory Workshop (ITW 2021), Kanazawa, Japan, 17–21 October 2021; Available online: https://arxiv.org/pdf/2111.06676.pdf (accessed on 12 November 2021).
- Ali, M.A.; Budak, H.; Zhang, Z. A new extension of quantum Simpson’s and quantum Newton’s inequalities for quantum differentiable convex functions. Math. Methods Appl. Sci. 2021, 1–19. [Google Scholar] [CrossRef]
- Budak, H.; Ali, M.A.; Tarhanaci, M. Some new quantum Hermite-Hadamard like inequalities for co-ordinated convex functions. J. Optim. Theory Appl. 2020, 186, 899–910. [Google Scholar] [CrossRef]
- Merhav, N.; Cohen, A. Universal randomized guessing with application to asynchronous decentralized brute-force attacks. IEEE Trans. Inform. Theory 2020, 66, 114–129. [Google Scholar] [CrossRef]
- Krichevsky, R.E.; Trofimov, V.K. The performance of universal encoding. IEEE Trans. Inform. Theory 1981, 27, 199–207. [Google Scholar] [CrossRef]
- Dong, A.; Zhang, H.; Wu, D.; Yuan, D. Logarithmic expectation of the sum of exponential random variables for wireless communication performance evaluation. In Proceedings of the 2015 IEEE 82nd Vehicular Technology Conference (VTC2015-Fall), Boston, MA, USA, 6–9 September 2015. [Google Scholar]
- Tse, D.; Viswanath, P. Fundamentals of Wireless Communication; Cambridge University Press: Cambridge, UK, 2005. [Google Scholar]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Merhav, N. Reversing Jensen’s Inequality for Information-Theoretic Analyses. Information 2022, 13, 39. https://doi.org/10.3390/info13010039
Merhav N. Reversing Jensen’s Inequality for Information-Theoretic Analyses. Information. 2022; 13(1):39. https://doi.org/10.3390/info13010039
Chicago/Turabian StyleMerhav, Neri. 2022. "Reversing Jensen’s Inequality for Information-Theoretic Analyses" Information 13, no. 1: 39. https://doi.org/10.3390/info13010039
APA StyleMerhav, N. (2022). Reversing Jensen’s Inequality for Information-Theoretic Analyses. Information, 13(1), 39. https://doi.org/10.3390/info13010039