entropy-logo

Journal Browser

Journal Browser

Information Measures with Applications

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (15 November 2019) | Viewed by 33199

Special Issue Editor


E-Mail Website
Guest Editor
Signal and Information Processing Laboratory, ETH Zurich, 8092 Zurich, Switzerland
Interests: multi-terminal information theory; the role of feedback in communication; digital communication; fading channels; optical communication
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Classical information measures such as entropy, relative entropy (Kullback–Leibler divergence), and mutual information have found numerous applications in storage, compression, transmission, cryptography, statistics, large deviations, gambling, and physics. However, over the years—arguably starting with the pioneering work of Alfréd Rényi (1921–1970)—other information measures were introduced and studied. Those include Rényi Entropy, Rényi Divergence, f-divergence, Arimoto's mutual information, Sibson's information radius, and others. These new measures typically generalize the classical measures and in some applications provide finer results. In recent years they have also found new applications in guessing, hypothesis testing, error exponents, task encoding, large deviations, etc.

For this Special Issue we solicit original papers presenting new applications of known information measures and new measures with interesting applications.

Prof. Dr. Amos Lapidoth
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • f-divergence
  • Rényi Divergence
  • Rényi Entropy
  • Arimoto's Mutual Information
  • Information Measures
  • Information Radius

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

17 pages, 324 KiB  
Article
Conditional Rényi Entropy and the Relationships between Rényi Capacities
by Gautam Aishwarya and Mokshay Madiman
Entropy 2020, 22(5), 526; https://doi.org/10.3390/e22050526 - 6 May 2020
Cited by 4 | Viewed by 3011
Abstract
The analogues of Arimoto’s definition of conditional Rényi entropy and Rényi mutual information are explored for abstract alphabets. These quantities, although dependent on the reference measure, have some useful properties similar to those known in the discrete setting. In addition to laying out [...] Read more.
The analogues of Arimoto’s definition of conditional Rényi entropy and Rényi mutual information are explored for abstract alphabets. These quantities, although dependent on the reference measure, have some useful properties similar to those known in the discrete setting. In addition to laying out some such basic properties and the relations to Rényi divergences, the relationships between the families of mutual informations defined by Sibson, Augustin-Csiszár, and Lapidoth-Pfister, as well as the corresponding capacities, are explored. Full article
(This article belongs to the Special Issue Information Measures with Applications)
Show Figures

Figure 1

59 pages, 750 KiB  
Article
Generalizations of Fano’s Inequality for Conditional Information Measures via Majorization Theory
by Yuta Sakai
Entropy 2020, 22(3), 288; https://doi.org/10.3390/e22030288 - 1 Mar 2020
Cited by 4 | Viewed by 3963
Abstract
Fano’s inequality is one of the most elementary, ubiquitous, and important tools in information theory. Using majorization theory, Fano’s inequality is generalized to a broad class of information measures, which contains those of Shannon and Rényi. When specialized to these measures, it recovers [...] Read more.
Fano’s inequality is one of the most elementary, ubiquitous, and important tools in information theory. Using majorization theory, Fano’s inequality is generalized to a broad class of information measures, which contains those of Shannon and Rényi. When specialized to these measures, it recovers and generalizes the classical inequalities. Key to the derivation is the construction of an appropriate conditional distribution inducing a desired marginal distribution on a countably infinite alphabet. The construction is based on the infinite-dimensional version of Birkhoff’s theorem proven by Révész [Acta Math. Hungar. 1962, 3, 188–198], and the constraint of maintaining a desired marginal distribution is similar to coupling in probability theory. Using our Fano-type inequalities for Shannon’s and Rényi’s information measures, we also investigate the asymptotic behavior of the sequence of Shannon’s and Rényi’s equivocations when the error probabilities vanish. This asymptotic behavior provides a novel characterization of the asymptotic equipartition property (AEP) via Fano’s inequality. Full article
(This article belongs to the Special Issue Information Measures with Applications)
Show Figures

Figure 1

23 pages, 518 KiB  
Article
Rényi and Tsallis Entropies of the Aharonov–Bohm Ring in Uniform Magnetic Fields
by Oleg Olendski
Entropy 2019, 21(11), 1060; https://doi.org/10.3390/e21111060 - 29 Oct 2019
Cited by 20 | Viewed by 3687
Abstract
One-parameter functionals of the Rényi R ρ , γ ( α ) and Tsallis T ρ , γ ( α ) types are calculated both in the position (subscript ρ ) and momentum ( γ ) spaces for the azimuthally symmetric 2D nanoring [...] Read more.
One-parameter functionals of the Rényi R ρ , γ ( α ) and Tsallis T ρ , γ ( α ) types are calculated both in the position (subscript ρ ) and momentum ( γ ) spaces for the azimuthally symmetric 2D nanoring that is placed into the combination of the transverse uniform magnetic field B and the Aharonov–Bohm (AB) flux ϕ A B and whose potential profile is modeled by the superposition of the quadratic and inverse quadratic dependencies on the radius r. Position (momentum) Rényi entropy depends on the field B as a negative (positive) logarithm of ω e f f ω 0 2 + ω c 2 / 4 1 / 2 , where ω 0 determines the quadratic steepness of the confining potential and ω c is a cyclotron frequency. This makes the sum R ρ n m ( α ) + R γ n m ( α 2 α 1 ) a field-independent quantity that increases with the principal n and azimuthal m quantum numbers and satisfies the corresponding uncertainty relation. In the limit α 1 , both entropies in either space tend to their Shannon counterparts along, however, different paths. Analytic expression for the lower boundary of the semi-infinite range of the dimensionless coefficient α where the momentum entropies exist reveals that it depends on the ring geometry, AB intensity, and quantum number m. It is proved that there is the only orbital for which both Rényi and Tsallis uncertainty relations turn into the identity at α = 1 / 2 , which is not necessarily the lowest-energy level. At any coefficient α , the dependence of the position of the Rényi entropy on the AB flux mimics the energy variation with ϕ A B , which, under appropriate scaling, can be used for the unique determination of the associated persistent current. Similarities and differences between the two entropies and their uncertainty relations are discussed as well. Full article
(This article belongs to the Special Issue Information Measures with Applications)
Show Figures

Figure 1

80 pages, 1639 KiB  
Article
On Data-Processing and Majorization Inequalities for f-Divergences with Applications
by Igal Sason
Entropy 2019, 21(10), 1022; https://doi.org/10.3390/e21101022 - 21 Oct 2019
Cited by 18 | Viewed by 4688
Abstract
This paper is focused on the derivation of data-processing and majorization inequalities for f-divergences, and their applications in information theory and statistics. For the accessibility of the material, the main results are first introduced without proofs, followed by exemplifications of the theorems [...] Read more.
This paper is focused on the derivation of data-processing and majorization inequalities for f-divergences, and their applications in information theory and statistics. For the accessibility of the material, the main results are first introduced without proofs, followed by exemplifications of the theorems with further related analytical results, interpretations, and information-theoretic applications. One application refers to the performance analysis of list decoding with either fixed or variable list sizes; some earlier bounds on the list decoding error probability are reproduced in a unified way, and new bounds are obtained and exemplified numerically. Another application is related to a study of the quality of approximating a probability mass function, induced by the leaves of a Tunstall tree, by an equiprobable distribution. The compression rates of finite-length Tunstall codes are further analyzed for asserting their closeness to the Shannon entropy of a memoryless and stationary discrete source. Almost all the analysis is relegated to the appendices, which form the major part of this manuscript. Full article
(This article belongs to the Special Issue Information Measures with Applications)
Show Figures

Figure 1

14 pages, 324 KiB  
Article
Entropic Matroids and Their Representation
by Emmanuel Abbe and Sophie Spirkl
Entropy 2019, 21(10), 948; https://doi.org/10.3390/e21100948 - 27 Sep 2019
Cited by 1 | Viewed by 3070
Abstract
This paper investigates entropic matroids, that is, matroids whose rank function is given as the Shannon entropy of random variables. In particular, we consider p-entropic matroids, for which the random variables each have support of cardinality p. We draw connections [...] Read more.
This paper investigates entropic matroids, that is, matroids whose rank function is given as the Shannon entropy of random variables. In particular, we consider p-entropic matroids, for which the random variables each have support of cardinality p. We draw connections between such entropic matroids and secret-sharing matroids and show that entropic matroids are linear matroids when p = 2 , 3 but not when p = 9 . Our results leave open the possibility for p-entropic matroids to be linear whenever p is prime, with particular cases proved here. Applications of entropic matroids to coding theory and cryptography are also discussed. Full article
(This article belongs to the Special Issue Information Measures with Applications)
Show Figures

Figure 1

40 pages, 955 KiB  
Article
Two Measures of Dependence
by Amos Lapidoth and Christoph Pfister
Entropy 2019, 21(8), 778; https://doi.org/10.3390/e21080778 - 8 Aug 2019
Cited by 13 | Viewed by 3677
Abstract
Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order α and the relative α -entropy, respectively, and both dependence measures reduce to Shannon’s mutual information when their order α is one. The first [...] Read more.
Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order α and the relative α -entropy, respectively, and both dependence measures reduce to Shannon’s mutual information when their order α is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding. Full article
(This article belongs to the Special Issue Information Measures with Applications)
Show Figures

Figure 1

16 pages, 308 KiB  
Article
Empirical Estimation of Information Measures: A Literature Guide
by Sergio Verdú
Entropy 2019, 21(8), 720; https://doi.org/10.3390/e21080720 - 24 Jul 2019
Cited by 44 | Viewed by 10396
Abstract
We give a brief survey of the literature on the empirical estimation of entropy, differential entropy, relative entropy, mutual information and related information measures. While those quantities are of central importance in information theory, universal algorithms for their estimation are increasingly important in [...] Read more.
We give a brief survey of the literature on the empirical estimation of entropy, differential entropy, relative entropy, mutual information and related information measures. While those quantities are of central importance in information theory, universal algorithms for their estimation are increasingly important in data science, machine learning, biology, neuroscience, economics, language, and other experimental sciences. Full article
(This article belongs to the Special Issue Information Measures with Applications)
Show Figures

Figure 1

Back to TopTop