entropy-logo

Journal Browser

Journal Browser

Mathematics in Information Theory and Modern Applications

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (30 September 2024) | Viewed by 6803

Special Issue Editors

Department of Electrical and Computer Engineering, University of California Santa Barbara, Goleta, CA 93117, USA
Interests: information theory; distributed computing; machine learning theory; probability theory; reversible logic gates
Courant Institute of Mathematical Sciences and Center for Data Science, New York University, New York, NY 10003, USA
Interests: high-dimensional and nonparametric statistics; information theory; online learning and bandits; statistical machine learning; probability theory

Special Issue Information

Dear Colleagues,

Founded by Claude E. Shannon in 1948, information theory was born to be the mathematical theory of communications. After its flourishing development for several decades, information theory has been contributing to the mathematical theories of many other disciplines (such as statistics, machine learning, coding theory, probability, combinatorics, computational biology, and genomics, to name a few), and its own development has also been fostered by progress in mathematics and related fields. Modern mathematics is not only an important component of modern information theory but also the key driving force behind the modern development of information theory.

Modern information theory is mainly concerned with quantifying the information in probability distributions and their interactions with large-scale nonlinear systems built for applications in the modern age. It typically involves novel mathematical applications of information measures, high-dimensional geometry, algebra, combinatorics, etc. Further progress on this front calls for new mathematical techniques to refine the understanding of information through the lens of information theory, and novel usage of information for real-world problems.

This Special Issue aims to be a forum for the presentation of recent mathematical advances in information theory, and how information-theoretic tools lead to new theoretical understandings of modern applications. In particular, the understanding and analysis of real-world problems related to data science with the help of mathematical tools based on information theory fall within the scope of this Special Issue.

Dr. Qian Yu
Dr. Yanjun Han
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • coding theory
  • statistics
  • machine learning
  • probability and entropy
  • shannon theory and information inequalities
  • learning theory
  • distributed storage and computation
  • privacy and security
  • applications

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

16 pages, 971 KiB  
Article
Derangetropy in Probability Distributions and Information Dynamics
by Masoud Ataei and Xiaogang Wang
Entropy 2024, 26(11), 996; https://doi.org/10.3390/e26110996 - 18 Nov 2024
Viewed by 231
Abstract
We introduce derangetropy, which is a novel functional measure designed to characterize the dynamics of information within probability distributions. Unlike scalar measures such as Shannon entropy, derangetropy offers a functional representation that captures the dispersion of information across the entire support of a [...] Read more.
We introduce derangetropy, which is a novel functional measure designed to characterize the dynamics of information within probability distributions. Unlike scalar measures such as Shannon entropy, derangetropy offers a functional representation that captures the dispersion of information across the entire support of a distribution. By incorporating self-referential and periodic properties, it provides insights into information dynamics governed by differential equations and equilibrium states. Through combinatorial justifications and empirical analysis, we demonstrate the utility of derangetropy in depicting distribution behavior and evolution, providing a new tool for analyzing complex and hierarchical systems in information theory. Full article
(This article belongs to the Special Issue Mathematics in Information Theory and Modern Applications)
Show Figures

Figure 1

22 pages, 5071 KiB  
Article
A Fisher Information Theory of Aesthetic Preference for Complexity
by Sébastien Berquet, Hassan Aleem and Norberto M. Grzywacz
Entropy 2024, 26(11), 901; https://doi.org/10.3390/e26110901 - 24 Oct 2024
Viewed by 461
Abstract
When evaluating sensory stimuli, people tend to prefer those with not too little or not too much complexity. A recent theoretical proposal for this phenomenon is that preference has a direct link to the Observed Fisher Information that a stimulus carries about the [...] Read more.
When evaluating sensory stimuli, people tend to prefer those with not too little or not too much complexity. A recent theoretical proposal for this phenomenon is that preference has a direct link to the Observed Fisher Information that a stimulus carries about the environment. To make this theory complete, one must specify the model that the brain has about complexities in the world. Here, we develop this model by first obtaining the distributions of three indices of complexity measured as normalized Shannon Entropy in real-world images from seven environments. We then search for a parametric model that accounts for these distributions. Finally, we measure the Observed Fisher Information that each image has about the parameters of this model. The results show that with few exceptions, the distributions of image complexities are unimodal, have negative skewness, and are leptokurtotic. Moreover, the sign and magnitude of the skewness varies systematically with the location of the mode. After investigating tens of models for these distributions, we show that the Logit-Losev function, a generalization of the hyperbolic-secant distribution, fits them well. The Observed Fisher Information for this model shows the inverted-U-shape behavior of complexity preference. Finally, we discuss ways to test our Fisher-Information theory. Full article
(This article belongs to the Special Issue Mathematics in Information Theory and Modern Applications)
Show Figures

Figure 1

16 pages, 477 KiB  
Article
Quantum Distance Measures Based upon Classical Symmetric Csiszár Divergences
by Diego G. Bussandri and Tristán M. Osán
Entropy 2023, 25(6), 912; https://doi.org/10.3390/e25060912 - 8 Jun 2023
Cited by 1 | Viewed by 1236
Abstract
We introduce a new family of quantum distances based on symmetric Csiszár divergences, a class of distinguishability measures that encompass the main dissimilarity measures between probability distributions. We prove that these quantum distances can be obtained by optimizing over a set of quantum [...] Read more.
We introduce a new family of quantum distances based on symmetric Csiszár divergences, a class of distinguishability measures that encompass the main dissimilarity measures between probability distributions. We prove that these quantum distances can be obtained by optimizing over a set of quantum measurements followed by a purification process. Specifically, we address in the first place the case of distinguishing pure quantum states, solving an optimization of the symmetric Csiszár divergences over von Neumann measurements. In the second place, by making use of the concept of purification of quantum states, we arrive at a new set of distinguishability measures, which we call extended quantum Csiszár distances. In addition, as it has been demonstrated that a purification process can be physically implemented, the proposed distinguishability measures for quantum states could be endowed with an operational interpretation. Finally, by taking advantage of a well-known result for classical Csiszár divergences, we show how to build quantum Csiszár true distances. Thus, our main contribution is the development and analysis of a method for obtaining quantum distances satisfying the triangle inequality in the space of quantum states for Hilbert spaces of arbitrary dimension. Full article
(This article belongs to the Special Issue Mathematics in Information Theory and Modern Applications)
Show Figures

Figure 1

16 pages, 443 KiB  
Article
Asymptotic Distribution of Certain Types of Entropy under the Multinomial Law
by Andrea A. Rey, Alejandro C. Frery, Magdalena Lucini, Juliana Gambini, Eduarda T. C. Chagas and Heitor S. Ramos
Entropy 2023, 25(5), 734; https://doi.org/10.3390/e25050734 - 28 Apr 2023
Cited by 5 | Viewed by 1677
Abstract
We obtain expressions for the asymptotic distributions of the Rényi and Tsallis of order q entropies and Fisher information when computed on the maximum likelihood estimator of probabilities from multinomial random samples. We verify that these asymptotic models, two of which (Tsallis and [...] Read more.
We obtain expressions for the asymptotic distributions of the Rényi and Tsallis of order q entropies and Fisher information when computed on the maximum likelihood estimator of probabilities from multinomial random samples. We verify that these asymptotic models, two of which (Tsallis and Fisher) are normal, describe well a variety of simulated data. In addition, we obtain test statistics for comparing (possibly different types of) entropies from two samples without requiring the same number of categories. Finally, we apply these tests to social survey data and verify that the results are consistent but more general than those obtained with a χ2 test. Full article
(This article belongs to the Special Issue Mathematics in Information Theory and Modern Applications)
Show Figures

Figure 1

9 pages, 269 KiB  
Article
About the Entropy of a Natural Number and a Type of the Entropy of an Ideal
by Nicuşor Minculete and Diana Savin
Entropy 2023, 25(4), 554; https://doi.org/10.3390/e25040554 - 24 Mar 2023
Cited by 1 | Viewed by 1965
Abstract
In this article, we find some properties of certain types of entropies of a natural number. We are studying a way of measuring the “disorder” of the divisors of a natural number. We compare two of the entropies H and H¯ defined [...] Read more.
In this article, we find some properties of certain types of entropies of a natural number. We are studying a way of measuring the “disorder” of the divisors of a natural number. We compare two of the entropies H and H¯ defined for a natural number. An useful property of the Shannon entropy is the additivity, HS(pq)=HS(p)+HS(q), where pq denotes tensor product, so we focus on its study in the case of numbers and ideals. We mention that only one of the two entropy functions discussed in this paper satisfies additivity, whereas the other does not. In addition, regarding the entropy H of a natural number, we generalize this notion for ideals, and we find some of its properties. Full article
(This article belongs to the Special Issue Mathematics in Information Theory and Modern Applications)
Back to TopTop