Useful Dual Functional of Entropic Information Measures
Abstract
:1. Introduction
The Quantum Entropic Functional
- is a true ignorance function, in the sense of Brillouin. For a normalized, discrete probability distribution , for instance, Shannon’s measure represents the missing information that one would need to possess so as to be in a “complete information” situation (CIS). In a CIS, just one , while the remaining ones vanish [4,5].
- There is a unique global minimum for subject to appropriate MaxEnt constraints.
- obeys an H-theorem.
- Ground state wave functions that maximize satisfy the virial theorem and the hyper virial ones [17].
2. A Recently Developed Analytic, Compact Expression for Coherent States
3. Some Different Monoparametric Ignorance Measures
4. The Main Mathematical Tool of This Paper
4.1. Important Comment on the Meaning of Equation (18)
- in no way makes an EF “better” or “worse” than another EF,
- but it serves the purpose of classifying EFs using it and
- classification is the starting step of any scientific discipline [25].
4.2. Ignorance-Amount (IA) for Generalized Entropies
4.3. Generalizing the -Independence to Arbitrary Entropic Measures
5. Results: Four Numerical Quantities Associated to Each of Our Monoparametric Ignorance Measures
6. Sharma-Mittal Biparametric Ignorance Measure
7. Value of Our Dual Functional When the -Argument Is Not a Coherent State
8. Application to An Statistical Complexity (SC) Measure
8.1. Shiner-Davison-Landsberg Complexity Measure for Distinct IM
8.2. López Ruiz-Mancini-Calbet (LMC) Measure
9. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Shannon, C. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
- Jaynes, E.T. Information Theory and Statistical Mechanics. Phys. Rev. 1957, 106, 620–630. [Google Scholar] [CrossRef]
- Jaynes, E.T. Information Theory and Statistical Mechanics. II. Phys. Rev. 1957, 1086, 171–190. [Google Scholar] [CrossRef]
- Brillouin, L. Science and Information Theory; Academic Press: New York, NY, USA, 1956. [Google Scholar]
- Katz, A. Principles of Statistical Mechanics; Freeman: San Francisco, CA, USA, 1967. [Google Scholar]
- Balian, R. From Microphysics to Macrophysics; Springer: Berlin, Germany, 1991. [Google Scholar]
- Balian, R.; Alhassid, Y.; Reinhardt, H. Dissipation in many-body systems: A geometric approach based on information theory. Phys. Rep. 1986, 131, 1–146. [Google Scholar] [CrossRef]
- Reinhardt, H. On the description of dissipative collective motion. Nucl. Phys. A 1984, 413, 475–488. [Google Scholar] [CrossRef]
- Canosa, N.; Plastino, A.; Rossignoli, R. Ground-state wave functions and maximum entropy. Phys. Rev. A 1989, 40, 519. [Google Scholar] [CrossRef]
- Canosa, N.; Rossignoli, R.; Plastino, A. Maximum entropy principle for many-body ground states. Nucl. Phys. A 1990, 512, 492–508. [Google Scholar] [CrossRef]
- Canosa, N.; Rossignoli, R.; Plastino, A.; Miller, H.G. Quantal entropy, fluctuations, and the description of many-body ground states. Phys. Rev. C 1992, 45, 1162. [Google Scholar] [CrossRef]
- Arrachea, L.; Canosa, N.; Plastino, A.; Portesi, M.; Rossignoli, R. Maximum-entropy approach to critical phenomena in ground states of finite systems. Phys. Rev. A 1992, 45, 7104. [Google Scholar] [CrossRef]
- Canosa, N.; Plastino, A.; Rossignoli, R. Maximum-entropy-correlated ground state and the description of collective excitations. Nucl. Phys. A 1992, 550, 453–472. [Google Scholar] [CrossRef]
- Arrachea, L.; Canosa, N.; Plastino, A.; Rossignoli, R. Ground state of the Hubbard model: A variational approach based on the maximum entropy principle. Phys. Lett. A 1993, 176, 353–359. [Google Scholar] [CrossRef]
- Casas, M.; Plastino, A.; Puente, A.; Canosa, N.; Rossignoli, R. WKB wave functions without matching. Phys. Rev. A 1993, 47, 3530. [Google Scholar] [CrossRef]
- Plastino, A.; Plastino, A.R. Maximum entropy and approximate descriptions of pure states. Phys. Lett. A 1993, 181, 446–449. [Google Scholar] [CrossRef]
- Fernandez, F.M.; Castro, E.A. Hypervirial Theorems; Springer: Berlin, Germany, 1987. [Google Scholar]
- Ferri, G.L.; Pennini, F.; Plastino, A.; Rocca, M.C. New mathematics for the nonadditive Tsallis’ scenario. Int. J. Mod. Phys. B 2017, 31, 1750151. [Google Scholar] [CrossRef] [Green Version]
- Plastino, A.; Rocca, M.C. Teaching strategy for introducing beginners to Coherent States. Rev. Mex. Fis. E 2019, 65, 191–194. [Google Scholar] [CrossRef] [Green Version]
- Tsallis, C. Introduction to Nonextensive Statistical Mechanics; Springer: Berlin, Germany, 2009. [Google Scholar]
- Landsberg, P.T. Entropies galore! Braz. J. Phys. 1999, 29, 46–49. [Google Scholar] [CrossRef] [Green Version]
- Kaniadakis, G. Statistical mechanics in the context of special relativity. Phys. Rev. E 2002, 66, 056125. [Google Scholar] [CrossRef] [Green Version]
- Kaniadakis, G. Theoretical foundations and mathematical formalism of the power-law tailed statistical distributions. Entropy 2013, 15, 3983–4010. [Google Scholar] [CrossRef] [Green Version]
- Sparavigna, A.C. On the generalized additivity of Kaniadakis entropy. Int. J. Sci. 2015, 4, 44–48. [Google Scholar] [CrossRef]
- Fara, P. Science, a Four Thousand Year History; Oxford University Press: Oxford, UK, 2009. [Google Scholar]
- Sharma, B.D.; Mittal, D.P. New non-additive measures of entropy for discrete probability distributions. J. Math. Sci. 1975, 10, 28–40. [Google Scholar]
- Nielsen, F.; Nock, R. A closed-form expression for the Sharma–Mittal entropy of exponential families. J. Phys. A 2011, 45, 032003. [Google Scholar] [CrossRef] [Green Version]
- Pennini, F.; Plastino, A. Disequilibrium, thermodynamic relations, and Rényi’s entropy. Phys. Lett. A 2017, 381, 212–215. [Google Scholar] [CrossRef]
- Pennini, F.; Plastino, A. Complexity and disequilibrium as telltales of superconductivity. Physica A 2018, 506, 828–834. [Google Scholar] [CrossRef]
- Pennini, F.; Plastino, A. Disequilibrium, complexity, the Schottky effect, and q-entropies, in paramagnetism. Physica A 2017, 488, 85–95. [Google Scholar] [CrossRef]
- Pennini, F.; Plastino, A. Statistical Complexity of the Coriolis Antipairing Effect. Entropy 2019, 21, 558. [Google Scholar] [CrossRef] [Green Version]
- Branada, R.; Pennini, F.; Plastino, A. Statistical complexity and classical–quantum frontier. Physica A 2018, 511, 18–26. [Google Scholar] [CrossRef]
- Pennini, F.; Plastino, A. Statistical quantifiers for few-fermion’systems. Physica A 2018, 491, 305–312. [Google Scholar] [CrossRef]
- Pennini, F.; Plastino, A. Statistical manifestation of quantum correlations via disequilibrium. Phys. Lett. A 2017, 381, 3849–3854. [Google Scholar] [CrossRef] [Green Version]
- Anteneodo, C.; Plastino, A.R. Some features of the López-Ruiz-Mancini-Calbet (LMC) statistical measure of complexity. Phys. Lett. A 1996, 223, 348–354. [Google Scholar] [CrossRef]
- Shiner, J.S.; Davison, M.; Landsberg, P.T. Simple measure for complexity. Phys. Rev. E 1999, 59, 1459. [Google Scholar] [CrossRef]
- López-Ruiz, R.; Mancini, H.L.; Calbet, X. A statistical measure of complexity. Phys. Lett. A 1995, 209, 321–326. [Google Scholar] [CrossRef] [Green Version]
- Crutchfield, J.P. The calculi of emergence: Computation, dynamics and induction. Phys. D 1994, 75, 11–54. [Google Scholar] [CrossRef]
- Feldman, D.P.; Crutchfield, J.P. Measures of statistical complexity: Why? Phys. Lett. A 1998, 238, 244–252. [Google Scholar] [CrossRef]
- Martin, M.T.; Plastino, A.; Rosso, O.A. Statistical Complexity and Disequilibrium. Phys. Lett. A 2003, 311, 126–132. [Google Scholar] [CrossRef]
- Kowalski, A.; Martin, M.T.; Plastino, A.; Rosso, O.; Proto, A.N. Wavelet statistical complexity analysis of the classical limit. Phys. Lett. A 2003, 311, 180–191. [Google Scholar] [CrossRef]
- Rudnicki, L.; Toranzo, I.V.; Sanchez-Moreno, P.; Dehesa, J.S. Monotone measures of statistical complexity. Phys. Lett. A 2016, 380, 377–380. [Google Scholar] [CrossRef] [Green Version]
- López-Ruiz, R. A statistical measure of complexity. In Concepts and Recent Advances in Generalized Information Measures and Statistics; Kowalski, A., Rossignoli, R., Curado, E.M.C., Eds.; Bentham Science Books: New York, NY, USA, 2013; pp. 147–168. [Google Scholar]
- Sen, K.D. Statistical Complexity. Applications in Electronic Structure; Springer: Berlin, Germany, 2011. [Google Scholar]
- Mitchell, M. Complexity: A Guided Tour; Oxford University Press: Oxford, UK, 2009. [Google Scholar]
- Martin, M.T.; Plastino, A.; Rosso, O.A. Generalized statistical complexity measures: Geometrical and analytical properties. Physica A 2006, 369, 439–462. [Google Scholar] [CrossRef]
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Plastino, A.; Rocca, M.C.; Pennini, F. Useful Dual Functional of Entropic Information Measures. Entropy 2020, 22, 491. https://doi.org/10.3390/e22040491
Plastino A, Rocca MC, Pennini F. Useful Dual Functional of Entropic Information Measures. Entropy. 2020; 22(4):491. https://doi.org/10.3390/e22040491
Chicago/Turabian StylePlastino, Angelo, Mario Carlos Rocca, and Flavia Pennini. 2020. "Useful Dual Functional of Entropic Information Measures" Entropy 22, no. 4: 491. https://doi.org/10.3390/e22040491
APA StylePlastino, A., Rocca, M. C., & Pennini, F. (2020). Useful Dual Functional of Entropic Information Measures. Entropy, 22(4), 491. https://doi.org/10.3390/e22040491