Entropy Evaluation Based on Value Validity
Abstract
:1. Introduction
2 Entropy Properties
2.1. Properties of S
- (P1)
- S is a continuous function of all its arguments p1,..., pn (so that small changes in some of the pi ’s result in only a small change in the value of S).
- (P2)
- S is (permutation) symmetric in the pi (i = 1,..., n).
- (P3)
- S is zero-indifferent (expansible), i.e., the addition of some state(s) or event(s) with zero probability does not change the value of S, or formally:
- (P4)
- S attains its extremal values for the two probability distributions:so that, for any distribution Pn = (p1,..., pn ):
- (P5)
- is strictly increasing in n for in Equation (4).
- (P6)
- S is strictly Schur-concave and hence, if Pn is majorized by Qn (denoted by ≺):with strict inequality unless Qn is simply a permutation of Pn.
- (P7)
- S is additive in the following sense. If {pij} in the joint probability distribution for the quantum states for two parts of a system or for the events of two statistical experiments, with marginal probability distributions {pi+} and {p+ j} where and for i = 1,..., n and j = 1,..., m, then, under independence:
2.2. Valid Comparison Conditions
3. Value-Valid Functions of S and S*
3.1. The Case of S
3.2. The Case of S*
4. Assessment of Entropy Families
5. The Euclidean Entropy
6. Statistical Inferences
7. Concluding Comments
Conflicts of Interest
References
- Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J 1948, 27. [Google Scholar]
- Aczél, J.; Daróczy, Z. On Measures of Information and Their Characterizations; Academic Press: New York, NY, USA, 1975. [Google Scholar]
- Landsberg, P.T. Thermodynamnics and Statistical Mechanics; Dover: New York, NY, USA, 1990. [Google Scholar]
- Reza, F.M. An Introduction to Information Theory; McGraw-Hill: New York, NY, USA, 1961. [Google Scholar]
- Boltzmann, L. Weitere studien über das wärmegleichgewicht unter gasmolekülen. In Kaiserliche Akademie der Wissenschaften [Vienna] Sitzungsberichte 1872; Hof, K.-K., Ed.; und Staatsdruckerei in Commission bei F; Tempsky: Wien, Austria, 1872; pp. 275–370. (In German) [Google Scholar]
- Tribus, M. Thirty years of information theory. In The study of Information; Machlup, F., Mansfield, U., Eds.; Wiley: New York, NY, USA, 1983; pp. 475–484. [Google Scholar]
- Magurran, A.E. Measuring Biological Diversity; Blackwell: Oxford, UK, 2004. [Google Scholar]
- Norwich, K.H. Information, Sensation, and Perception; Academic Press: San Diego, CA, USA, 1993. [Google Scholar]
- Cho, A. A fresh take on disorder, or disorderly science. Science 2002, 297, 1268–1269. [Google Scholar]
- Rényi, A. Probability Theory; North-Holland: Amsterdam, The Netherlands, 1970. [Google Scholar]
- Rényi, A. On Measures of Entropy and Information. Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Berkeley, CA, USA, 20 June–30 July 1961; University of California Press: Berkeley-Los Angeles, CA, USA, 1961; I, pp. 547–561. [Google Scholar]
- Havrda, J.; Charvat, F. Quantification method of classification processes, concept of structural α-entropy. Kybernetika 1967, 3, 30–35. [Google Scholar]
- Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys 1988, 52, 479–487. [Google Scholar]
- Kapur, J.N. Generalized entropy of order α and type β. Math. Semin 1967, 4, 78–94. [Google Scholar]
- Aczél, J.; Daróczy, Z. Über verallgemeinerte quasilineare mittelwerte, die mit gewichtsfunktionen gebildet sind. Publ. Math. Debr 1963, 10, 171–190. (In German) [Google Scholar]
- Arimoto, S. Information theoretical considerations on estimation problems. Inform. Control 1971, 19, 181–194. [Google Scholar]
- Sharma, B.D.; Mittal, D.P. New non-additive measures of entropy for discrete probability distributions. J. Math. Sci 1975, 10, 28–40. [Google Scholar]
- Rathie, P.N. Generalization of the non-additive measures of uncertainty and information and their axiomatic characterizations. Kybernetika 1971, 7, 125–131. [Google Scholar]
- Kvålseth, T.O. On generalized information measures of human performance. Percept. Mot. Skills 1991, 72, 1059–1063. [Google Scholar]
- Kvålseth, T.O. Correction of a generalized information measure. Percept. Mot. Skills 1994, 79, 348–350. [Google Scholar]
- Kvålseth, T.O. Entropy. In International Encyclopedia of Statistical Science; Lovric, M., Ed.; Springer-Verlag: Heidelberg, Germany, 2011; Part 5, pp. 436–439. [Google Scholar]
- Morales, D.; Pardo, L.; Vajda, I. Uncertainty of discrete stochastic systems: Gezneral theory and statistical inference. IEEE Trans. Syst. Man Cybern 1996, 26, 681–697. [Google Scholar]
- Good, I.J. The population frequencies of species and the estimation of population parameters. Biometrika 1953, 40, 237–264. [Google Scholar]
- Klir, G.J. Uncertainty and Information; Wiley: Hoboken, NJ, USA, 2006. [Google Scholar]
- Aczél, J. Measuring information beyond communication theory. Inf. Proc. Manag 1984, 20, 383–395. [Google Scholar]
- Marshall, A.W.; Ingram, O.; Arnold, B.C. Inequalities: Theory of Majorization and Its Applications, 2nd ed; Springer: New York, NY, USA, 2011. [Google Scholar]
- Hand, D.J. Measurement Theory and Practice; Wiley: Chichester, UK, 2004. [Google Scholar]
- Kvålseth, T.O. The lambda distribution and its applications to categorical summary measures. Adv. Appl. Stat 2011, 24, 83–106. [Google Scholar]
- Aczél, J. Lectures on Functional Equations and their Applications; Academic Press: New York, NY, USA, 1966. [Google Scholar]
- Kvålseth, T.O. Cautionary note about R2. Am. Stat 1985, 39, 279–285. [Google Scholar]
- Hartley, R.V. Transmission of information. Bell Syst. Tech. J 1928, 7, 535–563. [Google Scholar]
- Baczkowski, S.J.; Joanes, D.N.; Shamia, G.M. Range of validity of α and β for a generalized diversity index H(α, β) due to Good. Math. Biosci 1998, 148, 115–128. [Google Scholar]
- Kapur, M.N.; Kesavan, H.K. Entropy Optimization Principles with Application; Academic Press: Boston, MA, USA, 1992. [Google Scholar]
- Peitgen, H.-O.; Jürgens, H.; Saupe, D. Chaos and Fractals: New Frontiers of Science, 2nd ed; Springer-Verlag: New York, NY, USA, 2004. [Google Scholar]
- Schroeder, M. Fractals, Chaos, Power Laws: Minutes from an Infinite Paradise; W.H. Freeman: New York, NY, USA, 1991. [Google Scholar]
- Vadja, I. Bounds on the minimal error probability on checking a finite or countable numbers of hypotheses. Probl. Pereda. Inf 1968, 4, 9–19. [Google Scholar]
- Kvålseth, T.O. Coefficients of variation for nominal and ordinal categorical data. Percept. Mot. Skills 1995, 80, 843–847. [Google Scholar]
- Beckenbach, E.F.; Bellman, R. Inequalities; Springer-Verlag: Berlin, Germany, 1971. [Google Scholar]
- Bishop, Y.M.M.; Fienberg, S.E.; Holland, P.W. Discrete Multivariate Analysis; MIT Press: Cambridge, MA, USA, 1975. [Google Scholar]
- Pardo, L. Statistical Inference Based on Divergence Measures; Chapman & Hall: Boca Raton, FL, USA, 2006. [Google Scholar]
Formulation | Parameter Restrictions | Source |
---|---|---|
α > 0 | Rényi [10,11] | |
α > 0 | Havrda and Charvát [12] | |
−∞ < α < ∞, k constant | Tsallis [13] | |
α, δ > 0 | Kapur [14], Aczél and Daróczy [15] | |
α > 0 | Arimoto [16] | |
α, β > 0 | Sharma and Mittal [17] | |
α > 0, α + δ −1 > 0 | Rathie [18] | |
0 < α < 1 ≤ δ, βλ > 0; or, 0 ≤ δ ≤ 1 < α, βλ < 0 | Kvålseth [19–21] | |
α, β > 0 | Morales et al. [22] | |
α, β positive integers | Good [23] | |
Aczél and Daróczy [15] |
n | λ | λ and n | S | S* | ST | |
---|---|---|---|---|---|---|
2 | 0.1 | 0.07 | 0.20 | 0.29 | 0.08 | 0.10 |
2 | 0.3 | 0.21 | 0.42 | 0.61 | 0.20 | 0.31 |
2 | 0.5 | 0.35 | 0.56 | 0.81 | - | 0.51 |
2 | 0.7 | 0.49 | 0.65 | 0.94 | - | 0.72 |
2 | 0.9 | 0.62 | 0.69 | 0.99 | - | 0.88 |
5 | 0.1 | 0.16 | 0.39 | 0.24 | 0.19 | 0.09 |
5 | 0.3 | 0.48 | 0.88 | 0.55 | 0.51 | 0.29 |
5 | 0.5 | 0.80 | 1.23 | 0.76 | 0.78 | 0.50 |
5 | 0.7 | 1.13 | 1.46 | 0.91 | - | 0.71 |
5 | 0.9 | 1.45 | 1.59 | 0.99 | - | 0.92 |
10 | 0.1 | 0.23 | 0.50 | 0.22 | 0.25 | 0.09 |
10 | 0.3 | 0.69 | 1.18 | 0.51 | 0.74 | 0.28 |
10 | 0.5 | 1.15 | 1.68 | 0.73 | 1.15 | 0.50 |
10 | 0.7 | 1.61 | 2.04 | 0.89 | - | 0.71 |
10 | 0.9 | 2.07 | 2.27 | 0.98 | - | 0.90 |
20 | 0.1 | 0.30 | 0.59 | 0.20 | 0.31 | 0.08 |
20 | 0.3 | 0.90 | 1.44 | 0.48 | 0.95 | 0.28 |
20 | 0.5 | 1.50 | 2.09 | 0.70 | 1.51 | 0.49 |
20 | 0.7 | 2.10 | 2.60 | 0.87 | - | 0.71 |
20 | 0.9 | 2.70 | 2.93 | 0.98 | - | 0.92 |
50 | 0.1 | 0.39 | 0.70 | 0.18 | 0.39 | 0.08 |
50 | 0.3 | 1.17 | 1.75 | 0.45 | 1.21 | 0.28 |
50 | 0.5 | 1.96 | 2.60 | 0.66 | 1.99 | 0.48 |
50 | 0.7 | 2.74 | 3.29 | 0.84 | - | 0.70 |
50 | 0.9 | 3.52 | 3.80 | 0.97 | - | 0.92 |
100 | 0.1 | 0.46 | 0.78 | 0.17 | 0.44 | 0.08 |
100 | 0.3 | 1.38 | 1.97 | 0.43 | 1.41 | 0.28 |
100 | 0.5 | 2.30 | 2.97 | 0.64 | 2.35 | 0.49 |
100 | 0.7 | 3.22 | 3.80 | 0.83 | - | 0.72 |
100 | 0.9 | 4.14 | 4.44 | 0.96 | - | 0.91 |
© 2014 by the authors; licensee MDPI, Basel, Switzerland This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/).
Share and Cite
Kvålseth, T.O. Entropy Evaluation Based on Value Validity. Entropy 2014, 16, 4855-4873. https://doi.org/10.3390/e16094855
Kvålseth TO. Entropy Evaluation Based on Value Validity. Entropy. 2014; 16(9):4855-4873. https://doi.org/10.3390/e16094855
Chicago/Turabian StyleKvålseth, Tarald O. 2014. "Entropy Evaluation Based on Value Validity" Entropy 16, no. 9: 4855-4873. https://doi.org/10.3390/e16094855
APA StyleKvålseth, T. O. (2014). Entropy Evaluation Based on Value Validity. Entropy, 16(9), 4855-4873. https://doi.org/10.3390/e16094855