Information and Divergence Measures
Acknowledgments
Conflicts of Interest
References
- Alin, A.; Kurt, S. Ordinary and penalized minimum power-divergence estimators in two-way contingency tables. Comput. Stat. 2008, 23, 455–468. [Google Scholar] [CrossRef]
- Basu, A.; Harris, I.R.; Hjort, N.L. Robust and efficient estimation by minimising a density power divergence. Biometrika 1998, 85, 549–559. [Google Scholar] [CrossRef]
- Jiménez-Gamero, M.D.; Batsidis, A. Minimum distance estimators for count data based on the probability generating function with applications. Metrika 2017, 80, 503–545. [Google Scholar] [CrossRef]
- Patra, S.; Maji, A.; Basu, A.; Pardo, L. The Power Divergence and the Density Power Divergence Families: The Mathematical Connection. Sankhya B 2013, 75, 16–28. [Google Scholar] [CrossRef]
- Toma, A.; Broniatowski, M. Dual divergence estimators and tests: Robustness results. J. Multivar. Anal. 2011, 102, 20–36. [Google Scholar] [CrossRef]
- Vonta, F.; Mattheou, K.; Karagrigoriou, A. On properties of the (φ,α)-power divergence family with applications to goodness of fit tests. Methodol. Comput. Appl. Probab. 2012, 14, 335–356. [Google Scholar] [CrossRef]
- Shang, J.; Cavanaugh, J.E. Bootstrap variants of the Akaike information criterion for mixed model selection. Comput. Stat. Data Anal. 2008, 52, 2004–2021. [Google Scholar] [CrossRef]
- Neath, A.A.; Cavanaugh, J.E.; Weyhaupt, A.G. Model evaluation, discrepancy function estimation, and social choice theory. Comput. Stat. 2015, 30, 231–249. [Google Scholar] [CrossRef]
- Mattheou, K.; Lee, S.; Karagrigoriou, A. A model selection criterion based on the BHHJ measure of divergence. J. Stat. Plan. Inference 2009, 139, 228–235. [Google Scholar] [CrossRef]
- Barbu, V.S.; D’Amico, G.; Makrides, A. A continuous-time semi-Markov system governed by stepwise transitions. Mathematics 2022, 10, 2745. [Google Scholar] [CrossRef]
- Batsidis, A.; Martin, N.; Pardo Llorente, L.; Zografos, K. φ-Divergence Based Procedure for Parametric Change-Point Problems. Methodol. Comput. Appl. Probab. 2016, 18, 21–35. [Google Scholar] [CrossRef]
- Nielsen, F. Revisiting Chernoff Information with Likelihood Ratio Exponential Families. Entropy 2022, 24, 1400. [Google Scholar] [CrossRef]
- Sachlas, A.; Papaioannou, T. Residual and past entropy in actuarial science and survival models. Methodol. Comput. Appl. Probab. 2014, 16, 79–99. [Google Scholar] [CrossRef]
- Preda, V.; Dedu, S.; Iatan, I.; Cernat, I.D.; Sheraz, M. Tsallis Entropy for Loss Models and Survival Models Involving Truncated and Censored Random Variables. Entropy 2022, 24, 1654. [Google Scholar] [CrossRef] [PubMed]
- Zografos, K.; Nadarajah, S. Survival exponential entropies. IEEE Trans. Inf. Theory 2005, 51, 1239–1246. [Google Scholar] [CrossRef]
- Herntier, T.; Peter, A.M. Transversality Conditions for Geodesics on the Statistical Manifold of Multivariate Gaussian Distributions. Entropy 2022, 24, 1698. [Google Scholar] [CrossRef]
- Sfetcu, R.-C.; Sfetcu, S.-C.; Preda, V. Some Properties of Weighted Tsallis and Kaniadakis Divergences. Entropy 2022, 24, 1616. [Google Scholar] [CrossRef]
- Chen, J.; Wang, J.; Zhang, Y.; Wang, F.; Zhou, J. Spatial Information-Theoretic Optimal LPI Radar Waveform Design. Entropy 2022, 24, 1515. [Google Scholar] [CrossRef]
- Dajles, A.; Cavanaugh, J. Probabilistic Pairwise Model Comparisons Based on Bootstrap Estimators of the Kullback–Leibler Discrepancy. Entropy 2022, 24, 1483. [Google Scholar] [CrossRef]
- Suter, F.; Cernat, I.; Drăgan, M. Some Information Measures Properties of the GOS-Concomitants from the FGM Family. Entropy 2022, 24, 1361. [Google Scholar] [CrossRef]
- Yu, D.; Zhou, X.; Pan, Y.; Niu, Z.; Sun, H. Application of Statistical K-Means Algorithm for University Academic Evaluation. Entropy 2022, 24, 1004. [Google Scholar] [CrossRef] [PubMed]
- Bouhlel, N.; Rousseau, D. A Generic Formula and Some Special Cases for the Kullback–Leibler Divergence between Central Multivariate Cauchy Distributions. Entropy 2022, 24, 838. [Google Scholar] [CrossRef] [PubMed]
- Jaenada, M.; Miranda, P.; Pardo, L. Robust Test Statistics Based on Restricted Minimum Rényi’s Pseudodistance Estimators. Entropy 2022, 24, 616. [Google Scholar] [CrossRef] [PubMed]
- Levene, M. A Skew Logistic Distribution for Modelling COVID-19 Waves and Its Evaluation Using the Empirical Survival Jensen–Shannon Divergence. Entropy 2022, 24, 600. [Google Scholar] [CrossRef]
- Sason, I. Information Inequalities via Submodularity and a Problem in Extremal Graph Theory. Entropy 2022, 24, 597. [Google Scholar] [CrossRef]
- Meselidis, C.; Karagrigoriou, A. Contingency Table Analysis and Inference via Double Index Measures. Entropy 2022, 24, 477. [Google Scholar] [CrossRef]
- Nielsen, F. Statistical Divergences between Densities of Truncated Exponential Families with Nested Supports: Duo Bregman and Duo Jensen Divergences. Entropy 2022, 24, 421. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Karagrigoriou, A.; Makrides, A. Information and Divergence Measures. Entropy 2023, 25, 683. https://doi.org/10.3390/e25040683
Karagrigoriou A, Makrides A. Information and Divergence Measures. Entropy. 2023; 25(4):683. https://doi.org/10.3390/e25040683
Chicago/Turabian StyleKaragrigoriou, Alex, and Andreas Makrides. 2023. "Information and Divergence Measures" Entropy 25, no. 4: 683. https://doi.org/10.3390/e25040683
APA StyleKaragrigoriou, A., & Makrides, A. (2023). Information and Divergence Measures. Entropy, 25(4), 683. https://doi.org/10.3390/e25040683