Jeffreys Divergence and Generalized Fisher Information Measures on Fokker–Planck Space–Time Random Field
Abstract
:1. Introduction
1.1. Space–Time Random Field
1.2. Kramers–Moyal Expansion and Fokker–Planck Equation
1.3. Differential Entropy and De Bruijn Identity
1.4. Entropy Divergence
2. Notations, Definitions, and Propositions
2.1. Notations and Assumptions
- •
- Given a probability space , two real valued space–time random fields are denoted as , or , , where and , are space–time variables.
- •
- The probability density functions of P and Q are denoted as p and q. With , is the density value at of X and is the density value at of Y.
- •
- Unless there are specific restrictions on the ranges of variables, suppose that our density functions and belongs to . This means that and are partial differentiable twice with respect to u and once with respect to or , respectively.
- •
- Vectors that differ only from the k-th coordinate of are denoted , where the k-th coordinates are and , .
2.2. Definitions
- (1)
- The Kramers–Moyal expansion is truncated at , meaning that the process is deterministic;
- (2)
- The Kramers–Moyal expansion stops at , with the resulting equation being the Fokker–Planck equation, and describes diffusion processes;
- (3)
- The Kramers–Moyal expansion contains all the terms up to .
3. Main Results and Proofs
4. Three Fokker–Planck Random Fields and Their Corresponding Information Measures
5. Conclusions
Funding
Institutional Review Board Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
KL | Kullback–Leibler divergence |
FI | Fisher information |
CFI | Cross–Fisher information |
FD | Fisher divergence |
sFD | Symmetric Fisher divergence |
JD | Jeffreys divergence |
References
- Risken, H. The Fokker–Planck Equation: Methods of Solution and Applications; Springer: Berlin/Heidelberg, Germany, 1984. [Google Scholar]
- Stam, A.J. Some inequalities satisfied by the quantities of information of Fisher and Shannon. Inf. Control 1959, 2, 101–112. [Google Scholar] [CrossRef]
- Barron, A.R. Entropy and the central limit theorem. Ann. Probab. 1986, 14, 336–342. [Google Scholar] [CrossRef]
- Johnson, O. Information Theory and the Central Limit Theorem; Imperial College Press: London, UK, 2004. [Google Scholar]
- Guo, D. Relative entropy and score function: New information estimation relationships through arbitrary additive perturbation. In Proceedings of the 2009 IEEE International Symposium on Information Theory, Seoul, Republic of Korea, 28 June–3 July 2009; pp. 814–818. [Google Scholar]
- Toranzo, I.V.; Zozor, S.; Brossier, J.-M. Generalization of the De Bruijn Identity to General ϕ-Entropies and ϕ-Fisher Informations. IEEE Trans. Inform. Theory 2018, 64, 6743–6758. [Google Scholar] [CrossRef]
- Kharazmi, O.; Balakrishnan, N. Cumulative residual and relative cumulative residual Fisher information and their properties. IEEE Trans. Inform. Theory 2021, 67, 6306–6312. [Google Scholar] [CrossRef]
- Kolmogorov, A.N. The local structure of turbulence in incompressible viscous fluid for very large Reynolds numbers. Dokl. Akad. Nauk SSSR 1941, 30, 299–303. [Google Scholar]
- Kolmogorov, A.N. On the degeneration of isotropic turbulence in an incompressible viscous flu. Dokl. Akad. Nauk SSSR 1941, 31, 538–542. [Google Scholar]
- Kolmogorov, A.N. Dissipation of energy in isotropic turbulence. Dokl. Akad. Nauk SSSR 1941, 32, 19–21. [Google Scholar]
- Yaglom, A.M. Some classes of random fields in n-dimensional space, related to stationary random processes. Theory Probab. Its Appl. 1957, 2, 273–320. [Google Scholar] [CrossRef]
- Yaglom, A.M. Correlation Theory of Stationary and Related Random Functions. Volume I: Basic Results; Springer: New York, NY, USA, 1987. [Google Scholar]
- Yaglom, A.M. Correlation Theory of Stationary and Related Random Functions. Volume II: Supplementary Notes and References; Springer: Berlin, Germany, 1987. [Google Scholar]
- Bowditch, A.; Sun, R. The two-dimensional continuum random field Ising model. Ann. Probab. 2022, 50, 419–454. [Google Scholar] [CrossRef]
- Bailleul, I.; Catellier, R.; Delarue, F. Propagation of chaos for mean field rough differential equations. Ann. Probab. 2021, 49, 944–996. [Google Scholar] [CrossRef]
- Wu, L.; Samorodnitsky, G. Regularly varying random fields. Stoch. Process Their Appl. 2020, 130, 4470–4492. [Google Scholar] [CrossRef]
- Koch, E.; Dombry, C.; Robert, C.Y. A central limit theorem for functions of stationary max-stable random fields on Rd. Stoch. Process Their Appl. 2020, 129, 3406–3430. [Google Scholar] [CrossRef]
- Ye, Z. On Entropy and ε-Entropy of Random Fields. Ph.D. Dissertation, Cornell University, Ithaca, NY, USA, 1989. [Google Scholar]
- Ye, Z.; Berger, T. A new method to estimate the critical distortion of random fields. IEEE Trans. Inform. Theory 1992, 38, 152–157. [Google Scholar] [CrossRef]
- Ye, Z.; Berger, T. Information Measures for Discrete Random Fields; Science Press: Beijing, China; New York, NY, USA, 1998. [Google Scholar]
- Ye, Z.; Yang, W. Random Field: Network Information Theory and Game Theory; Science Press: Beijing, China, 2023. (In Chinese) [Google Scholar]
- Ma, C. Stationary random fields in space and time with rational spectral densities. IEEE Trans. Inform. Theory 2007, 53, 1019–1029. [Google Scholar] [CrossRef]
- Hairer, M. A theory of regularity structures. Invent. Math. 2014, 198, 269–504. [Google Scholar] [CrossRef]
- Hairer, M. Solving the KPZ equation. Ann. Math. 2013, 178, 559–664. [Google Scholar] [CrossRef]
- Kremp, H.; Perkowski, N. Multidimensional SDE with distributional drift and Lévy noise. Bernoulli 2022, 28, 1757–1783. [Google Scholar] [CrossRef]
- Beeson, R.; Namachchivaya, N.S.; Perkowski, N. Approximation of the filter equation for multiple timescale, correlated, nonlinear systems. SIAM J. Math. Anal. 2022, 54, 3054–3090. [Google Scholar] [CrossRef]
- Song, Z.; Zhang, J. A note for estimation about average differential entropy of continuous bounded space–time random field. Chin. J. Electron. 2022, 31, 793–803. [Google Scholar] [CrossRef]
- Kramers, H.A. Brownian motion in a field of force and the diffusion model of chemical reactions. Physica 1940, 7, 284–304. [Google Scholar] [CrossRef]
- Moyal, J.E. Stochastic processes and statistical physics. J. R. Stat. Soc. Ser. B Stat. Methodol. 1949, 11, 150–210. [Google Scholar] [CrossRef]
- Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423, 623–656. [Google Scholar] [CrossRef]
- Neeser, F.D.; Massey, J.L. Proper complex random processes with applications to information theory. IEEE Trans. Inform. Theory 1991, 39, 1293–1302. [Google Scholar] [CrossRef]
- Ihara, S. Information Theory-for Continuous Systems; World Scientific: Singapore, 1993. [Google Scholar]
- Gray, R.M. Entropy and Information Theory; Springer: Boston, MA, USA, 2011. [Google Scholar]
- Bach, F. Information Theory With Kernel Methods. IEEE Trans. Inform. Theory 2023, 69, 752–775. [Google Scholar] [CrossRef]
- Kullback, S.; Leibler, R.A. On information and sufficiency. Ann. Math. Stat. 1951, 22, 79–86. [Google Scholar] [CrossRef]
- Jeffreys, H. An invariant form for the prior probability in estimation problems. Proc. R. Soc. Lond. A 1946, 186, 453–461. [Google Scholar]
- Fuglede, B.; Topsøe, F. Jensen-Shannon divergence and Hilbert space embedding. In Proceedings of the IEEE International Symposium on Information Theory (ISIT), Chicago, IL, USA, 27 June–2 July 2004; Volume 31. [Google Scholar]
- Rényi, A. On measures of entropy and information. In Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics; University of California Press: Berkeley, CA, USA, 1961; Volume 1, pp. 547–561. [Google Scholar]
- She, R.; Fan, P.; Liu, X.-Y.; Wang, X. Interpretable Generative Adversarial Networks With Exponential Function. IEEE Trans. Signal Process. 2021, 69, 3854–3867. [Google Scholar] [CrossRef]
- Liu, S.; She, R.; Zhu, Z.; Fan, P. Storage Space Allocation Strategy for Digital Data with Message Importance. Entropy 2020, 22, 591. [Google Scholar] [CrossRef]
- She, R.; Liu, S.; Fan, P. Attention to the Variation of Probabilistic Events: Information Processing with Message Importance Measure. Entropy 2019, 21, 439. [Google Scholar] [CrossRef]
- Wan, S.; Lu, J.; Fan, P.; Letaief, K.B. Information Theory in Formation Control: An Error Analysis to Multi-Robot Formation. Entropy 2018, 20, 618. [Google Scholar] [CrossRef]
- She, R.; Liu, S.; Fan, P. Recognizing Information Feature Variation: Message Importance Transfer Measure and Its Applications in Big Data. Entropy 2018, 20, 401. [Google Scholar] [CrossRef]
- Nielsen, F. An Elementary Introduction to Information Geometry. Entropy 2020, 22, 1100. [Google Scholar] [CrossRef] [PubMed]
- Nielsen, F. On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means. Entropy 2019, 21, 485. [Google Scholar] [CrossRef]
- Nielsen, F.; Nock, R. Generalizing skew Jensen divergences and Bregman divergences with comparative convexity. IEEE Signal Process. Lett. 2017, 24, 1123–1127. [Google Scholar] [CrossRef]
- Furuichi, S.; Minculete, N. Refined Young Inequality and Its Application to Divergences. Entropy 2021, 23, 514. [Google Scholar] [CrossRef]
- Pinele, J.; Strapasson, J.E.; Costa, S.I. The Fisher-Rao Distance between Multivariate Normal Distributions: Special Cases, Bounds and Applications. Entropy 2020, 22, 404. [Google Scholar] [CrossRef]
- Reverter, F.; Oller, J.M. Computing the Rao distance for Gamma distributions. J. Comput. Appl. Math. 2003, 157, 155–167. [Google Scholar] [CrossRef]
- Pawula, R.F. Generalizations and extensions of the Fokker–Planck-Kolmogorov equations. IEEE Trans. Inform. Theory 1967, 13, 33–41. [Google Scholar] [CrossRef]
- Pawula, R.F. Approximation of the linear Boltzmann equation by the Fokker–Planck equation. Phys. Rev. 1967, 162, 186–188. [Google Scholar] [CrossRef]
- Khoshnevisan, D.; Shi, Z. Brownian Sheet and Capacity. Ann. Probab. 1999, 27, 1135–1159. [Google Scholar] [CrossRef]
- Revuz, D.; Yor, M. Continuous Martingales and Brownian Motion, 2nd ed.; Springer: New York, NY, USA, 1999. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, J. Jeffreys Divergence and Generalized Fisher Information Measures on Fokker–Planck Space–Time Random Field. Entropy 2023, 25, 1445. https://doi.org/10.3390/e25101445
Zhang J. Jeffreys Divergence and Generalized Fisher Information Measures on Fokker–Planck Space–Time Random Field. Entropy. 2023; 25(10):1445. https://doi.org/10.3390/e25101445
Chicago/Turabian StyleZhang, Jiaxing. 2023. "Jeffreys Divergence and Generalized Fisher Information Measures on Fokker–Planck Space–Time Random Field" Entropy 25, no. 10: 1445. https://doi.org/10.3390/e25101445
APA StyleZhang, J. (2023). Jeffreys Divergence and Generalized Fisher Information Measures on Fokker–Planck Space–Time Random Field. Entropy, 25(10), 1445. https://doi.org/10.3390/e25101445