Principle of Information Increase: An Operational Perspective on Information Gain in the Foundations of Quantum Theory
Abstract
:1. Introduction
2. Continuous Entropy and Bayesian Information Gain
2.1. Entropy of Continuous Distribution
- 1.
- Entropy of continuous distribution as a difference:For example, when variable X is updated to due to certain actions, the decrease in entropy can be expressed as:
- 2.
- Straightforward solution:Jaynes directly discards the infinite term in Equation (2). For the sake of convenience, the minus sign is also dropped. This leads to the definition of Shannon–Jaynes information:This term quantifies the amount of information we possess regarding the outcome of X rather than the degree of uncertainty about X. is equivalent to the KL divergence between the distributions and .
2.2. Bayesian Information Gain
3. Differential Information Gain
3.1. Finite Number of Tosses
3.1.1. Positivity of
3.1.2. Fraction of Negatives
3.1.3. Robustness of
3.2. Large N Approximation
4. Relative Information Gain
5. Expected Information Gain
6. Comparison of Three Information Gain Measures, and the Information Increase Principle
Principle of Information Increase:In a series of interrogations of an n-outcome probabilistic source, the information gain from additional data should tend towards positivity in the asymptotic limit. However, in the extreme case where the first N data points are identical and the data of the th trial is contrary to the previous data, the information gain in this exceptional case should be negative.
7. Related Work
7.1. Information Increase Principle and the Jeffreys Binomial Prior
7.2. Other Information-Theoretical Motivations of the Jeffreys Binomial Prior
8. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A. Derivation of Differential Information Gain
Appendix B. Derivation of Relative Information Gain
Appendix C. Equivalence of Expected Differential Information Gain and Expected Relative Information Gain
References
- Patra, M.K. Quantum state determination: Estimates for information gain and some exact calculations. J. Phys. A Math. Theor. 2007, 40, 10887–10902. [Google Scholar] [CrossRef]
- Madhok, V.; Riofrío, C.A.; Ghose, S.; Deutsch, I.H. Information Gain in Tomography–A Quantum Signature of Chaos. Phys. Rev. Lett. 2014, 112, 014102. [Google Scholar] [CrossRef]
- Quek, Y.; Fort, S.; Ng, H.K. Adaptive quantum state tomography with neural networks. npj Quantum Inf. 2021, 7, 105. [Google Scholar] [CrossRef]
- Gupta, R.; Xia, R.; Levine, R.D.; Kais, S. Maximal Entropy Approach for Quantum State Tomography. PRX Quantum 2021, 2, 010318. [Google Scholar] [CrossRef]
- McMichael, R.D.; Dushenko, S.; Blakley, S.M. Sequential Bayesian experiment design for adaptive Ramsey sequence measurements. J. Appl. Phys. 2021, 130, 144401. [Google Scholar] [CrossRef]
- Placek, B.; Angerhausen, D.; Knuth, K.H. Analyzing Exoplanet Phase Curve Information Content: Toward Optimized Observing Strategies. Astron. J. 2017, 154, 154. [Google Scholar] [CrossRef]
- Ma, C.W.; Ma, Y.G. Shannon information entropy in heavy-ion collisions. Prog. Part. Nuclear Phys. 2018, 99, 120–158. [Google Scholar] [CrossRef]
- Grinbaum, A. Elements of information-theoretic derivation of the formalism of quantum theory. Int. J. Quantum Inf. 2003, 1, 289–300. [Google Scholar] [CrossRef]
- Brukner, V.; Zeilinger, A. Information Invariance and Quantum Probabilities. Foundations Phys. 2009, 39, 677–689. [Google Scholar] [CrossRef]
- Goyal, P.; Knuth, K.H.; Skilling, J. Origin of Complex Quantum Amplitudes and Feynman’s Rules. Phys. Rev. A 2010, 81, 022109. [Google Scholar] [CrossRef]
- Caticha, A. Entropic dynamics, time and quantum theory. J. Phys. A Math. Theor. 2011, 44, 225303. [Google Scholar] [CrossRef]
- Masanes, L.; Müller, M.P.; Augusiak, R.; Pérez-García, D. Existence of an information unit as a postulate of quantum theory. Proc. Natl. Acad. Sci. USA 2013, 110, 16373–16377. [Google Scholar] [CrossRef]
- De Raedt, H.; Katsnelson, M.I.; Michielsen, K. Quantum theory as plausible reasoning applied to data obtained by robust experiments. Philos. Trans. R. Soc. A Math. Phys. Eng. Sci. 2016, 374, 20150233. [Google Scholar] [CrossRef]
- Höhn, P.A. Quantum Theory from Rules on Information Acquisition. Entropy 2017, 19, 98. [Google Scholar] [CrossRef]
- Aravinda, S.; Srikanth, R.; Pathak, A. On the origin of nonclassicality in single systems. J. Phys. A Math. Theor. 2017, 50, 465303. [Google Scholar] [CrossRef]
- Czekaj, L.; Horodecki, M.; Horodecki, P.; Horodecki, R. Information content of systems as a physical principle. Phys. Rev. A 2017, 95, 022119. [Google Scholar] [CrossRef]
- Chiribella, G. Agents, Subsystems, and the Conservation of Information. Entropy 2018, 20, 358. [Google Scholar] [CrossRef]
- Summhammer, J. Maximum predictive power and the superposition principle. Int. J. Theor. Phys. 1994, 33, 171–178. [Google Scholar] [CrossRef]
- Summhammer, J. Maximum predictive power and the superposition principle. arXiv, 1999; arXiv:quant-ph/9910039. [Google Scholar]
- Wootters, W.K. Communicating through Probabilities: Does Quantum TheoryOptimize the Transfer of Information? Entropy 2013, 15, 3130–3147. [Google Scholar] [CrossRef]
- Cover, T.M.; Thomas, J.A. Differential Entropy. In Elements of Information Theory; John Wiley & Sons, Ltd.: Hoboken, NJ, USA, 2005; chapter 8; pp. 243–259. [Google Scholar] [CrossRef]
- Jaynes, E.T. Information Theory and Statistical Mechanics. In Statistical Physics; Ford, K.W., Ed.; W. A. Benjamin, Inc.: Tokyo, Japan, 1963; pp. 181–218. [Google Scholar]
- Goyal, P. Prior Probabilities: An Information-Theoretic Approach. AIP Conf. Proc. 2005, 803, 366–373. [Google Scholar] [CrossRef]
- Berger, J.O.; Bernardo, J.M. Ordered Group Reference Priors with Application to the Multinomial Problem. Biometrika 1992, 79, 25–37. [Google Scholar] [CrossRef]
FoN (Numerical Result, ) | FoN (Asymptotic Result) | Discrepancy between the Two Results | |
---|---|---|---|
−0.7 | 0 | 0 | 0 |
−0.6 | 0.001 | 0 | |
−0.5 | 0.013 | 0 | |
−0.4 | 0.144 | 0.143 | |
0 | 0.334 | 0.333 | |
1 | 0.429 | 0.429 | 0 |
3 | 0.467 | 0.467 | 0 |
Information Gain Measure | Asymptotic Forms () | Asymptotic Sensitivity to Prior |
---|---|---|
Differential Information Gain | Heavily dependent upon prior. Independent of for certain priors (). | |
Relative Information Gain | Insensitive to prior. For large N, only affected by . |
Type of Information Gain | Positivity | Robustness about |
---|---|---|
Differential | Strictly positive when where . Asymptotically positive when . | Robustness exists only when of beta distribution prior. |
Relative | Strictly positive for all priors. | No significant differences of robustness among beta distribution priors. |
Expected | Strictly positive for all priors. | No significant differences of robustness among beta distribution priors. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yu, Y.; Goyal, P. Principle of Information Increase: An Operational Perspective on Information Gain in the Foundations of Quantum Theory. Information 2024, 15, 287. https://doi.org/10.3390/info15050287
Yu Y, Goyal P. Principle of Information Increase: An Operational Perspective on Information Gain in the Foundations of Quantum Theory. Information. 2024; 15(5):287. https://doi.org/10.3390/info15050287
Chicago/Turabian StyleYu, Yang, and Philip Goyal. 2024. "Principle of Information Increase: An Operational Perspective on Information Gain in the Foundations of Quantum Theory" Information 15, no. 5: 287. https://doi.org/10.3390/info15050287
APA StyleYu, Y., & Goyal, P. (2024). Principle of Information Increase: An Operational Perspective on Information Gain in the Foundations of Quantum Theory. Information, 15(5), 287. https://doi.org/10.3390/info15050287