Information Theory Applications in Signal Processing
Author Contributions
Funding
Conflicts of Interest
References
- Shannon, C.E. A Mathematical Theory of Communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
- Cover, T.M.; Thomas, J.A. Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing); Wiley-Interscience: New York, NY, USA, 2006. [Google Scholar]
- Verdú, S.; McLaughlin, S.W. (Eds.) Information Theory: 50 Years of Discovery; IEEE Press: Piscataway, NJ, USA, 2000. [Google Scholar]
- Scharf, L.L. Statistical Signal Processing: Detection, Estimation, and Time Series Analysis; Addison Wesley: Boston, MA, USA, 1991. [Google Scholar]
- Verdú, S. The interplay between estimation theory and information theory. In Proceedings of the IEEE 6th Workshop on Signal Processing Advances in Wireless Communications, New York, NY, USA, 5–8 June 2005; pp. xxiv–xxv. [Google Scholar]
- Kay, S.M. Fundamentals of Statistical Signal Processing: Estimation Theory; Prentice-Hall Inc.: Upper Saddle River, NJ, USA, 1993. [Google Scholar]
- Kay, S.M. Fundamentals of Statistical Signal Processing: Detection Theory; Prentice-Hall Inc.: Upper Saddle River, NJ, USA, 1993. [Google Scholar]
- MacKay, D.J.C. Information Theory, Inference & Learning Algorithms; Cambridge University Press: New York, NY, USA, 2002. [Google Scholar]
- Rissanen, J. Information and Complexity in Statistical Modeling, 1st ed.; Springer Publishing Company Incorporated: Berlin, Germany, 2007. [Google Scholar]
- Olshausen, B.; Field, D. Emergence of simple-cell receptive field properties by learning a sparse code for natural images. Nature 1996, 381, 607–609. [Google Scholar] [CrossRef] [PubMed]
- Smith, E.; Lewicki, M. Efficient auditory coding. Nature 2006, 439, 978–982. [Google Scholar] [CrossRef] [PubMed]
- Stilp, C.E.; Kluender, K.R. Cochlea-scaled entropy, not consonants, vowels, or time, best predicts speech intelligibility. Proc. Natl. Acad. Sci. USA 2010, 107, 12387–12392. [Google Scholar] [CrossRef] [Green Version]
- Amari, S.I. Information Geometry and Its Applications, 1st ed.; Springer Publishing Company Incorporated: Berlin, Germany, 2016. [Google Scholar]
- Tishby, N.; Zaslavsky, N. Deep learning and the information bottleneck principle. In Proceedings of the 2015 IEEE Information Theory Workshop (ITW), Jerusalem, Israel, 26 April–1 May 2015; pp. 1–5. [Google Scholar]
- Barron, A.R.; Klusowski, J.M. Complexity, Statistical Risk, and Metric Entropy of Deep Nets Using Total Path Variation. arXiv 2019, arXiv:1902.00800. [Google Scholar]
- Belda, J.; Vergara, L.; Safont, G.; Salazar, A. Computing the Partial Correlation of ICA Models for Non-Gaussian Graph Signal Processing. Entropy 2018, 21, 22. [Google Scholar] [CrossRef]
- Sarmiento, A.; Fondón, I.; Durán-Díaz, I.; Cruces, S. Centroid-Based Clustering with αβ-Divergences. Entropy 2019, 21, 196. [Google Scholar] [CrossRef]
- Delmaire, G.; Omidvar, M.; Puigt, M.; Ledoux, F.; Limem, A.; Roussel, G.; Courcot, D. Informed Weighted Non-Negative Matrix Factorization Using αβ-Divergence Applied to Source Apportionment. Entropy 2019, 21, 253. [Google Scholar] [CrossRef]
- Pinchas, M. A New Efficient Expression for the Conditional Expectation of the Blind Adaptive Deconvolution Problem Valid for the Entire Range of Signal-to-Noise Ratio. Entropy 2019, 21, 72. [Google Scholar] [CrossRef]
- Wu, B.; Gao, Y.; Feng, S.; Chanwimalueang, T. Sparse Optimistic Based on Lasso-LSQR and Minimum Entropy De-Convolution with FARIMA for the Remaining Useful Life Prediction of Machinery. Entropy 2018, 20, 747. [Google Scholar] [CrossRef]
- Cichocki, A.; Cruces, S.; Amari, S.i. Generalized Alpha-Beta Divergences and Their Application to Robust Nonnegative Matrix Factorization. Entropy 2011, 13, 134–170. [Google Scholar] [CrossRef]
- Vigneron, V.; Maaref, H. M-ary Rank Classifier Combination: A Binary Linear Programming Problem. Entropy 2019, 21, 440. [Google Scholar] [CrossRef]
- Szczęsna, A. Quaternion Entropy for Analysis of Gait Data. Entropy 2019, 21, 79. [Google Scholar] [CrossRef]
- Zhou, F.; Li, X.; Zhou, M.; Chen, Y.; Tan, H. A New Dictionary Construction Based Multimodal Medical Image Fusion Framework. Entropy 2019, 21, 267. [Google Scholar] [CrossRef]
- Ballesteros, D.M.; Peña, J.; Renza, D. A Novel Image Encryption Scheme Based on Collatz Conjecture. Entropy 2018, 20, 901. [Google Scholar] [CrossRef]
- Shen, S.; Yang, H.; Li, J.; Xu, G.; Sheng, M. Auditory Inspired Convolutional Neural Networks for Ship Type Classification with Raw Hydrophone Data. Entropy 2018, 20, 990. [Google Scholar] [CrossRef]
- Feng, G.; Guo, W.; Liu, B. Achievable Rate Region under Linear Beamforming for Dual-Hop Multiple-Access Relay Network. Entropy 2018, 20, 547. [Google Scholar] [CrossRef]
- Wang, M.; Wang, D. Sum-Rate of Multi-User MIMO Systems with Multi-Cell Pilot Contamination in Correlated Rayleigh Fading Channel. Entropy 2019, 21, 573. [Google Scholar] [CrossRef]
- Zhang, A.; Ji, Z. New Construction of Maximum Distance Separable (MDS) Self-Dual Codes over Finite Fields. Entropy 2019, 21, 101. [Google Scholar] [CrossRef]
- Wang, X.; Chang, H.; Li, J.; Cao, W.; Shan, L. Analysis of TDMP Algorithm of LDPC Codes Based on Density Evolution and Gaussian Approximation. Entropy 2019, 21, 457. [Google Scholar] [CrossRef]
- Wang, B.; Chen, X.; Xin, F.; Song, X. SINR- and MI-Based Maximin Robust Waveform Design. Entropy 2019, 21, 33. [Google Scholar] [CrossRef]
- Hao, T.; Cui, C.; Gong, Y. Efficient Low-PAR Waveform Design Method for Extended Target Estimation Based on Information Theory in Cognitive Radar. Entropy 2019, 21, 261. [Google Scholar] [CrossRef]
- Wang, J.; Ding, Q. Dynamic Rounds Chaotic Block Cipher Based on Keyword Abstract Extraction. Entropy 2018, 20, 693. [Google Scholar] [CrossRef]
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Cruces, S.; Martín-Clemente, R.; Samek, W. Information Theory Applications in Signal Processing. Entropy 2019, 21, 653. https://doi.org/10.3390/e21070653
Cruces S, Martín-Clemente R, Samek W. Information Theory Applications in Signal Processing. Entropy. 2019; 21(7):653. https://doi.org/10.3390/e21070653
Chicago/Turabian StyleCruces, Sergio, Rubén Martín-Clemente, and Wojciech Samek. 2019. "Information Theory Applications in Signal Processing" Entropy 21, no. 7: 653. https://doi.org/10.3390/e21070653
APA StyleCruces, S., Martín-Clemente, R., & Samek, W. (2019). Information Theory Applications in Signal Processing. Entropy, 21(7), 653. https://doi.org/10.3390/e21070653