Exponential Strong Converse for Successive Refinement with Causal Decoder Side Information †
Abstract
:1. Introduction
1.1. Related Works
1.2. Main Contribution and Challenges
2. Problem Formulation and Existing Results
2.1. Notation
2.2. Problem Formulation
- k encoding functions
- and decoding functions: for each
2.3. Existing Results
3. Main Results
3.1. Preliminaries
- (i)
- For any rate-distortion tuple outside the rate-distortion region, i.e., , we have
- (ii)
- For any rate-distortion tuple inside the rate-distortion region, i.e., , we have
3.2. Main Result
4. Proof of the Non-Asymptotic Converse Bound (Theorem 2)
4.1. Preliminaries
4.2. Proof Steps of Theorem 2
5. Proof of Properties of Strong Converse Exponent: Proof of Lemma 1
5.1. Alternative Expressions for the Rate-Distortion Region
5.2. Proof of Claim (i)
- (i)
- For any rate-distortion tuple ,
- (ii)
- For any rate-distortion tuple outside the rate-distortion region, i.e., , there exists such that:
5.3. Proof of Claim (ii)
6. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Appendix A. Proof of Theorem 1
Appendix B. Proof of Lemma 2
Appendix C. Proof of Lemma A1
Appendix D. Proof of Lemma 3
Appendix E. Proof of Lemma 4
Appendix F. Proof of Lemma 6
Appendix F.1. Proof of Claim (i)
Appendix F.2. Proof of Claim (ii)
References
- Maor, A.; Merhav, N. On Successive Refinement with Causal Side Information at the Decoders. IEEE Trans. Inf. Theory 2008, 54, 332–343. [Google Scholar] [CrossRef]
- Tian, C.; Diggavi, S.N. On multistage successive refinement for Wyner–Ziv source coding with degraded side informations. IEEE Trans. Inf. Theory 2007, 53, 2946–2960. [Google Scholar] [CrossRef]
- Steinberg, Y.; Merhav, N. On successive refinement for the Wyner-Ziv problem. IEEE Trans. Inf. Theory 2004, 50, 1636–1654. [Google Scholar] [CrossRef]
- Equitz, W.H.; Cover, T.M. Successive refinement of information. IEEE Trans. Inf. Theory 1991, 37, 269–275. [Google Scholar] [CrossRef]
- Koshelev, V. Estimation of mean error for a discrete successive-approximation scheme. Probl. Peredachi Informatsii 1981, 17, 20–33. [Google Scholar]
- Rimoldi, B. Successive refinement of information: Characterization of the achievable rates. IEEE Trans. Inf. Theory 1994, 40, 253–259. [Google Scholar] [CrossRef]
- Kanlis, A.; Narayan, P. Error exponents for successive refinement by partitioning. IEEE Trans. Inf. Theory 1996, 42, 275–282. [Google Scholar] [CrossRef]
- No, A.; Ingber, A.; Weissman, T. Strong Successive Refinability and Rate-Distortion-Complexity Tradeoff. IEEE Trans. Inf. Theory 2016, 62, 3618–3635. [Google Scholar] [CrossRef]
- Zhou, L.; Tan, V.Y.F.; Motani, M. Second-Order and Moderate Deviation Asymptotics for Successive Refinement. IEEE Trans. Inf. Theory 2017, 63, 2896–2921. [Google Scholar] [CrossRef]
- Tuncel, E.; Rose, K. Additive successive refinement. IEEE Trans. Inf. Theory 2003, 49, 1983–1991. [Google Scholar] [CrossRef]
- Chow, J.; Berger, T. Failure of successive refinement for symmetric Gaussian mixtures. IEEE Trans. Inf. Theory 1997, 43, 350–352. [Google Scholar] [CrossRef]
- Tuncel, E.; Rose, K. Error exponents in scalable source coding. IEEE Trans. Inf. Theory 2003, 49, 289–296. [Google Scholar] [CrossRef]
- Effros, M. Distortion-rate bounds for fixed- and variable-rate multiresolution source codes. IEEE Trans. Inf. Theory 1999, 45, 1887–1910. [Google Scholar] [CrossRef]
- Weissman, T.; Gamal, A.E. Source Coding With Limited-Look-Ahead Side Information at the Decoder. IEEE Trans. Inf. Theory 2006, 52, 5218–5239. [Google Scholar] [CrossRef]
- El Gamal, A.; Kim, Y.H. Network Information Theory; Cambridge University Press: Cambridge, UK, 2011. [Google Scholar]
- Timo, R.; Vellambi, B.N. Two lossy source coding problems with causal side-information. In Proceedings of the 2009 IEEE International Symposium on Information Theory, Seoul, South Korea, 28 June–3 July 2009; pp. 1040–1044. [Google Scholar] [CrossRef]
- Gu, W.H.; Effros, M. Source coding for a simple multi-hop network. In Proceedings of the International Symposium on Information Theory (ISIT 2005), Adelaide, SA, Australia, 4–9 September 2005. [Google Scholar]
- Gray, R.; Wyner, A. Source coding for a simple network. Bell Syst. Tech. J. 1974, 53, 1681–1721. [Google Scholar] [CrossRef]
- Maor, A.; Merhav, N. On Successive Refinement for the Kaspi/Heegard–Berger Problem. IEEE Trans. Inf. Theory 2010, 56, 3930–3945. [Google Scholar] [CrossRef]
- Heegard, C.; Berger, T. Rate distortion when side information may be absent. IEEE Trans. Inf. Theory 1985, 31, 727–734. [Google Scholar] [CrossRef]
- Chia, Y.K.; Weissman, T. Cascade and Triangular source coding with causal side information. In Proceedings of the 2011 IEEE International Symposium on Information Theory Proceedings, St. Petersburg, Russia, 31 July–5 August 2011; pp. 1683–1687. [Google Scholar] [CrossRef]
- Fong, S.L.; Tan, V.Y.F. A Proof of the Strong Converse Theorem for Gaussian Multiple Access Channels. IEEE Trans. Inf. Theory 2016, 62, 4376–4394. [Google Scholar] [CrossRef]
- Oohama, Y. Exponent Function for One Helper Source Coding Problem at Rates outside the Rate Region. arXiv 2015, arXiv:1504.05891. [Google Scholar]
- Oohama, Y. New Strong Converse for Asymmetric Broadcast Channels. arXiv 2016, arXiv:1604.02901. [Google Scholar]
- Oohama, Y. Exponential Strong Converse for Source Coding with Side Information at the Decoder. Entropy 2018, 20, 352. [Google Scholar] [CrossRef]
- Ahlswede, R.; Korner, J. Source coding with side information and a converse for degraded broadcast channels. IEEE Trans. Inf. Theory 1975, 21, 629–637. [Google Scholar] [CrossRef]
- Wyner, A.D. On source coding with side information at the decoder. IEEE Trans. Inf. Theory 1975, 21, 294–300. [Google Scholar] [CrossRef]
- Korner, J.; Marton, K. General broadcast channels with degraded message sets. IEEE Trans. Inf. Theory 1977, 23, 60–64. [Google Scholar] [CrossRef]
- Wyner, A.D.; Ziv, J. The rate-distortion function for source coding with side information at the decoder. IEEE Trans. Inf. Theory 1976, 22, 1–10. [Google Scholar] [CrossRef]
- Tuncel, E.; Gündüz, D. Identification and Lossy Reconstruction in Noisy Databases. IEEE Trans. Inf. Theory 2014, 60, 822–831. [Google Scholar] [CrossRef]
- Zhou, L.; Tan, V.Y.F.; Motani, M. Exponential Strong Converse for Content Identification with Lossy Recovery. IEEE Trans. Inf. Theory 2018, 64, 5879–5897. [Google Scholar] [CrossRef]
- Wyner, A. The common information of two dependent random variables. IEEE Trans. Inf. Theory 1975, 21, 163–179. [Google Scholar] [CrossRef]
- Yu, L.; Tan, V.Y.F. Wyner’s Common Information Under Rényi Divergence Measures. IEEE Trans. Inf. Theory 2018, 64, 3616–3632. [Google Scholar] [CrossRef]
- Csiszár, I.; Körner, J. Information Theory: Coding Theorems for Discrete Memoryless Systems; Cambridge University Press: Cambridge, UK, 2011. [Google Scholar]
- Gu, W.; Effros, M. A strong converse for a collection of network source coding problems. In Proceedings of the 2009 IEEE International Symposium on Information Theory, Seoul, South Korea, 28 June–3 July 2009; pp. 2316–2320. [Google Scholar] [CrossRef]
- Liu, J.; van Handel, R.; Verdú, S. Beyond the Blowing-Up Lemma: Optimal Second-Order Converses via Reverse Hypercontractivity. Preprint. Available online: http://web.mit.edu/jingbo/www/preprints/msl-blup.pdf (accessed on 17 April 2019).
- Tyagi, H.; Watanabe, S. Strong Converse using Change of Measure. arXiv 2018, arXiv:1805.04625. [Google Scholar]
- Ahlswede, R.; Csiszár, I. Hypothesis testing with communication constraints. IEEE Trans. Inf. Theory 1986, 32, 533–542. [Google Scholar] [CrossRef]
- Salehkalaibar, S.; Wigger, M.; Wang, L. Hypothesis testing in multi-hop networks. arXiv 2017, arXiv:1708.05198. [Google Scholar]
- Cao, D.; Zhou, L.; Tan, V.Y.F. Strong Converse for Hypothesis Testing Against Independence over a Two-Hop Network. arXiv 2018, arXiv:1808.05366. [Google Scholar]
- Marton, K. Error exponent for source coding with a fidelity criterion. IEEE Trans. Inf. Theory 1974, 20, 197–199. [Google Scholar] [CrossRef]
- Yassaee, M.H.; Aref, M.R.; Gohari, A. A technique for deriving one-shot achievability results in network information theory. In Proceedings of the 2013 IEEE International Symposium on Information Theory, Istanbul, Turkey, 7–12 July 2013; pp. 1287–1291. [Google Scholar]
- Kostina, V.; Verdú, S. Fixed-length lossy compression in the finite blocklength regime. IEEE Trans. Inf. Theory 2012, 58, 3309–3338. [Google Scholar] [CrossRef]
- Tan, V.Y.F. Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities. Found. Trends Commun. Inf. Theory 2014, 11, 1–184. [Google Scholar] [CrossRef]
- Pradhan, S.S.; Chou, J.; Ramchandran, K. Duality between source coding and channel coding and its extension to the side information case. IEEE Trans. Inf. Theory 2003, 49, 1181–1203. [Google Scholar] [CrossRef]
- Slepian, D.; Wolf, J.K. Noiseless coding of correlated information sources. IEEE Trans. Inf. Theory 1973, 19, 471–480. [Google Scholar] [CrossRef]
- Gelfand, S. Coding for channel with random parameters. Probl. Contr. Inf. Theory 1980, 9, 19–31. [Google Scholar]
- Shannon, C.E. Channels with side information at the transmitter. IBM J. Res. Dev. 1958, 2, 289–293. [Google Scholar] [CrossRef]
- Sigurjónsson, S.; Kim, Y.H. On multiple user channels with state information at the transmitters. In Proceedings of the International Symposium on Information Theory (ISIT 2005), Adelaide, SA, Australia, 4–9 September 2005; pp. 72–76. [Google Scholar]
- Zaidi, A.; Shitz, S.S. On cooperative multiple access channels with delayed CSI at transmitters. IEEE Trans. Inf. Theory 2014, 60, 6204–6230. [Google Scholar] [CrossRef]
- Shkel, Y.Y.; Verdú, S. A single-shot approach to lossy source coding under logarithmic loss. IEEE Trans. Inf. Theory 2018, 64, 129–147. [Google Scholar] [CrossRef]
- Courtade, T.A.; Weissman, T. Multiterminal source coding under logarithmic loss. IEEE Trans. Inf. Theory 2014, 60, 740–761. [Google Scholar] [CrossRef]
- Strassen, V. Asymptotische abschätzungen in shannons informationstheorie. In Transactions of the Third Prague Conference on Information Theory etc; Czechoslovak Academy of Sciences: Prague, Czech Republic, 1962; pp. 689–723. [Google Scholar]
- Hayashi, M. Information spectrum approach to second-order coding rate in channel coding. IEEE Trans. Inf. Theory 2009, 55, 4947–4966. [Google Scholar] [CrossRef]
- Polyanskiy, Y.; Poor, H.V.; Verdú, S. Channel coding rate in the finite blocklength regime. IEEE Trans. Inf. Theory 2010, 56, 2307–2359. [Google Scholar] [CrossRef]
- Kostina, V. Lossy Data Compression: Non-Asymptotic Fundamental Limits. Ph.D. Thesis, Department of Electrical Engineering, Princeton University, Princeton, NJ, USA, 2013. [Google Scholar]
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhou, L.; Hero, A. Exponential Strong Converse for Successive Refinement with Causal Decoder Side Information. Entropy 2019, 21, 410. https://doi.org/10.3390/e21040410
Zhou L, Hero A. Exponential Strong Converse for Successive Refinement with Causal Decoder Side Information. Entropy. 2019; 21(4):410. https://doi.org/10.3390/e21040410
Chicago/Turabian StyleZhou, Lin, and Alfred Hero. 2019. "Exponential Strong Converse for Successive Refinement with Causal Decoder Side Information" Entropy 21, no. 4: 410. https://doi.org/10.3390/e21040410
APA StyleZhou, L., & Hero, A. (2019). Exponential Strong Converse for Successive Refinement with Causal Decoder Side Information. Entropy, 21(4), 410. https://doi.org/10.3390/e21040410