Statistical Communication and Information Theory

A special issue of Information (ISSN 2078-2489). This special issue belongs to the section "Information Theory and Methodology".

Deadline for manuscript submissions: closed (28 February 2021) | Viewed by 13137

Special Issue Editor


E-Mail Website
Guest Editor
Faculty of Engineering, Bar-Ilan University, Ramat-Gan 52900, Israel
Interests: Statistical Communication; Information Theory

Special Issue Information

Dear Colleagues,

The MDPI journal Information is inviting submissions to a Special Issue on “Statistical Communication and Information Theory”.

Information Theory has become a prominent analysis tool in various research fields.

This trend might be associated with the growing interest in rigor justification of experimental results, obtained for specific models, and the desire to establish performance bounds for related system models.

This raises several challenges such as the refinement of current analysis tools, extension of existing results to new or more complicated models, and, more importantly, the search for new methods that are viable for the treatment of yet unsolved or new problems.

This Special Issue is concerned with previously unpublished contributions in areas related to information theory, with a particular emphasis on the following aspects:

Topics of call

  • Communication and storage coding
  • Coding theory
  • Combinatorics and information theory
  • Communication theory
  • Cryptography and security
  • Information theory and statistics
  • Network information theory
  • Shannon theory
  • Source coding and data compression
  • Wireless communication

Dr. Shraga I. Bross
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Information is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

18 pages, 1034 KiB  
Article
Distributed Hypothesis Testing over Noisy Broadcast Channels
by Sadaf Salehkalaibar and Michèle Wigger
Information 2021, 12(7), 268; https://doi.org/10.3390/info12070268 - 29 Jun 2021
Cited by 3 | Viewed by 2074
Abstract
This paper studies binary hypothesis testing with a single sensor that communicates with two decision centers over a memoryless broadcast channel. The main focus lies on the tradeoff between the two type-II error exponents achievable at the two decision centers. In our proposed [...] Read more.
This paper studies binary hypothesis testing with a single sensor that communicates with two decision centers over a memoryless broadcast channel. The main focus lies on the tradeoff between the two type-II error exponents achievable at the two decision centers. In our proposed scheme, we can partially mitigate this tradeoff when the transmitter has a probability larger than 1/2 to distinguish the alternate hypotheses at the decision centers, i.e., the hypotheses under which the decision centers wish to maximize their error exponents. In the cases where these hypotheses cannot be distinguished at the transmitter (because both decision centers have the same alternative hypothesis or because the transmitter’s observations have the same marginal distribution under both hypotheses), our scheme shows an important tradeoff between the two exponents. The results in this paper thus reinforce the previous conclusions drawn for a setup where communication is over a common noiseless link. Compared to such a noiseless scenario, here, however, we observe that even when the transmitter can distinguish the two hypotheses, a small exponent tradeoff can persist, simply because the noise in the channel prevents the transmitter to perfectly describe its guess of the hypothesis to the two decision centers. Full article
(This article belongs to the Special Issue Statistical Communication and Information Theory)
Show Figures

Figure 1

20 pages, 380 KiB  
Article
On Two-Stage Guessing
by Robert Graczyk and Igal Sason
Information 2021, 12(4), 159; https://doi.org/10.3390/info12040159 - 9 Apr 2021
Cited by 2 | Viewed by 3719
Abstract
Stationary memoryless sources produce two correlated random sequences Xn and Yn. A guesser seeks to recover Xn in two stages, by first guessing Yn and then Xn. The contributions of this work are twofold: (1) We [...] Read more.
Stationary memoryless sources produce two correlated random sequences Xn and Yn. A guesser seeks to recover Xn in two stages, by first guessing Yn and then Xn. The contributions of this work are twofold: (1) We characterize the least achievable exponential growth rate (in n) of any positive ρ-th moment of the total number of guesses when Yn is obtained by applying a deterministic function f component-wise to Xn. We prove that, depending on f, the least exponential growth rate in the two-stage setup is lower than when guessing Xn directly. We further propose a simple Huffman code-based construction of a function f that is a viable candidate for the minimization of the least exponential growth rate in the two-stage guessing setup. (2) We characterize the least achievable exponential growth rate of the ρ-th moment of the total number of guesses required to recover Xn when Stage 1 need not end with a correct guess of Yn and without assumptions on the stationary memoryless sources producing Xn and Yn. Full article
(This article belongs to the Special Issue Statistical Communication and Information Theory)
Show Figures

Figure 1

42 pages, 632 KiB  
Article
Information Bottleneck for a Rayleigh Fading MIMO Channel with an Oblivious Relay
by Hao Xu, Tianyu Yang, Giuseppe Caire and Shlomo Shamai (Shitz)
Information 2021, 12(4), 155; https://doi.org/10.3390/info12040155 - 8 Apr 2021
Cited by 7 | Viewed by 2472
Abstract
This paper considers the information bottleneck (IB) problem of a Rayleigh fading multiple-input multiple-out (MIMO) channel with an oblivious relay. The relay is constrained to operating without knowledge of the codebooks, i.e., it performs oblivious processing. Moreover, due to the bottleneck constraint, it [...] Read more.
This paper considers the information bottleneck (IB) problem of a Rayleigh fading multiple-input multiple-out (MIMO) channel with an oblivious relay. The relay is constrained to operating without knowledge of the codebooks, i.e., it performs oblivious processing. Moreover, due to the bottleneck constraint, it is impossible for the relay to inform the destination node of the perfect channel state information (CSI) in each channel realization. To evaluate the bottleneck rate, we first provide an upper bound by assuming that the destination node can obtain a perfect CSI at no cost. Then, we provide four achievable schemes, where each scheme satisfies the bottleneck constraint and gives a lower bound to the bottleneck rate. In the first and second schemes, the relay splits the capacity of the relay–destination link into two parts and conveys both the CSI and its observation to the destination node. Due to CSI transmission, the performance of these two schemes is sensitive to the MIMO channel dimension, especially the channel input dimension. To ensure that it still performs well when the channel dimension grows large, in the third and fourth achievable schemes, the relay only transmits compressed observations to the destination node. Numerical results show that, with simple symbol-by-symbol oblivious relay processing and compression, the proposed achievable schemes work well and can demonstrate lower bounds that come quite close to the upper bound on a wide range of relevant system parameters. Full article
(This article belongs to the Special Issue Statistical Communication and Information Theory)
Show Figures

Figure 1

16 pages, 478 KiB  
Article
Computational Techniques for Investigating Information Theoretic Limits of Information Systems
by Chao Tian, James S. Plank, Brent Hurst and Ruida Zhou
Information 2021, 12(2), 82; https://doi.org/10.3390/info12020082 - 16 Feb 2021
Cited by 3 | Viewed by 2580
Abstract
Computer-aided methods, based on the entropic linear program framework, have been shown to be effective in assisting the study of information theoretic fundamental limits of information systems. One key element that significantly impacts their computation efficiency and applicability is the reduction of variables, [...] Read more.
Computer-aided methods, based on the entropic linear program framework, have been shown to be effective in assisting the study of information theoretic fundamental limits of information systems. One key element that significantly impacts their computation efficiency and applicability is the reduction of variables, based on problem-specific symmetry and dependence relations. In this work, we propose using the disjoint-set data structure to algorithmically identify the reduction mapping, instead of relying on exhaustive enumeration in the equivalence classification. Based on this reduced linear program, we consider four techniques to investigate the fundamental limits of information systems: (1) computing an outer bound for a given linear combination of information measures and providing the values of information measures at the optimal solution; (2) efficiently computing a polytope tradeoff outer bound between two information quantities; (3) producing a proof (as a weighted sum of known information inequalities) for a computed outer bound; and (4) providing the range for information quantities between which the optimal value does not change, i.e., sensitivity analysis. A toolbox, with an efficient JSON format input frontend, and either Gurobi or Cplex as the linear program solving engine, is implemented and open-sourced. Full article
(This article belongs to the Special Issue Statistical Communication and Information Theory)
Show Figures

Figure 1

18 pages, 315 KiB  
Article
Source Coding with a Causal Helper
by Shraga I. Bross
Information 2020, 11(12), 553; https://doi.org/10.3390/info11120553 - 27 Nov 2020
Cited by 3 | Viewed by 1523
Abstract
A multi-terminal network, in which an encoder is assisted by a side-information-aided helper, describes a memoryless identically distributed source to a receiver, is considered. The encoder provides a non-causal one-shot description of the source to both the helper and the receiver. The helper, [...] Read more.
A multi-terminal network, in which an encoder is assisted by a side-information-aided helper, describes a memoryless identically distributed source to a receiver, is considered. The encoder provides a non-causal one-shot description of the source to both the helper and the receiver. The helper, which has access to causal side-information, describes the source to the receiver sequentially by sending a sequence of causal descriptions depending on the message conveyed by the encoder and the side-information subsequence it has observed so far. The receiver reconstructs the source causally by producing on each time unit an estimate of the current source symbol based on what it has received so far. Given a reconstruction fidelity measure and a maximal allowed distortion, we derive the rates-distortion region for this setting and express it in terms of an auxiliary random variable. When the source and side-information are drawn from an independent identically distributed Gaussian law and the fidelity measure is the squared-error distortion we show that for the evaluation of the rates-distortion region it suffices to choose the auxiliary random variable to be jointly Gaussian with the source and side-information pair. Full article
(This article belongs to the Special Issue Statistical Communication and Information Theory)
Show Figures

Figure 1

Back to TopTop