entropy-logo

Journal Browser

Journal Browser

Combinatorial Aspects of Shannon Theory

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (31 August 2021) | Viewed by 4651

Special Issue Editors


E-Mail Website
Guest Editor
Signal and Information Processing Laboratory, ETH Zurich, 8092 Zurich, Switzerland
Interests: multi-terminal information theory; the role of feedback in communication; digital communication; fading channels; optical communication
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
School of Computer Science and Engineering, Hebrew University of Jerusalem, Edmond J. Safra Campus, Jerusalem 91904, Israel
Interests: information theory; digital communication; compression; data science

E-Mail Website
Guest Editor
Department of Electrical Engineering Systems, Tel Aviv University, P.O box 39040, Tel Aviv 6997801, Israel
Interests: information theory; communication; data compression; statistical inference; data science

Special Issue Information

Dear Colleagues,

Combinatorial tools have played a key role in information theory since as early as Shannon's 1948 paper, which used counting techniques to study constrained coding and random coding—an early instance of the probabilistic method—to prove the channel coding theorem. Information theory, in turn, has inspired work in combinatorics, inter alia, through Shannon's work on the zero-error capacity and subsequent work on error-free source-coding and communications. This cross fertilization continued unabatedly over the years and has led to numerous results in both fields.

It is the purpose of this Special Issue to explore recent developments at the interface between the two fields. While appreciating the impact that combinatorics has had on code construction, our focus is more on Shannon theory than on coding theory. On the combinatorics side, we seek results where information theory plays a key role either in the formulation or in the solution. Topics of interest include, but are not limited to:

  • Zero-error capacity (constrained or unconstrained);
  • Zero-error list capacity;
  • Erasures-only/zero undetected errors capacity;
  • The capacity of deletion/insertion/substitution channels;
  • Graph-theoretic aspects of information theory;
  • Information theoretic proofs in extremal combinatorics;
  • Noisy Boolean functions in information theory.

Prof. Dr. Amos Lapidoth
Dr. Or Ordentlich
Prof. Dr. Ofer Shayevitz
Guest Editors

Manuscript Submission Information 

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. 

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropyis an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • zero-error and erasures-only capacity
  • noisy Boolean functions
  • extremal combinatorics
  • the capacity of the delete/substitute/insert channel

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

25 pages, 402 KiB  
Article
On the Non-Adaptive Zero-Error Capacity of the Discrete Memoryless Two-Way Channel
by Yujie Gu and Ofer Shayevitz
Entropy 2021, 23(11), 1518; https://doi.org/10.3390/e23111518 - 15 Nov 2021
Cited by 2 | Viewed by 1612
Abstract
We study the problem of communicating over a discrete memoryless two-way channel using non-adaptive schemes, under a zero probability of error criterion. We derive single-letter inner and outer bounds for the zero-error capacity region, based on random coding, linear programming, linear codes, and [...] Read more.
We study the problem of communicating over a discrete memoryless two-way channel using non-adaptive schemes, under a zero probability of error criterion. We derive single-letter inner and outer bounds for the zero-error capacity region, based on random coding, linear programming, linear codes, and the asymptotic spectrum of graphs. Among others, we provide a single-letter outer bound based on a combination of Shannon’s vanishing-error capacity region and a two-way analogue of the linear programming bound for point-to-point channels, which, in contrast to the one-way case, is generally better than both. Moreover, we establish an outer bound for the zero-error capacity region of a two-way channel via the asymptotic spectrum of graphs, and show that this bound can be achieved in certain cases. Full article
(This article belongs to the Special Issue Combinatorial Aspects of Shannon Theory)
14 pages, 337 KiB  
Article
A Generalized Information-Theoretic Approach for Bounding the Number of Independent Sets in Bipartite Graphs
by Igal Sason
Entropy 2021, 23(3), 270; https://doi.org/10.3390/e23030270 - 25 Feb 2021
Cited by 4 | Viewed by 1952
Abstract
This paper studies the problem of upper bounding the number of independent sets in a graph, expressed in terms of its degree distribution. For bipartite regular graphs, Kahn (2001) established a tight upper bound using an information-theoretic approach, and he also conjectured an [...] Read more.
This paper studies the problem of upper bounding the number of independent sets in a graph, expressed in terms of its degree distribution. For bipartite regular graphs, Kahn (2001) established a tight upper bound using an information-theoretic approach, and he also conjectured an upper bound for general graphs. His conjectured bound was recently proved by Sah et al. (2019), using different techniques not involving information theory. The main contribution of this work is the extension of Kahn’s information-theoretic proof technique to handle irregular bipartite graphs. In particular, when the bipartite graph is regular on one side, but may be irregular on the other, the extended entropy-based proof technique yields the same bound as was conjectured by Kahn (2001) and proved by Sah et al. (2019). Full article
(This article belongs to the Special Issue Combinatorial Aspects of Shannon Theory)
Back to TopTop