entropy-logo

Journal Browser

Journal Browser

Information Theory for Channel Coding

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (31 January 2021) | Viewed by 16196

Special Issue Editors


E-Mail Website
Guest Editor
German Aerospace Center (DLR), Oberpfaffenhofen, 82234 Weßling, Germany
Interests: coding theory; communication systems; information theory

E-Mail Website
Guest Editor
Skolkovo Institute of Science and Technology, 121205 Moscow, Russia
Interests: information theory; LDPC codes; polar codes; uncoordinated/random multiple access; coding for distributed and cloud storage systems; machine learning in communications

Special Issue Information

Dear Colleagues,

Information theory and channel coding are closely connected. While information theory is concerned (amongst others) with fundamental limits of communicating or storing information, channel coding aims at providing practical schemes that approach these limits. The vital interplay of both domains has coined today’s communication and storage systems. 

In recent years, information theorists have tackled a multitude of new interesting problems for which novel channel coding schemes were devised. The purpose of this Special Issue is to shed light on these novel developments. Researchers are highly encouraged to submit their recent findings in the field of information and coding theory. Topics of submission include but are not limited to the following:

  • Code-based cryptosystems;
  • Compressed sensing and group testing;
  • Distributed storage and computing;
  • High-throughput communications;
  • Machine learning;
  • Multiuser and MIMO communications;
  • Random access;
  • Ultrareliable low-latency communications;
  • Small data communications.

Dr. Balázs Matuz
Dr. Alexey Frolov
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

14 pages, 837 KiB  
Article
Improving Log-Likelihood Ratio Estimation with Bi-Gaussian Approximation under Multiuser Interference Scenarios
by Yu Fu and Hongwen Yang
Entropy 2021, 23(6), 784; https://doi.org/10.3390/e23060784 - 20 Jun 2021
Viewed by 3140
Abstract
Accurate estimation of channel log-likelihood ratio (LLR) is crucial to the decoding of modern channel codes like turbo, low-density parity-check (LDPC), and polar codes. Under an additive white Gaussian noise (AWGN) channel, the calculation of LLR is relatively straightforward since the closed-form expression [...] Read more.
Accurate estimation of channel log-likelihood ratio (LLR) is crucial to the decoding of modern channel codes like turbo, low-density parity-check (LDPC), and polar codes. Under an additive white Gaussian noise (AWGN) channel, the calculation of LLR is relatively straightforward since the closed-form expression for the channel likelihood function can be perfectly known to the receiver. However, it would be much more complicated for heterogeneous networks where the global noise (i.e., noise plus interference) may be dominated by non-Gaussian interference with an unknown distribution. Although the LLR can still be calculated by approximating the distribution of global noise as Gaussian, it will cause performance loss due to the non-Gaussian nature of global noise. To address this problem, we propose to use bi-Gaussian (BG) distribution to approximate the unknown distribution of global noise, for which the two parameters of BG distribution can easily be estimated from the second and fourth moments of the overall received signals without any knowledge of interfering channel state information (CSI) or signaling format information. Simulation results indicate that the proposed BG approximation can effectively improve the word error rate (WER) performance. The gain of BG approximation over Gaussian approximation depends heavily on the interference structure. For the scenario of a single BSPK interferer with a 5 dB interference-to-noise ratio (INR), we observed a gain of about 0.6 dB. The improved LLR estimation can also accelerate the convergence of iterative decoding, thus involving a lower overall decoding complexity. In general, the overall decoding complexity can be reduced by 25 to 50%. Full article
(This article belongs to the Special Issue Information Theory for Channel Coding)
Show Figures

Figure 1

16 pages, 338 KiB  
Article
Error Exponents of LDPC Codes under Low-Complexity Decoding
by Pavel Rybin, Kirill Andreev and Victor Zyablov
Entropy 2021, 23(2), 253; https://doi.org/10.3390/e23020253 - 22 Feb 2021
Cited by 3 | Viewed by 2295
Abstract
This paper deals with the specific construction of binary low-density parity-check (LDPC) codes. We derive lower bounds on the error exponents for these codes transmitted over the memoryless binary symmetric channel (BSC) for both the well-known maximum-likelihood (ML) and proposed low-complexity decoding algorithms. [...] Read more.
This paper deals with the specific construction of binary low-density parity-check (LDPC) codes. We derive lower bounds on the error exponents for these codes transmitted over the memoryless binary symmetric channel (BSC) for both the well-known maximum-likelihood (ML) and proposed low-complexity decoding algorithms. We prove the existence of such LDPC codes that the probability of erroneous decoding decreases exponentially with the growth of the code length while keeping coding rates below the corresponding channel capacity. We also show that an obtained error exponent lower bound under ML decoding almost coincide with the error exponents of good linear codes. Full article
(This article belongs to the Special Issue Information Theory for Channel Coding)
Show Figures

Figure 1

17 pages, 366 KiB  
Article
Threshold Computation for Spatially Coupled Turbo-Like Codes on the AWGN Channel
by Muhammad Umar Farooq, Alexandre Graell i Amat and Michael Lentmaier
Entropy 2021, 23(2), 240; https://doi.org/10.3390/e23020240 - 19 Feb 2021
Cited by 4 | Viewed by 2178
Abstract
In this paper, we perform a belief propagation (BP) decoding threshold analysis of spatially coupled (SC) turbo-like codes (TCs) (SC-TCs) on the additive white Gaussian noise (AWGN) channel. We review Monte-Carlo density evolution (MC-DE) and efficient prediction methods, which determine the BP thresholds [...] Read more.
In this paper, we perform a belief propagation (BP) decoding threshold analysis of spatially coupled (SC) turbo-like codes (TCs) (SC-TCs) on the additive white Gaussian noise (AWGN) channel. We review Monte-Carlo density evolution (MC-DE) and efficient prediction methods, which determine the BP thresholds of SC-TCs over the AWGN channel. We demonstrate that instead of performing time-consuming MC-DE computations, the BP threshold of SC-TCs over the AWGN channel can be predicted very efficiently from their binary erasure channel (BEC) thresholds. From threshold results, we conjecture that the similarity of MC-DE and predicted thresholds is related to the threshold saturation capability as well as capacity-approaching maximum a posteriori (MAP) performance of an SC-TC ensemble. Full article
(This article belongs to the Special Issue Information Theory for Channel Coding)
Show Figures

Figure 1

17 pages, 334 KiB  
Article
Skew Convolutional Codes
by Vladimir Sidorenko, Wenhui Li, Onur Günlü and Gerhard Kramer
Entropy 2020, 22(12), 1364; https://doi.org/10.3390/e22121364 - 2 Dec 2020
Cited by 3 | Viewed by 2259
Abstract
A new class of convolutional codes, called skew convolutional codes, that extends the class of classical fixed convolutional codes, is proposed. Skew convolutional codes can be represented as periodic time-varying convolutional codes but have a description as compact as fixed convolutional codes. Designs [...] Read more.
A new class of convolutional codes, called skew convolutional codes, that extends the class of classical fixed convolutional codes, is proposed. Skew convolutional codes can be represented as periodic time-varying convolutional codes but have a description as compact as fixed convolutional codes. Designs of generator and parity check matrices, encoders, and code trellises for skew convolutional codes and their duals are shown. For memoryless channels, one can apply Viterbi or BCJR decoding algorithms, or a dualized BCJR algorithm, to decode skew convolutional codes. Full article
(This article belongs to the Special Issue Information Theory for Channel Coding)
Show Figures

Figure 1

7 pages, 323 KiB  
Article
Systematic Encoding and Shortening of PAC Codes
by Erdal Arıkan
Entropy 2020, 22(11), 1301; https://doi.org/10.3390/e22111301 - 15 Nov 2020
Cited by 11 | Viewed by 2760
Abstract
Polarization adjusted convolutional (PAC) codes are a class of codes that combine channel polarization with convolutional coding. PAC codes are of interest for their high performance. This paper presents a systematic encoding and shortening method for PAC codes. Systematic encoding is important for [...] Read more.
Polarization adjusted convolutional (PAC) codes are a class of codes that combine channel polarization with convolutional coding. PAC codes are of interest for their high performance. This paper presents a systematic encoding and shortening method for PAC codes. Systematic encoding is important for lowering the bit-error rate (BER) of PAC codes. Shortening is important for adjusting the block length of PAC codes. It is shown that systematic encoding and shortening of PAC codes can be carried out in a unified framework. Full article
(This article belongs to the Special Issue Information Theory for Channel Coding)
Show Figures

Figure 1

19 pages, 314 KiB  
Article
Performance Analysis of Identification Codes
by Sencer Derebeyoğlu, Christian Deppe and Roberto Ferrara
Entropy 2020, 22(10), 1067; https://doi.org/10.3390/e22101067 - 23 Sep 2020
Cited by 23 | Viewed by 2356
Abstract
In this paper, we analyze the construction of identification codes. Identification codes are based on the question: “Is the message I have just received the one I am interested in?”, as opposed to Shannon’s transmission, where the receiver is interested in not only [...] Read more.
In this paper, we analyze the construction of identification codes. Identification codes are based on the question: “Is the message I have just received the one I am interested in?”, as opposed to Shannon’s transmission, where the receiver is interested in not only one, but any, message. The advantage of identification is that it allows rates growing double exponentially in the blocklength at the cost of not being able to decode every message, which might be beneficial in certain applications. We focus on a special identification code construction based on two concatenated Reed-Solomon codes and have a closer look at its implementation, analyzing the trade-offs of identification with respect to transmission and the trade-offs introduced by the computational cost of identification codes. Full article
(This article belongs to the Special Issue Information Theory for Channel Coding)
Show Figures

Figure 1

Back to TopTop