Information Theoretic Measures and Their Applications II
A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".
Deadline for manuscript submissions: closed (15 July 2022) | Viewed by 14760
Special Issue Editors
Interests: time-series analysis; information theory; time–frequency transform; wavelet transform; entropy and complexity; non-linear dynamics and chaos; complex networks; medical and biological applications
Special Issues, Collections and Topics in MDPI journals
Interests: time-series analysis; information theory; brain and neuronal dynamics, neural coding; entropy and complexity; nonlinear dynamics and chaos; complex networks, medical and biological applications
Special Issues, Collections and Topics in MDPI journals
Special Issue Information
Dear Colleagues,
The evaluation of information theory quantifiers supposes some prior knowledge about the system; specifically, a probability distribution function (PDF) associated to the time series under analysis should be provided beforehand. The determination of the most adequate PDF is a fundamental problem, because the PDF P and the sample space Ω are inextricably linked. Many methods have been proposed for a proper selection of the probability space (Ω, P).
Among others, we can mention frequency counting; procedures based on amplitude statistics; binary symbolic dynamics; Fourier analysis; Gabor transform, wavelet transform; and permutation patterns. The suitability of each of the proposed methodologies depends on the peculiarity of data, such as stationarity, length of the series, the variation of the parameters, the level of noise contamination, etc. In all these cases, global aspects of the dynamics can somehow be captured, but different approaches are not equivalent in their ability to discern all relevant physical details.
Usual methodologies assign a symbol from a finite alphabet A to each time point of the series X(t), thus creating a symbolic sequence that can be regarded as a non-causal coarse-grained description of the time series under consideration. As a consequence, order relations and the time scales of the dynamics are lost. The usual histogram technique corresponds to this kind of assignment. Time causal information may be duly incorporated if information about the past dynamics of the system is included in the symbolic sequence, i.e., symbols of alphabet A are assigned to a portion of the phase–space or trajectory.
Along different methodologies of non-causal coarse-grained type, we can mention frequency counting, procedures based on amplitude statistics, binary symbolic dynamics, Fourier analysis, or wavelet transform. The suitability of each of the proposed methodologies depends on the peculiarity of data, such as stationarity, length of the series, the variation of the parameters, the level of noise contamination, etc. In all these cases, global aspects of the dynamics can somehow be captured, but the different approaches are not equivalent in their ability to discern all relevant physical details.
In particular, Bandt and Pompe (BP) in their seminal work “Permutation Entropy: A Natural Complexity Measure for Time Series” (Phys. Rev. Lett. 1972, 88, 174102) introduced a simple and robust symbolic methodology that takes into account the time causality of the time series (causal coarse-grained methodology) by comparing neighboring values in a time series. The symbolic data are (i) created by ranking the values of the series; and (ii) defined by reordering the embedded data in ascending order, which is tantamount to a phase space reconstruction with embedding dimension (pattern length) D ≥ 2, D ∊ N and time lag τ ∊ N. In this way, it is possible to quantify the diversity of the ordering symbols (patterns) derived from a scalar time series. Note that the appropriate symbol sequence arises naturally from the time series, and no model-based assumptions are needed. In fact, the necessary “partitions” are devised by comparing the order of neighboring relative values rather than by apportioning amplitudes according to different levels. This technique, as opposed to most in current practice, takes into account the temporal structure of the time series generated by the physical process under study. As such, it allows us to uncover important details concerning the ordinal structure of the time series and can also yield information about temporal correlation.
Furthermore, the ordinal patterns associated with the Bandt–Pompe PDF are invariant with respect to nonlinear monotonous transformations. Accordingly, nonlinear drifts or scaling artificially introduced by a measurement device will not modify the estimation of quantifiers, a nice property if one deals with experimental data.
Recent approaches to obtain knowledge about the time series dynamics consider the creation of graphs based on the observed ordinal patterns of a given time series. These graphs are constructed after the transformation of a time series onto the set of ordinal patterns, taking into account the transitions between consecutive patterns. Each D! possible ordinal pattern is a vertex in the graph, and a directed edge connects two ordinal patterns in the graph if they appear sequentially in the time series. Each edge represents the transition between patterns, thus the name “ordinal patterns transition graphs”, denoted by Gπ. The study of time series via their transformation into graphs is a very successful strategy. Some notable examples are the visibility graph (VG) and horizontal visibility graph (HVG). For these approaches, each point in the data series is a vertex in the graph, and two vertices are connected by an edge if they satisfy the visibility criterion, i.e., it is possible to trace a line between two data points without intersecting intermediate points or, more strictly, a horizontal line for the HVG. However, the visibility approaches may not scale depending on the time series length, and thus, ordinal pattern transition graphs are the best of both worlds. Analysis of these graphs is often performed on their structure, by accounting for both graph measures and information quantifiers (such as probability of self-transition, in the case of graph Gπ, and node degree probability in the case of VG and HVG) that can be used for time series proper characterization as well as precise distinction among different time series dynamics.
In relation to other quantifiers, we can mention those based on mutual information which rigorously quantifies, in units known as “bits”, how much information the value of one variable reveals about the value of another. This is a dimensionless quantity that can be thought of as the reduction in uncertainty about one random variable given knowledge of another. Recent advances in machine learning have shown the relevance of deep learning methodologies, as these approaches have the capability to reshape the way we understand neuronal systems. Those are the calculations performed within the deep learning methodologies, which have the capability to learn from the experience instead of being performed by the researcher. In the case of deep learning, the most common use of information theory is to characterize probability distributions and quantify the similarity between two probability distributions. For these purposes, we use concepts such as Kullback–Leibler (KL) divergence and Jensen–Shannon divergence. The mutual information is the KL divergence between the joint distribution P(X; Y) and the factorized distribution P (x)P (y). The information bottleneck method is a theoretical principle for extracting the relevant information contained in input variable X from output variable Y . Given their joint distribution P(X,Y), the relevant information is defined as mutual information I(X; Y). It is assumed that there is a statistical relationship between X and Y, and Y implicitly determines the tangible and intangible characteristics of X. An optimal representation of X* would capture all the relevant information useful to predict Y and would eliminate all other irrelevant information. Importantly, this kind of measure can be used to study deep neural networks and can help to infer the complexity of an information flow and learning understood as the acquisition and processing of information.
Among the most recent entropy proposals, we can mention approximate entropy; sample entropy; delayed permutation entropy; the conditional entropy bottleneck method; and permutation min-entropy. That is, different methodologies have been used to understand the mechanisms behind information processing. Among those, there are also methods of frequency analysis such as wavelet transform (WT), which distinguishes itself from others due to its high efficiency when dealing with feature extraction. Wavelet analysis is the appropriate mathematical tool to analyze signals in the time and frequency domain. All these measures have important applications not only in physics but also in quite distinct areas, such as biology, medicine, economy, cognitive sciences, numerical and computational sciences, bigdata analysis, complex networks, deep learning, and neuroscience.
Dr. Osvaldo Anibal Rosso
Dr. Fernando Montani
Guest Editors
Manuscript Submission Information
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.
Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.
Benefits of Publishing in a Special Issue
- Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
- Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
- Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
- External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
- e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.
Further information on MDPI's Special Issue polices can be found here.