entropy-logo

Journal Browser

Journal Browser

Information Theoretic Measures and Their Applications II

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (15 July 2022) | Viewed by 14760

Special Issue Editors


E-Mail Website
Guest Editor
Instituto de Física, Universidade Federal de Alagoas, Maceió 57072-970, Alagoas, Brazil
Interests: time-series analysis; information theory; time–frequency transform; wavelet transform; entropy and complexity; non-linear dynamics and chaos; complex networks; medical and biological applications
Special Issues, Collections and Topics in MDPI journals

E-Mail
Guest Editor
Instituto de Física La Plata, CONICET-Universidad Nacional de la Plata, La Plata, Diagonal 113 entre 63 y 64, La Plata 1900, Argentina
Interests: time-series analysis; information theory; brain and neuronal dynamics, neural coding; entropy and complexity; nonlinear dynamics and chaos; complex networks, medical and biological applications
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The evaluation of information theory quantifiers supposes some prior knowledge about the system; specifically, a probability distribution function (PDF) associated to the time series under analysis should be provided beforehand. The determination of the most adequate PDF is a fundamental problem, because the PDF P and the sample space Ω are inextricably linked. Many methods have been proposed for a proper selection of the probability space (Ω, P).

Among others, we can mention frequency counting; procedures based on amplitude statistics; binary symbolic dynamics; Fourier analysis; Gabor transform, wavelet transform; and permutation patterns. The suitability of each of the proposed methodologies depends on the peculiarity of data, such as stationarity, length of the series, the variation of the parameters, the level of noise contamination, etc. In all these cases, global aspects of the dynamics can somehow be captured, but different approaches are not equivalent in their ability to discern all relevant physical details. 

Usual methodologies assign a symbol from a finite alphabet A to each time point of the series X(t), thus creating a symbolic sequence that can be regarded as a non-causal coarse-grained description of the time series under consideration. As a consequence, order relations and the time scales of the dynamics are lost. The usual histogram technique corresponds to this kind of assignment. Time causal information may be duly incorporated if information about the past dynamics of the system is included in the symbolic sequence, i.e., symbols of alphabet A are assigned to a portion of the phase–space or trajectory.

Along different methodologies of non-causal coarse-grained type, we can mention frequency counting, procedures based on amplitude statistics, binary symbolic dynamics, Fourier analysis, or wavelet transform. The suitability of each of the proposed methodologies depends on the peculiarity of data, such as stationarity, length of the series, the variation of the parameters, the level of noise contamination, etc. In all these cases, global aspects of the dynamics can somehow be captured, but the different approaches are not equivalent in their ability to discern all relevant physical details.

In particular, Bandt and Pompe (BP) in their seminal work “Permutation Entropy: A Natural Complexity Measure for Time Series” (Phys. Rev. Lett. 1972, 88, 174102) introduced a simple and robust symbolic methodology that takes into account the time causality of the time series (causal coarse-grained methodology) by comparing neighboring values in a time series. The symbolic data are (i) created by ranking the values of the series; and (ii) defined by reordering the embedded data in ascending order, which is tantamount to a phase space reconstruction with embedding dimension (pattern length) D ≥ 2, D ∊ N and time lag τ ∊ N. In this way, it is possible to quantify the diversity of the ordering symbols (patterns) derived from a scalar time series. Note that the appropriate symbol sequence arises naturally from the time series, and no model-based assumptions are needed. In fact, the necessary “partitions” are devised by comparing the order of neighboring relative values rather than by apportioning amplitudes according to different levels. This technique, as opposed to most in current practice, takes into account the temporal structure of the time series generated by the physical process under study. As such, it allows us to uncover important details concerning the ordinal structure of the time series and can also yield information about temporal correlation.

Furthermore, the ordinal patterns associated with the Bandt–Pompe PDF are invariant with respect to nonlinear monotonous transformations. Accordingly, nonlinear drifts or scaling artificially introduced by a measurement device will not modify the estimation of quantifiers, a nice property if one deals with experimental data.

Recent approaches to obtain knowledge about the time series dynamics consider the creation of graphs based on the observed ordinal patterns of a given time series. These graphs are constructed after the transformation of a time series onto the set of ordinal patterns, taking into account the transitions between consecutive patterns. Each D! possible ordinal pattern is a vertex in the graph, and a directed edge connects two ordinal patterns in the graph if they appear sequentially in the time series. Each edge represents the transition between patterns, thus the name “ordinal patterns transition graphs”, denoted by Gπ. The study of time series via their transformation into graphs is a very successful strategy. Some notable examples are the visibility graph (VG) and horizontal visibility graph (HVG). For these approaches, each point in the data series is a vertex in the graph, and two vertices are connected by an edge if they satisfy the visibility criterion, i.e., it is possible to trace a line between two data points without intersecting intermediate points or, more strictly, a horizontal line for the HVG. However, the visibility approaches may not scale depending on the time series length, and thus, ordinal pattern transition graphs are the best of both worlds. Analysis of these graphs is often performed on their structure, by accounting for both graph measures and information quantifiers (such as probability of self-transition, in the case of graph Gπ, and node degree probability in the case of VG and HVG) that can be used for time series proper characterization as well as precise distinction among different time series dynamics.

In relation to other quantifiers, we can mention those based on mutual information which rigorously quantifies, in units known as “bits”, how much information the value of one variable reveals about the value of another. This is a dimensionless quantity that can be thought of as the reduction in uncertainty about one random variable given knowledge of another. Recent advances in machine learning have shown the relevance of deep learning methodologies, as these approaches have the capability to reshape the way we understand neuronal systems. Those are the calculations performed within the deep learning methodologies, which have the capability to learn from the experience instead of being performed by the researcher. In the case of deep learning, the most common use of information theory is to characterize probability distributions and quantify the similarity between two probability distributions. For these purposes, we use concepts such as Kullback–Leibler (KL) divergence and Jensen–Shannon divergence. The mutual information is the KL divergence between the joint distribution P(X; Y) and the factorized distribution P (x)P (y). The information bottleneck method is a theoretical principle for extracting the relevant information contained in input variable X from output variable Y . Given their joint distribution P(X,Y), the relevant information is defined as mutual information I(X; Y). It is assumed that there is a statistical relationship between X and Y, and Y implicitly determines the tangible and intangible characteristics of X. An optimal representation of X* would capture all the relevant information useful to predict Y and would eliminate all other irrelevant information. Importantly, this kind of measure can be used to study deep neural networks and can help to infer the complexity of an information flow and learning understood as the acquisition and processing of information.

Among the most recent entropy proposals, we can mention approximate entropy; sample entropy; delayed permutation entropy; the conditional entropy bottleneck method; and permutation min-entropy. That is, different methodologies have been used to understand the mechanisms behind information processing. Among those, there are also methods of frequency analysis such as wavelet transform (WT), which distinguishes itself from others due to its high efficiency when dealing with feature extraction. Wavelet analysis is the appropriate mathematical tool to analyze signals in the time and frequency domain. All these measures have important applications not only in physics but also in quite distinct areas, such as biology, medicine, economy, cognitive sciences, numerical and computational sciences, bigdata analysis, complex networks, deep learning, and neuroscience.

Dr. Osvaldo Anibal Rosso
Dr. Fernando Montani
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

22 pages, 5467 KiB  
Article
Spike Timing-Dependent Plasticity with Enhanced Long-Term Depression Leads to an Increase of Statistical Complexity
by Monserrat Pallares Di Nunzio and Fernando Montani
Entropy 2022, 24(10), 1384; https://doi.org/10.3390/e24101384 - 28 Sep 2022
Viewed by 1682
Abstract
Synaptic plasticity is characterized by remodeling of existing synapses caused by strengthening and/or weakening of connections. This is represented by long-term potentiation (LTP) and long-term depression (LTD). The occurrence of a presynaptic spike (or action potential) followed by a temporally nearby postsynaptic spike [...] Read more.
Synaptic plasticity is characterized by remodeling of existing synapses caused by strengthening and/or weakening of connections. This is represented by long-term potentiation (LTP) and long-term depression (LTD). The occurrence of a presynaptic spike (or action potential) followed by a temporally nearby postsynaptic spike induces LTP; conversely, if the postsynaptic spike precedes the presynaptic spike, it induces LTD. This form of synaptic plasticity induction depends on the order and timing of the pre- and postsynaptic action potential, and has been termed spike time-dependent plasticity (STDP). After an epileptic seizure, LTD plays an important role as a depressor of synapses, which may lead to their complete disappearance together with that of their neighboring connections until days after the event. Added to the fact that after an epileptic seizure the network seeks to regulate the excess activity through two key mechanisms: depressed connections and neuronal death (eliminating excitatory neurons from the network), LTD becomes of great interest in our study. To investigate this phenomenon, we develop a biologically plausible model that privileges LTD at the triplet level while maintaining the pairwise structure in the STPD and study how network dynamics are affected as neuronal damage increases. We find that the statistical complexity is significantly higher for the network where LTD presented both types of interactions. While in the case where the STPD is defined with purely pairwise interactions an increase is observed as damage becomes higher for both Shannon Entropy and Fisher information. Full article
(This article belongs to the Special Issue Information Theoretic Measures and Their Applications II)
Show Figures

Figure 1

19 pages, 5045 KiB  
Article
The Consensus Problem in Polities of Agents with Dissimilar Cognitive Architectures
by Damian Radosław Sowinski, Jonathan Carroll-Nellenback, Jeremy DeSilva, Adam Frank, Gourab Ghoshal and Marcelo Gleiser
Entropy 2022, 24(10), 1378; https://doi.org/10.3390/e24101378 - 27 Sep 2022
Cited by 2 | Viewed by 1912
Abstract
Agents interacting with their environments, machine or otherwise, arrive at decisions based on their incomplete access to data and their particular cognitive architecture, including data sampling frequency and memory storage limitations. In particular, the same data streams, sampled and stored differently, may cause [...] Read more.
Agents interacting with their environments, machine or otherwise, arrive at decisions based on their incomplete access to data and their particular cognitive architecture, including data sampling frequency and memory storage limitations. In particular, the same data streams, sampled and stored differently, may cause agents to arrive at different conclusions and to take different actions. This phenomenon has a drastic impact on polities—populations of agents predicated on the sharing of information. We show that, even under ideal conditions, polities consisting of epistemic agents with heterogeneous cognitive architectures might not achieve consensus concerning what conclusions to draw from datastreams. Transfer entropy applied to a toy model of a polity is analyzed to showcase this effect when the dynamics of the environment is known. As an illustration where the dynamics is not known, we examine empirical data streams relevant to climate and show the consensus problem manifest. Full article
(This article belongs to the Special Issue Information Theoretic Measures and Their Applications II)
Show Figures

Figure 1

17 pages, 3712 KiB  
Article
Estimation of Radial Basis Function Network Centers via Information Forces
by Edilson Sousa Júnior, Antônio Freitas, Ricardo Rabelo and Welflen Santos
Entropy 2022, 24(10), 1347; https://doi.org/10.3390/e24101347 - 23 Sep 2022
Cited by 2 | Viewed by 1553
Abstract
The determination of The Radial Basis Function Network centers is an open problem. This work determines the cluster centers by a proposed gradient algorithm, using the information forces acting on each data point. These centers are applied to a Radial Basis Function Network [...] Read more.
The determination of The Radial Basis Function Network centers is an open problem. This work determines the cluster centers by a proposed gradient algorithm, using the information forces acting on each data point. These centers are applied to a Radial Basis Function Network for data classification. A threshold is established based on Information Potential to classify the outliers. The proposed algorithms are analysed based on databases considering the number of clusters, overlap of clusters, noise, and unbalance of cluster sizes. Combined, the threshold, and the centers determined by information forces, show good results in comparison to a similar Network with a k-means clustering algorithm. Full article
(This article belongs to the Special Issue Information Theoretic Measures and Their Applications II)
Show Figures

Figure 1

16 pages, 4074 KiB  
Article
Two-Dimensional EspEn: A New Approach to Analyze Image Texture by Irregularity
by Ricardo Espinosa, Raquel Bailón and Pablo Laguna
Entropy 2021, 23(10), 1261; https://doi.org/10.3390/e23101261 - 28 Sep 2021
Cited by 4 | Viewed by 2585
Abstract
Image processing has played a relevant role in various industries, where the main challenge is to extract specific features from images. Specifically, texture characterizes the phenomenon of the occurrence of a pattern along the spatial distribution, taking into account the intensities of the [...] Read more.
Image processing has played a relevant role in various industries, where the main challenge is to extract specific features from images. Specifically, texture characterizes the phenomenon of the occurrence of a pattern along the spatial distribution, taking into account the intensities of the pixels for which it has been applied in classification and segmentation tasks. Therefore, several feature extraction methods have been proposed in recent decades, but few of them rely on entropy, which is a measure of uncertainty. Moreover, entropy algorithms have been little explored in bidimensional data. Nevertheless, there is a growing interest in developing algorithms to solve current limits, since Shannon Entropy does not consider spatial information, and SampEn2D generates unreliable values in small sizes. We introduce a proposed algorithm, EspEn (Espinosa Entropy), to measure the irregularity present in two-dimensional data, where the calculation requires setting the parameters as follows: m (length of square window), r (tolerance threshold), and ρ (percentage of similarity). Three experiments were performed; the first two were on simulated images contaminated with different noise levels. The last experiment was with grayscale images from the Normalized Brodatz Texture database (NBT). First, we compared the performance of EspEn against the entropy of Shannon and SampEn2D. Second, we evaluated the dependence of EspEn on variations of the values of the parameters m, r, and ρ. Third, we evaluated the EspEn algorithm on NBT images. The results revealed that EspEn could discriminate images with different size and degrees of noise. Finally, EspEn provides an alternative algorithm to quantify the irregularity in 2D data; the recommended parameters for better performance are m = 3, r = 20, and ρ = 0.7. Full article
(This article belongs to the Special Issue Information Theoretic Measures and Their Applications II)
Show Figures

Figure 1

16 pages, 5373 KiB  
Article
Estimating Phase Amplitude Coupling between Neural Oscillations Based on Permutation and Entropy
by Liyong Yin, Fan Tian, Rui Hu, Zhaohui Li and Fuzai Yin
Entropy 2021, 23(8), 1070; https://doi.org/10.3390/e23081070 - 18 Aug 2021
Viewed by 2727
Abstract
Cross-frequency phase–amplitude coupling (PAC) plays an important role in neuronal oscillations network, reflecting the interaction between the phase of low-frequency oscillation (LFO) and amplitude of the high-frequency oscillations (HFO). Thus, we applied four methods based on permutation analysis to measure PAC, including multiscale [...] Read more.
Cross-frequency phase–amplitude coupling (PAC) plays an important role in neuronal oscillations network, reflecting the interaction between the phase of low-frequency oscillation (LFO) and amplitude of the high-frequency oscillations (HFO). Thus, we applied four methods based on permutation analysis to measure PAC, including multiscale permutation mutual information (MPMI), permutation conditional mutual information (PCMI), symbolic joint entropy (SJE), and weighted-permutation mutual information (WPMI). To verify the ability of these four algorithms, a performance test including the effects of coupling strength, signal-to-noise ratios (SNRs), and data length was evaluated by using simulation data. It was shown that the performance of SJE was similar to that of other approaches when measuring PAC strength, but the computational efficiency of SJE was the highest among all these four methods. Moreover, SJE can also accurately identify the PAC frequency range under the interference of spike noise. All in all, the results demonstrate that SJE is better for evaluating PAC between neural oscillations. Full article
(This article belongs to the Special Issue Information Theoretic Measures and Their Applications II)
Show Figures

Figure 1

17 pages, 1297 KiB  
Article
From Continuous-Time Chaotic Systems to Pseudo Random Number Generators: Analysis and Generalized Methodology
by Luciana De Micco, Maximiliano Antonelli and Osvaldo Anibal Rosso
Entropy 2021, 23(6), 671; https://doi.org/10.3390/e23060671 - 26 May 2021
Cited by 5 | Viewed by 2783
Abstract
The use of chaotic systems in electronics, such as Pseudo-Random Number Generators (PRNGs), is very appealing. Among them, continuous-time ones are used less because, in addition to having strong temporal correlations, they require further computations to obtain the discrete solutions. Here, the time [...] Read more.
The use of chaotic systems in electronics, such as Pseudo-Random Number Generators (PRNGs), is very appealing. Among them, continuous-time ones are used less because, in addition to having strong temporal correlations, they require further computations to obtain the discrete solutions. Here, the time step and discretization method selection are first studied by conducting a detailed analysis of their effect on the systems’ statistical and chaotic behavior. We employ an approach based on interpreting the time step as a parameter of the new “maps”. From our analysis, it follows that to use them as PRNGs, two actions should be achieved (i) to keep the chaotic oscillation and (ii) to destroy the inner and temporal correlations. We then propose a simple methodology to achieve chaos-based PRNGs with good statistical characteristics and high throughput, which can be applied to any continuous-time chaotic system. We analyze the generated sequences by means of quantifiers based on information theory (permutation entropy, permutation complexity, and causal entropy × complexity plane). We show that the proposed PRNG generates sequences that successfully pass Marsaglia Diehard and NIST (National Institute of Standards and Technology) tests. Finally, we show that its hardware implementation requires very few resources. Full article
(This article belongs to the Special Issue Information Theoretic Measures and Their Applications II)
Show Figures

Figure 1

Back to TopTop