entropy-logo

Journal Browser

Journal Browser

Entropy Measures for Data Analysis II: Theory, Algorithms and Applications

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Signal and Data Analysis".

Deadline for manuscript submissions: closed (13 June 2021) | Viewed by 15628

Special Issue Editor


E-Mail Website
Guest Editor
Institut für Mathematik, Universität zu Lübeck, D-23562 Lübeck, Germany
Interests: data analysis; time series analysis; computational statistics; information theory; ergodic theory; automatic learning
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Entropies and entropy-like quantities are playing an increasing role in modern nonlinear data analysis. The fields of their application reach from diagnostics in physiology, for instance, electroencephalography (EEG), magnetoencephalography (MEG), and electrocardiography (ECG), to econophysics and engineering. During the last few years, classical concepts such as approximate entropy and sample entropy have been supplemented by new entropy measures, like permutation entropy and various variants of it. Recent developments are focused on multidimensional generalizations of the concepts with a special emphasis on the quantification of coupling between time series and system components behind them.

Many of the considered entropy-based concepts and approaches are not fully understood, and there is a need for systematically exploring what and how much information of a data set and the systems behind the entropy measures they contain. In this context, the use of entropy measures as features in learning theory and as an instrument of data science is of some special interest. Not surprisingly, special emphasis has to be placed on the development of fast and efficient algorithms for determining and dealing with entropy measures in the case of large and complex data sets.

Prof. Dr. Karsten Keller
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Related Special Issue

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research

2 pages, 154 KiB  
Editorial
Entropy Measures for Data Analysis II: Theory, Algorithms and Applications
by Karsten Keller
Entropy 2021, 23(11), 1496; https://doi.org/10.3390/e23111496 - 12 Nov 2021
Cited by 2 | Viewed by 1530
Abstract
Entropies and entropy-like quantities are playing an increasing role in modern non-linear data analysis [...] Full article

Research

Jump to: Editorial

21 pages, 361 KiB  
Article
Generalized Ordinal Patterns and the KS-Entropy
by Tim Gutjahr and Karsten Keller
Entropy 2021, 23(8), 1097; https://doi.org/10.3390/e23081097 - 23 Aug 2021
Cited by 3 | Viewed by 2666
Abstract
Ordinal patterns classifying real vectors according to the order relations between their components are an interesting basic concept for determining the complexity of a measure-preserving dynamical system. In particular, as shown by C. Bandt, G. Keller and B. Pompe, the permutation entropy based [...] Read more.
Ordinal patterns classifying real vectors according to the order relations between their components are an interesting basic concept for determining the complexity of a measure-preserving dynamical system. In particular, as shown by C. Bandt, G. Keller and B. Pompe, the permutation entropy based on the probability distributions of such patterns is equal to Kolmogorov–Sinai entropy in simple one-dimensional systems. The general reason for this is that, roughly speaking, the system of ordinal patterns obtained for a real-valued “measuring arrangement” has high potential for separating orbits. Starting from a slightly different approach of A. Antoniouk, K. Keller and S. Maksymenko, we discuss the generalizations of ordinal patterns providing enough separation to determine the Kolmogorov–Sinai entropy. For defining these generalized ordinal patterns, the idea is to substitute the basic binary relation ≤ on the real numbers by another binary relation. Generalizing the former results of I. Stolz and K. Keller, we establish conditions that the binary relation and the dynamical system have to fulfill so that the obtained generalized ordinal patterns can be used for estimating the Kolmogorov–Sinai entropy. Full article
Show Figures

Figure 1

27 pages, 4321 KiB  
Article
Intelligent Fault Identification for Rolling Bearings Fusing Average Refined Composite Multiscale Dispersion Entropy-Assisted Feature Extraction and SVM with Multi-Strategy Enhanced Swarm Optimization
by Huibin Shi, Wenlong Fu, Bailin Li, Kaixuan Shao and Duanhao Yang
Entropy 2021, 23(5), 527; https://doi.org/10.3390/e23050527 - 25 Apr 2021
Cited by 6 | Viewed by 2284
Abstract
Rolling bearings act as key parts in many items of mechanical equipment and any abnormality will affect the normal operation of the entire apparatus. To diagnose the faults of rolling bearings effectively, a novel fault identification method is proposed by merging variational mode [...] Read more.
Rolling bearings act as key parts in many items of mechanical equipment and any abnormality will affect the normal operation of the entire apparatus. To diagnose the faults of rolling bearings effectively, a novel fault identification method is proposed by merging variational mode decomposition (VMD), average refined composite multiscale dispersion entropy (ARCMDE) and support vector machine (SVM) optimized by multistrategy enhanced swarm optimization in this paper. Firstly, the vibration signals are decomposed into different series of intrinsic mode functions (IMFs) based on VMD with the center frequency observation method. Subsequently, the proposed ARCMDE, fusing the superiorities of DE and average refined composite multiscale procedure, is employed to enhance the ability of the multiscale fault-feature extraction from the IMFs. Afterwards, grey wolf optimization (GWO), enhanced by multistrategy including levy flight, cosine factor and polynomial mutation strategies (LCPGWO), is proposed to optimize the penalty factor C and kernel parameter g of SVM. Then, the optimized SVM model is trained to identify the fault type of samples based on features extracted by ARCMDE. Finally, the application experiment and contrastive analysis verify the effectiveness of the proposed VMD-ARCMDE-LCPGWO-SVM method. Full article
Show Figures

Figure 1

26 pages, 3564 KiB  
Article
Image Statistics Preserving Encrypt-then-Compress Scheme Dedicated for JPEG Compression Standard
by Dariusz Puchala, Kamil Stokfiszewski and Mykhaylo Yatsymirskyy
Entropy 2021, 23(4), 421; https://doi.org/10.3390/e23040421 - 31 Mar 2021
Cited by 4 | Viewed by 2328
Abstract
In this paper, the authors analyze in more details an image encryption scheme, proposed by the authors in their earlier work, which preserves input image statistics and can be used in connection with the JPEG compression standard. The image encryption process takes advantage [...] Read more.
In this paper, the authors analyze in more details an image encryption scheme, proposed by the authors in their earlier work, which preserves input image statistics and can be used in connection with the JPEG compression standard. The image encryption process takes advantage of fast linear transforms parametrized with private keys and is carried out prior to the compression stage in a way that does not alter those statistical characteristics of the input image that are crucial from the point of view of the subsequent compression. This feature makes the encryption process transparent to the compression stage and enables the JPEG algorithm to maintain its full compression capabilities even though it operates on the encrypted image data. The main advantage of the considered approach is the fact that the JPEG algorithm can be used without any modifications as a part of the encrypt-then-compress image processing framework. The paper includes a detailed mathematical model of the examined scheme allowing for theoretical analysis of the impact of the image encryption step on the effectiveness of the compression process. The combinatorial and statistical analysis of the encryption process is also included and it allows to evaluate its cryptographic strength. In addition, the paper considers several practical use-case scenarios with different characteristics of the compression and encryption stages. The final part of the paper contains the additional results of the experimental studies regarding general effectiveness of the presented scheme. The results show that for a wide range of compression ratios the considered scheme performs comparably to the JPEG algorithm alone, that is, without the encryption stage, in terms of the quality measures of reconstructed images. Moreover, the results of statistical analysis as well as those obtained with generally approved quality measures of image cryptographic systems, prove high strength and efficiency of the scheme’s encryption stage. Full article
Show Figures

Figure 1

29 pages, 1229 KiB  
Article
Storage Space Allocation Strategy for Digital Data with Message Importance
by Shanyun Liu, Rui She, Zheqi Zhu and Pingyi Fan
Entropy 2020, 22(5), 591; https://doi.org/10.3390/e22050591 - 25 May 2020
Cited by 3 | Viewed by 2996
Abstract
This paper mainly focuses on the problem of lossy compression storage based on the data value that represents the subjective assessment of users when the storage size is still not enough after the conventional lossless data compression. To this end, we transform this [...] Read more.
This paper mainly focuses on the problem of lossy compression storage based on the data value that represents the subjective assessment of users when the storage size is still not enough after the conventional lossless data compression. To this end, we transform this problem to an optimization, which pursues the least importance-weighted reconstruction error in data reconstruction within limited total storage size, where the importance is adopted to characterize the data value from the viewpoint of users. Based on it, this paper puts forward an optimal allocation strategy in the storage of digital data by the exponential distortion measurement, which can make rational use of all the storage space. In fact, the theoretical results show that it is a kind of restrictive water-filling. It also characterizes the trade-off between the relative weighted reconstruction error and the available storage size. Consequently, if a relatively small part of total data value is allowed to lose, this strategy will improve the performance of data compression. Furthermore, this paper also presents that both the users’ preferences and the special characteristics of data distribution can trigger the small-probability event scenarios where only a fraction of data can cover the vast majority of users’ interests. Whether it is for one of the reasons above, the data with highly clustered message importance is beneficial to compression storage. In contrast, from the perspective of optimal storage space allocation based on data value, the data with a uniform information distribution is incompressible, which is consistent with that in the information theory. Full article
Show Figures

Figure 1

15 pages, 718 KiB  
Article
Regime-Switching Discrete ARMA Models for Categorical Time Series
by Christian H. Weiß
Entropy 2020, 22(4), 458; https://doi.org/10.3390/e22040458 - 17 Apr 2020
Cited by 14 | Viewed by 2853
Abstract
For the modeling of categorical time series, both nominal or ordinal time series, an extension of the basic discrete autoregressive moving-average (ARMA) models is proposed. It uses an observation-driven regime-switching mechanism, leading to the family of RS-DARMA models. After having discussed the stochastic [...] Read more.
For the modeling of categorical time series, both nominal or ordinal time series, an extension of the basic discrete autoregressive moving-average (ARMA) models is proposed. It uses an observation-driven regime-switching mechanism, leading to the family of RS-DARMA models. After having discussed the stochastic properties of RS-DARMA models in general, we focus on the particular case of the first-order RS-DAR model. This RS-DAR ( 1 ) model constitutes a parsimoniously parameterized type of Markov chain, which has an easy-to-interpret data-generating mechanism and may also handle negative forms of serial dependence. Approaches for model fitting are elaborated on, and they are illustrated by two real-data examples: the modeling of a nominal sequence from biology, and of an ordinal time series regarding cloudiness. For future research, one might use the RS-DAR ( 1 ) model for constructing parsimonious advanced models, and one might adapt techniques for smoother regime transitions. Full article
Show Figures

Figure 1

Back to TopTop