entropy-logo

Journal Browser

Journal Browser

Entropy Measures for Data Analysis: Theory, Algorithms and Applications

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (30 April 2019) | Viewed by 66045

Printed Edition Available!
A printed edition of this Special Issue is available here.

Special Issue Editor


E-Mail Website
Guest Editor
Institut für Mathematik, Universität zu Lübeck, D-23562 Lübeck, Germany
Interests: data analysis; time series analysis; computational statistics; information theory; ergodic theory; automatic learning
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Entropies and entropy-like quantities are playing an increasing role in modern non-linear data analysis. Fields of their application reach from diagnostics in physiology, for instance, electroencephalography (EEG), magnetoencephalography (MEG) and electrocardiography (ECG), to econophysics and engineering. During the last few years, classical concepts as the Approximate entropy and the Sample entropy have been supplemented by new entropy measures, like the Permutation entropy and various variants of it. Recent developments are focused on multidimensional generalizations of the concepts with a special emphasize on the quantification of coupling between time series and system components behind them. Some of the main future challenges in the field are a better understanding of the nature of the various entropy measures and their relationships, with the aim of their adequate application including good parameter choices. The utilization of entropy measures as features in automatic learning and their application to large and complex data for tasks as classification, discrimination and finding structural changes requires fast and well-founded algorithms.

Prof. Dr. Karsten Keller
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • data analysis
  • complexity measures
  • entropy
  • approximate entropy
  • sample entropy
  • permutation entropy
  • classification
  • discrimination
  • automatic learning

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Related Special Issue

Published Papers (15 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research, Other

3 pages, 164 KiB  
Editorial
Entropy Measures for Data Analysis: Theory, Algorithms and Applications
by Karsten Keller
Entropy 2019, 21(10), 935; https://doi.org/10.3390/e21100935 - 25 Sep 2019
Cited by 4 | Viewed by 2547
Abstract
Entropies and entropy-like quantities are playing an increasing role in modern non-linear data analysis and beyond [...] Full article

Research

Jump to: Editorial, Other

10 pages, 965 KiB  
Article
On the Information Content of Coarse Data with Respect to the Particle Size Distribution of Complex Granular Media: Rationale Approach and Testing
by Carlos García-Gutiérrez, Miguel Ángel Martín and Yakov Pachepsky
Entropy 2019, 21(6), 601; https://doi.org/10.3390/e21060601 - 17 Jun 2019
Cited by 4 | Viewed by 2935
Abstract
The particle size distribution (PSD) of complex granular media is seen as a mathematical measure supported in the interval of grain sizes. A physical property characterizing granular products used in the Andreasen and Andersen model of 1930 is re-interpreted in Information Entropy terms [...] Read more.
The particle size distribution (PSD) of complex granular media is seen as a mathematical measure supported in the interval of grain sizes. A physical property characterizing granular products used in the Andreasen and Andersen model of 1930 is re-interpreted in Information Entropy terms leading to a differential information equation as a conceptual approach for the PSD. Under this approach, measured data which give a coarse description of the distribution may be seen as initial conditions for the proposed equation. A solution of the equation agrees with a selfsimilar measure directly postulated as a PSD model by Martín and Taguas almost 80 years later, thus both models appear to be linked. A variant of this last model, together with detailed soil PSD data of 70 soils are used to study the information content of limited experimental data formed by triplets and its ability in the PSD reconstruction. Results indicate that the information contained in certain soil triplets is sufficient to rebuild the whole PSD: for each soil sample tested there is always at least a triplet that contains enough information to simulate the whole distribution. Full article
Show Figures

Figure 1

24 pages, 1071 KiB  
Article
Algorithmics, Possibilities and Limits of Ordinal Pattern Based Entropies
by Albert B. Piek, Inga Stolz and Karsten Keller
Entropy 2019, 21(6), 547; https://doi.org/10.3390/e21060547 - 29 May 2019
Cited by 15 | Viewed by 4983
Abstract
The study of nonlinear and possibly chaotic time-dependent systems involves long-term data acquisition or high sample rates. The resulting big data is valuable in order to provide useful insights into long-term dynamics. However, efficient and robust algorithms are required that can analyze long [...] Read more.
The study of nonlinear and possibly chaotic time-dependent systems involves long-term data acquisition or high sample rates. The resulting big data is valuable in order to provide useful insights into long-term dynamics. However, efficient and robust algorithms are required that can analyze long time series without decomposing the data into smaller series. Here symbolic-based analysis techniques that regard the dependence of data points are of some special interest. Such techniques are often prone to capacity or, on the contrary, to undersampling problems if the chosen parameters are too large. In this paper we present and apply algorithms of the relatively new ordinal symbolic approach. These algorithms use overlapping information and binary number representation, whilst being fast in the sense of algorithmic complexity, and allow, to the best of our knowledge, larger parameters than comparable methods currently used. We exploit the achieved large parameter range to investigate the limits of entropy measures based on ordinal symbolics. Moreover, we discuss data simulations from this viewpoint. Full article
Show Figures

Figure 1

20 pages, 2893 KiB  
Article
A Novel Hybrid Meta-Heuristic Algorithm Based on the Cross-Entropy Method and Firefly Algorithm for Global Optimization
by Guocheng Li, Pei Liu, Chengyi Le and Benda Zhou
Entropy 2019, 21(5), 494; https://doi.org/10.3390/e21050494 - 14 May 2019
Cited by 25 | Viewed by 4611
Abstract
Global optimization, especially on a large scale, is challenging to solve due to its nonlinearity and multimodality. In this paper, in order to enhance the global searching ability of the firefly algorithm (FA) inspired by bionics, a novel hybrid meta-heuristic algorithm is proposed [...] Read more.
Global optimization, especially on a large scale, is challenging to solve due to its nonlinearity and multimodality. In this paper, in order to enhance the global searching ability of the firefly algorithm (FA) inspired by bionics, a novel hybrid meta-heuristic algorithm is proposed by embedding the cross-entropy (CE) method into the firefly algorithm. With adaptive smoothing and co-evolution, the proposed method fully absorbs the ergodicity, adaptability and robustness of the cross-entropy method. The new hybrid algorithm achieves an effective balance between exploration and exploitation to avoid falling into a local optimum, enhance its global searching ability, and improve its convergence rate. The results of numeral experiments show that the new hybrid algorithm possesses more powerful global search capacity, higher optimization precision, and stronger robustness. Full article
Show Figures

Figure 1

25 pages, 1207 KiB  
Article
Embedded Dimension and Time Series Length. Practical Influence on Permutation Entropy and Its Applications
by David Cuesta-Frau, Juan Pablo Murillo-Escobar, Diana Alexandra Orrego and Edilson Delgado-Trejos
Entropy 2019, 21(4), 385; https://doi.org/10.3390/e21040385 - 10 Apr 2019
Cited by 35 | Viewed by 4657
Abstract
Permutation Entropy (PE) is a time series complexity measure commonly used in a variety of contexts, with medicine being the prime example. In its general form, it requires three input parameters for its calculation: time series length N, embedded dimension m, [...] Read more.
Permutation Entropy (PE) is a time series complexity measure commonly used in a variety of contexts, with medicine being the prime example. In its general form, it requires three input parameters for its calculation: time series length N, embedded dimension m, and embedded delay τ . Inappropriate choices of these parameters may potentially lead to incorrect interpretations. However, there are no specific guidelines for an optimal selection of N, m, or τ , only general recommendations such as N > > m ! , τ = 1 , or m = 3 , , 7 . This paper deals specifically with the study of the practical implications of N > > m ! , since long time series are often not available, or non-stationary, and other preliminary results suggest that low N values do not necessarily invalidate PE usefulness. Our study analyses the PE variation as a function of the series length N and embedded dimension m in the context of a diverse experimental set, both synthetic (random, spikes, or logistic model time series) and real–world (climatology, seismic, financial, or biomedical time series), and the classification performance achieved with varying N and m. The results seem to indicate that shorter lengths than those suggested by N > > m ! are sufficient for a stable PE calculation, and even very short time series can be robustly classified based on PE measurements before the stability point is reached. This may be due to the fact that there are forbidden patterns in chaotic time series, not all the patterns are equally informative, and differences among classes are already apparent at very short lengths. Full article
Show Figures

Figure 1

19 pages, 903 KiB  
Article
A Novel Belief Entropy for Measuring Uncertainty in Dempster-Shafer Evidence Theory Framework Based on Plausibility Transformation and Weighted Hartley Entropy
by Qian Pan, Deyun Zhou, Yongchuan Tang, Xiaoyang Li and Jichuan Huang
Entropy 2019, 21(2), 163; https://doi.org/10.3390/e21020163 - 10 Feb 2019
Cited by 30 | Viewed by 3790
Abstract
Dempster-Shafer evidence theory (DST) has shown its great advantages to tackle uncertainty in a wide variety of applications. However, how to quantify the information-based uncertainty of basic probability assignment (BPA) with belief entropy in DST framework is still an open issue. The main [...] Read more.
Dempster-Shafer evidence theory (DST) has shown its great advantages to tackle uncertainty in a wide variety of applications. However, how to quantify the information-based uncertainty of basic probability assignment (BPA) with belief entropy in DST framework is still an open issue. The main work of this study is to define a new belief entropy for measuring uncertainty of BPA. The proposed belief entropy has two components. The first component is based on the summation of the probability mass function (PMF) of single events contained in each BPA, which are obtained using plausibility transformation. The second component is the same as the weighted Hartley entropy. The two components could effectively measure the discord uncertainty and non-specificity uncertainty found in DST framework, respectively. The proposed belief entropy is proved to satisfy the majority of the desired properties for an uncertainty measure in DST framework. In addition, when BPA is probability distribution, the proposed method could degrade to Shannon entropy. The feasibility and superiority of the new belief entropy is verified according to the results of numerical experiments. Full article
Show Figures

Figure 1

15 pages, 1866 KiB  
Article
Improving Entropy Estimates of Complex Network Topology for the Characterization of Coupling in Dynamical Systems
by Teddy Craciunescu, Andrea Murari and Michela Gelfusa
Entropy 2018, 20(11), 891; https://doi.org/10.3390/e20110891 - 20 Nov 2018
Cited by 8 | Viewed by 3942
Abstract
A new measure for the characterization of interconnected dynamical systems coupling is proposed. The method is based on the representation of time series as weighted cross-visibility networks. The weights are introduced as the metric distance between connected nodes. The structure of the networks, [...] Read more.
A new measure for the characterization of interconnected dynamical systems coupling is proposed. The method is based on the representation of time series as weighted cross-visibility networks. The weights are introduced as the metric distance between connected nodes. The structure of the networks, depending on the coupling strength, is quantified via the entropy of the weighted adjacency matrix. The method has been tested on several coupled model systems with different individual properties. The results show that the proposed measure is able to distinguish the degree of coupling of the studied dynamical systems. The original use of the geodesic distance on Gaussian manifolds as a metric distance, which is able to take into account the noise inherently superimposed on the experimental data, provides significantly better results in the calculation of the entropy, improving the reliability of the coupling estimates. The application to the interaction between the El Niño Southern Oscillation (ENSO) and the Indian Ocean Dipole and to the influence of ENSO on influenza pandemic occurrence illustrates the potential of the method for real-life problems. Full article
Show Figures

Graphical abstract

14 pages, 2484 KiB  
Article
Cross-Sectoral Information Transfer in the Chinese Stock Market around Its Crash in 2015
by Xudong Wang and Xiaofeng Hui
Entropy 2018, 20(9), 663; https://doi.org/10.3390/e20090663 - 3 Sep 2018
Cited by 16 | Viewed by 4237
Abstract
This paper applies effective transfer entropy to research the information transfer in the Chinese stock market around its crash in 2015. According to the market states, the entire period is divided into four sub-phases: the tranquil, bull, crash, and post-crash periods. Kernel density [...] Read more.
This paper applies effective transfer entropy to research the information transfer in the Chinese stock market around its crash in 2015. According to the market states, the entire period is divided into four sub-phases: the tranquil, bull, crash, and post-crash periods. Kernel density estimation is used to calculate the effective transfer entropy. Then, the information transfer network is constructed. Nodes’ centralities and the directed maximum spanning trees of the networks are analyzed. The results show that, in the tranquil period, the information transfer is weak in the market. In the bull period, the strength and scope of the information transfer increases. The utility sector outputs a great deal of information and is the hub node for the information flow. In the crash period, the information transfer grows further. The market efficiency in this period is worse than that in the other three sub-periods. The information technology sector is the biggest information source, while the consumer staples sector receives the most information. The interactions of the sectors become more direct. In the post-crash period, information transfer declines but is still stronger than the tranquil time. The financial sector receives the largest amount of information and is the pivot node. Full article
Show Figures

Figure 1

20 pages, 890 KiB  
Article
Analog Circuit Fault Diagnosis via Joint Cross-Wavelet Singular Entropy and Parametric t-SNE
by Wei He, Yigang He, Bing Li and Chaolong Zhang
Entropy 2018, 20(8), 604; https://doi.org/10.3390/e20080604 - 14 Aug 2018
Cited by 26 | Viewed by 4299
Abstract
In this paper, a novel method with cross-wavelet singular entropy (XWSE)-based feature extractor and support vector machine (SVM) is proposed for analog circuit fault diagnosis. Primarily, cross-wavelet transform (XWT), which possesses a good capability to restrain the environment noise, is applied to transform [...] Read more.
In this paper, a novel method with cross-wavelet singular entropy (XWSE)-based feature extractor and support vector machine (SVM) is proposed for analog circuit fault diagnosis. Primarily, cross-wavelet transform (XWT), which possesses a good capability to restrain the environment noise, is applied to transform the fault signal into time-frequency spectra (TFS). Then, a simple segmentation method is utilized to decompose the TFS into several blocks. We employ the singular value decomposition (SVD) to analysis the blocks, then Tsallis entropy of each block is obtained to construct the original features. Subsequently, the features are imported into parametric t-distributed stochastic neighbor embedding (t-SNE) for dimension reduction to yield the discriminative and concise fault characteristics. Finally, the fault characteristics are entered into SVM classifier to locate circuits’ defects that the free parameters of SVM are determined by quantum-behaved particle swarm optimization (QPSO). Simulation results show the proposed approach is with superior diagnostic performance than other existing methods. Full article
Show Figures

Figure 1

17 pages, 2271 KiB  
Article
Sample Entropy of Human Gait Center of Pressure Displacement: A Systematic Methodological Analysis
by Samira Ahmadi, Nariman Sepehri, Christine Wu and Tony Szturm
Entropy 2018, 20(8), 579; https://doi.org/10.3390/e20080579 - 6 Aug 2018
Cited by 37 | Viewed by 6134
Abstract
Sample entropy (SampEn) has been used to quantify the regularity or predictability of human gait signals. There are studies on the appropriate use of this measure for inter-stride spatio-temporal gait variables. However, the sensitivity of this measure to preprocessing of the signal and [...] Read more.
Sample entropy (SampEn) has been used to quantify the regularity or predictability of human gait signals. There are studies on the appropriate use of this measure for inter-stride spatio-temporal gait variables. However, the sensitivity of this measure to preprocessing of the signal and to variant values of template size (m), tolerance size (r), and sampling rate has not been studied when applied to “whole” gait signals. Whole gait signals are the entire time series data obtained from force or inertial sensors. This study systematically investigates the sensitivity of SampEn of the center of pressure displacement in the mediolateral direction (ML COP-D) to variant parameter values and two pre-processing methods. These two methods are filtering the high-frequency components and resampling the signals to have the same average number of data points per stride. The discriminatory ability of SampEn is studied by comparing treadmill walk only (WO) to dual-task (DT) condition. The results suggest that SampEn maintains the directional difference between two walking conditions across variant parameter values, showing a significant increase from WO to DT condition, especially when signals are low-pass filtered. Moreover, when gait speed is different between test conditions, signals should be low-pass filtered and resampled to have the same average number of data points per stride. Full article
Show Figures

Figure 1

20 pages, 2361 KiB  
Article
Assessing Information Transmission in Data Transformations with the Channel Multivariate Entropy Triangle
by Francisco J. Valverde-Albacete and Carmen Peláez-Moreno
Entropy 2018, 20(7), 498; https://doi.org/10.3390/e20070498 - 27 Jun 2018
Cited by 3 | Viewed by 4179
Abstract
Data transformation, e.g., feature transformation and selection, is an integral part of any machine learning procedure. In this paper, we introduce an information-theoretic model and tools to assess the quality of data transformations in machine learning tasks. In an unsupervised fashion, we analyze [...] Read more.
Data transformation, e.g., feature transformation and selection, is an integral part of any machine learning procedure. In this paper, we introduce an information-theoretic model and tools to assess the quality of data transformations in machine learning tasks. In an unsupervised fashion, we analyze the transformation of a discrete, multivariate source of information X¯ into a discrete, multivariate sink of information Y¯ related by a distribution PX¯Y¯. The first contribution is a decomposition of the maximal potential entropy of (X¯,Y¯), which we call a balance equation, into its (a) non-transferable, (b) transferable, but not transferred, and (c) transferred parts. Such balance equations can be represented in (de Finetti) entropy diagrams, our second set of contributions. The most important of these, the aggregate channel multivariate entropy triangle, is a visual exploratory tool to assess the effectiveness of multivariate data transformations in transferring information from input to output variables. We also show how these decomposition and balance equations also apply to the entropies of X¯ and Y¯, respectively, and generate entropy triangles for them. As an example, we present the application of these tools to the assessment of information transfer efficiency for Principal Component Analysis and Independent Component Analysis as unsupervised feature transformation and selection procedures in supervised classification tasks. Full article
Show Figures

Graphical abstract

19 pages, 5725 KiB  
Article
Vague Entropy Measure for Complex Vague Soft Sets
by Ganeshsree Selvachandran, Harish Garg and Shio Gai Quek
Entropy 2018, 20(6), 403; https://doi.org/10.3390/e20060403 - 24 May 2018
Cited by 36 | Viewed by 3739
Abstract
The complex vague soft set (CVSS) model is a hybrid of complex fuzzy sets and soft sets that have the ability to accurately represent and model two-dimensional information for real-life phenomena that are periodic in nature. In the existing studies of fuzzy and [...] Read more.
The complex vague soft set (CVSS) model is a hybrid of complex fuzzy sets and soft sets that have the ability to accurately represent and model two-dimensional information for real-life phenomena that are periodic in nature. In the existing studies of fuzzy and its extensions, the uncertainties which are present in the data are handled with the help of membership degree which is the subset of real numbers. However, in the present work, this condition has been relaxed with the degrees whose ranges are a subset of the complex subset with unit disc and hence handle the information in a better way. Under this environment, we developed some entropy measures of the CVSS model induced by the axiomatic definition of distance measure. Some desirable relations between them are also investigated. A numerical example related to detection of an image by the robot is given to illustrate the proposed entropy measure. Full article
Show Figures

Figure 1

16 pages, 1309 KiB  
Article
Identification of Auditory Object-Specific Attention from Single-Trial Electroencephalogram Signals via Entropy Measures and Machine Learning
by Yun Lu, Mingjiang Wang, Qiquan Zhang and Yufei Han
Entropy 2018, 20(5), 386; https://doi.org/10.3390/e20050386 - 21 May 2018
Cited by 12 | Viewed by 4384
Abstract
Existing research has revealed that auditory attention can be tracked from ongoing electroencephalography (EEG) signals. The aim of this novel study was to investigate the identification of peoples’ attention to a specific auditory object from single-trial EEG signals via entropy measures and machine [...] Read more.
Existing research has revealed that auditory attention can be tracked from ongoing electroencephalography (EEG) signals. The aim of this novel study was to investigate the identification of peoples’ attention to a specific auditory object from single-trial EEG signals via entropy measures and machine learning. Approximate entropy (ApEn), sample entropy (SampEn), composite multiscale entropy (CmpMSE) and fuzzy entropy (FuzzyEn) were used to extract the informative features of EEG signals under three kinds of auditory object-specific attention (Rest, Auditory Object1 Attention (AOA1) and Auditory Object2 Attention (AOA2)). The linear discriminant analysis and support vector machine (SVM), were used to construct two auditory attention classifiers. The statistical results of entropy measures indicated that there were significant differences in the values of ApEn, SampEn, CmpMSE and FuzzyEn between Rest, AOA1 and AOA2. For the SVM-based auditory attention classifier, the auditory object-specific attention of Rest, AOA1 and AOA2 could be identified from EEG signals using ApEn, SampEn, CmpMSE and FuzzyEn as features and the identification rates were significantly different from chance level. The optimal identification was achieved by the SVM-based auditory attention classifier using CmpMSE with the scale factor τ = 10. This study demonstrated a novel solution to identify the auditory object-specific attention from single-trial EEG signals without the need to access the auditory stimulus. Full article
Show Figures

Figure 1

14 pages, 2781 KiB  
Article
The Relationship between Postural Stability and Lower-Limb Muscle Activity Using an Entropy-Based Similarity Index
by Chien-Chih Wang, Bernard C. Jiang and Pei-Min Huang
Entropy 2018, 20(5), 320; https://doi.org/10.3390/e20050320 - 26 Apr 2018
Cited by 6 | Viewed by 4647
Abstract
The aim of this study is to see if the centre of pressure (COP) measurements on the postural stability can be used to represent the electromyography (EMG) measurement on the activity data of lower limb muscles. If so, the cost-effective COP data measurements [...] Read more.
The aim of this study is to see if the centre of pressure (COP) measurements on the postural stability can be used to represent the electromyography (EMG) measurement on the activity data of lower limb muscles. If so, the cost-effective COP data measurements can be used to indicate the level of postural stability and lower limb muscle activity. The Hilbert–Huang Transform method was used to analyse the data from the experimental designed to examine the correlation between lower-limb muscles and postural stability. We randomly selected 24 university students to participate in eight scenarios and simultaneously measured their COP and EMG signals during the experiments. The Empirical Mode Decomposition was used to identify the intrinsic-mode functions (IMF) that can distinguish between the COP and EMG at different states. Subsequently, similarity indices and synchronization analyses were used to calculate the correlation between the lower-limb muscle strength and the postural stability. The IMF5 of the COP signals and the IMF6 of the EMG signals were not significantly different and the average frequency was 0.8 Hz, with a range of 0–2 Hz. When the postural stability was poor, the COP and EMG had a high synchronization with index values within the range of 0.010–0.015. With good postural stability, the synchronization indices were between 0.006 and 0.080 and both exhibited low synchronization. The COP signals and the low frequency EMG signals were highly correlated. In conclusion, we demonstrated that the COP may provide enough information on postural stability without the EMG data. Full article
Show Figures

Figure 1

Other

Jump to: Editorial, Research

14 pages, 1717 KiB  
Concept Paper
Learning Entropy as a Learning-Based Information Concept
by Ivo Bukovsky, Witold Kinsner and Noriyasu Homma
Entropy 2019, 21(2), 166; https://doi.org/10.3390/e21020166 - 11 Feb 2019
Cited by 11 | Viewed by 4469
Abstract
Recently, a novel concept of a non-probabilistic novelty detection measure, based on a multi-scale quantification of unusually large learning efforts of machine learning systems, was introduced as learning entropy (LE). The key finding with LE is that the learning effort of learning systems [...] Read more.
Recently, a novel concept of a non-probabilistic novelty detection measure, based on a multi-scale quantification of unusually large learning efforts of machine learning systems, was introduced as learning entropy (LE). The key finding with LE is that the learning effort of learning systems is quantifiable as a novelty measure for each individually observed data point of otherwise complex dynamic systems, while the model accuracy is not a necessary requirement for novelty detection. This brief paper extends the explanation of LE from the point of an informatics approach towards a cognitive (learning-based) information measure emphasizing the distinction from Shannon’s concept of probabilistic information. Fundamental derivations of learning entropy and of its practical estimations are recalled and further extended. The potentials, limitations, and, thus, the current challenges of LE are discussed. Full article
Show Figures

Figure 1

Back to TopTop