entropy-logo

Journal Browser

Journal Browser

Information Geometry for Data Analysis

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: 28 February 2025 | Viewed by 9414

Special Issue Editors


E-Mail Website
Guest Editor
Laboratoire de Mathématiques Appliquées, Ecole Nationale de l'Aviation Civile, 7 Av. Edouard Belin, 31400 Toulouse, France
Interests: information geometry; dual connections; gauge structures; foliations; differential geometry applied to machine learning
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Enac Laboratory, Ecole Nationale de L'Aviation Civile (ENAC), 31055 Toulouse, France
Interests: information geometry; shape analysis; functional data statistics; neuroimaging; aircraft safety applications

Special Issue Information

Dear Colleagues,

Data encountered in real-world applications are often embedded in a high-dimensional space while lying intrinsically on a low-dimensional object. This is known as the “manifold hypothesis” in machine learning and is one of the keys to understand why some algorithms seem to overcome the bias-variance dilemma. Furthermore,  some kind of group invariance, e.g., translation or rotation invariance, is encountered most of the time, allowing for an effective description of the data. Taking these observations into account justifies the use of techniques coming from differential geometry and topology to better understand the data. Within this frame,  the Fisher metric plays a special role and underlies the concept of statistical model, a smooth Riemannian manifold endowed with this metric. Some recent works on neural networks robustness have demonstrated the power of this representation.

This Special Issue aims at presenting recent results relating data analysis and machine learning to geometry and topology. Theoretical papers as well as application-oriented contributions are welcomed.

Dr. Stéphane Puechmorel
Dr. Florence Nicol
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • fisher metric
  • dualistic structure
  • manifold learning
  • data manifold
  • statistical manifold
  • statistical model
  • persistent homology

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

21 pages, 600 KiB  
Article
Polynomial Regression on Lie Groups and Application to SE(3)
by Johan Aubray and Florence Nicol
Entropy 2024, 26(10), 825; https://doi.org/10.3390/e26100825 - 27 Sep 2024
Viewed by 514
Abstract
In this paper, we address the problem of estimating the position of a mobile such as a drone from noisy position measurements using the framework of Lie groups. To model the motion of a rigid body, the relevant Lie group happens to be [...] Read more.
In this paper, we address the problem of estimating the position of a mobile such as a drone from noisy position measurements using the framework of Lie groups. To model the motion of a rigid body, the relevant Lie group happens to be the Special Euclidean group SE(n), with n=2 or 3. Our work was carried out using a previously used parametric framework which derived equations for geodesic regression and polynomial regression on Riemannian manifolds. Based on this approach, our goal was to implement this technique in the Lie group SE(3) context. Given a set of noisy points in SE(3) representing measurements on the trajectory of a mobile, one wants to find the geodesic that best fits those points in a Riemannian least squares sense. Finally, applications to simulated data are proposed to illustrate this work. The limitations of such a method and future perspectives are discussed. Full article
(This article belongs to the Special Issue Information Geometry for Data Analysis)
Show Figures

Figure 1

18 pages, 372 KiB  
Article
Adversarial Robustness with Partial Isometry
by Loïc Shi-Garrier, Nidhal Carla Bouaynaya and Daniel Delahaye
Entropy 2024, 26(2), 103; https://doi.org/10.3390/e26020103 - 24 Jan 2024
Cited by 1 | Viewed by 1240
Abstract
Despite their remarkable performance, deep learning models still lack robustness guarantees, particularly in the presence of adversarial examples. This significant vulnerability raises concerns about their trustworthiness and hinders their deployment in critical domains that require certified levels of robustness. In this paper, we [...] Read more.
Despite their remarkable performance, deep learning models still lack robustness guarantees, particularly in the presence of adversarial examples. This significant vulnerability raises concerns about their trustworthiness and hinders their deployment in critical domains that require certified levels of robustness. In this paper, we introduce an information geometric framework to establish precise robustness criteria for l2 white-box attacks in a multi-class classification setting. We endow the output space with the Fisher information metric and derive criteria on the input–output Jacobian to ensure robustness. We show that model robustness can be achieved by constraining the model to be partially isometric around the training points. We evaluate our approach using MNIST and CIFAR-10 datasets against adversarial attacks, revealing its substantial improvements over defensive distillation and Jacobian regularization for medium-sized perturbations and its superior robustness performance to adversarial training for large perturbations, all while maintaining the desired accuracy. Full article
(This article belongs to the Special Issue Information Geometry for Data Analysis)
Show Figures

Figure 1

16 pages, 429 KiB  
Article
Pullback Bundles and the Geometry of Learning
by Stéphane Puechmorel
Entropy 2023, 25(10), 1450; https://doi.org/10.3390/e25101450 - 15 Oct 2023
Cited by 1 | Viewed by 1415
Abstract
Explainable Artificial Intelligence (XAI) and acceptable artificial intelligence are active topics of research in machine learning. For critical applications, being able to prove or at least to ensure with a high probability the correctness of algorithms is of utmost importance. In practice, however, [...] Read more.
Explainable Artificial Intelligence (XAI) and acceptable artificial intelligence are active topics of research in machine learning. For critical applications, being able to prove or at least to ensure with a high probability the correctness of algorithms is of utmost importance. In practice, however, few theoretical tools are known that can be used for this purpose. Using the Fisher Information Metric (FIM) on the output space yields interesting indicators in both the input and parameter spaces, but the underlying geometry is not yet fully understood. In this work, an approach based on the pullback bundle, a well-known trick for describing bundle morphisms, is introduced and applied to the encoder–decoder block. With constant rank hypothesis on the derivative of the network with respect to its inputs, a description of its behavior is obtained. Further generalization is gained through the introduction of the pullback generalized bundle that takes into account the sensitivity with respect to weights. Full article
(This article belongs to the Special Issue Information Geometry for Data Analysis)
Show Figures

Figure 1

16 pages, 316 KiB  
Article
Information-Geometric Approach for a One-Sided Truncated Exponential Family
by Masaki Yoshioka and Fuyuhiko Tanaka
Entropy 2023, 25(5), 769; https://doi.org/10.3390/e25050769 - 8 May 2023
Cited by 1 | Viewed by 1286
Abstract
In information geometry, there has been extensive research on the deep connections between differential geometric structures, such as the Fisher metric and the α-connection, and the statistical theory for statistical models satisfying regularity conditions. However, the study of information geometry for non-regular [...] Read more.
In information geometry, there has been extensive research on the deep connections between differential geometric structures, such as the Fisher metric and the α-connection, and the statistical theory for statistical models satisfying regularity conditions. However, the study of information geometry for non-regular statistical models is insufficient, and a one-sided truncated exponential family (oTEF) is one example of these models. In this paper, based on the asymptotic properties of maximum likelihood estimators, we provide a Riemannian metric for the oTEF. Furthermore, we demonstrate that the oTEF has an α = 1 parallel prior distribution and that the scalar curvature of a certain submodel, including the Pareto family, is a negative constant. Full article
(This article belongs to the Special Issue Information Geometry for Data Analysis)
20 pages, 946 KiB  
Article
Modeling Categorical Variables by Mutual Information Decomposition
by Jiun-Wei Liou, Michelle Liou and Philip E. Cheng
Entropy 2023, 25(5), 750; https://doi.org/10.3390/e25050750 - 4 May 2023
Viewed by 1596
Abstract
This paper proposed the use of mutual information (MI) decomposition as a novel approach to identifying indispensable variables and their interactions for contingency table analysis. The MI analysis identified subsets of associative variables based on multinomial distributions and validated parsimonious log-linear and logistic [...] Read more.
This paper proposed the use of mutual information (MI) decomposition as a novel approach to identifying indispensable variables and their interactions for contingency table analysis. The MI analysis identified subsets of associative variables based on multinomial distributions and validated parsimonious log-linear and logistic models. The proposed approach was assessed using two real-world datasets dealing with ischemic stroke (with 6 risk factors) and banking credit (with 21 discrete attributes in a sparse table). This paper also provided an empirical comparison of MI analysis versus two state-of-the-art methods in terms of variable and model selections. The proposed MI analysis scheme can be used in the construction of parsimonious log-linear and logistic models with a concise interpretation of discrete multivariate data. Full article
(This article belongs to the Special Issue Information Geometry for Data Analysis)
Show Figures

Figure 1

24 pages, 5697 KiB  
Article
Systematic Classification of Curvature and Feature Descriptor of 3D Shape and Its Application to “Complexity” Quantification Methods
by Kazuma Matsuyama, Takahiro Shimizu and Takeo Kato
Entropy 2023, 25(4), 624; https://doi.org/10.3390/e25040624 - 6 Apr 2023
Cited by 5 | Viewed by 2085
Abstract
Generative design is a system that automates part of the design process, but it cannot evaluate psychological issues related to shapes, such as “beauty” and “liking”. Designers therefore evaluate and choose the generated shapes based on their experience. Among the design features, “complexity” [...] Read more.
Generative design is a system that automates part of the design process, but it cannot evaluate psychological issues related to shapes, such as “beauty” and “liking”. Designers therefore evaluate and choose the generated shapes based on their experience. Among the design features, “complexity” is considered to influence “aesthetic preference”. Although feature descriptors calculated from curvature can be used to quantify “complexity”, the selection guidelines for curvature and feature descriptors have not been adequately discussed. Therefore, this study aimed to conduct a systematic classification of curvature and a feature descriptor of 3D shapes and to apply the results to the “complexity” quantification. First, we surveyed the literature on curvature and feature descriptors and conducted a systematic classification. To quantify “complexity”, we used five curvatures (Gaussian curvature, mean curvature, Casorati curvature, shape index, and curvature index) and a feature descriptor (entropy of occurrence probability) obtained from the classification and compared them with the sensory evaluation values of “complexity”. The results showed that the determination coefficient between the quantified and sensory evaluation values of “complexity” was highest when the mean curvature was used. In addition, the Casorati curvature tended to show the highest signal-to-noise ratio (i.e., a high determination coefficient irrespective of the parameters set in the entropy calculation). These results will foster the development of generative design of 3D shapes using psychological evaluation. Full article
(This article belongs to the Special Issue Information Geometry for Data Analysis)
Show Figures

Figure 1

Back to TopTop