Robust Distance Metric Learning in the Framework of Statistical Information Theory
A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".
Deadline for manuscript submissions: closed (30 November 2023) | Viewed by 11572
Special Issue Editors
Interests: minimum divergence estimators: robustness and efficiency; robust test procedures based on minimum divergence estimators; robust test procedures in composite likelihood, empirical likelihood, change point, and time series
Special Issues, Collections and Topics in MDPI journals
Interests: decision making; fuzzy measures; convex polytopes; mathematical aspects of subfamilies of fuzzy measures; divergence measures in statistical inference
Special Issue Information
Dear Colleagues,
In the last 30 years, the use of suitable tools of statistical information theory (divergence measures and entropies) in inferential procedures has become a very popular technique, not only in the field of statistics, but also in other areas, such as machine learning (ML), pattern recognition, and so on. Distance metrics learning (DML) plays an important role in ML to better understand and analyze the structure of the data. The reason is that distances show if some groups of data are close together or are quite separated. Therefore, DML provides a strong foundation for several machine learning algorithms, like k-nearest neighbors for supervised learning and k-means clustering for unsupervised learning. Hence, DML has received great attention from many researchers in recent years. Different distance metrics are chosen depending on the type of the data and the problem considered. Although some typical and simple distances, e.g., the Euclidean distance, can be considered to measure the similarities among a group of data, they cannot usually catch the statistical irregularities appearing in the data (as data contamination or outliers). Hence, the performance of these measures is rather poor even when the specified model assumptions are slightly violated (due to noises present in the data). This has led to consider entropies and divergence measures as an alternative way to measure distances between data sets, as these measures consider the underlying data distribution. Consequently, the Kullback–Leibler divergence measure, the family of Phi-divergencies measures (including the Cressie–Read family), the Bregman divergence measures (including the density power divergence family) and other divergence measures, as well as different entropies, have led to alternative procedures for treating the classical problems considered in ML.
The scope of the contributions to this Special Issue is to present new and original research papers based on different families of entropy and divergence measures for solving, from a theoretical or applied point of view, different problems considered in ML, paying special attention to robustness. Problems to be considered include (but are not limited to):
- Dimensional reduction: Informational Correlation Analysis (ICA), Canonical Correlation Analysis, Principal Components, ICA procedures for solving blind source separation, and so on.
- Clustering.
- Classification.
- Density-Ratio Estimation.
- Non-negative Matrix Factorization.
- Singular value decomposition: robust SVD, Active Learning, and so on.
Papers treating the case of high-dimensional data in ML problems in the framework of divergence measures are also welcome.
Finally, reviews making emphasis on the most recent state-of-the-art in relation to the solution of ML problems based on divergence measures are also accepted.
Prof. Dr. Leandro Pardo
Prof. Dr. Pedro Miranda
Guest Editors
Manuscript Submission Information
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.
Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.
Benefits of Publishing in a Special Issue
- Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
- Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
- Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
- External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
- e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.
Further information on MDPI's Special Issue polices can be found here.