Robust Procedures for Estimating and Testing in the Framework of Divergence Measures
A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".
Deadline for manuscript submissions: closed (30 November 2020) | Viewed by 32439
Special Issue Editors
Interests: minimum divergence estimators: robustness and efficiency; robust test procedures based on minimum divergence estimators; robust test procedures in composite likelihood, empirical likelihood, change point, and time series
Special Issues, Collections and Topics in MDPI journals
Interests: categorical data analysis; change points; empirical likelihood; generalized linear models; order restricted statistical inference; overdispersion models; regression diagnostics; robust statistics; small area estimation; statistical methods in public health for cancer surveillance
Special Issue Information
Dear Colleagues,
The approach for estimating and testing based on suitable divergence measures has become, in the last 30 years, a very popular technique not only in the field of statistics but also in other areas, such as machine learning, pattern recognition, etc. In relation to the estimation problem, it is necessary to minimize a suitable divergence measure between the data and the model under consideration. Some interesting examples of those estimators are the minimum phi-divergence estimators (MPHIE), in particular, the minimum Hellinger distance (MHD) and the minimum density power divergence estimators (MDPDE). The MPHIE are characterized by asymptotic efficiency (BAN estimators), the MHE by asymptotic efficiency and robustness inside the family of the MPHIE, and the MDPD by their robustness without a significant loss of efficiency as well as by the simplicity of getting them, because it is not necessary to use a nonparametric estimator of the true density function.
Based on these estimators of minima divergence or distance, many people have studied the possibility to use them in order to obtain statistics for testing hypotheses. There are some possibilities to use them with that objective: (i) Plugging them in a divergence measure in order to obtain the estimated distance (divergence) between the model, whose parameters have been estimated under the null hypothesis and the model evaluated in all the parameter space; and (ii) extending the concept of the Wald test in the sense of considering MDPDE instead of maximum likelihood estimators. These test statistics have been considered in many different statistical problems: Censoring, equality of means in normal and lognormal models, logistic regression model, multinomial regression in particular and GLM models in general, etc.
The scope of the contributions to this Special Issue will be to present new and original research papers based on MPHIE, MHD, and MDPDE as well as test statistics based on these estimators from a theoretical and applied point of view in different statistical problems with special emphasis on robustness. Manuscripts given solutions to different statistical problems as model selection criteria based on divergence measures or in statistics for high-dimensional data with divergence measures as loss function are welcome. Reviews making emphasis in the most recent state-of-the art in relation to the solution of statistical problems base on divergence measures are also welcome.
Prof. Dr. Leandro Pardo
Prof. Dr. Nirian Martin
Guest Editors
Manuscript Submission Information
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.
Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.
Keywords
- divergence measures
- minimum divergence estimators
- test statistics based on minimum divergence estimators
- Wald-type tests based on minimum divergence estimators
- efficiency; robustness
- model selection based on minimum distance estimators
Benefits of Publishing in a Special Issue
- Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
- Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
- Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
- External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
- e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.
Further information on MDPI's Special Issue polices can be found here.