Robust Negative Binomial Regression via the Kibria–Lukman Strategy: Methodology and Application
Abstract
:1. Introduction
2. Methodology
2.1. Negative Binomial Regression Model
2.2. Shrinkage Estimators
2.3. Shrinkage-Robust Estimators
2.4. Theoretical Comparisons between Estimators
- (I)
- Ψ is finite.
- (II)
- ψjj is skew-symmetric and nondecreasing.
- (III)
- The errors have a zero mean and finite variance.
3. Simulation Study
4. Application
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Aeberhard, W.H.; Cantoni, E.; Heritier, S. Robust inference in the negative binomial regression model with an application to falls data. Biometrics 2014, 70, 920–931. [Google Scholar] [CrossRef]
- McCullagh, P.; Nelder, J.A. Generalized Linear Models, 2nd ed.; Chapman & Hall: London, UK, 1989. [Google Scholar]
- Algamal, Z.Y.; Abonazel, M.R.; Awwad, F.A.; Eldin, E.T. Modified jackknife ridge estimator for the Conway-Maxwell-Poisson model. Sci. Afr. 2023, 19, e01543. [Google Scholar] [CrossRef]
- Månsson, K. On ridge estimators for the negative binomial regression model. Econ. Model. 2012, 29, 178–184. [Google Scholar] [CrossRef]
- Mansson, K. Developing a Liu estimator for the negative binomial regression model: Method and application. J. Stat. Comput. Simul. 2013, 83, 1773–1780. [Google Scholar] [CrossRef]
- Alobaidi, N.N.; Shamany, R.E.; Algamal, Z.Y. A new ridge estimator for the negative binomial regression model. Thail. Stat. 2021, 19, 116–125. [Google Scholar]
- Jabur, D.; Rashad, N.; Algamal, Z. Jackknifed Liu-type estimator in the negative binomial regression model. Int. J. Nonlinear Anal. Appl. 2022, 13, 2675–2684. [Google Scholar]
- Abonazel, M.R.; El-sayed, S.M.; Saber, O.M. Performance of robust count regression estimators in the case of overdispersion, zero inflated, and outliers: Simulation study and application to German health data. Commun. Math. Biol. Neurosci. 2021, 2021, 55. [Google Scholar]
- Tüzen, F.; Erbaş, S.; Olmuş, H. A simulation study for count data models under varying degrees of outliers and zeros. Commun. Stat.-Simul. Comput. 2020, 49, 1078–1088. [Google Scholar] [CrossRef]
- Medina, M.A.; Ronchetti, E. Robust statistics: A selective overview and new directions. WIREs Comput. Stat. 2015, 7, 372–393. [Google Scholar] [CrossRef]
- Lukman, A.F.; Arashi, M.; Prokaj, V. Robust biased estimators for Poisson regression model: Simulation and applications. Concurr. Comput. Pract. Exp. 2023, 35, e7594. [Google Scholar] [CrossRef]
- Lukman, A.F.; Farghali, R.A.; Kibria, B.M.G.; Oluyemi, O.A. Robust-stein estimator for overcoming outliers and multicollinearity. Sci. Rep. 2023, 13, 9066. [Google Scholar] [CrossRef] [PubMed]
- Roozbeh, M.; Babaie-Kafaki, S.; Aminifard, Z. Two penalized mixed–integer nonlinear programming approaches to tackle multicollinearity and outliers effects in linear regression models. J. Ind. Manag. Optim. 2021, 17, 3475–3491. [Google Scholar] [CrossRef]
- Roozbeh, M.; Babaie-Kafaki, S.; Maanavi, M. A heuristic algorithm to combat outliers and multicollinearity in regression model analysis. Iran. J. Numer. Anal. Optim. 2022, 12, 173–186. [Google Scholar]
- Hoerl, A.E.; Kennard, R.W. Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 1970, 12, 55–67. [Google Scholar] [CrossRef]
- Liu, K. A new class of biased estimate in linear regression. Commun. Stat. 1993, 22, 393–402. [Google Scholar]
- Liu, K. Using Liu-type estimator to combat collinearity. Commun. Stat.-Theory Methods 2003, 32, 1009–2003. [Google Scholar] [CrossRef]
- Kibria, B.M.G.; Lukman, A.F. A New Ridge-Type Estimator for the Linear Regression Model: Simulations and Applications. Scientifica 2020, 2020, 9758378. [Google Scholar] [CrossRef]
- Asar, Y.; Genc, A. New shrinkage parameters for the liu-type logistic estimators. Commun. Stat.-Simul. Comput. 2016, 45, 1094–1103. [Google Scholar] [CrossRef]
- Asar, Y.; Genç, A. Two-parameter ridge estimator in the binary logistic regression. Commun. Stat.-Simul. Comput. 2017, 46, 7088–7099. [Google Scholar] [CrossRef]
- Kibria, B.G.; Månsson, K.; Shukur, G. Some ridge regression estimators for the zero-inflated Poisson model. J. Appl. Stat. 2013, 40, 721–735. [Google Scholar] [CrossRef]
- Kibria, B.G.; Månsson, K.; Shukur, G. A Ridge Regression Estimator for the Zero-Inflated Poisson Model; Royal Institute of Technology, CESIS-Centre of Excellence for Science and Innovation Studies: Stockholm, Sweden, 2011. [Google Scholar]
- Al-Taweel, Y.; Algamal, Z. Almost unbiased ridge estimator in the zero-inflated Poisson regression model. TWMS J. Appl. Eng. Math. 2022, 12, 235–246. [Google Scholar]
- Månsson, K.; Shukur, G. A Poisson ridge regression estimator. Econ. Model. 2011, 28, 1475–1481. [Google Scholar] [CrossRef]
- Asar, Y.; Genç, A. A new two-parameter estimator for the Poisson regression model. Iran. J. Sci. Technol. Trans. Sci. 2018, 42, 793–803. [Google Scholar] [CrossRef]
- Lukman, A.F.; Aladeitan, B.; Ayinde, K.; Abonazel, M.R. Modified ridge-type for the Poisson Regression Model: Simulation and Application. J. Appl. Stat. 2021, 49, 2124–2136. [Google Scholar] [CrossRef]
- Lukman, A.F.; Adewuyi, E.; Månsson, K.; Kibria, B.M.G. A new estimator for the multicollinear poisson regression model: Simulation and application. Sci. Rep. 2021, 11, 3732. [Google Scholar] [CrossRef]
- Huang, J.; Yang, H. A two-parameter estimator in the negative binomial regression model. J. Stat. Comput. Simul. 2014, 84, 124–134. [Google Scholar] [CrossRef]
- Çetinkaya, M.K.; Kaçıranlar, S. Improved two-parameter estimators for the negative binomial and Poisson regression models. J. Stat. Comput. Simul. 2019, 89, 2645–2660. [Google Scholar] [CrossRef]
- Abonazel, M.R.; Saber, A.A.; Awwad, F.A. Kibria–Lukman estimator for the Conway–Maxwell Poisson regression model: Simulation and applications. Sci. Afr. 2023, 19, e01553. [Google Scholar] [CrossRef]
- Barnett, V.; Lewis, T. Outliers in Statistical Data; Wiley: New York, NY, USA, 1994. [Google Scholar]
- Chatterjee, S.; Hadi, A.S. Influential observations, high leverage points, and outliers in linear regression. Stat. Sci. 1986, 1, 379–416. [Google Scholar]
- Rousseeuw, P.J.; Leroy, A.M. Robust Regression and Outlier Detection; Series in Applied Probability and Statistics; Wiley Interscience: New York, NY, USA, 1987; 329p. [Google Scholar] [CrossRef]
- Wasim, D.; Suhail, M.; Albalawi, O.; Shabbir, M. Weighted penalized m-estimators in robust ridge regression: An application to gasoline consumption data. J. Stat. Comput. Simul. 2024, 1–30. [Google Scholar] [CrossRef]
- Huber, P.J. Robust Regression: Asymptotics, Conjectures and Monte Carlo. Ann. Stat. 1973, 1, 799–821. [Google Scholar] [CrossRef]
- Silvapulle, M. Robust ridge regression based on an M-estimator. Aust. J. Stat. 1991, 33, 319–333. [Google Scholar] [CrossRef]
- Ertas, H. A modified ridge M-estimator for linear regression model with multicollinearity and outliers. Commun. Stat.-Simul. Comput. 2018, 47, 1240–1250. [Google Scholar] [CrossRef]
- Dawoud, I.; Abonazel, M. Robust Dawoud–Kibria estimator for handling multicollinearity and outliers in the linear regression model. J. Stat. Comput. Simul. 2021, 91, 3678–3692. [Google Scholar] [CrossRef]
- Abonazel, M.; Dawoud, I. Developing robust ridge estimators for Poisson regression model. Concurr. Comput. Pract. Exp. 2022, 34, e6979. [Google Scholar] [CrossRef]
- Arum, K.; Ugwuowo, F.; Oranye, H.; Alakija, T.; Ugah, T.; Asogwa, O. Combating outliers and multicollinearity in linear regression model using robust Kibria-Lukman mixed with principal component estimator, simulation and computation. Sci. Afr. 2023, 19, e01566. [Google Scholar] [CrossRef]
- Majid, A.; Ahmed, S.; Aslam, M.; Kashif, M. A robust Kibria–Lukman estimator for linear regression model to combat multicollinearity and outliers. Concurr. Comput. Pract. Exp. 2023, 35, e7533. [Google Scholar] [CrossRef]
- Kibria, B.M.G. Performance of some new ridge regression estimators. Commun. Stat.-Simul. Comput. 2003, 32, 419–435. [Google Scholar] [CrossRef]
- Dawoud, I.; Lukman, A.F.; Haadi, A. A new biased regression estimator: Theory, simulation and application. Sci. Afr. 2022, 15, e01100. [Google Scholar] [CrossRef]
- Kibria, B.M.G. More than hundred (100) estimators for estimating the shrinkage parameter in a linear and generalized linear ridge regression models. J. Econom. Stat. 2022, 2, 233–252. [Google Scholar]
- Hilbe, J.M. Negative Binomial Regression, 2nd ed.; Cambridge University Press: Cambridge, MA, USA, 2011. [Google Scholar]
- Algamal, Z. Variable Selection in Count Data Regression Model based on Firefly Algorithm. Stat. Optim. Inf. Comput. 2019, 7, 520–529. [Google Scholar] [CrossRef]
n | 1% Outliers | 8% | ||||||
---|---|---|---|---|---|---|---|---|
30 | 50 | 100 | 200 | 30 | 50 | 100 | 200 | |
0.2433 | 0.1759 | 0.1407 | 0.1371 | 1.7302 | 1.7117 | 1.3297 | 1.2726 | |
0.2226 | 0.1489 | 0.1322 | 0.1312 | 1.5284 | 1.5106 | 1.2843 | 1.2561 | |
0.1543 | 0.1480 | 0.1320 | 0.0274 | 0.4017 | 0.3218 | 0.1868 | 0.0971 | |
0.2054 | 0.1338 | 0.1301 | 0.1299 | 1.3438 | 1.2856 | 1.2401 | 1.2397 | |
0.1323 | 0.1258 | 0.1227 | 0.0255 | 0.3364 | 0.3178 | 0.1623 | 0.0933 | |
2.2999 | 0.5132 | 0.3606 | 0.1671 | 4.3435 | 2.4649 | 1.4263 | 1.3409 | |
2.1946 | 0.4154 | 0.2872 | 0.1494 | 3.6460 | 2.2617 | 1.3372 | 1.3218 | |
1.3213 | 0.3346 | 0.2128 | 0.0439 | 2.0874 | 1.4598 | 0.4295 | 0.2087 | |
2.1408 | 0.3481 | 0.2285 | 0.1341 | 3.0200 | 2.0920 | 1.2539 | 1.2630 | |
1.2524 | 0.2508 | 0.1895 | 0.0404 | 1.5043 | 1.3688 | 0.3516 | 0.1169 | |
4.2888 | 2.0952 | 2.0325 | 1.2622 | 13.6250 | 9.1668 | 2.8536 | 2.0822 | |
3.4781 | 1.2561 | 1.3010 | 0.8275 | 9.6663 | 5.4477 | 1.9129 | 1.7719 | |
2.7175 | 1.1227 | 0.5659 | 0.3005 | 5.9970 | 5.0101 | 1.4041 | 0.3160 | |
3.4430 | 0.6316 | 0.5857 | 0.5342 | 6.9971 | 3.1266 | 1.6077 | 1.5263 | |
2.3795 | 0.5089 | 0.4302 | 0.1149 | 3.7144 | 3.0711 | 0.8117 | 0.1274 | |
13.6544 | 14.3600 | 7.7466 | 6.7521 | 149.2937 | 70.4487 | 33.4742 | 11.6880 | |
5.6521 | 7.4897 | 2.4514 | 1.5667 | 104.4286 | 46.4188 | 23.5842 | 7.8009 | |
3.8032 | 5.1630 | 1.7471 | 1.3126 | 67.5508 | 24.1480 | 2.1384 | 1.6055 | |
5.0185 | 3.9674 | 1.2086 | 1.1580 | 71.7484 | 32.0596 | 17.3464 | 5.9552 | |
3.1395 | 2.6155 | 1.1361 | 0.4985 | 32.8514 | 7.9232 | 1.4756 | 0.5311 |
n | 1% Outliers | 8% | ||||||
---|---|---|---|---|---|---|---|---|
30 | 50 | 100 | 200 | 30 | 50 | 100 | 200 | |
0.3745 | 0.3413 | 0.3336 | 0.3259 | 2.3125 | 2.0120 | 1.6164 | 1.5833 | |
0.2878 | 0.2794 | 0.2485 | 0.1849 | 2.2610 | 1.7329 | 1.3588 | 1.2773 | |
0.2791 | 0.2633 | 0.2046 | 0.0458 | 1.0064 | 0.8254 | 0.3955 | 0.1100 | |
0.2506 | 0.2460 | 0.2387 | 0.1468 | 2.2164 | 1.6370 | 1.2932 | 1.2714 | |
0.2088 | 0.2001 | 0.1958 | 0.0400 | 1.0028 | 0.7087 | 0.3395 | 0.1082 | |
3.7542 | 0.6788 | 0.6353 | 0.5962 | 5.5100 | 2.8125 | 1.6546 | 3.3937 | |
3.5119 | 0.4856 | 0.4647 | 0.4254 | 4.3696 | 2.4532 | 1.5253 | 3.3180 | |
2.5476 | 0.4323 | 0.4217 | 0.0565 | 3.0110 | 2.0634 | 0.5418 | 0.2955 | |
3.3406 | 0.5366 | 0.5246 | 0.4618 | 4.2623 | 2.3850 | 1.4056 | 3.2442 | |
2.3775 | 0.5032 | 0.2481 | 0.0462 | 3.0106 | 2.0042 | 0.4404 | 0.2774 | |
6.8841 | 5.1915 | 4.6733 | 3.8550 | 14.3885 | 11.5578 | 7.9738 | 9.2243 | |
4.1880 | 3.2344 | 3.1333 | 2.7374 | 12.8431 | 7.7036 | 6.2574 | 7.5123 | |
3.8932 | 3.4318 | 1.6121 | 1.4313 | 6.0214 | 5.5507 | 2.1496 | 0.3618 | |
3.5452 | 2.0011 | 1.8565 | 1.8070 | 10.6653 | 7.0699 | 4.8587 | 4.1347 | |
3.1188 | 2.1364 | 1.4639 | 0.2391 | 5.0450 | 4.4454 | 2.0490 | 0.3477 | |
71.1404 | 45.7179 | 38.5937 | 32.2868 | 245.5499 | 100.2213 | 87.3481 | 26.0989 | |
43.6115 | 29.3162 | 12.7618 | 10.2636 | 128.8131 | 62.3445 | 50.5375 | 15.0965 | |
39.3370 | 28.9090 | 2.8173 | 2.6831 | 71.0531 | 45.0481 | 4.4200 | 3.1227 | |
26.5437 | 19.3226 | 10.4217 | 7.7887 | 86.0936 | 59.1129 | 47.7824 | 12.0998 | |
23.4875 | 18.5239 | 2.3113 | 1.0858 | 70.7459 | 43.2366 | 3.6519 | 2.7898 |
Metrics | ||
---|---|---|
Residual Variance | 661.2 | 661.2 |
AIC | 873.1 | 397.9 |
Coef. | Estimators | ||||
---|---|---|---|---|---|
x1 | 0.0267 (0.0128) | 0.0235 (0.0120) | −2.7247 (0.0122) | 0.0203 (0.0115) | −2.7247 (0.0119) |
x2 | 1.9640 (1.8877) | 1.0210 (1.3625) | 0.0043 (1.5516) | 0.0780 (0.9834) | 0.0065 (1.2754) |
x3 | 0.0228 (0.0456) | 0.0339 (0.0429) | 0.0267 (0.0438) | 0.0451 (0.0413) | 0.0267 (0.0425) |
x4 | 0.0139 (0.0101) | 0.0152 (0.0100) | −0.0350 (0.0100) | 0.0165 (0.0100) | −0.0350 (0.0100) |
x5 | 0.4493 (0.1659) | 0.4350 (0.1646) | 0.0490 (0.1651) | 0.4207 (0.1637) | 0.0488 (0.1645) |
SMSE | 3.5934 | 1.8855 | 1.6560 | 3.5927 | 0.4692 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lukman, A.F.; Albalawi, O.; Arashi, M.; Allohibi, J.; Alharbi, A.A.; Farghali, R.A. Robust Negative Binomial Regression via the Kibria–Lukman Strategy: Methodology and Application. Mathematics 2024, 12, 2929. https://doi.org/10.3390/math12182929
Lukman AF, Albalawi O, Arashi M, Allohibi J, Alharbi AA, Farghali RA. Robust Negative Binomial Regression via the Kibria–Lukman Strategy: Methodology and Application. Mathematics. 2024; 12(18):2929. https://doi.org/10.3390/math12182929
Chicago/Turabian StyleLukman, Adewale F., Olayan Albalawi, Mohammad Arashi, Jeza Allohibi, Abdulmajeed Atiah Alharbi, and Rasha A. Farghali. 2024. "Robust Negative Binomial Regression via the Kibria–Lukman Strategy: Methodology and Application" Mathematics 12, no. 18: 2929. https://doi.org/10.3390/math12182929
APA StyleLukman, A. F., Albalawi, O., Arashi, M., Allohibi, J., Alharbi, A. A., & Farghali, R. A. (2024). Robust Negative Binomial Regression via the Kibria–Lukman Strategy: Methodology and Application. Mathematics, 12(18), 2929. https://doi.org/10.3390/math12182929