Dealing with Missing Responses in Cognitive Diagnostic Modeling
Abstract
:1. Introduction
2. Background and Literature
2.1. Cognitive Diagnostic Models (CDMs)
The DINA Model
2.2. Missing Responses in Psychometrics and CDMs
2.2.1. Missing Data Mechanisms
2.2.2. Missing Responses in Assessment Settings
2.2.3. Treatment of Missing Responses in Assessment Data
2.2.4. Treatment of Missing Responses in CDMs
3. Methods
3.1. Simulation Design
3.1.1. Factors Related to Missing Data
3.1.2. Factors Related to CDMs
3.2. Data Generation
3.2.1. Generate Complete Data
3.2.2. Generate Missing Responses of MAR
3.2.3. Generate Missing Responses of MNAR
3.2.4. Generate Missing Responses of the MIXED Mechanism
3.3. Analyses and Outcome
3.3.1. Analyses
3.3.2. Outcome Variables
4. Results
4.1. Model Convergence
4.2. Item Parameter Recovery
4.2.1. Item Parameter Recovery under MAR
4.2.2. Item Parameter Recovery under MNAR
4.2.3. Item Parameter Recovery under the MIXED Mechanism
4.3. Attribute Classification Accuracy
4.3.1. Classification Accuracy under MAR
4.3.2. Classification Accuracy under MNAR
4.3.3. Classification Accuracy under the MIXED Mechanism
5. Discussion
5.1. Findings
5.2. Recommendations
5.3. Limitation and Future Directions
6. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- De la Torre, J. DINA model and parameter estimation: A didactic. J. Educ. Behav. Stat. 2009, 34, 115–130. [Google Scholar] [CrossRef] [Green Version]
- Templin, J.L.; Henson, R.A. Measurement of psychological disorders using cognitive diagnosis models. Psychol. Methods 2006, 11, 287–305. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Bradshaw, L.; Izsák, A.; Templin, J.; Jacobson, E. Diagnosing teachers’ understandings of rational numbers: Building a multidimensional test within the diagnostic classification framework. Educ. Meas. Issues Pract. 2014, 33, 2–14. [Google Scholar] [CrossRef]
- Rupp, A.A.; Templin, J.; Henson, R.A. Diagnostic Measurement: Theory, Methods, and Applications; Guilford Press: New York, NY, USA, 2010. [Google Scholar]
- Rupp, A.A.; Templin, J. The effects of Q-matrix misspecification on parameter estimates and classification accuracy in the DINA model. Educ. Psychol. Meas. 2008, 68, 78–96. [Google Scholar] [CrossRef]
- Li, X.; Wang, W.-C. Assessment of differential item functioning under cognitive diagnosis models: The DINA model example. J. Educ. Meas. 2015, 52, 28–54. [Google Scholar] [CrossRef]
- Junker, B.W.; Sijtsma, K. Cognitive assessment models with few assumptions, and connections with nonparametric item response theory. Appl. Psychol. Meas. 2001, 25, 258–272. [Google Scholar] [CrossRef] [Green Version]
- DiBello, L.V.; Stout, W.F.; Roussos, L.A. Unified cognitive/psychometric diagnostic assessment Likelihood-based classification techniques. In Cognitively Diagnostic Assessment; Routledge: London, UK, 1995; pp. 361–389. [Google Scholar]
- Henson, R.A.; Templin, J.L.; Willse, J.T. Defining a family of cognitive diagnosis models using log-linear models with latent variables. Psychometrika 2009, 74, 191–210. [Google Scholar] [CrossRef]
- Von Davier, M. A General diagnostic model applied to language testing data. ETS Res. Rep. Ser. 2005, 2005, i-35. Available online: https://files.eric.ed.gov/fulltext/EJ1111422.pdf (accessed on 23 March 2016). [CrossRef]
- Roussos, L.A.; DiBello, L.V.; Stout, W.; Hartz, S.M.; Henson, R.A.; Templin, J.L. The fusion model skills diagnosis system. In Cognitive Diagnostic Assessment for Education: Theory and Applications; Leighton, J.P., Gierl, M.J., Eds.; Cambridge University Press: Cambridge, UK, 2007; pp. 275–318. [Google Scholar]
- DiBello, L.V.; Roussos, L.A.; Stout, W. Review of cognitively diagnostic assessment and a summary of psychometric models. Handb. Stat. Psychom. 2007, 26, 979–1030. [Google Scholar]
- Leighton, J.; Gierl, M. Cognitive Diagnostic Assessment for Education: Theory and Applications; Cambridge University Press: Cambridge, UK, 2007. [Google Scholar]
- Nichols, P.D. A framework for developing cognitively diagnostic assessments. Rev. Educ. Res. 1994, 64, 575–603. [Google Scholar] [CrossRef]
- Rupp, A.A.; Templin, J.L. Unique characteristics of diagnostic classification models: A comprehensive review of the current state-of-the-art. Measurement 2008, 6, 219–262. [Google Scholar] [CrossRef]
- Snow, R.E.; Lohman, D.F. Implications of cognitive psychology for educational measurement. In Educational Measurement; Linn, R., Ed.; American Council on Education/Macmillan: New York, NY, USA, 1989; pp. 263–331. [Google Scholar]
- Xu, X.; von Davier, M. Cognitive diagnosis for NAEP proficiency data. ETS Res. Rep. Ser. 2006, 2006, i-25. Available online: https://files.eric.ed.gov/fulltext/EJ1111414.pdf (accessed on 23 March 2016). [CrossRef]
- Embretson, S. Applications of cognitive design systems to test development. In Cognitive Assessment: A Multidisciplinary Perspective; Reynolds, C.R., Ed.; Springer: New York, NY, USA, 1994; pp. 107–135. [Google Scholar]
- Embretson Whitely, S.E. Construct validity: Construct representation versus nomothetic span. Psychol. Bull. 1983, 93, 179–197. [Google Scholar] [CrossRef]
- Huff, K.; Goodman, D.P. The demand for cognitive diagnostic assessment. In Cognitive Diagnostic Assessment for Education: Theory and Applications; Leighton, J.P., Gierl, M.J., Eds.; Cambridge University Press: Cambridge, UK, 2007; pp. 19–60. [Google Scholar]
- Lee, Y.-W.; Sawaki, Y. Application of three cognitive diagnosis models to ESL reading and listening assessments. Lang. Assess. Q. 2009, 6, 239–263. [Google Scholar] [CrossRef]
- Lee, Y.-W.; Sawaki, Y. Cognitive diagnosis and Q-matrices in language assessment. Lang. Assess. Q. 2009, 6, 169–171. [Google Scholar] [CrossRef]
- Lee, Y.-W.; Sawaki, Y. Cognitive diagnosis approaches to language assessment: An overview. Lang. Assess. Q. 2009, 6, 172–189. [Google Scholar] [CrossRef]
- Leighton, J.P.; Gierl, M.J. Defining and evaluating models of cognition used in educational measurement to make inferences about examinees’ thinking processes. Educ. Meas. Issues Pract. 2007, 26, 3–16. [Google Scholar] [CrossRef]
- Mislevy, R.J.; Steinberg, L.S.; Breyer, F.J.; Almond, R.G.; Johnson, L. A cognitive task analysis, with implications for designing a simulation-based performance assessment. Comput. Hum. Behav. 1998, 15, 335–374. [Google Scholar] [CrossRef]
- Tatsuoka, K.K. Analysis of errors in fraction addition and subtraction problems. In Computer-Based Education Research Laboratory Report; University of Illinois: Urbana, IL, USA, 1984. Available online: https://files.eric.ed.gov/fulltext/ED257665 (accessed on 23 March 2016).
- Ravand, H.; Baghaei, P. Diagnostic classification models: Recent developments, practical issues, and prospects. Int. J. Test. 2020, 20, 24–56. [Google Scholar] [CrossRef]
- De la Torre, J. The generalized DINA model framework. Psychometrika 2011, 76, 179–199. [Google Scholar] [CrossRef]
- Huo, Y.; de la Torre, J. Estimating a cognitive diagnostic model for multiple strategies via the EM algorithm. Appl. Psychol. Meas. 2014, 38, 464–485. [Google Scholar] [CrossRef]
- Chiu, C.-Y. Statistical refinement of the Q-matrix in cognitive diagnosis. Appl. Psychol. Meas. 2013, 37, 598–618. [Google Scholar] [CrossRef] [Green Version]
- De la Torre, J. An empirically based method of Q-matrix validation for the DINA model: Development and applications. J. Educ. Meas. 2008, 45, 343–362. [Google Scholar] [CrossRef]
- Hou, L.; la Torre, J.D.; Nandakumar, R. Differential item functioning assessment in cognitive diagnostic modeling: Application of the Wald test to investigate dif in the DINA model. J. Educ. Meas. 2014, 51, 98–125. [Google Scholar] [CrossRef]
- Svetina, D.; Feng, Y.; Paulsen, J.; Valdivia, M.; Valdivia, A.; Dai, S. Examining DIF in the context of CDMs when the Q-matrix is misspecified. Front. Psychol. 2018, 696, 1–15. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Robitzsch, A.; Kiefer, T.; George, A.C.; Uenlue, A. CDM: Cognitive Diagnosis Modeling. R Package Version 7.5-15. 2022. Available online: https://CRAN.R-project.org/package=CDM (accessed on 8 May 2022).
- Ma, W.; de la Torre, J. GDINA: The Generalized DINA Model. Framework. 2022. Available online: https://cran.r-project.org/package=GDINA (accessed on 8 May 2022).
- R Core Team. R: A Language and Environment for Statistical Computing; R Foundation for Statistical Computing: Vienna, Austria, 2022. [Google Scholar]
- Jang, E.E. Demystifying a Q-Matrix for making diagnostic inferences about L2 reading skills. Lang. Assess. Q. 2009, 6, 210–238. [Google Scholar] [CrossRef]
- Sawaki, Y.; Kim, H.-J.; Gentile, C. Q-matrix construction: Defining the link between constructs and test items in large-scale reading and listening comprehension assessments. Lang. Assess. Q. 2009, 6, 190–209. [Google Scholar] [CrossRef]
- Effatpanah, F.; Baghaei, P.; Boori, A.A. Diagnosing EFL learners’ writing ability: A diagnostic classification modeling analysis. Lang. Test. Asia 2019, 9, 12. [Google Scholar] [CrossRef]
- Jurich, D.; Bradshaw, L. An illustration of diagnostic classification modeling in student learning outcomes assessment. Int. J. Test. 2014, 14, 49–72. [Google Scholar] [CrossRef]
- Lee, Y.-S.; Park, Y.S.; Taylan, D. A cognitive diagnostic modeling of attribute mastery in Massachusetts, Minnesota, and the US national sample using the TIMSS 2007. Int. J. Test. 2011, 11, 144–177. [Google Scholar] [CrossRef]
- Mei, H.; Chen, H. Assessing students’ translation competence: Integrating China’s standards of English with cognitive diagnostic assessment approaches. Front. Psychol. 2022, 13, 872025. [Google Scholar] [CrossRef] [PubMed]
- Park, Y.S.; Lee, Y.-S. An extension of the DINA model using covariates examining factors affecting response probability and latent classification. Appl. Psychol. Meas. 2014, 38, 376–390. [Google Scholar] [CrossRef]
- Svetina, D.; Gorin, J.S.; Tatsuoka, K.K. Defining and comparing the reading comprehension construct: A cognitive-psychometric modeling approach. Int. J. Test. 2011, 11, 1–23. [Google Scholar] [CrossRef]
- Birenbaum, M.; Tatsuoka, C.; Yamada, T. Diagnostic assessment in TIMSS-R: Between-countries and within-country comparisons of eighth graders’ mathematics performance. Stud. Educ. Eval. 2004, 30, 151–173. [Google Scholar] [CrossRef] [Green Version]
- De Ayala, R.J.; Plake, B.S.; Impara, J.C. The impact of omitted responses on the accuracy of ability estimation in item response theory. J. Educ. Meas. 2001, 38, 213–234. [Google Scholar] [CrossRef]
- Schafer, J.L.; Graham, J.W. Missing data: Our view of the state of the art. Psychol. Methods 2002, 7, 147–177. [Google Scholar] [CrossRef]
- Little, R.J.; Rubin, D.B. The analysis of social science data with missing values. Sociol. Methods Res. 1989, 18, 292–326. [Google Scholar] [CrossRef]
- Little, R.J.; Rubin, D.B. Statistical Analysis with Missing Data, 3rd ed.; Wiley Series in Probability and Statistics; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2020. [Google Scholar]
- Collins, L.M.; Schafer, J.L.; Kam, C.-M. A comparison of inclusive and restrictive strategies in modern missing data procedures. Psychol. Methods 2001, 6, 330–351. [Google Scholar] [CrossRef]
- Pohl, S.; Gräfe, L.; Rose, N. Dealing with omitted and not-reached items in competence tests evaluating approaches accounting for missing responses in item response theory models. Educ. Psychol. Meas. 2014, 74, 423–452. [Google Scholar] [CrossRef]
- Lord, F.M. Applications of Item Response Theory to Practical Testing Problems; Routledge: New York, NY, USA, 1980. [Google Scholar]
- IEA. Methods and Procedures: TIMSS 2019 Technical Report; Martin, M.O., von Davier, M., Mullis, I.V.S., Eds.; TIMSS & PIRLS International Study Center, Lynch School of Education and Human Development, Boston College and International Association for the Evaluation of Educational Achievement (IEA): Boston, MA, USA, 2020. [Google Scholar]
- Shan, N.; Wang, X. Cognitive diagnosis modeling incorporating item-level missing data mechanism. Front. Psychol. 2020, 11, 1–11. [Google Scholar] [CrossRef]
- Dai, S.; Svetina, D.; Chen, C. Investigation of missing responses in Q-matrix validation. Appl. Psychol. Meas. 2018, 42, 660–676. [Google Scholar] [CrossRef] [PubMed]
- Sünbül, S.Ö. The impact of different missing data handling methods on DINA model. Int. J. Eval. Res. Educ. 2018, 7, 77–86. [Google Scholar]
- Gorin, J.S. Test construction and diagnostic testing. In Cognitive Diagnostic Assessment for Education: Theory and Applications; Leighton, J.P., Gierl, M.J., Eds.; Cambridge University Press: New York, NY, USA, 2007; pp. 173–201. [Google Scholar]
- Henson, R.; Douglas, J. Test construction for cognitive diagnosis. Appl. Psychol. Meas. 2005, 29, 262–277. [Google Scholar] [CrossRef]
- Jang, E.E. Cognitive diagnostic assessment of L2 reading comprehension ability: Validity arguments for fusion model application to language assessment. Lang. Test. 2009, 26, 31–73. [Google Scholar] [CrossRef]
- Embretson, S.; Gorin, J. Improving Construct Validity with Cognitive Psychology Principles. J. Educ. Meas. 2001, 38, 343–368. [Google Scholar] [CrossRef]
- Chen, Y.; Liu, J.; Xu, G.; Ying, Z. Statistical analysis of Q-matrix based diagnostic classification models. J. Am. Stat. Assoc. 2015, 110, 850–866. [Google Scholar] [CrossRef] [Green Version]
- Hartz, S.M. A Bayesian Framework for the Unified Model for Assessing Cognitive Abilities: Blending Theory with Practicality. Ph.D. Thesis, The University of Illinois at Urbana-Champaign, Champaign, IL, USA, 2002. [Google Scholar]
- Huebner, A. An overview of recent developments in cognitive diagnostic computer adaptive assessments. Pract. Assess. Res. Eval. 2010, 15, 3. [Google Scholar]
- Von Davier, M. The DINA model as a constrained general diagnostic model: Two variants of a model equivalency. Br. J. Math. Stat. Psychol. 2014, 67, 49–71. [Google Scholar] [CrossRef]
- Rubin, D.B. Inference and missing data. Biometrika 1976, 63, 581–592. [Google Scholar] [CrossRef]
- Finch, H. Estimation of item response theory parameters in the presence of missing data. J. Educ. Meas. 2008, 45, 225–245. [Google Scholar] [CrossRef]
- Cheema, J.R. Some general guidelines for choosing missing data handling methods in educational research. J. Mod. Appl. Stat. Methods 2014, 13, 53–75. [Google Scholar] [CrossRef]
- Mislevy, R.J.; Wu, P.K. Missing responses and IRT ability estimation: Omits, choice, time limits, and adaptive testing. ETS Res. Rep. Ser. 1996, 1996, i-36. [Google Scholar] [CrossRef]
- Brown, N.J.S.; Dai, S.; Svetina, D. Predictors of omitted responses on the 2009 National Assessment of Educational Progress (NAEP) mathematics assessment. In Proceedings of the Annual Meeting of the American Educational Research Association, Philadelphia, PA, USA, 3–7 April 2014. [Google Scholar]
- Sportisse, A.; Boyer, C.; Josse, J. Imputation and low-rank estimation with missing not at random data. Stat. Comput. 2020, 30, 1629–1643. [Google Scholar] [CrossRef]
- Robitzsch, A. On the treatment of missing item responses in educational large-scale assessment data: An illustrative simulation study and a case study using PISA 2018 mathematics data. Eur. J. Investig. Health Psychol. Educ. 2021, 11, 1653–1687. [Google Scholar] [CrossRef]
- Huisman, M.; Molenaar, I.W. Imputation of missing scale data with item response models. In Essays on Item Response Theory; Boomsma, A., Duijn, M.A.J., Snijders, T.A.B., Eds.; Springer: New York, NY, USA, 2001; pp. 221–244. [Google Scholar]
- College Board Understanding Your Score Report. Available online: https://satsuite.collegeboard.org/media/pdf/understanding-your-sat-score-report.pdf (accessed on 4 June 2022).
- Pohl, S.; Becker, B. Performance of missing data approaches under nonignorable missing data conditions. Methodology 2020, 16, 147–165. [Google Scholar] [CrossRef]
- Rose, N.; Davier, M.; Xu, X. Modeling nonignorable missing data with item response theory (IRT). ETS Res. Rep. Ser. 2010, 2010, i-53. Available online: https://files.eric.ed.gov/fulltext/ED523925.pdf (accessed on 24 March 2016). [CrossRef]
- Lord, F.M. Quick estimates of the relative efficiency of two tests as a function of ability level. J. Educ. Meas. 1974, 11, 247–254. [Google Scholar] [CrossRef]
- Lord, F.M. Unbiased estimators of ability parameters, of their variance, and of their parallel-forms reliability. Psychometrika 1983, 48, 233–245. [Google Scholar] [CrossRef] [Green Version]
- Mislevy, R.J.; Wu, P.K. Inferring examinee ability when some item responses are missing. ETS Res. Rep. Ser. 1988, 1988, i-75. Available online: https://files.eric.ed.gov/fulltext/ED395017.pdf (accessed on 23 March 2016).
- Dai, S. Handling missing responses in psychometrics: Methods and software. Psych 2021, 3, 673–693. [Google Scholar] [CrossRef]
- Robitzsch, A. About Still Nonignorable Consequences of (Partially) Ignoring Missing Item Responses in Large-Scale Assessment. Osfpreprints. 2020. Available online: https://osf.io/hmy45 (accessed on 23 August 2021).
- Bernaards, C.A.; Sijtsma, K. Influence of imputation and EM methods on factor analysis when item nonresponse in questionnaire data is nonignorable. Multivar. Behav. Res. 2000, 35, 321–364. [Google Scholar] [CrossRef] [PubMed]
- Sijtsma, K.; van der Ark, L.A. Investigation and treatment of missing item scores in test and questionnaire data. Multivar. Behav. Res. 2003, 38, 505–528. [Google Scholar] [CrossRef]
- Van Buuren, S. Flexible Imputation of Missing Data; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
- Van Ginkel, J.R.; Van der Ark, L.A.; Sijtsma, K.; Vermunt, J.K. Two-way imputation: A bayesian method for estimating missing scores in tests and questionnaires, and an accurate approximation. Comput. Stat. Data Anal. 2007, 51, 4013–4027. [Google Scholar] [CrossRef] [Green Version]
- Rubin, D.B. The calculation of posterior distributions by data augmentation: Comment on a noniterative sampling/importance resampling alternative to the data augmentation algorithm for creating a few imputations when fractions of missing information are modest: The SIR algorithm. J. Am. Stat. Assoc. 1987, 82, 543–546. [Google Scholar]
- Enders, C.K. Applied Missing Data Analysis; Guilford Press: New York, NY, USA, 2010. [Google Scholar]
- Glas, C.A.; Pimentel, J.L. Modeling nonignorable missing data in speeded tests. Educ. Psychol. Meas. 2008, 68, 907–922. [Google Scholar] [CrossRef]
- Glas, C.A.W.; Pimentel, J.L.; Lamers, S.M.A. Nonignorable data in IRT models: Polytomous responses and response propensity models with covariates. Psychol. Test Assess. Model. 2015, 57, 523–541. [Google Scholar]
- Moustaki, I.; Knott, M. Weighting for item non-response in attitude scales by using latent variable models with covariates. J. R. Stat. Soc. Ser. A Stat. Soc. 2000, 163, 445–459. [Google Scholar] [CrossRef]
- O’muircheartaigh, C.; Moustaki, I. Symmetric pattern models: A latent variable approach to item non-response in attitude scales. J. R. Stat. Soc. Ser. A Stat. Soc. 1999, 162, 177–194. [Google Scholar] [CrossRef]
- Rose, N.; von Davier, M.; Nagengast, B. Modeling omitted and not-reached items in IRT models. Psychometrika 2017, 82, 795–819. [Google Scholar] [CrossRef]
- Choi, J.; Dekkers, O.M.; le Cessie, S. A Comparison of different methods to handle missing data in the context of propensity score analysis. Eur. J. Epidemiol. 2019, 34, 23–36. [Google Scholar] [CrossRef] [Green Version]
- Sperrin, M.; Martin, G.P. Multiple imputation with missing indicators as proxies for unmeasured variables: A simulation study. BMC Med. Res. Methodol. 2020, 20, 185. [Google Scholar] [CrossRef] [PubMed]
- Groenwold, R.H.; White, I.R.; Donders, A.R.T.; Carpenter, J.R.; Altman, D.G.; Moons, K.G. Missing covariate data in clinical research: When and when not to use the missing-indicator method for analysis. Cmaj 2012, 184, 1265–1269. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Sinharay, S. Reporting proficiency levels for examinees with incomplete data. J. Educ. Behav. Stat. 2022, 47, 263–296. [Google Scholar] [CrossRef]
- Ludlow, L.H.; O’Leary, M. Scoring omitted and not-reached items: Practical data analysis implications. Educ. Psychol. Meas. 1999, 59, 615–630. [Google Scholar] [CrossRef]
- Edwards, J.M.; Finch, W.H. Recursive partitioning methods for data imputation in the context of item response theory: A Monte Carlo simulation. Psicológica 2018, 39, 88–117. [Google Scholar] [CrossRef] [Green Version]
- Sulis, I.; Porcu, M. Handling missing data in item response theory: Assessing the accuracy of a multiple imputation procedure based on latent class analysis. J. Classif. 2017, 34, 327–359. [Google Scholar] [CrossRef]
- Xiao, J.; Bulut, O. Evaluating the performances of missing data handling methods in ability estimation from sparse data. Educ. Psychol. Meas. 2020, 80, 932–954. [Google Scholar] [CrossRef]
- Bernaards, C.A.; Sijtsma, K. Factor analysis of multidimensional polytomous item response data suffering from ignorable item nonresponse. Multivar. Behav. Res. 1999, 34, 277–313. [Google Scholar] [CrossRef] [Green Version]
- Aryadoust, V.; Goh, C.; Galaczi, E.D.; Weir, C.J. Exploring the Relative Merits of Cognitive Diagnostic Models and Confirmatory Factor Analysis for Assessing Listening Comprehension. In Proceedings of the Studies in Language Testing, Volume of Proceedings from the ALTE Kraków Conference, Kraków, Poland, 7–9 July 2011. [Google Scholar]
- Cui, Y.; Gierl, M.J.; Chang, H.-H. Estimating classification consistency and accuracy for cognitive diagnostic assessment. J. Educ. Meas. 2012, 49, 19–38. [Google Scholar] [CrossRef]
- Templin, J.; Henson, R.A.; Templin, S.E.; Roussos, L. Robustness of hierarchical modeling of skill association in cognitive diagnosis models. Appl. Psychol. Meas. 2008, 32, 559–574. [Google Scholar] [CrossRef]
- Gu, Y.; Xu, G. The sufficient and necessary condition for the identifiability and estimability of the DINA model. Psychometrika 2019, 84, 468–483. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Xu, G.; Zhang, S. Identifiability of diagnostic classification models. Psychometrika 2015, 89, 625–649. [Google Scholar] [CrossRef] [PubMed]
- Dai, S.; Wang, X.; Svetina, D. TestDataImputation: Missing Item Responses Imputation for Test and Assessment Data. R Package Version 2.3. Available online: https://CRAN.R-project.org/package=TestDataImputation (accessed on 18 October 2021).
Conditions | Model Convergence Rate | |||
---|---|---|---|---|
Missing Mechanism | Missing Rate | No. of Attributes in Q-Matrix | Missing Response Treatment | |
MAR | 20% | 8 | IM | 0.998 |
30% | 8 | RI | 0.998 | |
MNAR | 20% | 5 | IM | 0.998 |
20% | 8 | IM | 0.998 | |
30% | 8 | IM | 0.996 | |
30% | 8 | IN | 0.998 | |
30% | 8 | RI | 0.992 | |
30% | 8 | PMM | 0.998 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Dai, S.; Svetina Valdivia, D. Dealing with Missing Responses in Cognitive Diagnostic Modeling. Psych 2022, 4, 318-342. https://doi.org/10.3390/psych4020028
Dai S, Svetina Valdivia D. Dealing with Missing Responses in Cognitive Diagnostic Modeling. Psych. 2022; 4(2):318-342. https://doi.org/10.3390/psych4020028
Chicago/Turabian StyleDai, Shenghai, and Dubravka Svetina Valdivia. 2022. "Dealing with Missing Responses in Cognitive Diagnostic Modeling" Psych 4, no. 2: 318-342. https://doi.org/10.3390/psych4020028
APA StyleDai, S., & Svetina Valdivia, D. (2022). Dealing with Missing Responses in Cognitive Diagnostic Modeling. Psych, 4(2), 318-342. https://doi.org/10.3390/psych4020028