Critical thinking in math: 10th-grade analysis using cognitive diagnostic modeling
DOI:
https://doi.org/10.21831/reid.v11i1.88074Keywords:
mathematics education, senior high school, attribute mastery, critical thinking, G-DINA model in rjags, MCMC convergence diagnostics, Posterior predictive checks, Educational assessment in Indonesian high schools, Data-driven feedback for critical thinking, Attribute-level mastery reporting, Diagnostic measurement in mathematics education, Bayesian cognitive diagnostic modelingAbstract
Critical thinking is widely recognized as an essential competency in mathematics education, yet assessments often fail to capture its multidimensional nature. This study applied a Bayesian Cognitive Diagnostic Modeling (G-DINA) approach to identify the mastery profiles of tenth-grade students in Indonesia across four attributes: interpretation, analysis, evaluation, and inference. Data from 60 students revealed that most learners demonstrated partial rather than full mastery, with consistent challenges in evaluative reasoning and inference. These diagnostic profiles provide actionable insights for teachers, enabling more targeted instructional strategies that go beyond total test scores. The findings highlight the potential of Bayesian CDMs to enhance classroom assessment by offering fine-grained evidence of students’ reasoning patterns. This study contributes novelty by being among the first to implement Bayesian cognitive diagnosis in mathematics education within the Indonesian context, bridging methodological innovation with practical implications for teaching and assessment.
References
Applebaum, M. (2024). Enhancing critical thinking in pre-service mathematics teachers: Bridging procedural fluency and conceptual understanding. Математика Плюс, 32(3), 58–66. https://www.ceeol.com/search/article-detail?id=1276653
Atai-Tabar, M., Zareian, G., Amirian, S. M. R., & Adel, S. M. R. (2024). Relationships between EFL teachers’ perceptions of consequential validity of formative assessment and data-driven decision-making self-efficacy and anxiety. Journal of Applied Research in Higher Education, 16(3), 919–933. https://doi.org/10.1108/JARHE-04-2023-0169
Belzak, W. C. M. (2023). The multidimensionality of measurement bias in high-stakes testing: Using machine learning to evaluate complex sources of differential item functioning. Educational Measurement: Issues and Practice, 42(1), 24–33. https://doi.org/10.1111/emip.12486
Cao, C., Lugu, B., & Li, J. (2024). The sensitivity of Bayesian fit indices to structural misspecification in structural equation modeling. Structural Equation Modeling: A Multidisciplinary Journal, 31(3), 477–493. https://doi.org/10.1080/10705511.2023.2253497
Cao, Y., Hong, S., Li, X., Ying, J., Ma, Y., Liang, H., Liu, Y., Yao, Z., Wang, X., Huang, D., Zhang, W., Huang, L., Chen, M., Hou, L., Sun, Q., Ma, X., Wu, Z., Kan, M.-Y., Lo, D., Zhang, Q., Ji, H., Jiang, J., Li, J., Sun, A., Huang, X., Chua, T.-S., & Jiang, Y.-G. (2025). Toward generalizable evaluation in the LLM era: A survey beyond benchmarks. ArXiv Preprint ArXiv:2504.18838. https://doi.org/10.48550/arXiv.2504.18838
Choo, S., Park, S., & Nelson, N. J. (2021). Evaluating spatial thinking ability using item response theory: Differential item functioning across math learning disabilities and geometry instructions. Learning Disability Quarterly, 44(2), 68–81. https://doi.org/10.1177/0731948720912417
Davenport, J. L., Kao, Y. S., Matlen, B. J., & Schneider, S. A. (2020). Cognition research in practice: Engineering and evaluating a middle school math curriculum. The Journal of Experimental Education, 88(4), 516–535. https://doi.org/10.1080/00220973.2019.1619067
Gamino, J. F., Frost, C., Riddle, R., Koslovsky, J., & Chapman, S. B. (2022). Higher-order executive function in middle school: Training teachers to enhance cognition in young adolescents. Frontiers in Psychology, 13, 867264. https://doi.org/10.3389/fpsyg.2022.867264
Gao, Y., Zhai, X., Bae, A., & Ma, W. (2023). Rasch-CDM: A combination of Rasch and Cognitive Diagnosis models to assess a learning progression. In X. Liu & W. Boone (Eds.), Advances in applications of Rasch measurement in science education. Springer Nature. http://dx.doi.org/10.2139/ssrn.4345437
Garcia, M. B. (2025). Profiling the skill mastery of introductory programming students: A cognitive diagnostic modeling approach. Education and Information Technologies, 30(5), 6455–6481. https://doi.org/10.1007/s10639-024-13039-6
Go, M. C. J. (2023). Enhancing mathematical proficiency assessment: Insights from mathematics teachers. Science International, 35(6), 773–780. https://scholar.google.com/scholar_lookup?title=Enhancing%20mathematical%20proficiency%20assessment%3A%20Insights%20from%20mathematics%20teachers&publication_year=2023&author=M.%20Go
Molerov, D., Zlatkin-Troitschanskaia, O., Nagel, M.-T., Brückner, S., Schmidt, S., & Shavelson, R. J. (2020). Assessing university students’ critical online reasoning ability: A conceptual and assessment framework with preliminary evidence. Frontiers in Education, 5, 577843. https://doi.org/10.3389/feduc.2020.577843
Nitz, L., Gurabi, M. A., Cermak, M., Zadnik, M., Karpuk, D., Drichel, A., Schäfer, S., Holmes, B., & Mandal, A. (2025). On collaboration and automation in the context of threat detection and response with privacy-preserving features. Digital Threats: Research and Practice, 6(1), 1–36. https://doi.org/10.1145/3707651
Ntumi, S., Agbenyo, S., & Bulala, T. (2023). Estimating the psychometric properties (item difficulty, discrimination and reliability indices) of test items using Kuder-Richardson approach (KR-20). Shanlax International Journal of Education, 11(3), 18–28. https://doi.org/10.34293/education.v11i3.6081
Overton, C. (2023). A practitioner inquiry to examine text selection practices for secondary students with learning disabilities. Doctoral Dissertation, Indiana University. https://hdl.handle.net/2022/29509
Pohl, C., Klein, J. T., Hoffmann, S., Mitchell, C., & Fam, D. (2021). Conceptualising transdisciplinary integration as a multidimensional interactive process. Environmental Science & Policy, 118, 18–26. https://doi.org/10.1016/j.envsci.2020.12.005
Pokropek, A., Marks, G. N., Borgonovi, F., Koc, P., & Greiff, S. (2022). General or specific abilities? Evidence from 33 countries participating in the PISA assessments. Intelligence, 92, 101653. https://doi.org/10.1016/j.intell.2022.101653
Rojas, E., & Benakli, N. (2020). Mathematical literacy and critical thinking. In J. C. But (Ed.), Teaching college-level disciplinary literacy: Strategies and practices in STEM and professional studies (pp. 197–226). Palgrave Macmillan Cham. https://doi.org/10.1007/978-3-030-39804-0
Rustam, R., & Priyanto, P. (2022). Critical thinking assessment in the teaching of writing Indonesian scientific texts in high school. Jurnal Penelitian dan Evaluasi Pendidikan, 26(1), 12–25. https://doi.org/10.21831/pep.v26i1.36241
Schad, D. J., Betancourt, M., & Vasishth, S. (2021). Toward a principled Bayesian workflow in cognitive science. Psychological Methods, 26(1), 103-126. https://psycnet.apa.org/doi/10.1037/met0000275
Smid, S. C., McNeish, D., Miočević, M., & van de Schoot, R. (2020). Bayesian versus frequentist estimation for structural equation models in small sample contexts: A systematic review. Structural Equation Modeling: A Multidisciplinary Journal, 27(1), 131–161. https://doi.org/10.1080/10705511.2019.1577140
Sun, C., Shute, V. J., Stewart, A., Yonehiro, J., Duran, N., & D’Mello, S. (2020). Towards a generalized competency model of collaborative problem solving. Computers & Education, 143, 103672. https://doi.org/10.1016/j.compedu.2019.103672
Tanudjaya, C. P., & Doorman, M. (2020). Examining higher order thinking in Indonesian lower secondary mathematics classrooms. Journal on Mathematics Education, 11(2), 277–300. https://research-portal.uu.nl/en/publications/examining-higher-order-thinking-in-indonesian-lower-secondary-mat
Ufer, S., & Bochnik, K. (2020). The role of general and subject-specific language skills when learning mathematics in elementary school. Journal Für Mathematik-Didaktik, 41(1), 81–117. https://doi.org/10.1007/s13138-020-00160-5
Vasishth, S., Yadav, H., Schad, D. J., & Nicenboim, B. (2023). Sample size determination for Bayesian hierarchical models commonly used in psycholinguistics. Computational Brain & Behavior, 6(1), 102–126. https://doi.org/10.1007/s42113-021-00125-y
Waller, B. N. (2023). Critical thinking: Consider the verdict (7th ed.). Waveland Press. https://www.waveland.com/browse.php?t=771&pgtitle=Bruce%20N.%20Waller
Wang, X., Pan, J., Ren, Z., Zhai, M., Zhang, Z., Ren, H., Song, W., He, Y., Li, C., Yang, X., Li, M., Quan, D., Chen, L., & Qiu, L. (2021). Application of a novel hybrid algorithm of Bayesian network in the study of hyperlipidemia related factors: A cross-sectional study. BMC Public Health, 21, 1375. https://doi.org/10.1186/s12889-021-11412-5
Wu, H., & Molnár, G. (2022). Analysing complex problem-solving strategies from a cognitive perspective: The role of thinking skills. Journal of Intelligence, 10(3), 46. https://doi.org/10.3390/jintelligence10030046
Xin, T., Wang, C., Chen, P., & Liu, Y. (2022). Cognitive diagnostic models: Methods for practical applications. Frontiers in Psychology, 13, 895399. https://doi.org/10.3389/fpsyg.2022.895399
Yamaguchi, K., & Okada, K. (2020). Variational Bayes inference for the DINA model. Journal of Educational and Behavioral Statistics, 45(5), 569–597. https://doi.org/10.3102/1076998620911934
Zhai, C., Wibowo, S., & Li, L. D. (2024). The effects of over-reliance on AI dialogue systems on students’ cognitive abilities: A systematic review. Smart Learning Environments, 11(1), 28. https://doi.org/10.1186/s40561-024-00316-7
Zhang, Z., Zhang, J., Lu, J., & Tao, J. (2020). Bayesian estimation of the DINA model with Pólya-Gamma Gibbs sampling. Frontiers in Psychology, 11, 384. https://doi.org/10.3389/fpsyg.2020.00384
Downloads
Published
How to Cite
Issue
Section
Citation Check
License
Copyright (c) 2025 REID (Research and Evaluation in Education)

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
The authors submitting a manuscript to this journal agree that, if accepted for publication, copyright publishing of the submission shall be assigned to REID (Research and Evaluation in Education). However, even though the journal asks for a copyright transfer, the authors retain (or are granted back) significant scholarly rights.
The copyright transfer agreement form can be downloaded here: [REID Copyright Transfer Agreement Form]
The copyright form should be signed originally and sent to the Editorial Office through email to reid.ppsuny@uny.ac.id
REID (Research and Evaluation in Education) by http://journal.uny.ac.id/index.php/reid is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.