Score conversion methods with modern test theory approach: Ability, difficulty, and guessing justice methods
DOI:
https://doi.org/10.21831/reid.v11i2.67484Keywords:
item response theory, 1-PL, R program, Rasch modelAbstract
The one-parameter logistic (1-PL) model is widely used in Item Response Theory (IRT) to estimate student ability; however, ability-based scoring disregards item difficulty and guessing behavior, which can bias proficiency interpretations. This study evaluates three scoring alternatives derived from IRT: an ability-based conversion, a difficulty-weighted conversion, and a proposed guessing-justice method. Dichotomous responses from 400 students were analyzed using the Rasch (1-PL) model in the R environment with the ltm package. The 1-PL specification was retained to support a parsimonious and interpretable calibration framework consistent with the comparative scoring purpose of the study. Rasch estimation produced item difficulty values ranging from −1.03 to 0.18 and identified 268 unique response patterns. Ability-based scoring yielded only eight score distinctions, demonstrating limited discriminatory capacity. In contrast, the guessing-justice method produced a substantially more differentiated distribution, with approximately 70 percent of patterns consistent with knowledge-based responding and 30 percent indicative of guessing. The findings indicate that scoring models incorporating item difficulty and guessing behaviour provide a more equitable and accurate representation of student proficiency than traditional ability-based conversions. The proposed approach offers a practical and implementable alternative for classroom assessment and can be applied using widely accessible spreadsheet software such as Microsoft Excel.
References
Baker, F. B. (2001). The basics of item response theory (2nd edition). ERIC Clearinghouse on Assessment and Evaluation. https://doi.org/10.1007/978-3-319-54205-8_1
Cappelleri, J. C., Jason Lundy, J., & Hays, R. D. (2014). Overview of classical test theory and item response theory for the quantitative assessment of items in developing patient-reported outcomes measures. Clinical Therapeutics, 36(5), 648–662. https://doi.org/10.1016/j.clinthera.2014.04.006
Creswell, J. W. (2012). Educational Research: Planning, conducting, and evaluating quantitative and qualitative research (4th edition). Pearson.
Edelen, M. O., & Reeve, B. B. (2007). Applying item response theory (IRT) modeling to questionnaire development, evaluation, and refinement. Quality of Life Research, 16, 5–18. https://doi.org/10.1007/s11136-007-9198-0
Fan, X. (1998). Item response theory and classical test theory: An empirical comparison of their item/person statistics. Educational and Psychological Measurement, 58(3), 357–381. https://doi.org/10.1177/0013164498058003001
Fraley, R. C., Waller, N. G., & Brennan, K. A. (2000). An item response theory analysis of self-report measures of adult attachment. Journal of Personality and Social Psychology, 78(2), 350–365. https://doi.org/10.1037/0022-3514.78.2.350
Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1992). Fundamentals of Item Response Theory. In Contemporary Sociology (Vol. 21, Issue 2). https://doi.org/10.2307/2075521
Mahmud, M. N. (2021). Diagnostik kesulitan belajar Matematika siswa SMP kelas VIII di Kota Baubau menggunakan soal-soal model TIMSS. Yogyakarta State University.
Mellenbergh, G. J. (1989). Item bias and item response theory. International Journal of Educational Research, 13(2), 127–143. https://doi.org/10.1016/0883-0355(89)90002-5
Rizopoulos, D. (2006). Itm: An R package for latent variable modeling and item response theory analyses. Journal of Statistical Software, 17(5), 1–25. https://doi.org/10.18637/jss.v017.i05
Published
How to Cite
Issue
Section
Citation Check
License
Copyright (c) 2025 REID (Research and Evaluation in Education)

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
The authors submitting a manuscript to this journal agree that, if accepted for publication, copyright publishing of the submission shall be assigned to REID (Research and Evaluation in Education). However, even though the journal asks for a copyright transfer, the authors retain (or are granted back) significant scholarly rights.
The copyright transfer agreement form can be downloaded here: [REID Copyright Transfer Agreement Form]
The copyright form should be signed originally and sent to the Editorial Office through email to reid.ppsuny@uny.ac.id

REID (Research and Evaluation in Education) by http://journal.uny.ac.id/index.php/reid is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.



.png)




