Applying Item Response Theory model for evaluating item and test properties of academic potential test for students with disability
Dian Putri Permatasari, Universitas Brawijaya, Indonesia
Unita Werdi Rahajeng, Universitas Brawijaya, Indonesia
Abstract
Universitas Brawijaya (UB) is one of the pioneers of inclusive education in higher education in Indonesia. One of the innovations in the policies related to inclusive education is affirmative action admissions special for students with disabilities, namely Seleksi Mandiri Penyandang Disabilitas (Independent Selection for Person with Disabilities), which focuses on accommodating admissions selection for students with disabilities who want to enroll in bachelors or vocational programs. A part of this admission selection is the test called the Computer-Based Academic Potential Test. This study aims to evaluate, from a psychometric perspective, the psychometric properties of the potential academic test. The approach used in this study is the item response theory (IRT) framework, which is mostly used for evaluating psychometric quality at both item-level and test levels. This study's IRT model is a two-parameter logistic model that includes difficulty parameter and discrimination parameter. The result of this study exhibited that the three subtests of the Computer-Based Academic Potential Test, in general, have satisfying results from the 2PL model estimation. The result also showed that most of the item difficulties ranged from medium to very difficult.
Keywords
Full Text:
PDFReferences
An, X., & Yung, Y.-F. (2014). Item response theory: What it is and how you can use the IRT procedure to apply it. SAS364, 1–14. https://support.sas.com/resources/papers/proceedings14/SAS364-2014.pdf
Baker, F. B. (2001). The basics of item response theory (2nd ed.). ERIC Clearinghouse on Assessment and Evaluation.
Baker, F. B., & Kim, S.-H. (2017). The basics of item response theory using R. Springer.
Byrne, B. M. (1998). Structural equation modeling with Lisrel, Prelis, and Simplis. Psychology Press. https://doi.org/10.4324/9780203774762
Fan, X. (1998). Item response theory and classical test theory: An empirical comparison of their item/person statistics. Educational and Psychological Measurement, 58(3), 357–381. https://doi.org/10.1177/0013164498058003001
Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory (Volume 2). SAGE Publications.
Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 1–55. https://doi.org/10.1080/10705519909540118
Kline, R. B. (2016). Principles and practice of structural equation modeling (4th ed.). Guilford.
Palombi, B. J. (2000). Recruitment and admission of students with disabilities. New Directions for Student Services, 2000(91), 31–39. https://doi.org/10.1002/ss.9103
Pratiwi, A., Lintangsari, A. P., Rizky, U. F., & Rahajeng, U. W. (2018). Disabilitas dan pendidikan inklusif di perguruan tinggi. Universitas Brawijaya Press.
Rukmantara, A., & Lesmana, B. (2018). Inclusive education and SDGs: Snapshots from the field. In M. Anwar (Ed.), International Conference on Sustainability Development Goals for Disabilities (ICSDGD) (pp. 1–17). Asosiasi Profesi Pendidikan Khusus Indonesia (APPKhI). http://appkhi.or.id/Proceedings ICSDGD.pdf
Schmidt, K. M., & Embretson, S. E. (2012). Item response theory and measuring abilities. In I. B. Weiner, J. A. Schinka, & W. F. Velicer (Eds.), Handbook of psychology - Volume 2: Research methods in psychology (2nd ed., pp. 451–473). John Wiley & Sons.
Sim, S. M., & Rasiah, R. I. (2006). Relationship between item difficulty and discrimination indices in true/false-type multiple choice questions of a para-clinical multidisciplinary paper. Annals of the Academy of Medicine, Singapore, 35(2), 67–71.
Wang, J., & Wang, X. (2019). Structural equation modeling: Applications using Mplus. John Wiley & Sons.
Wolanin, T. R., & Steele, P. (2004). Higher education opportunities for students with disabilities: A primer for policymakers. The Institute for Higher Education Policy.
Wu, M. (2017). Some IRT-based analyses for interpreting rater effects. Psychological Test and Assessment Modeling, 59(4), 453–470. https://www.psychologie-aktuell.com/fileadmin/download/ptam/4-2017_20171218/04_Wu.pdf
Zanon, C., Hutz, C. S., Yoo, H., & Hambleton, R. K. (2016). An application of item response theory to psychological test development. Psicologia: Reflexão e Crítica, 29(1), 18. https://doi.org/10.1186/s41155-016-0040-x
Zoghi, M., & Valipour, V. (2014). A comparative study of Classical Test Theory and Item Response Theory in estimating test item parameters in a linguistics test. Indian Journal of Fundamental and Applied Life Sciences, 4(s4), 424–435. https://www.cibtech.org/sp.ed/jls/2014/04/JLS-051-S4-052-VALIPOUR-COMPARATIVE.pdf
DOI: https://doi.org/10.21831/pep.v25i1.38808
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Find Jurnal Penelitian dan Evaluasi Pendidikan on:
ISSN 2338-6061 (online) || ISSN 2685-7111 (print)
View Journal Penelitian dan Evaluasi Pendidikan Visitor Statistics