Differential Item Functioning of the region-based national examination equipment

Adi Setiawan, PT Batamindo Green Farm, Indonesia
Gulzhaina Kuralbaevna Kassymova, Abai Kazakh National Pedagogical University, Kazakhstan
Vianney Mbazumutima, African Institute for Mathematical Sciences, Cameroon
Anggit Reviana Dewi Agustyani, Umeå Mathematics Education Research Centre (UMERC), Sweden

Abstract


This research aims to detect Differential Item Functioning (DIF) in the 2014/2015 National Examination Questions in mathematics of junior high schools and equivalent-level schools in the Yogyakarta region as a reference group and the South Kalimantan region as a focus group using the Likelihood Ratio Test (LRT) method, Area Measure Raju, and Lord. A sensitivity analysis was carried out to determine which method was the most sensitive. The data consisted of 5,465 National Examination papers of the students from the two regions, who worked on type A questions. A sample of 1,000 exam papers for each region was established using the simple random sampling (SRS) technique, which was conducted to avoid the effect of sample size. The results of the research showed that by using the LRT method the researchers found 36 items had significant DIF detection, 32 items were significant for Raju Area, and all items had significant DIF detection using Lord. Lord Method is the most sensitive because this method can detect the most DIF items.

Keywords


comparison of DIF detection methods; differential items functioning; unidimensional IRT

References


Akour, M., Sabah, S., & Hammouri, H. (2015). Net and Global Differential Item Functioning in PISA Polytomously Scored Science Items. Journal of Psychoeducational Assessment, 33(2), 166–176. https://doi.org/10.1177/0734282914541337

Alfarizi. (2019). Meningkatkan Mutu Pendidikan di Indonesia Melalui MESUPPEN “Maksimalkan Pendekatan Supervisi Pendidikan.” 1–5. http://dx.doi.org/10.31227/osf.io/tmyz7

Azis, A. (2015). CONCEPTIONS AND PRACTICES OF ASSESSMENT: A CASE OF TEACHERS REPRESENTING IMPROVEMENT CONCEPTION. TEFLIN Journal - A Publication on the Teaching and Learning of English, 26(2), 129. https://doi.org/10.15639/teflinjournal.v26i2/129-154

BAŞMAN, M. (2023). A Comparison of the efficacies of differential item functioning detection methods. International Journal of Assessment Tools in Education, 10(1), 145–159. https://doi.org/10.21449/ijate.1135368

Berrío, Á. I., Herrera, A. N., & Gómez-Benito, J. (2019). Effect of Sample Size Ratio and Model Misfit When Using the Difficulty Parameter Differences Procedure to Detect DIF. The Journal of Experimental Education, 87(3), 367–383. https://doi.org/10.1080/00220973.2018.1435502

ÇELİK, M., & ÖZKAN, Y. Ö. (2020). Analysis of Differential Item Functioning of PISA 2015 Mathematics Subtest Subject to Gender and Statistical Regions. Eğitimde ve Psikolojide Ölçme ve Değerlendirme Dergisi, 11(3), 283–301. https://doi.org/10.21031/epod.715020

Cho, S., Suh, Y., & Lee, W. (2016). An NCME Instructional Module on Latent DIF Analysis Using Mixture Item Response Models. Educational Measurement: Issues and Practice, 35(1), 48–61. https://doi.org/10.1111/emip.12093

Delgado, A. R., Burin, D. I., & Prieto, G. (2018). Testing the generalized validity of the Emotion Knowledge test scores. PLOS ONE, 13(11), e0207335. https://doi.org/10.1371/journal.pone.0207335

Desjardins, C. D., & Bulut, O. (2018). Handbook of Educational Measurement and Psychometrics Using R. Chapman and Hall/CRC. https://doi.org/10.1201/b20498

Effiom, A. P. (2021). Test fairness and assessment of differential item functioning of mathematics achievement test for senior secondary students in Cross River state, Nigeria using item response theory. Global Journal of Educational Research, 20(1), 55–62. https://doi.org/10.4314/gjedr.v20i1.6

French, B. F., Finch, W. H., & Immekus, J. C. (2019). Multilevel Generalized Mantel-Haenszel for Differential Item Functioning Detection. Frontiers in Education, 4. https://doi.org/10.3389/feduc.2019.00047

Gaberson, K. B. (1997). Measurement reliability and validity. AORN Journal, 66(6), 1092–1094. https://doi.org/10.1016/S0001-2092(06)62551-9

Galli, S., Chiesi, F., & Primi, C. (2011). Measuring mathematical ability needed for “non-mathematical” majors: The construction of a scale applying IRT and differential item functioning across educational contexts. Learning and Individual Differences, 21(4), 392–402. https://doi.org/10.1016/j.lindif.2011.04.005

Hadi, S., Basukiyatno, B., & Susongko, P. (2021). Differential Item Functioning National Examination on Device Test Mathematics High School in Central Java. Proceedings of the 1st International Conference on Social Science, Humanities, Education and Society Development, ICONS 2020, 30 November, Tegal, Indonesia. https://doi.org/10.4108/eai.30-11-2020.2303726

Hadi, S., Puspita, F., Ati, A. P., & Widiyarto, S. (2020). PENYULUHAN DAN PEMBELAJARAN KARAKTER MELALUI PELAKSANAAN IDUL ADHA PADA SISWA SMA. Jurnal Pemberdayaan: Publikasi Hasil Pengabdian Kepada Masyarakat, 4(2), 205–210. https://doi.org/10.12928/jp.v4i2.1833

Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. In Fundamentals of item response theory. Sage Publications, Inc.

Hidajad, A. (2019). PENDIDIKAN INDONESIA : RAMAI DI DAPUR, SEPI DI PANGGUNG ( Sebuah Tinjauan Perkembangan ). GETER : Jurnal Seni Drama, Tari Dan Musik, 2(2), 1–11. https://doi.org/10.26740/geter.v2n2.p1-11

Huang, X., Wilson, M., & Wang, L. (2016). Exploring plausible causes of differential item functioning in the PISA science assessment: language, curriculum or culture. Educational Psychology, 36(2), 378–390. https://doi.org/10.1080/01443410.2014.946890

Ihsan, H. (2016). VALIDITAS ISI ALAT UKUR PENELITIAN KONSEP DAN PANDUAN PENILAIANNYA. PEDAGOGIA Jurnal Ilmu Pendidikan, 13(2), 266. https://doi.org/10.17509/pedagogia.v13i2.3557

James, G., James, R. C., & Davis, P. J. (1959). Mathematics Dictionary. Physics Today, 12(10), 50–52. https://doi.org/10.1063/1.3060526

Jusmirad, M., Angraeini, D., Faturrahman, M., Syukur, M., & Arifin, I. (2023). Implementasi Literasi Dan Numerasi Pada Program MBKM Dan Dampaknya Terhadap Siswa SMP Datuk Ribandang. Jurnal Pendidikan Indonesia, 4(03), 303–310. https://doi.org/10.59141/japendi.v4i03.1687

Kane, M. T. (2013). Validating the Interpretations and Uses of Test Scores. Journal of Educational Measurement, 50(1), 1–73. https://doi.org/10.1111/jedm.12000

Langer, M. M. (2008). A Reexamination of Lord’s Wald Test for Differential Item Functioning Using Item Response Theory and Modern Error Estimation. https://doi.org/https://doi.org/10.17615/chn0-dz45

Leiner, J. E. M., Scherndl, T., & Ortner, T. M. (2018). How Do Men and Women Perceive a High-Stakes Test Situation? Frontiers in Psychology, 9. https://doi.org/10.3389/fpsyg.2018.02216

Octavianus Turang, D. A. (2017). Pendekatan Model Ontologi untuk Pencarian Lembaga Pendidikan (Studi Kasus Lembaga Pendidikan Provinsi Daerah Istimewa Yogyakarta). Jurnal Ilmiah Teknologi Infomasi Terapan, 3(3). https://doi.org/10.33197/jitter.vol3.iss3.2017.134

Ozdemir, B., & Alshamrani, A. H. (2020). Examining the Fairness of Language Test Across Gender with IRT-based Differential Item and Test Functioning Methods. International Journal of Learning, Teaching and Educational Research, 19(6), 27–45. https://doi.org/10.26803/ijlter.19.6.2

Patricia, D. M. D. C., & Luisa, D. S. L. B. D. A. (2009). Differential Item Functioning ( DIF ): What Functions Differently for Immigrant Students in PISA 2009 Reading Items ? JRC Publications Repository. https://doi.org/10.2788/60811

Puspendik. (2020). Capaian Nilai Ujian Nasional. https://hasilun.puspendik.kemdikbud.go.id/#2019!smp!capaian!26&99&999!T&03&T&T&1&!1!&

Raju, N. S. (1990). Determining the Significance of Estimated Signed and Unsigned Areas Between Two Item Response Functions. Applied Psychological Measurement, 14(2), 197–207. https://doi.org/10.1177/014662169001400208

Retnawati, H. (2013). PENDETEKSIAN KEBERFUNGSIAN BUTIR PEMBEDA DENGAN INDEKS VOLUME SEDERHANA BERDASARKAN TEORI RESPONS BUTIR MULTIDIMENSI. Jurnal Penelitian Dan Evaluasi Pendidikan, 17(2), 275–286. https://doi.org/10.21831/pep.v17i2.1700

Scott, N. W., Fayers, P. M., Aaronson, N. K., Bottomley, A., de Graeff, A., Groenvold, M., Gundy, C., Koller, M., Petersen, M. A., & Sprangers, M. A. G. (2009). A simulation study provided sample size guidance for differential item functioning (DIF) studies using short scales. Journal of Clinical Epidemiology, 62(3), 288–295. https://doi.org/10.1016/j.jclinepi.2008.06.003

Siegrist, M., Connor, M., & Keller, C. (2012). Trust, Confidence, Procedural Fairness, Outcome Fairness, Moral Conviction, and the Acceptance of GM Field Experiments. Risk Analysis, 32(8), 1394–1403. https://doi.org/10.1111/j.1539-6924.2011.01739.x

Sinha, R., van den Heuvel, W. A., & Arokiasamy, P. (2013). Validity and reliability of MOS short form health survey (SF-36) for use in India. Indian Journal of Community Medicine, 38(1), 22. https://doi.org/10.4103/0970-0218.106623

Sitepu, V. V., & Rahmawati, F. (2022). Analisis pusat pertumbuhan dan sektor ekonomi dalam mengurangi ketimpangan pendapatan. AKUNTABEL, 19(1), 1–12. https://doi.org/10.30872/jakt.v19i1.10710

Soysal, S., & Koğar, E. Y. (2021). An Investigation of Item Position Effects by Means of IRT-Based Differential Item Functioning Methods. International Journal of Assessment Tools in Education, 8(2), 239–256. https://doi.org/10.21449/ijate.779963

SUDARYONO, S. (2017). SENSITIVITAS METODE PENDETEKSIAN DIFFERENTIAL ITEM FUNCTIONING (DIF). Jurnal Evaluasi Pendidikan, 3(1), 82. https://doi.org/10.21009/JEP.031.07

Thissen, D., Steinberg, L., & Gerrard, M. (1986). Beyond group-mean differences: The concept of item bias. Psychological Bulletin, 99(1), 118–128. https://doi.org/10.1037/0033-2909.99.1.118

Thissen, D., Steinberg, L., & Wainer, H. (1988). Use of item response theory in the study of group differences in trace lines. In Test validity. (pp. 147–172). Lawrence Erlbaum Associates, Inc. https://doi.org/10.1037/14047-004

UĞURLU, S., & ATAR, B. (2020). Performances of MIMIC and Logistic Regression Procedures in Detecting DIF. Eğitimde ve Psikolojide Ölçme ve Değerlendirme Dergisi, 11(1), 1–12. https://doi.org/10.21031/epod.531509

Ukanda, F., Othuon, L., Agak, J., & Oleche, P. (2024). Effectiveness of Mantel-Haenszel And Logistic Regression Statistics in Detecting Differential Item Functioning Under Different Conditions of Sample Size, Ability Distribution and Test Length. American Journal of Educational Research, 7(11), 878–887. http://pubs.sciepub.com/

Whynes, D. K., Sprigg, N., Selby, J., Berge, E., & Bath, P. M. (2013). Testing for Differential Item Functioning within the EQ-5D. Medical Decision Making, 33(2), 252–260. https://doi.org/10.1177/0272989X12465016

Yamin, M., & Syahrir, S. (2020). PEMBANGUNAN PENDIDIKAN MERDEKA BELAJAR (TELAAH METODE PEMBELAJARAN). Jurnal Ilmiah Mandala Education, 6(1). https://doi.org/10.36312/jime.v6i1.1121

Yildirim, O. (2019). Detecting Gender Differences in PISA 2012 Mathematics Test with Differential Item Functioning. International Education Studies, 12(8), 59. https://doi.org/10.5539/ies.v12n8p59

Zampetakis, L. A., Bakatsaki, M., Litos, C., Kafetsios, K. G., & Moustakis, V. (2017). Gender-based Differential Item Functioning in the Application of the Theory of Planned Behavior for the Study of Entrepreneurial Intentions. Frontiers in Psychology, 8. https://doi.org/10.3389/fpsyg.2017.00451

Zukmadini, A. Y., Karyadi, B., & Rochman, S. (2021). Peningkatan Kompetensi Guru Melalui Workshop Model Integrasi Terpadu Literasi Sains Dan Pendidikan Karakter Dalam Pembelajaran IPA. Publikasi Pendidikan, 11(2), 107. https://doi.org/10.26858/publikan.v11i2.18378




DOI: https://doi.org/10.21831/reid.v10i1.73270

Refbacks

  • There are currently no refbacks.




Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.




Find REID (Research and Evaluation in Education) on:

  

ISSN 2460-6995 (Online)

View REiD Visitor Statistics