Developing and analyzing items of a physics conceptual understanding test on wave topics for high school students using the Rasch Model
DOI:
https://doi.org/10.21831/reid.v11i1.75575Keywords:
conceptual understanding, instrument development, item analysis, Rasch ModelAbstract
This study aims to develop, validate, and analyze test items for assessing the understanding of mechanical wave concepts among high school students. The test development process followed the Mardapi instrument development model, which includes: (1) constructing test specifications, (2) writing test items, (3) reviewing test items, (4) piloting the test, and (5) analyzing the items. The developed instrument consists of 12 multiple-choice items, covering three aspects of conceptual understanding: translation, interpretation, and interpolation. Content validity was assessed by three validators, and the results were analyzed using the Aiken V method. The instrument was then administered to 257 high school students in South Sulawesi Province. The results were analyzed using Item Response Theory (IRT) with the Rasch model through the Quest program. Item analysis included item fit estimation, reliability, and item difficulty. The content validity test results indicate that the instrument is valid. All items fit the Rasch model, with a reliability coefficient of 0.95, categorized as high reliability. Item difficulty analysis revealed that 8.3% of items were categorized as easy, 8.3% as difficult, and 83.3% as moderate. Overall, the results indicate that the test instrument is of good quality and can be used to assess high school students’ understanding of mechanical wave concepts.
References
Aiken, L. R. (1980). Content validity and reliability of single items or questionnaires. Educational and Psychological Measurement, 40(4), 955–959. https://doi.org/10.1177/001316448004000419
Anaya, L., Iriberri, N., Rey-Biel, P., & Zamarro, G. (2022). Understanding performance in test taking: The role of question difficulty order. Economics of Education Review, 90, 102293. https://doi.org/https://doi.org/10.1016/j.econedurev.2022.102293
Andrich, D., & Marais, I. (2019). Reliability and validity in classical test theory. In D. Andrich & I. Marais (Eds.), A course in Rasch measurement theory: Measuring in the educational, social and health sciences (pp. 41–53). Springer Nature Singapore. https://doi.org/10.1007/978-981-13-7496-8_4
Asriadi, M., & Hadi, S. (2021). Analysis of the quality of the formative test items for physics learning using the Rasch model in the 21st century learning. JIPF (Jurnal Ilmu Pendidikan Fisika), 6(2), 158-166. https://doi.org/10.26737/jipf.v6i2.2030
Azizah, I., & Supahar, S. (2023). Analisis kualitas butir soal penilaian harian bersama I fisika kelas X SMA Negeri 1 Patikraja. Jurnal Pendidikan Fisika, 10(2), 90–104. https://doi.org/10.21831/jpf.v10i2.18230
Azizah, N., Suseno, M., & Hayat, B. (2022). Item analysis of the Rasch model items in the final semester exam Indonesian language lesson. World Journal of English Language, 12(1), 15–26. https://doi.org/10.5430/wjel.v12n1p15
Baker, F. B. (2001). The basics of Item Response Theory (2nd ed.). ERIC Clearinghouse on Assessment and Evaluation.
Balta, N., Japashov, N., Salibašić Glamočić, D., & Mešić, V. (2022). Development of the high school wave optics test. Journal of Turkish Science Education, 19(1), 306-331. https://doi.org/10.36681/tused.2022.123
Bond, T. G., Yan, Z., & Heene, M. (2020). Applying the Rasch Model (4th ed.). Routledge. https://doi.org/10.4324/9780429030499
Boone, W. J. (2016). Rasch analysis for instrument development: Why, when, and how? CBE Life Sciences Education, 15(4). https://doi.org/10.1187/cbe.16-04-0148
Boone, W. J., Staver, J. R., & Yale, M. S. (2014). Rasch analysis in the human sciences. Springer Dordrecht. https://doi.org/10.1007/978-94-007-6857-4
Chen, L., Uemura, H., Hao, H., Goda, Y., Okubo, F., Taniguchi, Y., Oi, M., Konomi, S., Ogata, H., & Yamada, M. (2018). Relationships between collaborative problem solving, learning performance, and learning behavior in science education. Proceedings of 2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering, TALE 2018, 17–24. https://doi.org/10.1109/TALE.2018.8615254
Darling-Hammond, L., Flook, L., Cook-Harvey, C., Barron, B., & Osher, D. (2020). Implications for educational practice of the science of learning and development. Applied Developmental Science, 24(2), 97–140. https://doi.org/10.1080/10888691.2018.1537791
Dewi, H. H., Damio, S. M., & Sukarno, S. (2023). Item analysis of reading comprehension questions for English proficiency test using Rasch model. REID (Research and Evaluation in Education), 9(1), 24–36. https://doi.org/10.21831/reid.v9i1.53514
Docktor, J. L., & Mestre, J. P. (2014). Synthesis of discipline-based education research in physics. Physical Review Special Topics - Physics Education Research, 10(2), 020119. https://doi.org/10.1103/PhysRevSTPER.10.020119
Faradillah, A., & Febriani, L. (2021). Mathematical trauma students’ junior high school based on grade and gender. Infinity Journal, 10(1), 53–68. https://doi.org/10.22460/infinity.v10i1.p53-68
Goodhew, L. M., Robertson, A. D., Heron, P. R. L., & Scherr, R. E. (2019). Student conceptual resources for understanding mechanical wave propagation. Physical Review Physics Education Research, 15(2), 020127. https://doi.org/10.1103/PhysRevPhysEducRes.15.020127
Halim, A., Suriana, S., & Mursal, M. (2017). Dampak problem based learning terhadap pemahaman konsep ditinjau dari gaya berpikir siswa pada mata pelajaran fisika. Jurnal Penelitian & Pengembangan Pendidikan Fisika, 3(1), 1–10. https://doi.org/10.21009/1.03101
Habibi, H., Jumadi, J., & Mundilarto, M. (2019). The Rasch-rating scale model to identify learning difficulties of physics students based on self-regulation skills. International Journal of Evaluation and Research in Education, 8(4), 659–665. https://doi.org/10.11591/ijere.v8i4.20292
Hanna, W. F., & Retnawati, H. (2022). Analisis kualitas butir soal matematika menggunakan model Rasch dengan bantuan software Quest. AKSIOMA: Jurnal Program Studi Pendidikan Matematika, 11(4), 3695. https://doi.org/10.24127/ajpm.v11i4.5908
Hope, D., Kluth, D., Homer, M., Dewar, A., Goddard-Fuller, R., Jaap, A., & Cameron, H. (2024). Exploring the use of Rasch modelling in “common content” items for multi-site and multi-year assessment. Advances in Health Sciences Education, 30, 427–438. https://doi.org/10.1007/s10459-024-10354-y
Istiyono, E. (2020). Pengembangan instrumen penilaian dan analisis hasil belajar fisika dengan teori tes klasik dan modern. UNY Press.
Kanyesigye, S. T., Uwamahoro, J., & Kemeza, I. (2022). Difficulties in understanding mechanical waves: Remediated by problem-based instruction. Physical Review Physics Education Research, 18(1), 010140. https://doi.org/10.1103/PhysRevPhysEducRes.18.010140
Kola, A. J. (2017). Investigating the conceptual understanding of physics through an interactive lecture-engagement. Cumhuriyet International Journal of Education, 6(1), 82–96. http://cije.cumhuriyet.edu.tr/en/pub/issue/29856/321440
Kurniawan, A., Istiyono, E., & Daeng Naba, S. (2024). Item quality analysis of physics concept understanding test with Rasch model. JIPF (Jurnal Ilmu Pendidikan Fisika), 9(3), 474–486. https://doi.org/10.26737/jipf.v9i3.5692
Kurniawan, F., Samsudin, A., Chandra, D. T., Sriwati, E., Zahran, M., Gani, A. W., Ramadhan, B. P., Aminudin, A. H., & Ramadani, F. (2023). Assessing conceptual understanding of high school students on transverse and stationary waves through Rasch analysis in Malang. Journal of Physics: Conference Series, 2596(1). https://doi.org/10.1088/1742-6596/2596/1/012060
Kurpius, S. E. R., & Stafford, M. E. (2005). Testing and measurement: A user-friendly guide. Sage Publications.
Lafifa, F., & Dadan, R. (2024). Innovations in assessing students’ digital literacy skills in learning science: Effective multiple choice closed-ended tests using Rasch model. Turkish Online Journal of Distance Education, 25(3), 44–56. https://dergipark.org.tr/tr/download/article-file/3425765
Larasati, P. E., Supahar, & Yunanta, D. R. A. (2020). Validity and reliability estimation of assessment ability instrument for data literacy on high school physics material. Journal of Physics: Conference Series, 1440(1), 012020. https://doi.org/10.1088/1742-6596/1440/1/012020
Mardapi, D. (2008). Teknik penyusunan instrumen tes dan nontes. Mitra Cendekia.
Mešić, V., Neumann, K., Aviani, I., Hasović, E., Boone, W. J., Erceg, N., Grubelnik, V., Sušac, A., Glamočić, D. S., Karuza, M., Vidak, A., AlihodŽić, A., & Repnik, R. (2019). Measuring students’ conceptual understanding of wave optics: A Rasch modeling approach. Physical Review Physics Education Research, 15(1), 010115. https://doi.org/10.1103/PhysRevPhysEducRes.15.010115
Pals, F. F. B., Tolboom, J. L. J., & Suhre, C. J. M. (2023). Development of a formative assessment instrument to determine students’ need for corrective actions in physics: Identifying students’ functional level of understanding. Thinking Skills and Creativity, 50, 101387. https://doi.org/10.1016/j.tsc.2023.101387
Planinic, M., Jelicic, K., Matejak Cvenic, K., Susac, A., & Ivanjek, L. (2024). Effect of an inquiry-based teaching sequence on secondary school students’ understanding of wave optics. Physical Review Physics Education Research, 20(1), 010156. https://doi.org/10.1103/PhysRevPhysEducRes.20.010156
Putranta, H., & Supahar. (2019). Development of physics-tier tests (PysTT) to measure students’ conceptual understanding and creative thinking skills: A qualitative synthesis. Journal for the Education of Gifted Young Scientists, 7(3), 747–775. https://doi.org/10.17478/jegys.587203
Putri, A. H., Sutrisno, S., & Chandra, D. T. (2020). Efektivitas pendekatan multirepresentasi dalam pembelajaran berbasis masalah untuk meningkatkan pemahaman konsep siswa SMA pada materi gaya dan gerak. Journal of Natural Science and Integration, 3(2), 205–214. http://dx.doi.org/10.24014/jnsi.v3i2.9400
Rahim, A., Hadi, S., Susilowati, D., Marlina, & Muti’ah. (2023). Developing of Computerized Adaptive Test (CAT) based on a learning management system in mathematics final exam for junior high school. International Journal of Educational Reform. https://doi.org/10.1177/10567879231211297
Retnawati, H., & Wulandari, N. F. (2019). The development of students’ mathematical literacy proficiency. Problems of Education in the 21st Century, 77(4), 502–514. https://doi.org/10.33225/pec/19.77.502
Şahin, A., & Anıl, D. (2017). The effects of test length and sample size on item parameters in item response theory. Kuram ve Uygulamada Egitim Bilimleri, 17(1), 321–335. https://doi.org/10.12738/estp.2017.1.0270
Salman, A., & Abd. Aziz, A. (2015). Evaluating user readiness towards digital society: A Rasch measurement model analysis. Procedia Computer Science, 65, 1154–1159. https://doi.org/10.1016/j.procs.2015.09.028
Sartika, R. P. (2018). The implementation of problem-based learning to improve students’ understanding in management of laboratorium subject. EDUSAINS, 10(2), 197–205. https://doi.org/10.15408/es.v10i2.7376
Setyawarno, D. (2017). Makalah PPM: Panduan Quest. FMIPA UNY.
Shanti, M. R. S., Istiyono, E., Munadi, S., Permadi, C., Pattiserlihun, A., & Sudjipto, D. N. (2020). Analisa penilaian soal fisika menggunakan model Rasch dengan Program R. Jurnal Sains dan Edukasi Sains, 3(2), 46–52. https://doi.org/10.24246/juses.v3i2p46-52
Sharma, V., Gupta, N. L., & Agarwal, A. K. (2023). Impact of ICT-enabled teaching–learning processes in physical sciences in Indian higher education in light of COVID-19: A comprehensive overview. National Academy Science Letters, 46(5), 465–469. https://doi.org/10.1007/s40009-023-01225-y
Sumaryanta. (2021). Teori Tes Klasik & Teori Respon Butir: Konsep & contoh penerapannya. CV. Confident.
Szabó, G. (2008). Applying Item Response Theory in language test item bank building. Peter Lang. https://books.google.co.id/books?id=I0V9AAAAMAAJ
Wright, B. D. (1977). Solving measurement problems with the Rasch model. Journal of Educational Measurement, 14(2), 97–116. https://doi.org/10.1111/j.1745-3984.1977.tb00031.x
Wright, B. D., & Stone, M. H. (1979). Best test design. Mesa Press.
Xie, L., Liu, Q., Lu, H., Wang, Q., Han, J., Feng, X. M., & Bao, L. (2021). Student knowledge integration in learning mechanical wave propagation. Physical Review Physics Education Research, 17(2), 020122. https://doi.org/10.1103/PhysRevPhysEducRes.17.020122
Yim, L. W. K., Lye, C. Y., & Koh, P. W. (2024). A psychometric evaluation of an item bank for an English reading comprehension tool using Rasch analysis. REID (Research and Evaluation in Education), 10(1), 18–34. https://doi.org/10.21831/reid.v10i1.65284
Downloads
Published
How to Cite
Issue
Section
Citation Check
License
Copyright (c) 2025 REID (Research and Evaluation in Education)

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
The authors submitting a manuscript to this journal agree that, if accepted for publication, copyright publishing of the submission shall be assigned to REID (Research and Evaluation in Education). However, even though the journal asks for a copyright transfer, the authors retain (or are granted back) significant scholarly rights.
The copyright transfer agreement form can be downloaded here: [REID Copyright Transfer Agreement Form]
The copyright form should be signed originally and sent to the Editorial Office through email to reid.ppsuny@uny.ac.id
REID (Research and Evaluation in Education) by http://journal.uny.ac.id/index.php/reid is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.