The analytical scoring assessment usage to examine Sundanese students’ performance in writing descriptive texts

Dedi Koswara, Universitas Pendidikan Indonesia, Indonesia
Ruswan Dallyono, Universitas Pendidikan Indonesia, Indonesia
Agus Suherman, Universitas Pendidikan Indonesia, Indonesia
Pandu Hyangsewu, Universitas Pendidikan Indonesia, Indonesia

Abstract


Many scholars view writing as a highly laborious task because many subcomponents are to be moulded into a single discourse. This issue is also faced by Indonesian higher education students, particularly sophomore students of the department of Sundanese language education. Therefore, this present study aims to uncover the factual evidence of whether the use of a rubric can enhance valid judgment. This process employed a validity and reliability scoring system. Subsequently, it is also significant to examine the functions of a rubric to promote sophomores’ learning and develop instruction from lecturers in the department of Sundanese language education. This study objectively assesses students’ descriptive writing in Sundanese language by considering the analytical scoring assessment functions. It turns out that the outcomes motivated the lecturers to constantly simplify the criteria in order to fit sophomores' descriptive texts, discover significant features, and incorporate the qualities into their own text insights. Furthermore, depending on the text complexity, such as intermediate, pre-advanced, and advanced writing, as well as students' writing competence and maturation, lecturers may modify the analytical rubric that moderately recommends the necessary qualities of descriptive texts or encompasses a wider component of rubrics qualities. This finding implies that the analytical scoring rubric incorporates the influential scoring method into daily activities.


Keywords


Analytical scoring; descriptive text; validity and reliability; Sundanese education

Full Text:

PDF

References


Baksh, A., Mohd Sallehhudin, A. A., Tayeb, Y. A., & Norhaslinda, H. (2016). Washback effect of school-based english language assessment: A case-study on students’ perceptions. Pertanika Journal of Social Sciences & Humanities, 24(3).

Balta, E. E. (2018). The relationships among writing skills, writing anxiety and metacognitive awareness. Journal of Education and Learning, 7(3), 233–241.

Blumenfeld, P. C., Kempler, T. M., & Krajcik, J. S. (2005). Motivation and cognitive engagement in learning environments. In R. K. Sawyer (Ed.), The Cambridge Handbook of the Learning Sciences (pp. 475–488). Cambridge University Press. https://doi.org/10.1017/CBO9780511816833.029

Chen, H.-X., Chen, W., Liu, X., Liu, Y.-R., & Zhu, S.-L. (2017). A review of the open charm and open bottom systems. Reports on Progress in Physics, 80(7), 76201. https://doi.org/10.1088/1361-6633/aa6420

Cyr, P. R., Smith, K. A., Broyles, I. L., & Holt, C. T. (2014). Developing, evaluating and validating a scoring rubric for written case reports. International Journal of Medical Education, 5, 18–23. https://doi.org/10.5116/ijme.52c6.d7ef

Dlugokienski, A., & Sampson, V. (2008). Learning to write and writing to learn in science: Refutational texts and analytical rubrics. Science Scope, 32(3), 14–19. https://doi.org/10.2505/3/sc08_032_03

Ekmekçi, E. (2018). Exploring Turkish EFL students’ writing anxiety. The Reading Matrix: An International Online Journal, 18(1), 158–175.

Epstein, R. M., & Hundert, E. M. (2002). Defining and assessing professional competence. JAMA, 287(2), 226. https://doi.org/10.1001/jama.287.2.226

Frank, J. R., Snell, L. S., Cate, O. Ten, Holmboe, E. S., Carraccio, C., Swing, S. R., Harris, P., Glasgow, N. J., Campbell, C., Dath, D., Harden, R. M., Iobst, W., Long, D. M., Mungroo, R., Richardson, D. L., Sherbino, J., Silver, I., Taber, S., Talbot, M., & Harris, K. A. (2010). Competency-based medical education: theory to practice. Medical Teacher, 32(8), 638–645. https://doi.org/10.3109/0142159X.2010.501190

Ghufron, M. N., & Suminta, R. R. (2020). Epistemic beliefs on field-dependent and field-independent learning style. Jurnal Cakrawala Pendidikan, 39(3), 532–544. https://doi.org/10.21831/cp.v39i3.23800

Guasch, T., Espasa, A., & Martinez-Melo, M. (2019). The art of questioning in online learning environments: the potentialities of feedback in writing. Assessment & Evaluation in Higher Education, 44(1), 111–123. https://doi.org/10.1080/02602938.2018.1479373

Henry, E., Hinshaw, R., Al-Bataineh, A., & Bataineh, M. (2020). Exploring teacher and student perceptions on the use of digital conferencing tools when providing feedback in writing workshop. Turkish Online Journal of Educational Technology-TOJET, 19(3), 41–50.

Kargozari, H. R., Ghaemi, H., & Heravi, M. A. (2012). Cohesive devices in argumentative, descriptive, and expository writing produced by Iranian EFL university students. Modern Journal of Language Teaching Methods, 2(3), 25.

Kaven, J. (2013). The development of a valid and reliable general analytic rubric for a college-level public-speaking course. California State University.

Lim, S. C., & Renandya, W. A. (2020). Efficacy of written corrective feedback in writing instruction: A meta-analysis. TESL-EJ, 24(3), n3. http://www.tesl-ej.org/wordpress/sub_howto/submit-proc/

Martinez, K. (1997). The effect of a rubric on evaluating and improving student writing. Caldwell College.

Meisels, S. J., Xue, Y., & Shamblott, M. (2008). Assessing language, literacy, and mathematics skills with work sampling for head start. Early Education & Development, 19(6), 963–981. https://doi.org/10.1080/10409280801971890

Meletiadou, E. (2012). The impact of training adolescent EFL learners on their perceptions of peer assessment of writing. Research Papers in Language Teaching & Learning, 3(1). https://rpltl.eap.gr/images/2012/03-01-240-Meletiadou.pdf

Mubarok, H. (2017). Students’ perception toward the implementation of peer assessment in writing: Before and after revision. Celt: A Journal of Culture, English Language, Teaching & Literature, 17(1), 13–26. https://doi.org/10.24167/celt.v17i1.1136

Naresh, N., & Chahine, I. (2013). Reconceptualizing research on workplace mathematics: Negotiations grounded in personal practical experiences. REDIMAT, 2(3), 316–342. https://doi.org/10.4471/redimat.2013.34

Nonaka, I., Konno, N., & Toyama, R. (2001). Knowledge emergence: Social, technical, and evolutionary dimensions of knowledge creation. Oxford University Press.

Nurgiantoro, B. (2018). Penilaian otentik dalam pembelajaran bahasa. UGM PRESS.

Orey, D. C., & Rosa, M. (2006). Ethnomathematics: Cultural assertions and challenges towards pedagogical action. The Journal of Mathematics and Culture, 1(1), 57–78.

Pappamihiel, N. E., Nishimata, T., & Mihai, F. (2008). Timed writing and adult English-language learners: an investigation of first language use in invention strategies. Journal of Adolescent & Adult Literacy, 51(5), 386–394. https://doi.org/10.1598/JAAL.51.5.2

Polston, K. (2014). Self-and peer-assessment of product creativity with a rubric in a collaborative environment: A research study with undergraduate textile and apparel designers. North Carolina State University.

Rahimipoor, S. (2016). Evaluating the effects of drama techniques on improving the writing competency of the EFL learners. Nova Journal of Humanities and Social Sciences, 1(1).

Şahan, Ö., & Razı, S. (2020). Do experience and text quality matter for raters’ decision-making behaviors? Language Testing, 37(3), 311–332. https://doi.org/10.1177/0265532219900228

Schirmer, B. R., & Bailey, J. (2000). Writing assessment rubric. TEACHING Exceptional Children, 33(1), 52–58. https://doi.org/10.1177/004005990003300110

Spence, L. K. (2010). Discerning writing assessment: Insights into an analytical rubric. Language Arts, 87(5), 337. https://library.ncte.org/journals/la/issues/v87-5/10535

Spronken-Smith, R. (2005). Implementing a problem-based learning approach for teaching research methods in geography. Journal of Geography in Higher Education, 29(2), 203–221. https://doi.org/10.1080/03098260500130403

Sumekto, D. R. (2018). Investigating the influence of think-pair-share approach toward students’ reading achievement. Lingua Cultura, 12(2), 195. https://doi.org/10.21512/lc.v12i2.4011

Weigle, S. C. (2002). Assessing writing. Cambridge University Press. https://doi.org/10.1017/CBO9780511732997

Williams, D. P., & Handa, S. (2016). Chemistry student perceptions of transferable & workplace skills development. New Directions in the Teaching of Physical Sciences, 11. https://doi.org/10.29311/ndtps.v0i11.584

Wiseman, C. S. (2008). Investigating selected facets in measuring second language writing ability using holistic and analytic scoring methods. Teachers College, Columbia University.

Wosnitza, M., & Volet, S. (2012). Group heterogeneity and homogeneity in personal content goals for a group learning activity: impact on individual appraisals. Applied Psychology, 61(4), 585–604. https://doi.org/10.1111/j.1464-0597.2012.00507.x

Wu, Y., & Schunn, C. D. (2021). The effects of providing and receiving peer feedback on writing performance and learning of secondary school students. American Educational Research Journal, 58(3), 492–526. https://doi.org/10.3102/0002831220945266

Yu, S. (2021). Feedback-giving practice for L2 writing teachers: Friend or foe? Journal of Second Language Writing, 52, 100798. https://doi.org/10.1016/j.jslw.2021.100798

Yu, S., Jiang, L., & Zhou, N. (2020). Investigating what feedback practices contribute to students’ writing motivation and engagement in Chinese EFL context: A large scale study. Assessing Writing, 44, 100451. https://doi.org/10.1016/j.asw.2020.100451




DOI: https://doi.org/10.21831/cp.v40i3.40948

Refbacks

  • There are currently no refbacks.




 

Social Media:

     


 

 Creative Commons License
Jurnal Cakrawala Pendidikan by Lembaga Pengembangan dan Penjaminan Mutu Pendidikan UNY is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Based on a work at https://journal.uny.ac.id/index.php/cp/index.

Translator
 
 web
    analytics
View Our Stats