Item Analysis of a Multiple-Choice Exam

Sibel Toksöz, Ayşe Ertunç

Abstract


Although foreign language testing has been subject to some changes in line with the different perspectives on learning and language teaching, multiple-choice items have been considerably popular regardless of these perspectives and trends in foreign language teaching. There have been some studies focusing on the efficiency of multiple choice items in different contexts. In Turkish context multiple choice items have been commonly used as standardized stake holder tests as a requirement for undergraduate level for the departments such as English Language Teaching, Western Languages and Literatures and Translation Studies and academic progress of the students in departments. Moreover, multiple choice items have been used noticeably in all levels of language instruction. However, there hasn’t been enough item analysis of multiple-choice tests in terms of item discrimination, item facility and distractor efficiency. The present study aims to analyze the multiple choice items aiming to test grammar, vocabulary and reading comprehension and administrated at a state university to preptory class students. In the study, 453 students’ responses have been analyzed in terms of item facility, item discrimination and distractor efficiency by using the frequency showing the distribution of the responses of prepatory students. The study results reveal that, most of the items are at the moderate level in terms of item facility. Besides, the results show that 28% of the items have a low item discrimination value. Finally, the frequency results were analyzed in terms of distractor efficiency and it has been found that some distractors in the exam are significantly ineffective and they should be revised.


Keywords


assessment; item analysis; multiple choice; item facility; item discrimination; distractor efficiency

Full Text:

PDF

References


Adisutrisno, W. D. (2008). Multiple Choice English Grammar Test Items That Aid English Learning for Students of English as a Foreign Language. k@ta, 10 (1), 36-52. doi: 10.9744.

Alderson, J. C. & Wall, D. (1993). Does washback exist?. Applied Linguistics, 14, 115-129.

Bachman, L. & Palmer, A. S. (1996). Language Testing in Practice. New York: Oxford University Press.

Bloom B.S. (1956). Taxonomy of Educational Objectives, Handbook I: The Cognitive Domain. New York: David McKay Co Inc.

Bodner, G. M. (1980). Statistical Analysis of Multiple-Choice Exams. Journal of Chemical Education, 57(3), 188-90.

Brown, H. D. (2004). Language Assessment: Principles and Classroom Practices. White Plains, NY: Pearson Education.

Carroll, D. W. (1986). Psychology of language. Belmont: Wadsworth.

Chomksy, N. (1965). Aspects of the theory of syntax. Cambridge, MA: MIT Press.

Coniam, D. 1997. A preliminary inquiry into using corpus word frequency data in the automatic generation of English language cloze tests. CALICO Journal, 14 (2-4), 15-33.

Davidson, F., Hudson, T. & Lynch, B. (1985). Language Testing: Operationalization in classroom measurement and L2 research. In Marianne Celce Murcia (Ed.) Beyond basics: Issues and research in TESOL. Rowley, MA: Newbury House.

Ding, L., & Beichner, R. (2009). Approaches to data analysis of multiple-choice questions. Physical Review Special Topics-Physics Education Research, 5(2), 1-17.

Gajjar, S., Sharma, R., Kumar, P., & Rana, M. (2014). Item and test analysis to identify quality multiple choice questions (MCQs) from an assessment of medical students of Ahmedabad, Gujarat. Indian Journal of Community Medicine, 39(1), 17.

Hamp-Lyons, L. (1997). Washback, impact and validity: ethical concerns. Language Testing, 14 (3), 295-303.

Hughes, A. (2003). Testing for language teachers. Second Edition. Cambridge: Cambridge University Pres.

Karabulut, A. (2007). Micro level impacts of foreign language test (university entrance examination) in Turkey: A washback study (Master’s Thesis). Available from ProQuest Dissertations and Theses database. (UMI No. 1448691)

Malau-Aduli, B. S., & Zimitat, C. (2012). Peer review improves the quality of MCQ examinations. Assessment & Evaluation in Higher Education, 37(8), 919-931.

Mousavi, S. A. (2002). An encyclopedic dictionary of language testing. Third edition. Taiwan: Tung Hua Book Company.

Oluseyi, A. E., & Olufemi, A. T. (2011). The Analysis of Multiple Choice Item of the Test of an Introductory Course in Chemistry in a Nigerian University. International Journal of Learning, 18(4), 237-246.

Öztürk, M. (2007). Multiple-Choice Test Items of Foreign Language Vocabulary. Eğitim Fakültesi Dergisi, 20 (2), 399-426.

Qi, L. (2005). Stakeholders’ conflicting aims undermine the washback function of a high-stakes test. Language Testing, 22(2), 142-173.

Shehadeh, M., A. (1997). The Effect of Test Type on Reading Comprehension in English as a Foreıgn Language: The Case of Recall Protocol and Multiple Choice. (Doctoral Dissertation). Retrieved from UMI Microform. (9731711).

Woodford, E., P. (1980). Foreign Language Testing. The Modern Language Journal, 64 (1), 97-102.




DOI: http://dx.doi.org/10.7575/aiac.alls.v.8n.6p.141

Refbacks

  • There are currently no refbacks.




Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

2010-2020 (CC-BY) Australian International Academic Centre PTY.LTD.

Advances in Language and Literary Studies

You may require to add the 'aiac.org.au' domain to your e-mail 'safe list’ If you do not receive e-mail in your 'inbox'. Otherwise, you may check your 'Spam mail' or 'junk mail' folders.