The Role of Order and Sequence of Options in Multiple-choice Questions for High-stakes Tests of English Language Proficiency

Talip Karanfil, Steve Neufeld


High-stakes and high-volume English language proficiency tests typically rely on multiple-choice questions (MCQs) to assess reading and listening skills. Due to the Covid-19 pandemic, more institutions are using MCQs via online assessment platforms, which facilitate shuffling the order of options within test items to minimize cheating. There is scant research on the role that order and sequence of options plays in MCQs, so this study examined the results of a paper-based, high-stakes English proficiency test administered in two versions. Each version had identical three-option MCQs but with different ordering of options. The test-takers were chosen to ensure a very similar profile of language ability and level for the groups who took the two versions. The findings indicate that one in four questions exhibited significantly different levels of difficulty and discrimination between the two versions. The study identifies order dominance and sequence priming as two factors that influence the outcomes of MCQs, both of which can accentuate or diminish the power of attraction of the correct and incorrect options. These factors should be carefully considered when designing MCQs in high-stakes language proficiency tests and shuffling of options in either paper-based or computer-based testing.


Multiple-choice Questions (MCQs), Option Order in MCQs, Sequence Priming in MCQs, Order Dominance in MCQs

Full Text:



Bachman, L.F. (2004). Statistical analyses for language assessment. Cambridge University Press.

Bachman, L.F., Lyle, F. and Palmer, A.S. (1996). Language testing in practice: Designing and developing useful language tests. Oxford University Press.

Brown, J.D. (2005). Testing in language programs: a comprehensive guide to English language assessment. McGraw-Hill College.

Cizek, G.J. (1994). The Effect of Altering the Position of Options in a Multiple-Choice Examination. Educational and psychological measurement 54(1), pp. 8–20.

Davis, D.B. (2017). Exam question sequencing effects and context cues. Teaching of Psychology 44(3), pp. 263–267.

Gierl, M.J., Bulut, O., Guo, Q. and Zhang, X. (2017). Developing, Analyzing, and Using Distractors for Multiple-Choice Tests in Education: A Comprehensive Review. Review of Educational Research 87(6), pp. 1082–1116.

Haladyna, T.M., Downing, S.M. and Rodriguez, M.C. (2002). A Review of Multiple-Choice Item-Writing Guidelines for Classroom Assessment. Applied Measurement in Education 15(3), pp. 309–333.

Hambleton, R.K., Traub, Ross E. and Traub, Rose E. (1974). The Effects of Item Order on Test Performance and Stress. The Journal of Experimental Education 43(1), pp. 40–46.

Hohensinn, C. and Baghaei, P. (2017). Does the position of response options in multiple-choice tests matter? Psicológica.

Holzknecht, F., McCray, G., Eberharter, K., Kremmel, B., Zehentner, M., Spiby, R., & Dunlea, J. (2020). The effect of response order on candidate viewing behaviour and item difficulty in a multiple-choice listening test. Language Testing,

Marcus, A. (1963). The effect of correct response location on the difficulty level of multiple-choice questions. The Journal of applied psychology 47(1), pp. 48–51.

McNamara, W.J. and Weitzman, E. (1945). The effect of choice placement on the difficulty of multiple-choice questions. Journal of educational psychology 36(2), pp. 103–113.

Messick, S. (1996. Validity and washback in language testing. Language Testing 13(3), pp. 241–256.

Mosier, C.I. and Price, H.G. (1945). The arrangement of choices in multiple choice questions and a scheme for randomizing choice. Educational and psychological measurement 5(4), pp. 379–382.

Ollennu, S.N.N. and Etsey, Y.K.A. (2015). The Impact of Item Position in Multiple-choice Test on Student Performance at the Basic Education Certificate Examination (BECE) Level. Universal Journal of Educational Research 3(10), pp. 718–723.

Oruç Ertürk, N. and Mumford, S.E. (2017). Understanding test-takers’ perceptions of difficulty in EAP vocabulary tests: The role of experiential factors. Language Testing 34(3), pp. 413–433.

Rodriguez, M.C. 2005. Three Options Are Optimal for Multiple-Choice Items: A Meta-Analysis of 80 Years of Research. Educational Measurement: Issues and Practice 24(2), pp. 3–13.

Sadeghi, K. and Masoumi, G.A. (2017). Does number of options in multiple choice tests affect item facility and discrimination? An examination of test-taker preferences. Journal of English Language Teaching and Learning, 9(19), 123-143.

Satti, I., Hassan, B., Alamri, A., Khan, M.A. and Patel, A. (2019). The effect of scrambling test item on students’ performance and difficulty level of mcqs test in a college of medicine, KKU. Creative Education 10(08), pp. 1813–1818.

Shin, J., Bulut, O. and Gierl, M.J. (2019). The Effect of the Most-Attractive-Distractor Location on Multiple-Choice Item Difficulty. The Journal of Experimental Education, pp. 1–17.

Shizuka, T., Takeuchi, O., Yashima, T. and Yoshizawa, K. (2006). A comparison of three- and four-option English tests for university entrance selection purposes in Japan. Language Testing 23(1), pp. 35–57.

Stout, D., & Heck, J. (1995). Empirical Findings Regarding Student Exam Performance and Question Sequencing: The Case of the Cumulative Final. Journal of Financial Education, 21, 29-35.

Tellinghuisen, J. and Sulikowski, M.M. (2008). Does the Answer Order Matter on Multiple-Choice Exams? Journal of chemical education 85(4), p. 572.



  • There are currently no refbacks.

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

2012-2021 (CC-BY) Australian International Academic Centre PTY.LTD

International Journal of Applied Linguistics and English Literature

To make sure that you can receive messages from us, please add the journal emails into your e-mail 'safe list'. If you do not receive e-mail in your 'inbox', check your 'bulk mail' or 'junk mail' folders.