skip to main content skip to footer

Does Rearranging Multiple‐Choice Item Response Options Affect Item and Test Performance?

Author(s):
Wang, Lin
Publication Year:
2019
Report Number:
RR-19-02
Source:
ETS Research Report
Document Type:
Report
Page Count:
14
Subject/Key Words:
Response Option Rearrangement, Item Response Theory (IRT), Multiple Choice Items, Distractors (Tests), Item Calibration, Item Characteristic Curves, Item Performance, Scaling, Test Performance, Test Security, Cheating, Score Comparability, Test Design, Test Delivery

Abstract

Rearranging response options in different versions of a test of multiple‐choice items can be an effective strategy against cheating on the test. This study investigated if rearranging response options would affect item performance and test score comparability. A study test was assembled as the base version from which 3 variant versions were created by rearranging the response options of the items. The 4 versions were administered to randomly equivalent samples of approximately 1,200 test takers in an operational administration. The weighted root mean squared difference and the test characteristic curves were computed from the data to assess the differences between the base and its variant versions. The item‐level and test‐level results show very small differences between the base and the 3 variant versions.

Read More