skip to main content skip to footer

An Empirical Investigation of the Potential Impact of Item Misfit on Test Scores MST DIF

Author(s):
Kim, Sooyeon; Robin, Frederic
Publication Year:
2017
Report Number:
RR-17-60
Source:
ETS Research Report
Document Type:
Report
Page Count:
13
Subject/Key Words:
Empirical Investigations, Test Scores, Multistage Testing (MST), Differential Item Functioning (DIF), Item Calibration, Linking Error, Subpopulation Invariance

Abstract

In this study, we examined the potential impact of item misfit on the reported scores of an admission test from the subpopulation invariance perspective. The target population of the test consisted of 3 major subgroups with different geographic regions. We used the logistic regression function to estimate item parameters of the operational items based on the empirical data accumulated over 3 years. A new set of item parameter estimates derived using the data from each subgroup separately was compared to the original (i.e., operational) item parameter estimates to assess the degree of item misfit due to subgroup memberships. Using the new set of item parameter estimates for each subgroup, we also updated the conversion tables, which were derived from the original item parameter estimates, and compared them to their original conversions to determine whether score invariance was achieved at the scaled score level. Score invariance was not absolutely achieved. Even so, the magnitude of reported score differences (systematic error or bias) caused by subgroup dependence was still smaller than the standard error of measurement (random error) of the test. This study suggests a practical remedy for enhancing the level of score invariance of the test.

Read More