skip to main content skip to footer

Assessing Differential Item Functioning in Performance Tests DIF

Author(s):
Zwick, Rebecca J.; Donoghue, John R.; Grima, Angela
Publication Year:
1993
Report Number:
RR-93-14
Source:
ETS Research Report
Document Type:
Report
Page Count:
42
Subject/Key Words:
Construct Validity, Differential Item Functioning (DIF), Mantel-Haenszel Technique, Performance Assessment, Test Fairness, Test Items

Abstract

(42pp.) Although the belief has been expressed that performance assessments are intrinsically more fair than multiple- choice measures, some forms of performance assessment may in fact be more likely than conventional tests to tap construct-irrelevant factors. As performance assessment grows in popularity, it will be increasingly important to monitor the validity and fairness of alternative item types. The assessment of differential item functioning (DIF), as one component of this evaluation, can be helpful in investigating the effect on subpopulations of the introduction of performance tasks. Developing a DIF analysis strategy for performance measures requires decisions as to how the matching variable should be defined and how the analysis procedure should accommodate polytomous re- sponses. In this study, two inferential procedures and two types of descriptive summaries that may be useful in assessing DIF in performance measures were explored and applied to simulated data. All the investigated statistics appear to be worthy of further study

Read More