skip to main content skip to footer

Technology-Enhanced Items and Model–Data Misfit IRT TEI GPCM GRE NRM

Author(s):
Eckerly, Carol; Jia, Yue; Jewsbury, Paul A.
Publication Year:
2022
Report Number:
RR-22-11
Source:
ETS Research Report
Document Type:
Report
Page Count:
18
Subject/Key Words:
Technology-Enhanced Item, Item Response Theory, Misfit, Adaptive Testing, Psychometric Methods, Speededness (Tests), Motivation, Scaffolding (Teaching Technique), Generalized Partial-Credit Model (GPCM), Testlet, Branching Item, Polytomous Items, Scoring Rubric, Scoring Method, Nominal Response Model (NRM), Simulation Studies

Abstract

Testing programs have explored the use of technology-enhanced items alongside traditional item types (e.g., multiple-choice and constructed-response items) as measurement evidence of latent constructs modeled with item response theory (IRT). In this report, we discuss considerations in applying IRT models to a particular type of adaptive testlet referred to as a branching item. Under the branching format, all test takers are assigned to a common question, and the assignment of the next question relies on the response to the first question through deterministic rules. In addition, the items at both stages are scored together as one polytomous item. Real and simulated examples are provided to discuss challenges in applying IRT models to branching items. We find that model–data misfit is likely to occur when branching items are scored as polytomous items and modeled with the generalized partial credit model (GPCM) and that the relationship between the discrimination of the routing component and the discriminations of the subsequent components seemed to drive the misfit. We conclude with lessons learned and provide suggested guidelines and considerations for operationalizing the use of branching items in future assessments.

Read More