skip to main content skip to footer

An Alternative Method for Scoring Adaptive Tests IRT

Author(s):
Stocking, Martha L.
Publication Year:
1994
Report Number:
RR-94-48
Source:
ETS Research Report
Document Type:
Report
Page Count:
38
Subject/Key Words:
Adaptive Testing, Computer Assisted Testing, Equated Scores, Item Response Theory (IRT), Scoring

Abstract

Modern applications of computerized adaptive testing (CAT) are typically grounded in Item Response Theory (IRT; Lord, 1980). While the IRT foundations of adaptive testing provide a number of approaches to adaptive test scoring that may seem natural and efficient to psychometricians, these approaches may be more demanding for test takers, test score users, interested regulatory institutions, and so forth, to comprehend. An alterative method, based on more familiar equated number-correct scores and identical to that used to score and equate many conventional tests, is explored and compared with one that relies more directly on IRT. The conclusion is reached that scoring adaptive tests using the familiar number-correct score, accompanied by the necessary equating to adjust for the intentional differences in adaptive test difficulty, is a statistically viable, although slightly less efficient, method of adaptive test scoring. To enhance the prospects for enlightened public debate about adaptive testing, it may be preferable to use this more familiar approach. Public attention would then likely be focussed on issues more central to adaptive testing, namely the adaptive nature of the test. (38pp.)

Read More