skip to main content skip to footer

Qualifying Readers for the Online Scoring Network: Scoring Argument Essays CAT OSN

Author(s):
Powers, Donald E.; Kubota, Melvin Y.; Bentley, Jill; Farnum, Marisa; Swartz, Richard; Willard, Ann E.
Publication Year:
1998
Report Number:
RR-98-28
Source:
ETS Research Report
Document Type:
Report
Page Count:
17
Subject/Key Words:
Essay Tests, Scoring, Qualifications, Online Systems, Computer Assisted Testing, Constructed Response

Abstract

To accomplish these objectives, experienced and inexperienced readers evaluated sets of essays both before and after they received standard training for scoring essays (of the kind requiring examinees to "discuss an issue"). The results showed that training did improve the accuracy with which readers scored essays. Moreover, after training, a significant proportion of inexperienced readers exhibited a level of accuracy that was commensurate with that shown by their more experienced counterparts. The major objective of the study was to extend, on a smaller scale, the results of the previous study to a second kind of writing prompt - "analysis of an argument" - which is used in the GMAT writing assessment and is being considered for the GRE writing test. As in the earlier study, the results suggest that inexperienced readers without current required credentials can be trained to score "argument" essays with a high degree of accuracy.

Read More