skip to main content skip to footer

E-rater Performance on GRE Essay Variants GRE AES

Author(s):
Attali, Yigal; Bridgeman, Brent; Trapani, Catherine S.
Publication Year:
2014
Source:
Wendler, Cathy; Bridgeman, Brent (eds.) with assistance from Chelsea Ezzo. The Research Foundation for the GRE revised General Test: A Compendium of Studies. Princeton, NJ: Educational Testing Service, 2014, p4.6.1-4.6.5
Document Type:
Chapter
Page Count:
5
Subject/Key Words:
Graduate Record Examination (GRE), Revised GRE, Test Design, Test Revision, Automated Essay Scoring (AES), Human Scoring, e-rater, Scoring Models, Essay Prompts

Abstract

Reports on a study that investigated the use of e-rater with essay variants. These variants, which are created from a parent prompt, ask a focused question that addresses a specific aspect of the prompt and requires the test taker to respond to that aspect. Variants were developed to address the problem of memorized responses to essay prompts that did not make such specific demands and to help test developers enlarge the pool of essay topics. This chapter extends the findings from a study described in Chapter 1.10, which investigated the comparability of essay variants as scored by human raters to e-rater scores on essay variants. Findings indicated that e-rater could score all variant types and that no significant differences in performance were found across different variant types.

Read More