skip to main content skip to footer

Analytic Scoring of TOEFL CBT Essays: Scores From Humans and E-rater TOEFL CBT AES ESL

Author(s):
Lee, Yong-Won; Gentile, Claudia A.; Kantor, Robert
Publication Year:
2008
Report Number:
RR-08-01, TOEFL-RR-81
Source:
ETS Research Report
Document Type:
Report
Page Count:
71
Subject/Key Words:
Test of English as a Foreign Language (TOEFL), Computer-Based Testing (CBT), Automated Essay Scoring (AES), Electronic Essay Rater (E-rater), Analytic Scoring, Holistic Scoring, English as a Second Language (ESL), Writing Assessment, Feedback

Abstract

The main purpose of this study was to investigate the distinctness and reliability of analytic (or multitrait) rating dimensions and their relationships to holistic scores and e-rater essay feature variables in the context of the TOEFL computer-based test (CBT) writing assessment. Data analyzed in the study were analytic and holistic essay scores provided by human raters and essay feature variable scores computed by e-rater (version 2.0) for two TOEFL CBT writing prompts. It was found that (a) all of the six analytic scores were not only correlated among themselves but also correlated with the holistic scores, (b) high correlations obtained among holistic and analytic scores were largely attributable to the impact of essay length on both analytic and holistic scoring, (c) there may be some potential for profile scoring based on analytic scores, and (d) some strong associations were confirmed between several e-rater variables and analytic ratings. Implications are discussed for improving the analytic scoring of essays, validating automated scores, and refining e-rater essay feature variables.

Read More