skip to main content skip to footer

Exploration of an Automated Editing Task as a GRE Writing Measure GREB GPA CAT

Breland, Hunter M.
Publication Year:
Report Number:
ETS Research Report
Document Type:
Page Count:
Subject/Key Words:
Graduate Record Examinations Board, Interrater Reliability, Writing Skills, Self Evaluation (Individuals), Grade Point Average (GPA), Graduate Students, Editing, Multivariate Analysis, Automation, Computer Assisted Testing


Two editing tasks were developed and programmed for the computer to explore the possibility that such tasks might be useful as measures of writing skill. An informal data collection was then conducted with 52 prospective graduate students. These students completed the editing tasks with no time limit, as well as a writing experience questionnaire. Scores obtained on the two editing tasks were correlated with variables developed from the questionnaire. The total score for the two editing tasks correlated .52 with student self-assessments of their writing ability, .46 with grade-point average (GPA) based on courses requiring at least some writing, and .30 with writing accomplishments. The correlation with GPA, however, was only .14. The reliability of the total editing score was estimated at .84.

Read More