skip to main content skip to footer

The Assessment of Writing Ability: A Review of Research GREB

Author(s):
Cooper, Peter L.
Publication Year:
1984
Report Number:
RR-84-12
Source:
ETS Research Report
Document Type:
Report
Page Count:
50
Subject/Key Words:
Graduate Record Examinations Board, Essay Tests, Literature Reviews, Multiple Choice Tests, Test Reliability, Test Validity, Writing (Composition), Writing Evaluation

Abstract

Recent information from established testing programs was used to investigate the nature and limitations of essay and multiple-choice tests of writing ability, the statistical relationship of these types of tests, the performance of population subgroups on each, the possible need of different disciplines for different tests of composition skill, and the cost and usefulness of various writing evaluation strategies. The literature indicates that essay tests are often considered more valid than multiple-choice tests. Although essay tests may sample a wider range of composition skills, the variance in essay test scores can reflect such irrelevant factors as speed and fluency under time pressure or even penmanship. Also, essay test scores are typically far less reliable than multiple-choice test scores. Multiple-choice measures tend to overpredict the performance of minority candidates on essay tests. It is not certain whether multiple-choice tests have the same predictive validity for candidates in different academic disciplines, where writing requirements may vary. Still, there appears to be a close relationship between performance on multiple-choice and essay tests of writing ability. The best measures of writing ability have both essay and multiple-choice sections, but this design can be prohibitively expensive.

Read More