skip to main content skip to footer

A Comparison of Free-Response and Multiple-Choice Questions in the Assessment of Reading Comprehension

Ward, William C.; Dupree, David; Carlson, Sybil B.
Publication Year:
Report Number:
ETS Research Report
Document Type:
Page Count:
Subject/Key Words:
Performance Factors, Reading Comprehension, Response Style (Tests), Test Construction, Test Format, Test Items


Undergraduate students completed a Reading Comprehension test in which each of a number of passages served as the basis for four to eight questions, half framed in multiple-choice format and half requiring that the examinee produce an appropriate answer. Questions within a passage were balanced according to the kind of information processing they were thought to require: Explicit questions dealt with information given explicitly in the passage; Inference questions required inference based on material presented; and Application and Evaluation questions required that the examinee go beyond what was presented to consider applications of ideas presented or evaluation of the logic or the style of the passage. It was hypothesized that questions of the first type would show no systematic differences associated with the format of the question, while those of the second might, and those of the third would, draw on somewhat different cognitive abilities when examinees were required to produce rather than merely to recognize an appropriate answer. Relations of test scores with other cognitive measures also failed to show systematic differences associated with format. Despite this evidence for similarity in performance across formats, however, there is also evidence of systematic differences for individuals. The subjects by format interaction in the analysis of variance, while not of great strength, was significant at the five percent level of confidence. This effect indicates that some students showed larger differences in performance between the two response formats than did others. It was speculated that two changes in test items might be required to elicit the differences in performance associated with response format that have been observed in some other testing situations. Items may need to be structured so that they draw less on information given in or derivable from the test materials and more from information whose relevance is not specified for the examinee; and types of questions with which examinees are not so well practiced may need to be employed. (30pp.)

Read More