skip to main content skip to footer

A Complexity Analysis of Items From a Survey of Academic Achievement in the Life Sciences NAEP

Author(s):
Allen, Nancy L.; Enright, Mary K.; Kim, Myung-In
Publication Year:
1993
Report Number:
RR-93-18
Source:
ETS Research Report
Document Type:
Report
Page Count:
40
Subject/Key Words:
Achievement Tests, Construct Validity, Difficulty Level, Item Analysis, National Assessment of Educational Progress (NAEP), Performance Factors, Science

Abstract

(40pp.) The difficulty of 44 items from the life sciences subscale of the NAEP 1985-86 science assessment was analyzed in terms of item attributes and science educators' judgments of difficulty. The attributes included ratings of various characteristics of the items' text and option set, the items' cognitive demand, and the level of knowledge required by items. Science educators' mean judgment of item difficulty, which accounted for 52% of the variance, was the best single predictor of item difficulty. Combining item attribute information with educators' judgments of item difficulty improved the prediction of item difficulty on the order of 7% to 15% of the variance. When item difficulty was modeled in terms of discrete item attributes (global judgments of item difficulty not included in the model), the level of knowledge required was an important determinant of difficulty, while cognitive demand was not. The implications to these results for construct validation and for test design are discussed.

Read More