The purposes of this study were three-fold: 1) to examine and compare the stability of IRT item difficulty parameter estimates with conventional item difficulty estimates for a set of items from an Admissions Testing Program Biology achievement test given to a group of students recently completing a biology course and another group of students who, for the most part, had received no formal instruction in the content area from 6 to 18 months prior to taking the test; 2) to assess the impact of the lack of stability of the item difficulty estimates on score equating for both IRT and conventional equating methods; and 3) using confirmatory factor analytic techniques, to assess differences in the factor structures of the set of common items given to the groups of interest in an attempt to determine the specific curriculum effects leading to a lack of stability in the item difficulty estimates. (63pp.)