This paper reports on a set of think aloud studies that had the goal of examining the comparability of item presentation formats between paper- and computer-based test forms of the revised GRE® and consequently the validity of inferences based on scores regardless of delivery method. Verbal Text Completion and Quantitative Numeric Entry items were administered in three response and presentation formats to 25 adults with a native language other than English in order to identify construct-irrelevant challenges that potentially could be encountered by test takers when completing items developed for computer delivery that are delivered in a paper-based test format. The response and presentation formats that were the focus of this study were (a) computer-based delivery with responses on the computer, (b) paper-based delivery with responses written directly into the test booklet, and (c) paper-based delivery with responses transferred to a separate answer sheet. Of interest were obstacles associated with the various response and presentation formats, as well as issues specific to the item types and subject matter. Overall results provide evidence to support comparability of the test formats. Possible areas for future research are provided.