Moving from paper-based to online administration of an assessment presents many advantages, including cost savings, speed, accuracy, and environmental conservation. However, a question arises as to whether changing the modality of administration affects reliability and thus validity, how scores or ratings should be interpreted. We investigated whether the interrater reliability (within-class variance) for the SIR II™ Student Instructional Report differed between the paper-based and online versions. Our results indicated that they did not. The findings provide additional evidence that moving to an online version of the instrument does not change how one should interpret SIR II ratings.