skip to main content skip to footer

Testing the Invariance of Interrater Reliability Between Paper-Based and Online Modalities of the SIR II Student Instructional Report SIR II

Author(s):
Klieger, David M.; Centra, John A.; Young, John W.; Holtzman, Steven; Kotloff, Lauren J.
Publication Year:
2014
Source:
SIR II Report
Document Type:
Report
Page Count:
9
Subject/Key Words:
Course Evaluation, Teaching Evaluation, Teacher Evaluation, College Instruction, Classroom Instruction, Interrater Reliability, Student Instructional Report (SIR II), Delivery System

Abstract

Moving from paper-based to online administration of an assessment presents many advantages, including cost savings, speed, accuracy, and environmental conservation. However, a question arises as to whether changing the modality of administration affects reliability and thus validity, how scores or ratings should be interpreted. We investigated whether the interrater reliability (within-class variance) for the SIR II™ Student Instructional Report differed between the paper-based and online versions. Our results indicated that they did not. The findings provide additional evidence that moving to an online version of the instrument does not change how one should interpret SIR II ratings.

Read More