Test Reliability—Basic Concepts
- Author(s):
- Livingston, Samuel A.
- Publication Year:
- 2018
- Report Number:
- RM-18-01
- Source:
- ETS Research Memorandum
- Document Type:
- Report
- Page Count:
- 46
- Subject/Key Words:
- Test Reliability, True Score, Error of Measurement, Alternate Forms, Interrater Reliability, Internal Consistency, Reliability Coefficient, Standard Error of Measurement, Classification Consistency, Classification Accuracy, Human Raters
Abstract
The reliability of test scores is the extent to which they are consistent across different occasions of testing, different editions of the test, or different raters scoring the test taker’s responses. This guide explains the meaning of several terms associated with the concept of test reliability: “true score,” “error of measurement,” “alternate-forms reliability,” “interrater reliability,” “internal consistency,” “reliability coefficient,” “standard error of measurement,” “classification consistency,” and “classification accuracy.” It also explains the relationship between the number of questions, problems, or tasks in the test and the reliability of the scores.
Read More
- Request Copy (specify title and report number, if any)
- https://www.ets.org/Media/Research/pdf/RM-18-01.pdf