How to Equate Tests With Little or No Data IRT PPST
- Author(s):
- Mislevy, Robert J.; Sheehan, Kathleen M.; Wingersky, Marilyn S.
- Publication Year:
- 1992
- Report Number:
- RR-92-20-ONR
- Source:
- ETS Research Report
- Document Type:
- Report
- Page Count:
- 49
- Subject/Key Words:
- Bayesian Statistics, Cognitive Processes, Equated Scores, Item Response Theory (IRT), Pre-Professional Skills Tests (PPST), Statistical Analysis
Abstract
Standard procedures for equating tests, including those based on item response theory (IRT), require item responses from large numbers of examinees. Such data may not be forthcoming for reasons theoretical, political, or practical. Information about items' operating characteristics may be available from other sources, however, such as content and format specifications, expert opinion, or psychological theories about the skills and strategies required to solve them. This paper shows how, in the IRT framework, collateral information about items can be exploited to augment or even replace examinee responses when linking or equating new tests to established scales. The procedures are illustrated with data from the Pre-Professional Skills Test (PPST). (48pp.)
Read More
- Request Copy (specify title and report number, if any)
- http://dx.doi.org/10.1002/j.2333-8504.1992.tb01451.x