skip to main content skip to footer

Orthogonal Regression, the Cleary Criterion, and Lord’s Paradox: Asking the Right Questions

Author(s):
Kane, Michael; Mroch, Andrew A.
Publication Year:
2020
Report Number:
RR-20-14
Source:
ETS Research Report
Document Type:
Report
Page Count:
24
Subject/Key Words:
Lord's Paradox, Linear Regression, Regression Models, Cleary Model, Test Bias, Predictive Bias

Abstract

Ordinary least squares (OLS) regression and orthogonal regression (OR) address different questions and make different assumptions about errors. The OLS regression of Y on X yields predictions of a dependent variable (Y) contingent on an independent variable (X) and minimizes the sum of squared errors of prediction. It assumes that the independent variable (X) is an observed score that is known without error, and all of the error is assigned to the dependent variable (Y). OLS is not designed to estimate underlying functional relationships, and if both variables contain error, OLS regression tends to yield biased estimates of such functional (or true-score) relationships. OR models, including the errors-in-variables (EIV) and geometric-mean (GM) models, assume that both variables contain error and seek to identify the line that minimizes squared deviations of the data points from the line in both the X and Y directions. OLS and OR address different questions and serve different purposes. If one wants to predict one variable from another variable, OLS regression is an optimal approach and OR is less efficient; in examining the functional relationship between two variables, OR provides a more plausible model. The OR models are hard to apply in many contexts because they depend on strong assumptions about sources of error. As examples of cases where OR can shed light, we examine its use in analyzing test bias as distinct from predictive bias and in making sense of Lord’s paradox.

Read More