skip to main content skip to footer

Bridging Validity and Evaluation to Match International Large‐Scale Assessment Claims and Country Aims ILSA

Oliveri, Maria Elena; Rutkowski, David; Rutkowski, Leslie
Publication Year:
Report Number:
ETS Research Report
Document Type:
Page Count:
Subject/Key Words:
International Large-Scale Assessments (ILSA), Consequences, Test Validity, Test Fairness, Evaluation, Claims


Fifty years after the first international large‐scale assessment (ILSA), participation in these studies continues to grow, with more than 50% of the world's countries participating. Concomitant with growth in ILSAs is an expansion in the diversity of participant countries with respect to languages, cultures, and educational perspectives and goals. As educational aims might differ for new participants and goals among historic participants can be expected to shift over time, it is useful to understand the degree to which countries' expectations of ILSAs—as a means for understanding their educational system—align with the explicitly and implicitly stated purposes of these studies. In this presentation, we shift the conversation away from countries reporting ILSA shock and dissatisfaction with participation to fostering a productive conversation about the value and utility of participation. We propose a framework that combines notions from meta‐evaluation to systematically test the evaluation tools—ILSAs and validity theory (in relation to test use and alignment with stakeholder needs) to help countries understand why they participate in ILSAs and the potential value in taking part. We develop this conceptual framework with the aim that countries can (a) systematically consider their educational goals and the degree to which ILSA participation can reasonably help countries monitor progress toward them; (b) use an argument model to analyze claims by ILSA programs against the background of a country's specific context; and (c) more clearly understand intended and unintended consequences of ILSA participation. The framework offers a tool to systematically think through a complex web of implicit and explicit purposes, goals, and actors related to ILSAs and educational systems. To demonstrate our proposed framework, we review national education agendas in several countries with differing educational traditions (e.g., the United States, Mexico, and Norway) against published ILSA frameworks. Using our proposed method would offer a set of general guidelines that national funders can use to chart a path forward in terms of future ILSA participation. It can also equip participating countries with the knowledge to engage in reasoned conversations with testing organizations regarding unmet needs from testing programs.

Read More