angle-up angle-right angle-down angle-left close user menu open menu closed search globe bars phone store

A Review of Evidence Presented in Support of Three Key Claims in the Validity Argument for the TextEvaluator Text Analysis Tool

Author(s):
Sheehan, Kathleen M.
Publication Year:
2016
Report Number:
RR-16-12
Source:
ETS Research Report
Document Type:
Report
Page Count:
17
Subject/Key Words:
Bias Common Core State Standards (CCSS) Literary Genres Readability Text Complexity TextEvaluator Validity

Abstract

The TextEvaluator text analysis tool is a fully automated text complexity evaluation tool designed to help teachers and other educators select texts that are consistent with the text complexity guidelines specified in the Common Core State Standards (CCSS). This paper provides an overview of the TextEvaluator measurement approach and summarizes evidence related to three key claims in the TextEvaluator validity argument: (a) TextEvaluator has succeeded in expanding construct coverage beyond the two dimensions of text variation that are traditionally assessed by readability metrics; (b) the TextEvaluator strategy of estimating distinct prediction models for informational, literary, and mixed texts has succeeded in generating text complexity predictions that exhibit little, if any, genre bias; and (c) TextEvaluator scores are highly correlated with text complexity judgments provided by human experts, including judgments generated via the inheritance method and judgments generated via the exemplar method. Implications with respect to the goal of helping teachers and other educators select texts that are closely aligned with the accelerated text complexity exposure trajectory outlined in the CCSS are discussed.

Read More