skip to main content skip to footer

AutoESD: An Automated System for Detecting Nonauthentic Texts for High-Stakes Writing Tests

Author(s):
Choi, Ikkyu; Hao, Jiangang; Li, Chen; Fauss, Michael; Novak, Jakub
Publication Year:
2024
Report Number:
RR-24-08
Source:
ETS Research Report
Document Type:
Report
Page Count:
18
Subject/Key Words:
Writing Tasks, Essays, High Stakes Tests, Test Security, TOEFL iBT, Automated Essay Evaluation Systems, Human-in-the-Loop, Similarity Measures, Pairwise Comparisons

Abstract

A frequently encountered security issue in writing tests is nonauthentic text submission: Test takers submit texts that are not their own but rather are copies of texts prepared by someone else. In this report, we propose AutoESD, a human-in-the-loop and automated system to detect nonauthentic texts for a large-scale writing tests, and report its performance on an operational data set. The AutoESD system utilizes multiple automated text similarity measures to identify suspect texts and provides an analytics-enhanced web application to help human experts review the identified texts. To evaluate the performance of AutoESD, we obtained its similarity measures on TOEFL iBT test writing responses collected from multiple remote administrations and examined their distributions. The results were highly encouraging in that the distributional characteristics of AutoESD similarity measures were effective in identifying suspect texts and the measures could be computed quickly without affecting the operational score turnaround timeline.

Read More