Listed below are publications that report results of the CBAL® Initiative's work or that use CBAL data or instruments.
Statistical Results From the 2013 CBAL® English Language Arts Multistate Study: Parallel Forms for Argumentative Writing
van Rijn, P. W. & Yan-Koo, Y. (2016)
ETS Research Memorandum No. RM-16-15
In this paper, statistical results of part of the 2013 CBAL English-language arts multistate study are reported. The focus is on the evaluation of parallelism of three argumentative writing assessment forms. View the abstract or request a copy of the report.
Opt Out: An Examination of Issues
R. E. Bennett (2016)
ETS Research Report, No. RR-16-13
With use of a synthesis of news accounts, research studies, survey results, and state and federal education department documents, this paper examines the opt-out movement and some of the dynamics that appear to underlie it. Suggestions are given for how the assessment community might respond to the concerns raised by the movement and by the general public. View the full article.
From Cognitive-Domain Theory to Assessment Practice
R. E. Bennett, P. Deane & P. van Rijn (2016)
Educational Psychologist, Vol. 51, Issue 1, pp.1–26
This article exemplifies how assessment design might be grounded in theory, thereby helping to strengthen validity claims. Spanning work across multiple related projects, the article summarizes an assessment-system model for the elementary and secondary levels; describes how cognitive-domain theory and principles are used in the design of a scenario-based summative assessment for argumentation in the English-language arts; and gives results relating to propositions suggested by the domain theory, including for the use of topical scenarios and learning progressions in assessment design. View the abstract or download the full paper.
Statistical Results from the 2013 CBAL English Language Arts Multistate Study: Parallel Forms for Policy Recommendation Writing
P. van Rijn, J. Chen & Y. K. Yuen (2016)
ETS Research Memorandum, No. RM-16-01
In this paper, statistical results of part of the 2013 CBAL English-language arts multistate study are reported. The focus is on the evaluation of parallelism of three policy recommendation writing assessment forms. View the abstract or request a copy of the full paper.
Learning Progressions as a Guide for Design: Recommendations Based on Observations from a Mathematics Assessment
E. A. Graf & P. van Rijn (2016)
Chapter in Lane, Suzanne, Raymond, Mark R. & Haladyna, Thomas M. (Editors)
Handbook of Test Development, Second Edition. New York: Routledge, pp.165–189
This chapter discusses some of the literature on learning progressions and provides a theoretical framework for the validation of learning progressions. The chapter presents an IRT-based empirical recovery study for a learning progression in mathematics. View the abstract or get access to the full paper.
Promoting the Cognitive and Social Aspects of Inquiry through Classroom Discourse
H. Jin, X. Wei, P. Duan, Y. Guo & W. Wang (2016)
International Journal of Science Education. Volume 38, Issue 2. pp. 319–343
In this paper, an inquiry framework was developed based on the literature review carried out in the 2015 CBAL Science Inquiry Project. The inquiry framework was then used to guide the analyses of lesson videos. View abstract or download the full paper.
Validity and Automated Scoring
R. E. Bennett & M. Zhang (2016)
Chapter in Drasgow, Fritz (Editor)
Technology and Testing: Improving Educational and Psychological Measurement. New York: Routledge, pp.142–173
This chapter views automated scoring through the lens of validity. The chapter defines automated scoring; describes the types of tasks that have been scored in the English-language arts and mathematics; and outlines the process involved in carrying out one type of scoring: that of essays. Second, it offers several assertions about validity and automated essay scoring. Finally, it provides suggestions for implementation in operational settings. View abstract or download the full paper.
Classification of Writing Patterns Using Keystroke Logs
M. Zhang, J. Hao, C. Li & P. Deane (2016)
Chapter in van der Ark, L.A., Bolt, D.M., Wang, W.-C., Douglas, J.A. & Wiberg, M. (Editors)
Quantitative Psychology Research. New York: Springer, pp. 299–314
This chapter discusses keystroke logs as a valuable tool for writing research. The researchers used large samples of student responses to two prompts targeting different writing purposes and analyzed the longest 25 inter-word intervals in each keystroke log. The logs were extracted using the ETS keystroke logging engine. The analysis found two distinct patterns of student writing processes associated with stronger and weaker writers. They also observed an overall moderate association between the inter-word interval information and the quality of final product. The results suggest promise for the use of keystroke log analysis as a tool for describing patterns or styles of student writing processes. View the abstract or download the full paper.
Gallery of Top Submissions for the 2016 Cover Graphic/Data Visualization Competition
K. Furgol-Castellano (2016)
Educational Measurement: Issues and Practice, Vol. 35, Issue 2, pp. 29–35
This publication contains the graphs created by Peter van Rijn and Usama Ali that received one of the NCME graphic/data visualization competition awards. View the full article.
A Comparison of Newly-trained and Experienced Raters on a Standardized Writing Assessment
Attali, Y. (2015)
Language Testing, Vol. 33, No. 1, pp. 99–115.
This study evaluated the effectiveness of a very short training program for novice raters who were recruited from Amazon Mechanical Turk™. View the abstract or download the full paper.
Gamification in Assessment: Do Points Affect Test Performance?
Y. Attali, & M. Arieli-Attali (2015)
Computers & Education Vol. 83, pp. 57–63
This paper presents the results of two studies which looked at whether giving points to make an assessment more game-like affected test-takers' performance, test-takers' speed and the likeability of the test. View the abstract or download the full paper.
The Changing Nature of Educational Assessment
R. E. Bennett, (2015)
Review of Research in Education, Vol. 39, No.1, pp. 370–407
This article concerns the evolution of educational assessment from a paper-based technology to an electronic one, with particular attention to the substantive changes that this evolution might allow. Focusing on the K–12 level, the article reviews the place of emerging approaches to summative and formative assessment in that evolution. View the abstract or download the full paper.
Cognitively Based Assessment of Research and Inquiry Skills: Defining a Key Practice in the English Language Arts
J. R. Sparks, & P. Deane (2015)
ETS Research Report No. RR-15-35
This report provides a theoretical background for the key practice of research and inquiry. The authors identify a set of activities and skills that are critical for participating in research. Each skill is accompanied by a set of provisional learning progressions which outlines tentative predictions about the qualitative changes in a skill that develop over time with appropriate instruction. View the abstract or download the full paper.
An Exploratory Study Using Social Network Analysis to Model Eye Movements in Mathematics Problem Solving
M. Zhu, & G. Feng (2015)
Proceedings of the ACM Learning Analytics and Knowledge Conference LAK'15.
This paper at the 2015 Learning Analytics and Knowledge Conference explores a new method to analyze students' eye gaze patterns during a CBAL mathematics task. View the abstract or download the full paper.
On the Explaining-away Phenomenon in Multivariate Latent Variable Models
P. van Rijn, & F. Rijmen (2015)
British Journal of Mathematical and Statistical Psychology, Vol. 68, issue 1, pp.1–22
This paper discusses what is referred to as the "explaining-away" phenomenon in the context of latent variable models for psychological and educational measurement. This phenomenon can occur when multiple latent variables are related to the same observed variable, and can elicit seemingly counterintuitive conditional dependencies between latent variables given observed variables. View the abstract.
Designing and Developing Assessments of Complex Thinking in Mathematics for the Middle Grades
E. A. Graf, & M. Arieli-Attali (2015)
Theory Into Practice, Vol. 54, Issue 3, pp. 195–202
Designing an assessment system for complex thinking in mathematics involves decisions at every stage, from how to represent the target competencies to how to interpret evidence from student performances. A challenge in characterizing mathematical competency is to capture not only the variety of skills and concepts, but also their connections. Designing assessments based on learning progressions may be one way to respond to this challenge. The authors discuss their experience developing a learning progression and an associated task model for mathematical functions. View the abstract or download the full paper.
Tablet-Based Math Assessment: What Can We Learn from Math Apps?
G. A. Cayton-Hodges, G. Feng, & X. Pan (2015)
Educational Technology and Society, Vol. 18, No. 2, pp. 3–20
This paper discusses a survey of mathematics education apps in the Apple® App Store, conducted as part of a research project to develop a tablet-based assessment prototype for elementary mathematics. Download the full paper.
The Impact of Training Data on Automated Short Answer Scoring Performance
M. Heilman, & N. Madnani (2015)
Proceedings of the Tenth Workshop on Innovative Use of NLP for Building Educational Applications, 2015, pp. 81–85
The authors conducted experiments using scored responses to 44 prompts from five diverse datasets in order to better understand how training set size and other factors relate to system performance. Download the full paper.
Measuring Argumentation Skills with Game-Based Assessments: Evidence for Incremental Validity and Learning
M. Bertling, G. T. Jackson, A. Oranje, & V.E. Owen (2015)
Artificial Intelligence in Education
17th International Conference, AIED 2015, Madrid, Spain, June 22-26, 2015. Proceedings, pp. 545–549
This paper describes the development and evaluation of a game-based assessment on argumentation skills called Mars Generation One. Results show that the in-game process data can substantially improve the measurement of argumentation compared to non-interactive multiple choice tests. View the abstract or download the full paper.
The Key Practice, Discuss and Debate Ideas: Conceptual Framework, Literature Review, and Provisional Learning Progressions for Argumentation
P. Deane, & Y. Song (2015)
ETS Research Report No. RR-15-33
In this report, the authors provide a comprehensive literature review on the development of key argumentation skills to lay a foundation for a framework of the key practice, discuss and debate ideas. Specifically, the framework includes 5 phases of core activities and related sets of argumentation skills, and for each set of skills, a provisional learning progression is designed to identify qualitative shifts in the development of critical argumentation skills informed by the developmental literature. These learning progressions may have the potential to support teachers' instructional decisions that effectively scaffold their students to the next level. View the abstract or download the full paper.
Process Features in Writing: Internal Structure and Incremental Value Over Product Features
M. Zhang, & P. Deane (2015)
ETS Research Report No. RR-15-27
In this study, the authors used a large sample of essays collected from middle school students in the United States to investigate the factor structure of the writing process features gathered from keystroke logs and that association of that latent structure with the quality of the final product (i.e., the essay text).The extent to which those process factors had incremental value over product features was also examined. View the abstract or download the full paper.
Exploring the Feasibility of Using Writing Process Features to Assess Text Production Skills
P. Deane, & M. Zhang (2015)
ETS Research Report No. RR-15-26
This report examines the feasibility of characterizing writing performance using process features derived from a keystroke log. View the abstract or download the full paper.
Building and Sharing Knowledge Key Practice: What Do You Know, What Don't You Know, What Did You Learn?
T. O'Reilly, P. Deane, & J. Sabatini (2015)
ETS Research Report No. RR-15-24
In this paper, the authors provide the rationale and foundation for the building and sharing knowledge key practice for the CBAL™ English language arts competency model. Building and sharing knowledge is a foundational literacy activity that enables students to learn and communicate what they read in texts. It is a strategic process that involves the integration of five key components or phases. The authors outline the major features of the key practice as well as address potential advantages and challenges of the approach. View the abstract or download the full paper.
Key Practices in the English Language Arts (ELA): Linking Learning Theory, Assessment, and Instruction
P. Deane, J. Sabatini, G. Feng, J. Sparks, Y. Song, M. Fowles, T. O'Reilly, K. Jueds, R. Krovetz, & C. Foley (2015)
ETS Research Report No. RR-15-17
This report presents a framework intended to link the following assessment development concepts into a systematic framework: evidence-centered design (ECD) and scenario-based assessment (SBA), and assessment of, for and as learning. Central to the framework is the concept of a key practice, drawn from constructivist learning theory, which emphasizes the purposeful social context within which skills are recruited and organized to carry out complex literacy tasks. View the abstract or download the full paper.
A Concept-Based Learning Progression for Rational Numbers
G. A. Cayton-Hodges, & M. Arieli-Attali, (2014)
International Journal for Research in Mathematics Education (RIPEM), Vol. 4, No. 3, pp.104–117
This report describes a provisional learning progression for rational numbers, specifically as embodied in fractions and decimals that was designed to be useful toward the development of formative assessment. Sign in to view the abstract or download the full paper.
Broadening the Scope of Reading Comprehension Using Scenario-based Assessments: Preliminary Findings and Challenges
J. P. Sabatini, T. O'Reilly, L. Halderman & K. Bruce (2014)
L'Année psychologique, Vol.114 , Issue 04, pp. 693–723
This paper presents the results of a study that evaluated a scenario-based assessment of reading comprehension of sixth-grade students. The assessment was designed to measure students' ability to integrate and evaluate a set of thematically related sources. View the abstract or download the full paper.
A Case Study in Principled Assessment Design: Designing Assessments to Measure and Support the Development of Argumentative Reading and Writing Skills
P. Deane & Y. Song (2014)
Psicología Educativa, Vol. 20, No. 2, pp. 99–108
This paper focuses on the literacy practice of argumentation as an approach to assessment design. It also serves to define and explain the argumentation learning progressions as part of the CBAL ELA competency model. View the abstract or download the full paper.
Connecting Lines of Research on Task Model Variables, Automatic Item Generation, and Learning Progressions in Game-Based Assessment (commentary paper)
E. A. Graf, (2014)
Measurement: Interdisciplinary Research and Perspectives, Vol.12, pp.42–46
This commentary paper discusses different approaches to automatic item generation, the underlying assumptions about cognitive models of difficulty, and how that relates to the use of automatic item generation in game-based assessment. View the abstract or download the full paper.
Integrating Scenario-Based and Component Reading Skill Measures to Understand the Reading Behavior of Struggling Readers
J. Sabatini, T. O'Reilly, L. Halderman, & K. Bruce (2014)
Learning Disabilities Research and Practice, Vol. 29, No. 1, pp. 36–43
This paper presents the results of a study that administered two assessments designed to work together to provide a more complete picture of reading comprehension ability in students. The value of using this assessment approach for struggling readers is also presented. View the abstract or download the full paper.
Empirical Recovery of Argumentation Learning Progressions in Scenario-Based Assessments of English Language Arts
P. W. van Rijn, E. A. Graf, & P. Deane (2014)
Psicología Educativa, Vol. 20, No. 2, pp. 109–115
This paper explains how data collected on three parallel scenario-based assessment forms was used to explore methods for studying the argumentation learning progressions as part of the larger CBAL ELA framework. View the abstract or download the full paper.
Fixing What's Wrong with Testing in K–12 Education
R. E. Bennett (2014, July 31)
This article details three key goals of CBAL and their implications for new types of K–12 assessment. Read the article.
Keeping Your Audience in Mind: Applying Audience Analysis to the Design of Interactive Score Reports
D. Zapata-Rivera & R.I. Katz, (2014)
Assessment in Education: Principles, Policy & Practice, Vol. 21, No. 4, pp. 442–463
This paper introduces an approach to identifying important audience characteristics for designing computer-based, interactive score reports. Through three examples, the paper demonstrate how an audience analysis suggests a design pattern, which guides the overall design of a report, as well as design details, such as data representations and scaffolding. View the abstract or download the full paper.
Expanding the CBAL™ Mathematics Assessment to Elementary Grades: The Development of a Competency Model and a Rational Number Learning Progression
M. Attali & G. Cayton-Hodges (2014)
ETS Research Report No. RR-14-08
This report describes the development of the CBAL mathematics competency model for grades 3–5 and a learning progression for rational numbers. Future implications for task development are also discussed. View the abstract or download the full report.
Formative Assessment with Cognition in Mind: The Cognitively Based Assessment of, for and as Learning (CBAL™) Research Initiative at Educational Testing Service
M. Arieli-Attali (2013)
Paper in the proceedings of the 39th annual International Association for Educational Assessment (IAEA) Conference
This paper gives a brief overview of the CBAL assessment system and focuses on the incorporation into assessment of theory and research findings from the cognitive and learning sciences. Download the paper.
A Ranking Method for Evaluating Constructed Responses
Y. Attali (2014)
Educational and Psychological Measurement, Pre-published April 2, 2014, DOI: 10.1177/0013164414527450
This article presents results of a study that used a rank-order approach to score constructed-response items from CBAL writing assessments. View the abstract.
Comparing Graphical and Verbal Representations of Measurement Error in Test Score Reports
R. Zwick, D. Zapata-Rivera, & M. Hegarty (2014)
Educational Assessment, Vol. 19, No. 2, pp. 116–138
This report presents the results of a study designed to evaluate teachers' understanding of measurement error. The research was supported by the CBAL research initiative, and it was intended to guide the development of test score reports directed primarily at teachers. View the abstract or download the full report.
Using Writing Process and Product Features to Assess Writing Quality and Explore How Those Features Relate to Other Literacy Tasks
P. Deane (2014)
ETS Research Report No. RR-14-03
This paper explores using the e-rater® automatic essay scoring system to measure product features and features extracted from keystroke logs to measure process features applied to student essays written during large-scale pilot administrations of writing assessments developed for ETS's CBAL research initiative. View the abstract or download this report.
A Bayesian Hierarchical Mixture Approach to Model Timing Data with Application to Writing Assessment
T. Li (2013)
The Florida State University, DigiNole Commons
This paper explores methods for modeling the statistical properties of the distribution of pause durations derived from keystroke logs captured during student writing. It argues for an analysis using mixture modeling, and seeks to relate the components of the mixture models to external measures of writing skill. Download this report.
Preliminary Reading Literacy Assessment Framework: Foundation and Rationale for Assessment and System Design
J. Sabatini, T. O'Reilly, & P. Deane
ETS Research Report No. RR-13-30
This report describes the foundation for an assessment framework designed to measure reading literacy based on models of cognitive development. The framework described in this paper was informed by the CBAL English Language Arts Competency Model and Provisional Learning Progressions. View the abstract or download this report.
A CBAL™ Science Model of Cognition: Developing a Competency Model and Learning Progressions to Support Assessment Development
L. Liu, A. Rogat, & M. Bertling (2013)
ETS Research Report No. RR-13-29
This report describes the CBAL science competency model and three of its related learning progressions for middle school science. Some examples from a formative assessment prototype are presented to illustrate how the competency model can be used to inform assessment development. View the abstract or download this report.
Automated Scoring of Mathematics Tasks in the Common Core Era: Enhancements to M-rater in Support of CBAL Mathematics and the Common Core Assessments
J. Fife (2013)
ETS Research Report No. RR-13-26
This report describes recent enhancements to the ETS m-rater scoring engine to support automated scoring for innovative CBAL mathematics tasks that presented new challenges. View the abstract or download this report.
Exploring Teachers' Understanding of Graphical Representations of Group Performance
D. Zapata-Rivera, & M. Vezzu, & W. VanWinkle
ETS Research Memorandum No. RM-13-04
In this paper we report on the results of a study that explored the effectiveness of three graphical representations of score distributions aimed at helping teachers understand and effectively use assessment results. View the abstract or download this report.
Automated Essay Scoring in Innovative Assessments of Writing from Sources
P. Deane, F. Williams, V. Weng, & C. S. Trapani (2013)
Journal of Writing Assessment, Vol. 6, No. 1
In this study, data from two large administrations of CBAL scenario-based writing assessments were evaluated using automated scoring. The results show that automated scoring can contribute accurate information about writing proficiency when used in combination with other sources of evidence. View the full article on the publisher's website.
Dimensionality Analysis of CBAL Writing Tests
J. Fu, S. Chung, & M. Wise (2013)
ETS Research Report No. RR-13-10
This report presents the results of a study to investigate the dimensionality of four CBAL writing tests using exploratory and confirmatory factor analyses. The tests were administered to grade 8 students in fall 2009, and the results show that the tests were multidimensional and support subscore structures and bifactor models. View the full abstract or download this report.
Using Argumentation Learning Progressions to Support Teaching and Assessments of English Language Arts
Y. Song, P. Deane, E. A. Graf, & P. van Rijn (2013)
ETS R&D Connections, No. 22
This report explains the CBAL argumentation learning progressions and shows how assessment tasks developed around the learning progressions can be used to support classroom instruction. View the full abstract or download this report.
On the Relation Between Automated Essay Scoring and Modern Views of the Writing Construct
P. Deane (2013)
Assessing Writing, Vol. 18, No. 1, pp. 7–24
This paper explores which aspects of the writing construct are currently measured directly by the kinds of features used in automated essay scoring models. It also examines the implications for validity and for different use cases for automated essay scoring, using the CBAL competency model as a framework for evaluation. View the full abstract or order this article from the publisher.
Covering the construct: An approach to automated essay scoring motivated by a socio-cognitive framework for defining literacy skills
P. Deane (2013)
Chapter in M. D. Shermis & J. Burstein (Editors), The Handbook of Automated Essay Evaluation: Current Applications and New Directions. Routledge.
This paper presents the CBAL understanding of the writing construct and explores which aspects of writing are currently measured directly by the kinds of features used in automated essay scoring models. They conclude that AES can usefully be combined with other measures of writing, provided that one understands that AES primarily characterizes the ability to produce grammatical, coherent, well-developed multiple-paragraph texts, and does not address deeper aspects such as quality of argumentation. Learn more about the book on the publisher's website.
Statistical Report of Fall 2009 CBAL™ Writing Tests
J. Fu, S. Chung, & M. Wise (2013)
ETS Research Report No. RR-13-01
This report presents the statistical results from the administration of four CBAL writing assessments to grade 8 students in 12 states in the fall of 2009. View the full abstract or download this report.
CBAL™ Results From Piloting Innovative K–12 Assessments
J. Johnson (ed.); R. E. Bennett (2012)
ETS Research Spotlight, No. 6
The sixth issue of ETS Research Spotlight focuses on a report highlighting our long-term Research & Development initiative — Cognitively Based Assessment of, for, and as Learning (CBAL). View the article.
Designing Accessible Technology-Enabled Reading Assessments: Recommendations from Teachers of Students with Visual Impairments
E. G. Hansen, C. C. Laitusis, L. Frankel, & T. King (2012)
Journal of Blindness Innovation and Research, Vol. 2, No. 2
This report provides a summary of a focus group with teachers of the visually impaired on both the challenges and potential solutions for making technology-enhanced test items accessible for students with visual disabilities. View the report.
Statistical Report of 2011 CBAL™ Multistate Administration of Reading and Writing Tests
J. Fu & M. Wise (2012)
ETS Research Report No. RR-12-24
This report presents the statistical results from the administration of CBAL reading assessments to grade 7 students and writing assessments to grade 8 students in 20 states in spring 2011. View the full abstract or download this report.
A Preliminary Analysis of Keystroke Log Data from a Timed Writing Task
R. Almond, P. Deane, T. Quinlan, & M. Wagner (2012)
ETS Research Report No. RR-12-23.
This report documents the approaches used to capture the keystroke logs and the algorithms used to process the outputs from CBAL Writing assessment pilot tests. In the pilot data, many of the features extracted from the keystroke logs were correlated with human scores. View the full abstract or download this report.
Applying Score Design Principles in the Design of Score Reports for CBAL™ Teachers
D. Zapata-Rivera, W. VanWinkle, & R. Zwick (2012)
ETS Research Memorandum No. RM-12-20.
This paper describes a framework for developing and evaluating score reports, presents several web-based score report prototypes, and summarizes the results of two studies designed to evaluate the score reports with teachers. View the full abstract or download this report.
The Case for Scenario-Based Assessments of Reading Competency
K. M. Sheehan & T. O'Reilly (2012)
Chapter in J. Sabatini, E. Albro, & T. O'Reilly (Editors), Reaching an Understanding: Innovation in How We View Reading Assessment. R&L Education.
This book represents initial attempts to apply theory in order to guide development of new assessments and measurement models. Learn more about the book on the publisher's website.
Rethinking K–12 Writing Assessment to Support Best Instructional Practices
P. Deane, J. Sabatini, & M. Fowles (2012)
Chapter in C. Bazerman, C. Dean, J. Early, K. Lunsford, S. Null, P. Rogers, & A. Stansell (Editors), International Advances in Writing Research: Cultures, Places, Measures. Parlor Press.
This chapter presents some preliminary CBAL research results specific to writing. Learn more about this book at the publisher's website.
Exploring the Role of Games in Educational Assessment
D. Zapata-Rivera & M. Bauer (2012)
Chapter in M. C. Mayrath, J. Clarke-Midura, & D. Robinson (Editors), Technology-based assessments for 21st century skills: Theoretical and practical implications from modern research. Information Age.
This chapter describes how some aspects from gaming technology may have a positive effect in the development of innovative assessment systems. Several examples of educational games and game-like assessment tasks including a game-based score report for students developed as part of the CBAL initiative are used to illustrate some of these concepts. Learn more about the book on the publisher's website.
Technology Enhanced Assessments in Mathematics and Beyond: Strengths, Challenges, and Future Directions
G. Cayton-Hodges, L. Marquez, P. van Rijn, M. Keehner, C. Laitusis, D. Zapata-Rivera, M. Bauer, & M. T. Hakkinen (2012).
Proceedings of the Invitational Research Symposium on Technology Enhanced Assessments
Center for K–12 Assessment & Performance Management at ETS
This paper presents an overview of the development of a technology enhanced mathematics prototype assessment for the CBAL initiative. Download this paper.
Statistical Report of Fall 2009 CBAL™ Reading Tests
J. Fu, M. Wise, & S. Chung (2012)
ETS Research Memorandum No. RM 12-12
This report presents the statistical results from the administration of two CBAL reading assessments to grade 7 students in nine states in fall 2009. View the full abstract or download this report.
Rethinking K–12 Writing Assessment
P. Deane (2012)
Chapter in N. Elliot & L. Perelman (Editors), Writing Assessment in the 21st Century, Hampton Press.
This book chapter provides a description of the CBAL competency model, outlining its basis in the research literature on writing. Learn more about the book on the publisher's website.
Designing and Evaluating an Interactive Score Report for Students
M. Vezzu, W. VanWinkle, & D. Zapata-Rivera (2012)
ETS Research Memorandum No. RM-12-01
This paper describes an interactive student score report developed for the Cognitively Based Assessment of, for, and as Learning (CBAL) initiative and reports on results from an initial usability study. View the full abstract or download this report.
Designing and Evaluating Score Reports for Particular Audiences
D. Zapata-Rivera (2011)
Chapter in Zapata-Rivera, D. & Zwick, R. (Editors), Test Score Reporting: Perspectives From the ETS Score Reporting Conference, ETS Research Report, No. RR-11-45, pp. 32–51
This chapter presents a framework for creating and assessing score reports for three audiences: teachers, administrators and students. It is part of a volume of three papers based on workshop presentations at ETS's Score Reporting Conference held on November 4, 2010. View the chapter as part of the full report.
Constructed-Response Mathematics Tasks Study
J. H. Fife, E. A. Graf, & S. Ohls (2011)
ETS Research Report No. RR-11-35
This report describes potential ways in which the selected CBAL constructed-response mathematics tasks might be revised to reduce construct-irrelevant variance. View full abstract or download this report.
Automated Scoring of CBAL Mathematics Tasks with m-Rater
J. H. Fife (2011)
ETS Research Memorandum No. RM-11-12
This paper presents the automated scoring work done in CBAL Mathematics in 2009 using m-rater, a technology developed at ETS for automated scoring of mathematics items. View the full abstract or download the report.
CBAL: Results From Piloting Innovative K–12 Assessments
R. E. Bennett (2011)
ETS Research Report No. RR-11-23
This report summarizes empirical results from almost 10,000 online administrations of the CBAL summative assessments conducted from 2007 to 2010. View the full abstract or download this report.
The CBAL Reading Assessment: An Approach for Balancing Measurement and Learning Goals
K. M. Sheehan & T. O'Reilly (2011)
ETS Research Report No. RR-11-21
This paper presents a framework for developing new types of reading comprehension assessments that provide evidence about what students know and can do and that help to move learning forward. View the full abstract or download this report.
Automated Scoring Within a Developmental, Cognitive Model of Writing Proficiency
P. Deane, T. Quinlan, & I. Kostin (2011)
ETS Research Report No. RR-11-16
This paper focuses on the potential for using automated scoring techniques to support learning effectively within the CBAL assessments. View the full abstract or download this report
Four Years of Cognitively Based Assessment of, for, and as Learning (CBAL): Learning About Through-Course Assessment (TCA)
J. P. Sabatini, R. E. Bennett, & P. Deane (2011)
Proceedings of the Invitational Research Symposium on Through-Course Summative Assessments
Center for K–12 Assessment & Performance Management at ETS
This paper describes the lessons learned about through-course summative assessment and the reasoning behind some of the design decisions that underlie such assessment in the CBAL. Download this report.
Writing Assessment and Cognition
P. Deane (2011)
ETS Research Report No. RR-11-14
This paper reviews a model that places a strong emphasis on writing as an integrated, socially situated skill. View the full abstract or download this report.
The CBAL Summative Writing Assessment: A Draft Eighth-Grade Design
P. Deane, M. Fowles, D. Baldwin, & H. Persky (2011)
ETS Research Memorandum No. RM-11-01
This paper describes the process and results of developing draft summative writing assessments within the CBAL Initiative. It outlines and reviews four designs, and it briefly discusses initial results from preliminary pilots. View the full abstract or download this report.
Formative Assessment: A Critical Review
R. E. Bennett (2011)
Assessment in Education: Principles, Policy and Practice, Vol. 18, No. 1, pp. 5–25
This paper covers six interrelated issues in formative assessment, several of which motivated the approach taken in the CBAL. View the full abstract or order this report from the publisher.
Cognitively Based Assessment of, for, and as Learning (CBAL): A Preliminary Theory of Action for Summative and Formative Assessment
R. E. Bennett (2010)
Measurement: Interdisciplinary Research & Perspectives, Vol. 8, No. 2–3, pp. 70–91
This paper describes the notion of a theory of action, offers a preliminary version of such a theory for the CBAL initiative, and outlines research necessary to evaluate that theory. View the full abstract or download this report from the publisher.
An Evidence-Centered Approach to Using Assessment Data for Policymakers
J. S. Underwood, D. Zapata-Rivera, & W. VanWinkle (2010)
ETS Research Report No. RR-10-03
District-level policymakers receive reports of student achievement data that are complex, difficult to read and even harder to interpret. In this report, the authors propose an evidence-centered reporting framework in order to design reports that will help policymakers make sense of data. View full abstract or download this report.
Highlights from the Cognitively Based Assessment of, for, and as Learning (CBAL) Project in Mathematics
E. A. Graf, K. Harris, E. Marquez, J. H. Fife, & M. Redman (2010)
ETS Research Spotlight, No. 3, pp. 19–30
This article describes the early design and development stages of the mathematics strand of the Cognitively Based Assessment of, for, and as Learning (CBAL) project. Download the issue of ETS Research Spotlight that contains this article.
Cognitively Based Assessment of, for, and as Learning (CBAL) in Mathematics: A Design and First Steps Toward Implementation
Graf, E. A., Harris, K., Marquez, E., Fife, J., & Redman, M. (2009)
ETS Research Memorandum No. RM-09-07
In this report, the first stages of design and development for the mathematics strand of the Cognitively Based Assessment of, for, and as Learning (CBAL) project are described. A general rationale for the design as well as a catalog of materials is presented. View the abstract or request a copy of the report.
c-rater: Automatic Content Scoring for Short Constructed Responses
J. Z. Sukkarieh & J. Blackmore (2009)
Paper in the proceedings of the 22nd Florida Artificial Intelligence Research Society (FLAIRS) Conference
The c-rater™ automated scoring engine, developed at ETS, is a technology for automatic content scoring for short, free-text responses. This paper describes recent developments in this technology. View the full report.
Defining Mathematics Competency in the Service of Cognitively Based Assessment for Grades 6 Through 8
E. A. Graf (2009)
ETS Research Report No. RR-09-42
This report makes recommendations for the development of middle-school mathematics assessment. It discusses how to model mathematical competency at the middle school level, the kinds of evidence that reflect student competency and support future learning, and how to design tasks that elicit evidence. View full abstract or download this report.
Cognitively Based Assessment of, for, and as Learning: A Framework for Assessing Reading Competency
T. O'Reilly & K. M. Sheehan (2009)
ETS Research Report No. RR-09-26
This paper presents the rationale and research base for a reading competency model designed to guide the development of cognitively based assessment of reading comprehension. View full abstract or download this report.
Cognitively Based Assessment of, for, and as Learning: A 21st–Century Approach for Assessing Reading Competency
T. O'Reilly & K. M. Sheehan (2009)
ETS Research Memorandum No. RM-09-04
This paper describes the CBAL system's approach for assessing reading comprehension in an accountability setting. The approach uses evidence-centered design to develop a competency model that drives the development of summative, formative and professional support aspects of the assessment. To request a copy of this report, send an email to RDWeb@ets.org. Include the title and report number in your request.
Horizontal and Vertical Linking in a Longitudinal Design
F. Rijmen (2009)
ETS Research Memorandum No. RM-09-03
This paper describes two longitudinal data collection designs that may result in substantial reduction of costs. View full abstract or order this report.
Transforming K–12 Assessment: Integrating Accountability Testing, Formative Assessment and Professional Support
R. E. Bennett & D. H. Gitomer (2008)
ETS Research Memorandum No. RM-08-13
This report presents a brief overview of the status of K–12 accountability testing in the United States and describes the CBAL assessment-system model, which is designed to overcome the problems associated with current approaches to accountability testing. Download the report.
Cognitive Models of Writing: Writing Proficiency as a Complex Integrated Skill
P. Deane, N. Odendahl, T. Quinlan, M. Fowles, C. Welsh, & J. Bivens-Tatum (2008)
ETS Research Report No. RR-08-55
This paper undertakes a review of the literature on writing cognition, writing instruction and writing assessment with the goal of developing a framework and competency model for a new approach to writing assessment. View full abstract or download this report.
Learn about the CBAL Language Arts learning progressions and their relationship to the Common Core State Standards (Flash, 4:49).
Get an overview of the CBAL research program, including how the CBAL assessment prototypes are being used in the classroom (Flash, 8:02).