angle-up angle-right angle-down angle-left close user menu open menu closed search globe bars phone store

Measuring What Students Learn in College

Focus on R&D

Issue 10

May 2018

By: Hans Sandberg

Higher education institutions in the United States face growing expectations to prove that their students are learning what's needed to participate in society and the global workforce. Today, over 80 percent of these institutions are measuring student learning outcomes (SLOs) in one way or another. Focus on ETS R&D turned to Ou Lydia Liu, Senior Research Director in ETS's Academic to Career Research Center, to learn more about measuring higher education SLOs and ETS's own research in this area. In 2015 ETS launched the HEIghten® Outcomes Assessment Suite.

Why measure SLOs? Aren't they already reflected in the students' final grades?

Photo of Ou Lydia Liu Ou Lydia Liu: The grades students earn mainly reflect what they learned in subjects like mathematics, psychology and other disciplinary domains, but there are other types of learning that are equally important. Take generic and transferable skills and competencies, such as critical thinking and analytical reasoning, for example. They are important, but may not be reflected in the course grades. The idea behind SLO assessments is to capture these skills and competencies, which can give us a broader picture of students' strengths and weaknesses. They can also help identify areas where students have deficiencies that need to be addressed. For example, if an institution sees a deficiency in students' written communication, it could be motivated to promote a writing curriculum that could help address such a deficiency. The assessments can be used for both formative and summative purposes.

How did ETS decide what its SLO assessments should measure?

Ou Lydia Liu: Economic and technological developments are changing the range of skills and competencies society expects from today's and tomorrow's college graduates. We wanted to understand what specific learning outcomes higher education institutions and employers thought are most important, so we asked administrators at over 200 colleges and universities, as well as many professional organizations, to share their views on learning outcomes. Our researchers also conducted a comprehensive review of the existing literature, covering conceptual and assessment frameworks concerning core knowledge, skills and competencies for college students.

The outreach and literature review ETS R&D did resulted in a list of five important competencies suitable for assessments: critical thinking, written communication, quantitative literacy, civic competency and engagement, and finally, intercultural competency and diversity. It was based on this work that we developed the HEIghten assessment, which targets these five competencies.

What if a school doesn't teach critical thinking, for example?

Ou Lydia Liu: When we talk about learning outcomes, like those measured by the HEIghten assessment, we are referring to general competencies that students are expected to have acquired or improved while in college. We are not attributing any measured proficiency on these general competencies to a specific course or series of courses. Students can learn critical thinking even if a school doesn't offer classes in the subject. They can gain critical thinking skills from exposure to a broad range of courses, programs and institutional activities during their college career. They can acquire these skills when taking a philosophy course that discusses inductive and deductive reasoning, or in a statistics course that covers the difference between correlation and causation. They can also do it by participating in an undergraduate research program, or by being part of a debate club. All these activities can contribute to a student's skills without the student needing to be in a specific critical thinking class. In this sense, the skills reflect the overall learning experience.

Who uses SLO assessments and why?

Ou Lydia Liu: Institutions of higher education often use SLO assessments to benchmark student performance and find areas where learning has improved. They can use the results to satisfy accreditation requirements and facilitate internal improvement. These assessments can also be used by regional or national educational systems to help identify positive outcomes and promote effective practices.

Can the validity of an assessment be affected by how it is used and the level of the stakes attached to the results?

Ou Lydia Liu: Absolutely! Every assessment is created for a certain purpose and is expected to be used in a specific way. The validity of an assessment depends on whether it is used in a way that matches the intended use. If an assessment is designed to serve one purpose, but it is interpreted in a way that has little to do with that purpose, then those interpretations could be invalid.

This is also true for the stakes attached to an assessment. Take the HEIghten assessment, which was created to evaluate learning outcomes and is typically used in low-stakes contexts. The validity of this test could be undermined if it were used for decisions that have high stakes; for example, admissions to graduate school. That said, it doesn't mean that we should refrain from exploring uses of an assessment beyond its original design. There are successful examples of new uses and repurposed uses, but to warrant such uses we first need sufficient empirical validity evidence.

ETS R&D is currently conducting a longitudinal study to gather evidence of how performance on different outcome measures (e.g., domain-specific and generic) may change during the course of students' college experience and how the performances could interact. To do this, researchers plan to track students at multiple institutions over four years. In addition, they also plan to see if the assessment results are associated with other success indicators, such as grades, retention and graduation.

Motivation to perform well on a test is another important issue that can impact score validity. Most SLO assessments take place in a low-stakes setting, meaning that they have little practical consequences for the test takers. Students may, as a result, not give their best when taking the tests, which can make it hard to draw meaningful conclusions from the SLO scores. ETS has done quite a lot of research in the last decade to clarify the relationship between test-taking motivation and student performance. Using experimental studies, we have identified practical strategies that institutions can use to help improve students’ motivation in taking SLO assessments.

Is there a global interest in SLO assessments?

Ou Lydia Liu: On the international scene, we've seen an increasing number of institutions becoming more interested in assessing learning outcomes and taking action. Our research team has forged collaborations with key partners in China, Russia, India, Korea, Canada and Germany. There are big differences in these countries' educational systems and priorities, but they share the realization that direct evidence of students' learning is required for stakeholders to identify gaps and facilitate change.

Promoting critical skills and competencies is a complex task, and assessments to measure performance levels is just the first step. Assessments can spot problems, but they are not necessarily cures. It takes the coordinated efforts of assessment organizations and institutions to develop plans of action based on the test results.

Future efforts may seek to identify effective practices associated with better learning outcomes. Also, developing learning and training materials can augment important skills and competencies in a sustainable and organic way, as opposed to preparing for the test. In the context of assessing learning outcomes, the assessment effort would be more beneficial if it were transformed from being a one-shot activity to a continual process that engages learning and training opportunities.

Ou Lydia Liu is Senior Research Director at ETS's Academic to Career Research Center in the Research & Development division.

Learn more:

Griffith, R. L., Wolfeld, L., Armon, B. K., Rios, J., & Liu, O. L. (2016). Assessing Intercultural Competence in Higher Education: Existing Research and Future Directions. (ETS Research Report No. RR-16-25). Princeton, NJ: Educational Testing Service. https://doi.org/10.1002/ets2.12112

Liu, O. L. (2017). Ten Years After the Spellings Commission: From Accountability to Internal Improvement. Educational Measurement: Issues and Practice, 36: 34–41. https://doi.org/10.1111/emip.12139

Liu, O. L., Frankel, L., & Roohr, K. C. (2014). Assessing Critical Thinking in Higher Education: Current State and Directions for Next-Generation Assessment. (ETS Research Report No. RR-14-10). Princeton, NJ: Educational Testing Service. https://doi.org/10.1002/ets2.12009

Liu, O. L., Liu, H., Roohr, K. C., & McCaffrey, D. F. (2016). Investigating College Learning Gain: Exploring a Propensity Score Weighting Approach. Journal of Educational Measurement, 53: 352–367. https://doi.org/10.1111/jedm.12112

Liu, O. L., Rios, J. A., & Borden, V. (2015). The Effects of Motivational Instruction on College Students' Performance on Low-Stakes Assessment. Educational Assessment, 20, 79–94. https://doi.org/10.1080/10627197.2015.1028618

Roohr, K. C., Graf, E. A., & Liu, O. L. (2014). Assessing Quantitative Literacy in Higher Education: An Overview of Existing Research and Assessments With Recommendations for Next-Generation Assessment. (ETS Research Report No. RR-14-22). Princeton, NJ: Educational Testing Service. https://doi.org/10.1002/ets2.12024

Sparks, J. R., Song, Y., Brantley, W., & Liu, O. L. (2014). Assessing Written Communication in Higher Education: Review and Recommendations for Next-Generation Assessment. (ETS Research Report No. RR-14-37). Princeton, NJ: Educational Testing Service. https://doi.org/10.1002/ets2.12035

Torney-Purta, J., Cabrera, J. C., Roohr, K. C., Liu, O. L., & Rios, J. A. (2015). Assessing Civic Competency and Engagement in Higher Education: Research Background, Frameworks, and Directions for Next-Generation Assessment. (ETS Research Report No. RR-15-34). Princeton, NJ: Educational Testing Service. https://doi.org/10.1002/ets2.12081