Adaptive computer systems show great potential in education. Successful adoption of these systems, however, requires researchers to identify and address situations that negatively influence students’ learning and assessment experiences. We have designed conversation-based assessments (CBAs) to assess constructs such as science inquiry, English language, and collaborative problem-solving skills. These computer-based conversations have provided useful evidence of students’ skills. However, in some cases, students provide unexpected responses that are indicative of a state such as disengagement or negative experience. In this research memorandum, we explore an approach for identifying and addressing these unexpected cases. The proposed approach involves identifying cases of unexpected responses, analyzing cases and defining categories, designing possible solutions, and validating case categories and possible solutions with expert teachers. Results of applying this approach in the context of CBAs suggest that the envisioned solutions could be used to successfully deal with the cases identified. This work will inform the creation of detectors to be used across CBAs to successfully identify and appropriately address unexpected responses.