Skip to main content

StudyCAT: A Computerized Adaptive Testing (CAT) platform to enhance preparation for formative assessments

·
Table of Contents

This posting closed on the date above on the department’s list of instruction-initiated projects for CSC494H1. Jason Wang, Paarth Arya, and Jayden Chiola-Nakai were selected for the project.

This page archives the details from the posting; it is not a call for applications.

Project description #

Practice testing has been shown to improve learner outcomes in formative assessments [1], however course specific tools to support effective practice testing are limited in availability and difficult to create. While generative AI (gAI) has to potential to provide practice questions for learners, these questions are currently of low quality, primarily assess recall or comprehension, and often do not align with course-specific learning objectives. Computerized Adaptive Testing (CAT) provides a potential resource to empower learners with a personalized practice test that adjusts to their level, providing important feedback on course concepts that require additional review [2-5].

CATs utilize a machine learning algorithm to select an appropriate question from a question bank based on the user’s previous responses [6]. These tools have been shown to efficiently determine a test taker’s performance on standardized examinations such as the Nursing Licensure Exam (NCLEX)7. When combined with a validated, learner-generated, course-specific question bank, this tool has the potential to enhance metacognition, academic performance and learner-engagement [8–10].

In this project, you will contribute to the creation of a CAT tool, from ideation and design to development and testing. StudyCAT will incorporate a validated question bank of practice MCQs, adapting to user performance. Upon completion of the practice test, users will see a visualization of their current performance as well as content areas that would benefit from additional review. Users can take practice tests repeatedly as they prepare for their formative assessments. While initially used in select courses at UofT, this tool may extend to other courses or even beyond the university.

This project is ideal for students who are interested in applications of machine learning, software design and development and improving of education outcomes.

[1] Adesope, O. O., Trevisan, D. A. & Sundararajan, N. Rethinking the Use of Tests: A Meta-Analysis of Practice Testing. Rev Educ Res 87, 659–701 (2017). [2] Malkemes, S. & Phelan, J. C. Impact of Adaptive Quizzing as a Practice and Remediation Strategy to Prepare for the NCLEX-RN. Open J Nurs 07, 1289–1306 (2017). [3] Ross, B., Chase, A.-M., Robbie, D., Oates, G. & Absalom, Y. Adaptive quizzes to increase motivation, engagement and learning outcomes in a first year accounting unit. International Journal of Educational Technology in Higher Education 15, 30 (2018). [4] Heitmann, S. et al. Adaptive Practice Quizzing in a University Lecture: A Pre-Registered Field Experiment. J Appl Res Mem Cogn 10, 603–620 (2021). [5] Kisielewska, J. et al. Medical students’ perceptions of a novel international adaptive progress test. Educ Inf Technol (Dordr) 29, 11323–11338 (2024). [6] Thompson, N. A. & Weiss, D. A. A Framework for the Development of Computerized Adaptive Tests. Practical Assessment, Research, and Evaluation 16, (2011). [7] Beeman, P. B. & Waterhouse, J. K. NCLEX-RN performance: Predicting success on the computerized examination. Journal of Professional Nursing 17, 158–165 (2001). [8] Bottomley, S. & Denny, P. A participatory learning approach to biochemistry using student authored and evaluated multiple-choice questions. Biochemistry and Molecular Biology Education 39, 352–361 (2011). [9] Riggs, C. D., Kang, S. & Rennie, O. Positive impact of multiple-choice question authoring and regular quiz participation on student learning. CBE Life Sci Educ 19, (2020). [10] Katz, L. et al. Student-generated Multiple-Choice Questions: A Java and Web-Based Tool for Students to Create Multiple Choice Tests. The Canadian Journal for the Scholarship of Teaching and Learning 15, (2024).

Skills #

Required: Experience with full-stack web development (e.g., from CSC309) and in particular React; experience working with machine learning algorithms (e.g., from CSC311). Strong teamwork, project management, and communication skills.

Assets: Experience building software for external clients; interest in education and the science of learning.

Notes #

This project is an interdisciplinary collaboration between Professors Jason De Melo (Department of Biochemistry), Sian Patterson (Department of Biochemistry), Naomi Levy-Strumpf (Human Biology Program), David Liu (Department of Computer Science), and Mario Badr (Department of Computer Science).