Date of Award

Spring 2025

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Computer Science

First Advisor

Dennis Brylow

Second Advisor

Michael Zimmer

Third Advisor

Tahmidul Islam

Abstract

The rapid expansion of computer science education has placed significant strain on educators and students alike, particularly in large introductory programming courses. Traditional assessment methods often fail to balance timely feedback, effective student engagement, and scalable instructor support. In response, this dissertation presents TA-Bot, a novel automated assessment tool designed to incentivize early engagement, improve code quality, and encourage office-hour participation through an adaptive, non-punitive reward system. TA-Bot integrates a Time Between Submissions mechanism, dynamically modulating feedback frequency to discourage trial-and-error programming while fostering thoughtful code development. The system also implements gamification principles to motivate students to start assignments earlier and interact with course support structures. By shifting away from punitive restrictions and invasive data tracking, this research explores how positive reinforcement strategies can enhance student learning behaviors without discouraging participation. A longitudinal study was conducted across multiple semesters to evaluate the effectiveness of TA-Bot. The findings indicate that students using the system demonstrated higher engagement levels, improved code quality, and greater office-hour attendance, leading to better overall retention and performance. This work contributes to the broader discourse on CS education by demonstrating the efficacy of behavioral nudges and incentive-driven assessment tools in fostering productive learning habits.

Share

COinS