TrackIt: An Interactive Rule-Based Tool for Detecting Struggling Programming Students
Identifying students who struggle during programming tasks remains a persistent challenge in computer science education, often resulting in delayed interventions and untimely support. Keystroke data analysis has emerged as an approach for capturing students’ coding behaviors and identifying those who might have difficulties. This is made possible due to the granular nature of the data. By examining the keystroke dynamics such as typing speed, frequency of deletions, pauses and code reconstruction, it becomes possible to make an inference of the level of confidence, frustration and hesitations students experience during programming tasks. Nevertheless, while keystroke data can reveal insightful patterns, the interpretation of the results is complex and requires complementary data sources to validate inferences about students’ struggles or confidence. This study introduces AnonTool a rule-based system designed to detect struggling programming students by analyzing keystroke data in conjunction with self-reported survey responses. By applying a series of predefined thresholds and scoring rules, AnonTool computes a struggling score that categorizes students into five struggle levels. The system was validated in an introductory Python course involving 40 students with data collected from 547 labs and 130 homework keystroke logs, alongside structured post-assignment surveys. Results revealed that AnonTool effectively identified key struggle behaviors such as long pauses, frequent deletions, low insert-delete ratios and copy-paste events. These indicators closely aligned with students’ self-reports, particularly for challenging assignments.