Most undergraduate computer science programs still treat core developer tools including the command-line interface (CLI), symbolic debuggers, and version control systems as knowledge students will acquire on their own outside of class. We believe that these skills deserve the same explicit instruction, practice, and assessment that Computer Science programs dedicate to languages and algorithms.

This experience report presents a practical, scalable method to assess students’ proficiency through checkoffs: one-on-one sessions where students demonstrate skills and verbalize concepts to a teaching assistant (TA). This live, interactive format allows TAs to probe student understanding and provide immediate feedback, offering a more effective and authentic assessment experience than traditional paper-based exams. In a semester-long Systems Fundamentals course enrolling 237 students, we designed and administered three checkoffs, one each for the CLI, debugging, and Git.

Final scores on each checkoff demonstrated that the vast majority of students achieved a baseline proficiency with each tool. Additionally, survey results indicated the checkoff format boosted students’ confidence with the tools, encouraged earlier help-seeking on other assignments, and fostered a stronger sense of community. Given these positive outcomes, we plan to continue using checkoffs to evaluate tool-based skills. To enable other educators to replicate this system, we have detailed key design decisions for scaling checkoffs effectively including logistics, a breakdown of skills assessed, and sample questions.