Evaluating Feedback Tools in Introductory Programming Classes

Frontiers in Education Conference (FIE) |

Published by IEEE | Organized by IEEE

This Research Full Paper presents a study on the evaluation of feedback tools in introductory programming classes. Recently, several tools have been proposed in order to provide guidance and help students overcome conceptual difficulties in programming education. Some tools leverage clustering algorithms and program repair techniques to automatically generate personalized hints for students’ incorrect programs. In contrast, some teachers choose to present students with program visualization tools to help them understand the dynamic execution of a source code. These tools are used to help students get correct solutions for programming assignments. However, due to limitations in assessments, it is still unclear how effective the feedback provided by these tools is. In this study, we analyzed the effectiveness of a tool for generating personalized hints and a tool for visualizing programs. To do so, we conducted a user study in which students, assisted by these tools, implemented solutions for three programming problems. Our results show that personalized hints can significantly reduce student’s effort to get correct solutions. In addition, personalized hints can provide students with an understanding of problem solving similar to when using test cases. However, students who used the program visualization tool got lower post-test performance than using other tools.