Analyzing the Quality of Submissions in Online Programming Courses
Programming education should aim to provide students with a broad range of skills that they will later use while developing software. An important aspect in this is their ability to write code that is not only correct but also of high quality. Unfortunately, this is difficult to control in the setti...
Saved in:
Main Authors: | , , , , , |
---|---|
Format: | Journal Article |
Language: | English |
Published: |
26-01-2023
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Programming education should aim to provide students with a broad range of
skills that they will later use while developing software. An important aspect
in this is their ability to write code that is not only correct but also of
high quality. Unfortunately, this is difficult to control in the setting of a
massive open online course. In this paper, we carry out an analysis of the code
quality of submissions from JetBrains Academy - a platform for studying
programming in an industry-like project-based setting with an embedded code
quality assessment tool called Hyperstyle. We analyzed more than a million Java
submissions and more than 1.3 million Python submissions, studied the most
prevalent types of code quality issues and the dynamics of how students fix
them. We provide several case studies of different issues, as well as an
analysis of why certain issues remain unfixed even after several attempts.
Also, we studied abnormally long sequences of submissions, in which students
attempted to fix code quality issues after passing the task. Our results point
the way towards the improvement of online courses, such as making sure that the
task itself does not incentivize students to write code poorly. |
---|---|
DOI: | 10.48550/arxiv.2301.11158 |