Student Participation in Computing Studies to Understand Engagement and Grade Outcome
This paper focuses on understanding undergraduate computing student-learning behaviour through reviewing their online activity in a university online learning management system (LMS), along with their grade outcome, across three subjects. A specific focus is on the activity of students who failed the computing subjects.
Between 2008 and 2020 there has been a multiplicative growth and adoption of Learning Analytics (LA) by education institutions across many countries. Insights gained through LA can result in actionable implementations at higher institutions for the benefit of students, including refinement of curriculum and assessment regimes, teacher reflection, and more targeted course offerings.
To understand student activity, this study utilised a quantitative approach to analyse LMS activity and grade outcome data drawn from three undergraduate computing subjects. Data analysis focused on presenting counts and averages to show an understanding of student activity.
This paper contributes a practical approach towards LA use in higher education, demonstrating how a review of student activity can impact the learning design of the computing subjects. In addition, this study has provided a focus on poor performing students so that future offerings of the computing subjects can support students who are at risk of failure.
The study found that:
• Collecting data relating to student activity and analysing the activity is an important indicator of engagement, with cross referencing the data to grade outcome providing information to support modification to the learning design of the computing subjects.
• The computing subjects in this study all had the majority of the as-sessment marks awarded at the later part of the study period.
• Students that fail subjects are active within the LMS for the period of the subject even when they submit no assessments
• Assessment weight and the time of delivery could influence the out-comes
The collection and analysis of student activity in the LMS can enable learning designers and practitioners to better reflect the subject design and delivery to provide more informed ways of delivering the learning material.
Collecting LA requires a thought-out process, designed well in advance of the teaching period. This study provides useful insight that can impact other researchers in the collection of assessment related analytics.
The cost of education is expensive to those that undertake it. Failing, although expected, potentially can be reduced by examining how education is designed, delivered, and assessed. The study has shown how information on how students are engaging has the potential to impact their outcomes.
Further work is needed to investigate whether intervention may assist the poor performing students to improve their grade outcomes relative to activity levels, subsequently impacting their retention.