Automated Essay Grading System Applied to a First Year University Subject - How Can We Do It Better?

John Palmer, Robert Williams, Heinz Dreher
InSITE 2002  •  Volume 2  •  2002
Automated marking of assignments consisting of written text would doubtless be of advantage to teachers and education administrators alike. When large numbers of assignments are submitted at once, teachers find themselves bogged down in their attempt to provide consistent evaluations and high quality feedback to students within as short a timeframe as is reasonable, usually a matter of days rather than weeks. Educational administrators are also concerned with quality and timely feedback, but in addition must manage the cost of doing this work. Clearly an automated system would be a highly desirable addition to the educational tool-kit, particularly if it can provide less costly and more effective outcome. In this paper we present a description and evaluation of four automated essay grading systems. We then report on our trial of one of these systems which was undertaken at Curtin University of Technology in the first half of 2001. The purpose of the trial was to assess whether automated essay grading was feasible, economically viable and as accurate as manually grading the essays. Within the Curtin Business School we have not previously used automated grading systems but the benefit could be enormous given the very large numbers of students in some first year subjects. As we evaluate the results of our trial, a research and development direction is indicated which we believe will result in improvement over existing systems.
assessment, assignment, automatic, essay, grading, marking, plagiarism
3238 total downloads
Share this

Back to Top ↑