Do-it-yourself e-Exams

Authors

  • Mathew Hillier
  • Scott Grant

DOI:

https://doi.org/10.14742/apubs.2018.1920

Keywords:

computerised assessment system, e-exam, spreadsheets

Abstract

This paper focuses on a small case study in which we developed and tested a set of spreadsheets as a 'do-it-yourself' e-examination delivery and marking environment. A trial was conducted in a first-year university level class during 2017 at Monash University, Australia. The approach enabled automatic marking for selected response questions and semi-automatic marking for short text responses. The system did not require a network or servers to operate therefore minimising the reliance on complex infrastructure. We paid particular attention to the integrity of the assessment process by ensuring separation of the answer key from the response composition environment. Students undertook a practice session followed by an invigilated exam. Student's perceptions of the process were collected using pre-post surveys (n = 16) comprising qualitative comments and Likert items. The data revealed that students were satisfied with the process (4 or above on 5-point scales). Comments revealed that their experience was in part influenced by their level of computer literacy with respect to enabling skills in the subject domain. Overall the approach was found to be successful with all students successfully completing the e-exam and administrative efficiencies realised in terms of marking time saved.

 

Downloads

Published

2018-11-20

Issue

Section

ASCILITE Conference - Full Papers