Scaling Oral Assessment Authentically

Live Recorded Video Exams (LRVE)

Authors

  • Tahmina Akhter The University of New South Wales
  • Margaret Connor The University of New South Wales
  • Evelyn Lai The University of New South Wales
  • Natalie Yoon-na Oh The University of New South Wales
  • Flemming Rhode The University of New South Wales

DOI:

https://doi.org/10.14742/apubs.2024.1235

Keywords:

Oral assessments, Academic integrity, Authentic assessment

Abstract

The rise of AI and related threats to academic integrity in a post-pandemic academic environment where most assessments are not invigilated forces universities to revisit the question of how to conduct safe and fair assessments (Dawson, 2022). As invigilated exams and online proctoring have fallen out of favour with university administrators due to cost, space or privacy concerns, oral assessments have found renewed interest (Newell, 2023). These are effective in measuring students' knowledge whilst verifying that the responses truly belong to the student (Joughin, 1998). Though there are many established variations of oral assessment, from viva voce thesis defence to interactive orals (as widely conducted in Law and Medicine), they usually emphasise authentic live interaction between student and assessor. However, the sheer scale of some undergraduate programs (as common in Business Schools) along with the integrated nature of business curriculum requiring marking expertise from different graders makes truly interactive oral exams unfeasible. As a response, a large first-year undergraduate Business course (with student enrolments of between 500-1200 per term) at a G08 institution implemented an authentic recorded oral assessment, which mimics job interview practice.

Students were recorded for a set time period while they responded to questions verbally with the resulting video and transcripts being later marked. We utilised an asynchronous, remotely invigilated oral assessment (Baird et al., 2022) for the final exam to improve academic integrity and to further embed transferable skills that is industry relevant whilst increasing student employability. The new assessment format was named ‘Live Recorded Video Exam (LRVE)’. The invigilation process for this exam was conducted on a one-on-one basis where one invigilator checked each student in their groupings. The invigilators were provided with a procedural checklist to follow to be consistent within all the groups. The process involved the invigilator calling the student into a Microsoft Teams channel, verifying the student’s identity, checking their immediate surroundings, sharing the screen of the device that the student was using, providing a password verbally for the student to access the exam questions and then to start the recording for invigilation purposes. Once the recording commenced, the invigilator would leave the meeting room and allow the student to complete the exam independently by articulating their responses. The graders had access to both the video recording as well as the captioned transcript to mark the exam and check for any student misconduct.

Students were informed about the exam format through different channels from the teaching team as well as innovative student-led initiatives. They were given various opportunities to prepare throughout the term within established class structures and practice mock sessions.

Out of 542 exams, none were reported for violations of academic integrity and the rate of special consideration applications was lower. This is, in part, due to potential problems being rectified during the student check-in process. Surveys conducted after the exam showed the LRVE format was well-received by markers and student representatives in part because of its flexibility, offering a complement to traditional viva voce models and demonstrating its scalability.

Downloads

Published

2024-11-11

Issue

Section

ASCILITE Conference - Pecha Kuchas