Supporting hybrid engagement through digital learning design

With the shift to post-pandemic flexible delivery there is an increased focus on the quality of online learning materials. In response to this, a project was undertaken to transform online law unit sites from pdfs to interactive pages; and from 2-hour lectures to multiple short pre-recordings dedicated to a single concept. This study explored student engagement, preferences, and performance through a pragmatic and meaningful analysis of learning analytics in relation to learning design for the purpose of informing our practice. This paper draws on data involving 89 students that participated in the study from the cohort of approximately 200 students. This study confirms that the majority of students preferred the new digital learning design approach, however, the diversity of student preferences has implications for learning design that need to be taken into consideration.


Introduction
University online learning options have been available to students for decades (Fan et al., 2021;Nguyen et al., 2021;Singh et al., 2021).The COVID-19 pandemic catalysed a widespread need for online delivery in higher education globally (Fan et al., 2021).Post-pandemic approaches are embracing more hybrid options allowing students to flexibly chose their modes of engagement (Bevacqua et al., 2019;Singh et al., 2021).This shift to offering flexible modes, has led to a focus on the quality of learning materials students can engage with synchronously or asynchronously (Bevacqua et al., 2019).Alongside the emphasis on quality online learning and teaching experiences is the parallel focus on how this can be evaluated and improved.In this study we focus on student engagement and preferences in the context of the digital learning design of law units.

Purpose of this research and the research questions
This study explores the student experience of the redesign of unit materials.The planning for and implementation of the learning design in the project work that provided the context for this study aligned with the research informed CloudFirst approach that is the application of the principles of learning and teaching developed and promoted by Deakin University (2022).We collaboratively built the unit site, documenting the high-level mapping of the content (shifting from four modules to 10 topics) our progress and style choices to ensure consistency, accessibility, and ease of navigation throughout the site.Under the old model, students were offered a 2-hour lecture and a 1-hour seminar, recordings of both were made available to all students.Under the new model, students were provided short, pre-recorded videos (90 minutes in total per week) embedded within interactive site pages (replacing a weekly pdf study guide).Students had the choice of attending a 90 minute inperson seminar on-campus, a synchronous online seminar or watching a recording.The scaffolded content (mini-recordings, linked readings, activities, and formative feedback quizzes) was gradually released as the weekly content and design was finalised.The purpose of this study was to investigate the aspects of the new approach that worked for students, based on their learning analytics i.e., their feedback from self-report surveys, system generated learner usage data and grades to inform our future practice, hence our research question is "Which learning design practices support students in studying law?"In answering this question, this study makes contributions to scholarly research about learning design practices within and across disciplines.

Informing our scholarship of practice with student learning analytics
The value of practitioner led scholarly investigations of practice are becoming more widely recognised and encouraged across disciplines.The types of evidence available to teaching practitioners, beyond the problematised student teacher evaluations (Hornstein, 2016), has increased exponentially with widespread collection of learning analytics by universities (Corrin et al., 2016;Du et al., 2021;Griffiths, 2020).Learning analytics are one of many technological innovations offering promise and serious concern as they forge forward faster than the ethical debates, policy and practical application for educators and students alike (Archer & Prinsloo, 2020;Clow, 2013;Corrin et al. 2019;Griffiths, 2020;Wilson et al, 2017).We consider learning analytics in this project according to the definition developed by the Society for Learning Analytics Research (SoLAR) as "the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs" (Siemens et al., 2011, p. 4).This means all the data collected in this study (both systems generated, student survey responses and grades) are learning analytics by this definition.
Studies that have used learning analytics in the form of system access data as a proxy for student behaviour have sought to inform teaching and learning practice by testing if there is a correlation between this behavioural measure and students' performance as measured by grades (Atherton et al., 2017, Crampton et al., 2012).Understanding student behaviour, however, does not emphasise the difference between learning itself and activity (Wilson et al., 2017) nor assist with understanding students' motivations and preferences (Atherton et al., 2017).Suggestions for supplementing behavioural analytics to understand student learning include observation, self-report data (Ellis et al., 2017) and interviewing (Atherton et al., 2017).Others, such as Saqr et al. (2022) have stressed the consideration of the context, learning design and subject areas in the interpretation of learning analytics.Corrin et al. (2016) highlighted the need to understand learning design "or pedagogical intent" (p. 5) to support meaningful use of learning analytics.The broad pedagogical intent (Corrin et al., 2016) for our digital learning redesign was that students choosing online options receive the equivalent opportunity for learning as on-campus through the program and pedagogy (Bevacqua et al., 2019;Chen et al., 2010).
In this study of our practice, we build on understandings beyond what can be discerned from the behavioural indicators from system data, to critically engage with other learning analytics such as, student self-report data (Archer & Prinsloo, 2020).In understanding student engagement in relation to the hybrid space with a particular focus on the learning design we utilise Redmond et al.'s (2018) Online Engagement Framework for Higher Education to make meaningful interpretations to inform our practice.Redmond et al.'s (2018) social constructionist framework considers the interrelated elements emotional, behavioural, social, collaborative, and cognitive engagement that can be used to audit learning content (Cain & Fanshawe, 2021) or as a tool for evaluating the learning environment design (Redmond et al., 2018).This study is topical given the increased focus on the learning design of unit materials considering the increased interest and engagement in online learning quality during the COVID-19 pandemic and as well as the benefits and implications of how this might be evaluated using learning analytics by educators themselves.

Study design
Our research project was designed to explore student engagement, preferences, and performance in relation to our redesign of units.The study uses the three-dimensional model of Pragmatic, Evidence-Based Education designed to consider useful evidence, educator judgement and the local context (Newton et al., 2020).This model, originally designed for educators to evaluate existing research to inform their practice in their context informs our approach to analysis because consideration of the three aspects of the model serve as a useful a starting point for considering what we can understand about the student experience.This study also draws on Redmond et al.'s (2018) Online Engagement Framework to guide meaningful discussion and conclusions around engagement beyond behavioural from the multiple data points and perspectives explored in this study.

Ethics, participants and consent
The project received low-risk ethics permission for the study Teaching and Learning in Law: Engagement, Preferences and Performance by the Faculty of Business and Law, Deakin University Human Ethics Advisory Group on 17 December 2021 (modified 21 June 2022), project number BL-EC 67-21.Ethics was set up to protect students' anonymity by specifying that the data would be collated and de-identified by the non-teaching researcher after the results were released to assure that responses would not impact their grades and increase the likelihood of candid responses.Approval was given for an incentive of five randomly drawn $50 vouchers for participation in the survey.The student cohort were invited to participate at the beginning of the trimester by the Unit Chair who informed students that data would be used to find out about their study habits and preferences to improve teaching and learning should they consent via the information page or survey.
The total participants of this study were 89 students of the approximately 200 enrolled in the law units (one undergraduate and one post-graduate, the latter having a stronger emphasis on policy).The age spread shown in Table 1 indicates that while the number of mature age students is significant, most participants were in the 18-24 age bracket and female.The 35-44 and 45-54 group were merged and renamed 35+ to increase anonymity.

Data collection methods, description and approach to analysis
The learning analytics collected included learning management system generated data, student grades as well as self-report data collected using online surveys.There were 89 students consenting to the use of system generated learner usage reports (that included the pages visited and the time spent on each page) and their grades.There were 58 students that self-reported mode of seminar attendance (on-campus, online or recording) and 67 students who responded to the self-report survey.The survey began with basic demographics (student number, gender and age bracket) and then a series of Likert scale questions on: (1) engagement indicators (lecture videos, access to slides, seminars, consulting with the lecturer, fire side chats, required readings, optional readings, online materials, weekly feedback quizzes, checklists, activities, forums, online readings links, email reminders, and announcements) (2) study habits (attention paid watching videos, time on assignments and preparing for the exam, what helped with assignments and exams) and ( 3) preferences (on learning design model and video length).The survey concluded with three text response questions (Table 2) with 52 unique respondents.We took a pragmatic and critical approach to analysing the large volume of data, in two phases with a view to triangulating and reporting data that could inform our practice.In phase one, aspects of the self-report and learner usage data were explored by two of the researchers in preparing the guidance and data for a statistician.This included the volume and types of data to consider how aspects might be triangulated and collated to test for correlations between behavioural usage data and self-report preferences and grades.The sample contained 59 students with fully completed self-report surveys.This method of inferential statistical analysis was conducted using predictors of overall success (effects of using learning resources and impact of watching videos), predictors of assignment and exam success (by time spent and using specific resources), student preferences (opinion of learning model and video length) and an analysis of linear associations for predictors of assignment grade, exam grade and total grade (also included on a percentage scale and grades versus hours spent).The unpublished report (Davey, C. & Finch, S., 2023) on the statistical analysis commissioned by us from the Statistical Consulting Centre will be referenced in this paper.
In phase two, the two non-teacher researchers collaboratively applied an iterative approach to the analysis and triangulation of themed responses with corresponding parts of the self-report and the system generated learner usage data to better understand the student experience beyond the behavioural indicators.The number of students responding to each question and total words are recorded in Table 2. Responses were coded with a pseudonym, gender, age bracket, and question number.The data collected from 49 students consisted of 84 comments in Q1 and Q3 that were coded by snippets (partial comments) associated with the learning design elements in a first pass and second pass where connections were made with other data and the Online Engagement Framework (Redmond, et al., 2018) elements (see themes in Table 3).Snippets drawn from Q1 and Q3 to the same effect were merged.Question Two, on assessment was analysed separately.This paper reports on our research resulting from the two phases of analysis, using the Online Engagement Framework (Redmond, et al, 2018) to structure our initial discussion and findings for this study.

Discussion and findings
The discussion and findings presented in this paper are explored using the Online Engagement Framework interrelated factors of emotional and behavioural engagement (where we discuss emotional engagement preferences in relation to the learning redesign including the unit site instead of pdf documents and short videos instead of lectures, behavioural engagement and performance), the social and collaborative engagement factors (where we discuss student experience and preferences in seminars, fireside chats and online discussion forums) and their cognitive engagement (in relation to assessment and seminars).

Emotional and behavioural engagement
CloudFirst model approach to learning Majority of students responded favourably (42 of 60 or 70%) towards the prompt "The new CloudFirst model of the unit i.e., short, pre-recorded lectures and 90-minute seminars is a good learning model for me" (Figure 1).

Figure 1: Student preference for the new CloudFirst model
To understand the majority agreement that the CloudFirst approach was a good learning model for most students was a good starting point for understanding that our learning design efforts were helpful for students.
To explore this further we draw on 13 snippets of the open-text responses coded as liked the model exemplified in the affective descriptors "loved", "enjoyable (3)", "liked", "excellent", "very good", "good", "fantastic", "easier" (2), "achievable" and "fun", for their reaction towards the new model, indicative of their positive emotional engagement (Redmond, et al., 2018).The positive emotional response to the new format positions students for a positive learning experience illustrated in this example from Annie, "I liked how the weekly topics were broken up into multiple parts with their own videos and PowerPoints" articulating that "overall, the unit is very well designed from an online learning perspective and don't think you could really improve it!"(Female, 18-24, Q1 and Q3).For some students the model alleviated anticipated negative feelings toward studying the content in law as in the case of Nora: "For a unit that I was nervous about (seeming content heavy), it was set up so well that you could follow simple steps to keep on top of it all" (Female, 18-24, Q1) and similarly for Ren, "The new model was really easy to follow and less intimidating than a long lecture and huge readings" (Female, 25-34, Q1).For George it was less stressful because there was "no pressure to have to be online at certain times" (Male, 25-34, Q1).We see in these examples how the positive emotional engagement potentially supports behavioural engagement to use the content.There were eleven students who made comments expressing a negative emotional reaction to the new approach using terms such as "dislike", "didn't work for me" or "overwhelming".Sarita commented that she "…really struggled with the structure of this unit.The content was interesting, but the delivery was a very time-consuming process" (Female, 18-24, Q1).Additionally, this was further supported by Linh who "…found the combination of the notes on the page and the lecture slides/recording to be very tedious and took a lot longer than most other units" (Female, 18-24, Q1).Two found they "struggled" because of the amount of content with Theo representative of the experience as "the content level was very high for the allocated time which made info absorption difficult while juggling life and work" (Male, 35+, Q1).Four others found the presentation of some dense areas of law "disjointed" (2) or "topics were not broken down very well" in the format and preferred the textbook or simply just wanted 2hr lectures or just no page content at all.The negative emotional reactions seen in these examples could be a barrier to a positive learning experience.
Here we explored positive emotional engagement as an important precursor to a positive behavioural and cognitive engagement.We now explore the interrelated student experience of the short video lectures embedded in the content pages as they were an integral part of unit content redesign.

Short Interspersed Video Lectures
A clear majority (92%) of students thought that "watching video lectures helped" their learning in the unit with 25 Strongly Agreeing and 9 Somewhat Agreeing and one student in each of the other three categories.When it came to the behavioural engagement indicator of how much they paid attention (Fredricks, Blumenfeld & Paris, 2004), most students (77%) reported this as Always ( 22) or Usually (26). Figure 2 demonstrates a trend in mean grades and how much attention students paid watching videos.

Figure 2: Mean grades and attention while watching videos
Furthermore, Table 4 shows the number of students in each 'attention to category and their mean grade.The emotional reaction to the shorter videos were exemplified by eleven students in descriptors such as "really like", "easy", "perfect", "fantastic" (2), "extremely helpful and digestible", "enticing" and "positive".Ari thought, "The video lengths were appropriate and not intimidating or laborious" (Male, 25-34, Q1).If a video were to be intimidating, then it is understandable that negative engagement would inhibit behavioural engagement.Five comments reveal how the emotional engagement supports the behavioural engagement illustrated here by Nadya saying, "short is good because if you need to refresh your knowledge on a particular area, it is easier to find (more efficient use of time)" (Female, 35+, Q1).Surprisingly, there were four students who wanted shorter videos with two preferring that no video exceed 20 minutes.There were six students who would prefer longer videos akin to the traditional 2-hr lecture (4) because for them the short videos eliciting a negative emotional response as in the case of Sindu, who, "was irritated by having to load so many videos and then clicking through them as I listen to them on the train -sometimes without internet."(Female, 35+years old, Q3).These barriers to positive emotional and behavioural motivation could impact their cognitive engagement.

Time spent on unit site as a measure of behavioural engagement and relationship with performance
The relationship between the total grade and the total number of hours spent on the unit site (Figure 3) can be explained as "the positive slope of its regression line implies that the more time a student spends on the unit site, the higher their total grade is expected to be" (Davey, C. & Finch, S., 2023, p. 31).

Figure 3: Scatterplot of the linear relationship between total grade and time spent on the unit cloud site
The statistics of this regression line are provided in Table 5 and "demonstrate that the results represent statistically significant evidence against the assumption that total grade does not depend on the amount of time spent on the unit's cloud site."(Davey, C. & Finch, S., 2023, p. 31).This representation that uses the hours spent on the site as a proxy for behavioural engagement shows there is a statistically significant relationship between the time students spend on the unit site and performance.

Preference towards the model and correlation with grades
The majority of students did prefer the CloudFirst model as indicated in the self-report data.For the statistical analysis (Davey, C. & Finch, S., 2023, p. 21) of student opinion towards the new CloudFirst model a binary variable of yes and no was created by collapsing the agree and disagree responses.Table 6 provides the mean total grades of students who found the CloudFirst model to be helpful for learning law and of students who did not.The grades of the two groups of students covered similar ranges of values with the standard deviation of the two groups being of a similar order of magnitude, being between 10% and 13%.The mean grades of the two groups differed by less than one percent.Table 7 provides the results of a t-test of the difference between the mean total grade of students who found the model helpful for learning and the mean total grade of students and "shows that there is insufficient evidence against the assumption that overall grade is independent of student opinion towards the CloudFirst model" (Davey, C. & Finch, S., 2023, p. 21).In other words, students who preferred the model on average scored slightly higher (66.73 vs 65.78) however the difference is too slight and the sample too small to affirm that that the relationship between their preference and performance was statistically significant.

Social and collaborative engagement
The main opportunities for social engagement were seminars, discussion forums and fireside chats (an interpersonal with the lecturer the start of the trimester via zoom).

Seminars and online discussion forums
We know from the 58 students' self-report attendance that on average they attended 4.4 times (out of a possible 11) given the total of 260 attendances.When students did engage in the seminars, 65% of the time it was synchronously (79 for on-campus and 90 live online attendances) that could have afforded an opportunity for social engagement as opposed to the 91 instances of watching the recording.Low attendance was anecdotally confirmed by Bec who "observed most did not attend the live seminars" (Female, 35+, Q3).Dara reported that she engaged with 7 seminars in total with only the first and last being live online but otherwise she watched the recording.In the open-text response she said "I should have attended the seminars live.That's my fault, not anyone else's.Oops" (Female, 18-24, Q3).We can understand from this that Dara thought she would have benefitted more from the live sessions.
A total of five students made a comment in the open-text responses about wanting further social engagement.
Ang noted that the seminars were less useful due to "a lack of student interaction" (Female, 25-34, Q1) equally supported by Theo who thought "more personal engagement with the online study groups" (Male, 35+, Q3) would have improved their experience.Theo did not blame the teachers for this citing that, "the university and its teachers can only do so much" but expressed his disappointment that "the largest drawback is when you sit in a seminar and only one person engages".Interestingly, Theo's insight reflects the point made by Coats (2006cited in Redmond et al., 2018) that "institutions are responsible for creating environments that make learning possible, that afford opportunities to learn," but "the final responsibility for learning … rests with students" (p.29).It should be noted that in the Deakin Law School attendance is not compulsory.Students such as Bec would like the institution to do more about this emphasising that the "online learning experience can only be enhanced with student engagement" and she recommended "having student attendance and participation say 10% of the overall final grade" backing this up with an argument that "they do this at certain top American law schools.Class participation helps to consolidate learning; exposes gaps in our knowledge; learn from each other; and trains us to argue like a lawyer; etc. Sadly, this didn't happen with our cohort" (Female, 35+, Q3).Mona was another student who felt that that their learning in the unit would have been improved if there were more students engaging in the discussion forums: "More engagement from the class.I felt like I made more forum posts than anyone else which is not usual for me" (Female, 25-34, Q3).

Fireside chats
Fireside chats were offered to students as an opportunity to engage with the Unit Chair and each other in an informal zoom setting.Of the 63 students who responded to the survey question 22% agreed that they were useful while 20% disagreed and 50% indicated that they did not attend or indicated a neutral response.Although there were no direct mentions of the fireside chats with the Unit Chair in open-text there were 14 positive comments about the teaching of the unit with thirteen that indicated warm connections with the Unit Chair illustrated in the example of Georgina "In this unit I don't think anything could be improved, [Unit Chair] was great with her reminders and really kind responses to questions … and thank [Unit Chair] so much for her time and attention to detail" (Female, 25-34 years, Q3).
A small number of students expressed the desire for social engagement and felt this would support their cognitive engagement (Redmond et al., 2018) in the seminars, however, low attendance and low active participation in the sessions limited the experience and benefits of social engagement.For the 58 who attended at least one seminar (an average of 4.4 seminars per respondent) more than two-thirds attended synchronously but still found the social engagement lacking.A number of students did appreciate the collaborative engagement experiences with the unit chair (and tutor) but relatively few took advantage of attending the fireside chat opportunity at the start of the trimester.

Assignment and seminars
There was evidence of student's cognitive engagement in relation to the assignment and a wish for deeper cognitive engagement in seminars.In answer to the open-text question two, about whether the assignment achieved the goal of providing students with authentic experiences 51 students responded with a minority of 7 indicating that it did not.Of the 44 who indicated that this goal was met or partially met, an example of the cognitive engagement through the assessment was evident for Ayla who found that "They were challenging but in a productive way" fulfilling her intention that she "wanted to learn and improve rather than just coast by for an easy grade" (Female, 18-24 years old, Q2).In relation to the cognitive engagement experienced with the assessments, Ayla revealed the "commitment to learning" with which she approached the unit, an indicator of emotional engagement (Redmond, et al., 2028) that provides us evidence of the connection between these forms of engagement.Students' cognitive engagement could have been increased in the seminars if there was more of a focus on answering problem questions (11 open text-comments to this effect) and/or that written answers were provided (6 comments).While Ari agreed that the seminars should be "more structured to specifically and substantively follow the legal steps required to complete the problem questions" and acknowledged that when seminar responses and practice exams were provided that they were "helpful" (Male, 25-34, Q1).The evidence of students indicating that they wanted deeper engagement with the legal problem scenarios in seminars is an indicator of their desire for further cognitive engagement (Redmond et al., 2018).

Summary findings and limitations
In this study we have presented an initial analysis and discussion exploring which learning design elements help students studying law in relation to a redesign using the CloudFirst approach using learning analytics.We have been able to approach this in a more wholistic way by taking into consideration what we consider useful evidence, in using our educator judgement and knowledge of the local context (Newton et al., 2020).
The CloudFirst model was a good approach for 70% of students in supporting their learning according to the self-report survey.From the self-report survey, 92% of students thought that watching the video lectures helped their learning of these 72% reported that they usually or always paid attention.There was a statistically significant relationship between the time spent on the unit site and student grades and a positive trend between the amount of attention paid while watching videos and grades (although this was not statistically significant).This study may not be broadly generalised due to the specific context of a small cohort of law students, a particular learning management system and the study design.The learning management system used and the add on that allows students to download the course content in an alternative format means that for students who use this we do not know how much of the content was viewed or for how long this was open for.Although this limitation did not to a large degree impact the validity of results in data presented in this paper it was something we learned that impacts the integrity of our data.We potentially could access the downloaded alternative formats data to be able to triangulate the impact of this possibility on the data.However, the self-report survey and the open-text responses on the overall approach and main features has most certainly enriched the depth of our understanding of the student experience.This study could have been strengthened by the opportunity interview students (Atherton et al., 2017) so we could further unpack the behavioural indicators or lack of them.The inclusion of self-report data helped us to better understand the motivation and preferences of students that would have been limited to a behavioural one about resources used and accessed if we had not included this (Wilson et al., 2017;Atherton et al., 2017;Ellis et al., 2017).
We became aware of the interrelatedness of students' emotional, behavioural and cognitive engagement that we had not considered before through using Redmond et al.'s (2018) Online Engagement Framework.From the open-ended text responses, we were able to understand that a positive emotional reaction to the redesign and short videos was a motivational factor for behavioural and cognitive engagement and conversely for a negative reaction.These juxtaposed polar experiences extended to some students experiencing flexibility in the hybrid model in relation to be able to engage asynchronously at a time of their choosing as well as this being diminished for others who would have preferred a two-hour lecture that was easier to download to listen to anywhere, any time.The open-text responses also gave us insight into the interpersonal benefits for students who had the opportunity to engage with their teachers.Five students expressed the desire for social and collaborative engagement with peers in seminars and in the discussion forums to support their cognitive engagement.However, this was impacted due to lack of attendance or lack of participation in synchronous sessions even if they attended.Cognitive engagement could have been improved in the seminars if more time was spent working through legal problem question as mentioned by eleven students in the open-text responses with a further six wanting more written answers to these to support their learning.The emotional engagement demonstrated by the intention and desire to learn and willingness to be challenged in assessment and seminars reinforces the importance of this as a precursor to productive cognitive engagement as a finding in our study.

Implications for practice and further research
In this initial analysis we have begun with looking at students' responses to the overall model and modes available for them to engage drawing on the learner usage data triangulated with self-report data to enrich our understanding.There is further granular analysis that we could do to understand which aspects and particular resources contained in the unit sites were being accessed by students and their corresponding self-report indication as to the aspects that were most useful.There are over 180 site pages that we have learner usage data for and after completing this initial analysis we could do further work exploring student engagement and preferences with particular resources such as the multiple-choice feedback quizzes, checklists, seminar preparation and exam revision materials, for example.We did look at checklists as one student did make a comment that these were useful, and our initial exploration triangulating the data showed an interesting discrepancy of more students agreeing checklists were useful in the self-report data than had checked anything off in the lists according to the learner usage data.We do not know if this is due to inaccuracy of their reporting in the survey or whether this feature was useful even as a summary at a glance of what should have been covered that week.This discrepancy is another example of the kind of question that could be unpacked in student interviews (Ellis et al., 2017).
In exploring student preferences in the text-responses, we noticed the juxtaposition of the strong opposite experiences of some students in relation to the model and the short videos.Even though it was a minority of students expressed a preference for the former style of delivery we understand that the negative emotional response due to the reaction of the short videos impacts their learning and in the interests of catering to student diversity needs and preferences our next steps include creating an optional 'binge model' which will give access to one longer video lecture for each week and a text only study guide equivalent to the site content text that we will trial in 2024.This is in line with the Universal Design for Learning Principles (CAST, 2018) in offering multiple modes for engagement.Our next steps for further research will be auditing the learning design of the unit against the Universal Design for Learning principles to identify aspects already embedded in as well as aspects that could be developed further to improve the student experience in the hybrid environment in our context.

Table 2 : Number of student responses and words to the three open-text response questions
Q2.A goal of the assessment task was to provide you with authentic (lawyerly) experiences and enhance your graduate employability skills.Do you believe the assessment achieved these goals?51 721Q3.What do you think would improve your online learning experience [Last chance to make comments]?