‘I don’t know the hierarchy’: Using UX to position literacy development resources where students expect them

The provision of online resources for tertiary student literacy development outside of students’ curricular contexts is problematic because engagement with centralised support provisions is low. As part of a Library team focused on student literacy development across our university, we have conducted multiple rounds of user experience (UX) testing with our students to design a set of resources in a course within our learning management system that all students can access. Our most recent UX employed usability testing, card sorting, and low-fi wire framing activities to identify how students experience our resources. Findings indicate that students are confused by enforced groupings of literacy development content and that they expect our resources to be accessible with the rest of their assessment information. Implications for design include balancing student expectations of immediate access to relevant literacy development resources with the constraints of having a small team who can design this content.


Introduction and literature review
In articulating a meaningful application of UX design methods, this research contributes practical tools for codesigning literacy development resources with students that are both assessment and discipline specific.We have partnered with students (Dalziel, 2016) on designing literacy development resources for five years, but we have not found a user-friendly hierarchy (organisation) for content that is fundamental to students' engagement with knowledge.Outside of literacy courses that may be required in their first year, students' usual engagement with literacy development occurs when working on their assessments in mainstream courses.Through such activity, students are assumed to learn their disciplinary discourses osmotically.Those who do not are often referred to centralised support services.Student engagement with such support is low on-campus, via workshops/individual appointments (Harris, 2016), and online, via generic skills websites (Behrend, 2014).
One solution that we have taken is to create a course in our learning management system (LMS), Canvas, into which all students at our institution are enrolled: Your Library on Canvas (YLOC).This aligns with a finding from a survey of 1,568 students' experiences of online spaces at one university that "digital platforms such as the LMS were valued as the 'one place' to interact successfully with university requirements" (Henderson et al., 2017(Henderson et al., , p. 1572)).Students access YLOC via their Canvas dashboards, with content divided into distinct areas, such as how to search for information and common types of written assessments.YLOC is informed by multimedia principles of e-learning design (Clark & Mayer, 2016), research and information literacy development models (e.g., Willison & O'Regan, 2007), and assessment genre theory (Nesi & Gardner, 2012).As well as generic content (e.g., the meaning and relevance of peer review), annotated examples of disciplinerelevant writing (e.g., how to write a clinical sciences case study) are also provided.
The current iteration of YLOC has developed out of multiple rounds of user experience (UX) testing.Our latest UX that we report on here was partly focused on how to organise and position content in a way that is meaningful to students.We hypothesised that the ideal positioning of our content should be on, or linked from, Canvas course assignment pages, rather than located separately from students' courses waiting to be discovered.

Methods
UX methods are appropriate to our design process because they enable us to determine whether students get the outcomes that we expect and to interpret the quality of their experiences (Massis, 2018).Ethical approval was granted to recruit students for participation in a one-hour individual UX session during Semester 1, 2023, and we have approval to recruit again in Semester 2. This paper reports on the Semester 1 sessions, which each comprised usability testing, card sorting, and wire framing.We did not target any educational level or other classification of student.Six students participated, which provided us with rich detail about their experiences and expectations.When conducting multiple rounds of usability testing, this is a sufficient amount of data because three or four users per round will usually reveal what the main problems are, with further testing yielding diminishing returns (Krug, 2000).Each session was audio recorded, usability activity was screen recorded, and we took photos of students' card sorts and screenshots of their wireframes.
Usability testing was of YLOC.Following a semi-structured contextual interview format (UX Design Institute, 2023), we engaged students in a mixture of "get it testing" (whether students understood the purpose of the content and how it was organised), and "key task testing" (how well students could perform tasks we asked them to do) (Krug, 2000, p. 153).For example, one get it item asked students to look at YLOC's home page and tell us their impressions of it, including what they thought YLOC was for.Key tasks were scenario based, such as asking students to imagine that their assignment instructions specified use of good quality information sources and then asking them to find content on YLOC that they thought would be helpful.As students engaged with the tasks, we encouraged them to verbalise their thoughts and feelings, and we asked probe questions.
Closed card sorting was used because we wanted to know how students grouped specific services and content, (UX Design Institute, 2022a).We asked students to hand-sort 16 cards into five specified groups.Each card represented a service or content type at Auckland University of Technology (AUT): Assignment instructions & marking criteria, Careers advice, Class timetable, Counselling, Course readings, Course selection advice, Financial support, How to use software, IT support, Learning tasks, Lecturers' slides, Mentoring, Resources that show me how to do my assignments, Searching for information for my assignments, Social activities, and Time management strategies.The five groups corresponded to online spaces at AUT: on Canvas any time, on Canvas when I have an assignment to do, Library website, Student Hub Online, and Other.We asked students to explain their groupings.
Low-fidelity wire framing (UX Design Institute, 2022b) was used as the final activity to help us find out which elements students wanted on an assignment page in an imaginary Canvas course, as well as how they would group those elements.Students were presented with a mockup of a blank Canvas assignment page.To the left were several simple boxes with specific labels that corresponded to elements/functions usually found on an assignment page: Due date, Examples, Instructions, Links, Marking criteria, Originality checker, Submit, Videos, and Word/Time limit.We asked students to design their own assignment page by dragging and dropping the elements they wanted on their page and organising them as they liked.There was also an option for students to create their own elements.We asked students to explain their page designs.
Our department only has authority to design YLOC.We do not have responsibility for some of the content and services stated in the card sorting and wire framing activities.For the card sorting, we included a range of options to contextualise the choices students made about content and services for which we do have responsibility.For the wire framing, while we do not have authority to alter the design of LMS assignment pages, we have shared our findings with the team who do.
We transcribed audio recordings of all six sessions and coded to identify categories and themes (Charmaz, 2006).Analysis of the visual data drew on visual content analysis to allow us to systematically assess the presence, meanings, and relationships of visual elements (Rose, 2016).The visual data were also matched with corresponding quotes from the transcripts to increase the robustness of the analysis.

Findings and discussion
Students were generally able to complete the usability tasks successfully.Analysis of their interactions with the interviewers also showed students positively appraising the content and how it was organised within pages.However, navigating across different pages was more problematic, and students expected all their course-related content, including our literacy development content, to be in their Canvas courses.
In the usability testing activity, students appreciated the use of tabs and accordions for grouping related content in page elements and making the groupings clear to them.For example, S3 stated how accordions simplified the process of finding specific content: 'I like the drop-down menus.Kind of simplifies categories for you so you know what you're looking for.'This and other comments indicate that the individual page elements themselves are working well.However, organising a large number of related, but separate, literacy development resources into enforced groupings is problematic because students are usually looking for specific content.They are not aware of, or interested in, how such content relates to other content.This was evident when we asked S6 to locate content that might help them with preparing for a presentation: 'I want to look for the presentation.So that's the only thing I look for.I don't know the hierarchy.'This unawareness of how different content items related to each other was also noticeable in S3's confusion over the distinction between two pages when looking at one of YLOC's second-level menu pages, where different information literacy tasks are shown as cards that each link to a content page.Without having yet seen the content on either page, S3 did not know what the difference might be: 'Choose good quality information, but then you're also searching for information.What's the difference between having two different buckets for information?'In a synchronous teaching context, a librarian would be able to explain this distinction between searching for and then quality appraising information, but it was not evident to this student as they looked at a static online resource.Students' confusion over how content related to other content motivated them to suggest added functionality or segmentation of content into different pages.For example, S5 wanted the content on one page to be segmented into separate pages: 'Oh okay oh you just got to scroll down more.Okay maybe it would be useful to make these [faculty accordions] into another page… because I didn't notice that you just keep scrolling down so much.' In the card sorting activity, Canvas was the preferred location for course-related topics.Students indicated that Canvas is an important online location that they use every day.For example, 'I find I'm usually on Canvas most of the time looking for information.And every time we get announcements, and stuff usually comes through Canvas' (S3).Table 1 shows all topics students deemed course related.Some topics were placed depending on how they conceived of them (e.g., placing 'How to use software' on Canvas if it related to an assignment task).Views about whether some content should be on the Library website or on Canvas differed yet even those used to using the Library website could see Canvas as a preferred location: I know that these services [resources that show me how to do my assignments & searching for information for my assignments] are available in Library, but when I come here for the first time, as a first year student, I would probably go and check on Canvas rather than Library.And, I feel these assignments are mainly related to the course, and I tend to organise it that way (S2).
The wire framing activity revealed that students expect a comprehensive set of elements on an assignment page.Similar to the usability and card sorting, there was an indication that students want information integrated into their learning contexts, not elsewhere.As S3 put it, 'if everything was already explained in the assignment tab, I wouldn't need to constantly click and click and download word documents to have those instructions available.'All students selected Instructions first and placed it at the top left corner of the page.Other features were oriented around and in relation to the Instructions.Students unanimously supported the inclusion of Examples, including those provided on YLOC.As S2 explained, 'Examples, videos and links are something that would help me to prepare for it [an assignment].'Others expanded by suggesting that an example on the assignments page 'tells you what you should be aiming for' (S4) or gives 'an idea of how your work will look' (S3).Beyond frequency, patterns emerged regarding placement and proximity of certain elements to each other.To better understand these relationships, the six screenshots of the students' designs were split into a grid and the elements in each cell identified and recorded.This data was then compiled and visualised in a heat map (Figure 1), showing participants' preferred positioning for the Instructions, Marking Criteria, and Examples.In line with Behrend's (2014) finding that students do not access institutionally provided learning resources out of the context of their own courses, only one of our six students had engaged with YLOC content prior to the UX session, which is consistent with our UX testing in previous years.Moreover, when we asked these students to actually engage with the content, the findings above indicate that they were unaware of how it is organised, and that they were confused or even overwhelmed by the amount of it, which is similar to Selwyn's (2016) finding that navigating LMS courses can be a burden for students.Rather than presupposing that students will naturally want to browse a repository of literacy development resources that we have organised in ways that make sense to us, our findings suggest that we should only aim for a minimally viable organisation of content.Instead, we can focus on connecting students with relevant content as they work on their current assessments, making it accessible to them at time of need (Henderson et al., 2017), rather than leaving it up to them to go looking for it.These observations align with a well-established body of literature that advocates for in-context e-learning resources that combine services from across the university if they meet similar needs (e.g., Russell, 2016).Likewise, our students' preferences align with research that shows embedding literacy development at point of need makes a positive impact on academic performance (Macnaught et al., 2022).

Concluding remarks and future research
The findings from our first round of 2023 UX data collection point to the perhaps insurmountable challenge of creating a navigable set of literacy development resources that, while integral to student success, are not of sufficiently clear relevance to students who are often focused on their current assessments.It appears that students want examples/resources that show them how to do their assessments easily accessible with all their assessment information.However, centralised departments with specialist knowledge like ours work with all staff and students at their universities, so designing tailored resources for all courses is unfeasible.In the design of our content, we are faced with a tension between providing literacy resources to students at point of need but also having to organise them into some kind of structure that can be managed sustainably.One option may be to link to a centralised resource LMS repository from specific LMS courses.We, and/or lecturers, could provide hyperlinks from assessment pages in individual courses to subject-specific information and examples of whatever literacies are required in those assessments.For example, if students are expected to synthesise literature, these exact words can be hyperlinked to a discipline-relevant resource that shows students how they can synthesise.Such an approach would have implications for the extent to which content in the centralised repository would have to be segmented in order to minimise the navigating students would need to do after opening a link (i.e., students should be able to access something of specific relevance to their current assessment and discipline without having to sift through our organisation of that content).Our second round of UX in 2023 will focus on testing the reorganisation of some of our literacy development content in YLOC into finer segments and to present students with a proposed assignment page design that reflects the elements that our first round of students selected, as well as how they grouped those elements.A subsequent research project could then involve a collaboration with lecturers, learning designers and learning technologists to investigate student use of assignment pages that contain hyperlinks to relevant literacy development resources and how that use impacts academic performance.

Figure 1 :
Figure 1: Frequency and location of Instructions, Marking Criteria, and Examples The heat map shows the ideal relationship and proximity of the three elements to each other in the page designs.Four designs placed Marking Criteria and Examples in the same location at cell 2.1 directly under the Instructions.Marking Criteria and Examples were viewed as equally essential to be close to the Instructions and each other.As S4 says, 'I would put the marking criteria next to it [the instructions], no actually, I would put the examples, and then the marking criteria because it tells you what you should be aiming for'.The horizontal variation across the top of the map suggests an even stronger connection between these elements, with Marking Criteria and Examples being placed at the same level as instructions to indicate a flat hierarchy.What is clear is that when students open an assignment page, they want to see the Instructions first, followed closely by the Marking Criteria and Examples to help them to understand what is expected in terms of performance.This understanding of where users expect to find certain elements, can inform design decisions to improve the user experience going forward.In line withBehrend's (2014) finding that students do not access institutionally provided learning resources out of the context of their own courses, only one of our six students had engaged with YLOC content prior to the UX session, which is consistent with our UX testing in previous years.Moreover, when we asked these students to actually engage with the content, the findings above indicate that they were unaware of how it is organised, and that they were confused or even overwhelmed by the amount of it, which is similar to Selwyn's (2016) finding that navigating LMS courses can be a burden for students.Rather than presupposing that students will naturally want to browse a repository of literacy development resources that we have organised in ways that make sense to us, our findings suggest that we should only aim for a minimally viable organisation of content.Instead, we can focus on connecting students with relevant content as they work on their current assessments, making it accessible to them at time of need(Henderson et al., 2017), rather than leaving it up to them to go looking for it.These observations align with a well-established body of literature that advocates for in-context e-learning resources that combine services from across the university if they meet similar needs (e.g., Russell, 2016).Likewise, our students' preferences align with research that shows embedding literacy development at point of need makes a positive impact on academic performance(Macnaught et al., 2022).

Table 1 : Student card sorting -content placed in AUT online spaces
Because these are the resources we developed, 'Resources that show me how to do my assignments' and 'Searching for information for my assignments' were of great interest to us.'Resources that show me how to do my assignments' fitted naturally on Canvas according to several students, such as Student 1:And it [assignment page] might give me some hints as to resources on how to do my assignments.For instance, I have a couple of assignments that are very specific in what they want me to do.It's Do X Y Z using this other thing, but you might have to figure out how to use the something, but it's very specific what I am intended to do, so I feel like that's [Canvas] a good place to have those resources.