The missing link in learning analytics: A discussion on using assessment data for student-facing dashboards

Learning Analytics and staff-facing dashboards enable educators to gain insights into student learning, enhancing teaching practices and learning outcomes. Student-facing dashboards (SFDs) can also prove advantageous for students as they empower them to better interpret their results, promote self-reflection with the goal of improving their academic performance. Whilst there is no currently preferred software solution for implementing dashboards. Through the authors’ reflections of creating SFDs, using assessment and ePortfolio asset data, this can be achieved with free or readily available tools. This concise paper used initiatives for the author to reflect upon the process creating SFDs. Investigations for opportunities to refine dashboards are also presented.


Introduction
This paper presents Learning Analytics (LA) data obtained from assessments and ePortfolios and its use in improving student learning outcomes via a student-facing dashboard (SFD).It includes personal reflections on the trial of these initiatives by the first author.
Existing research focuses on the success and engagement of SFDs (Pokhrel, J., et al, 2021)), the frameworks SFDs can be based on (Teasley 2021) and which components might best be displayed (De Barba et al., 2022).Google Scholar searches yielded a limited amount of papers ) highlighting the implementation and software of a dashboard including Quincey, E., et al, (2019).Of those, most research refers to custom, or specific software solutions (Clarke, J., et al, 2021) which are inaccessible beyond their own context.This paper addresses the research gap by presenting two initiatives using readily available data and software.These initiatives are expected to guide practice to implement implement SFDs with LA with a fraction of the resources and time of implementing a university-wide version.There are still areas which can be optimised as some aspects require manual processing as the appropriate software has not been identified yet.Future research could develop this process further.In the following sections, we provide background information on LD and SFD and then present two initiatives related to implementing SFDs.We then conclude with a discussion based on our reflections.

Background
Dashboards are built upon LA data, and we therefore begin by defining LA.The Society for Research in Learning Analytics (SoLAR) defines LA as "the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs" (SoLAR, n.d.).This could include improving the understanding of student performance and engagement, improvement areas for teaching and learning, and opportunities to provide feedback to students.It can in turn lead to improved learning outcomes by making well-informed data-driven decisions (Mangaroska & Giannakos, 2019).Consequently, most LA systems are designed from the perspectives of educators & institutions.
However, most LMS LA tools are limited and there is no preferred external or enterprise solution (Dawson et al., 2018), while some institutions build proprietary systems (Clarke & Tuffley, 2021).In addition, 75% of information systems/software implementations fail as they are never completed or systems are not utilised (Beynon-Davies, 2013).Moreover, if LA systems are too hard to use and the data are difficult to understand, the implementation is likely to fail (Wei et al., 2019).These are important lessons that we considered in relation to our initiatives.Whilst LA dashboards have been used in higher education for more than a decade (Muntean, 2010), most research has been carried out on staff-facing dashboards.The focus of this paper will be on SFDs that are aimed at student users.
Dashboards present a visual representation of data that assists understanding of complex information that can be student and/or staff facing.Schwendimann et al. (2017) have suggested using the term "learning dashboard" for SFDs and proposed the definition: "A learning dashboard is a single display that aggregates different indicators about learner(s), learning process(es) and/or learning context(s) into one or multiple visualizations" (p.37).
SFDs have the potential to assist students in monitoring and managing their learning outcomes by providing objective data on their achievement and progress (Molenaar et al., 2019).Similar to digital literacy skills, students may need to be taught how to use dashboards and increase their dashboard literacy skills to maximise their utility.These dashboards can then serve as intervention tools aiming to either instruct, support or motivate students (Pokhrel et al., 2021).
However, research on SFDs has had mixed results.Some aspects of dashboards have had a positive impact on student motivation while others have had a negative impact on motivation (Corrin & de Barba, 2015).As mentioned in Teasley (2017), a one size fits all approach may not work and may require "producing personalised dashboards [that] may mediate the possible negative effects of feedback" (p.377).
The literature is not clear what components make up a successful SFD.De Barba et al. (2022) have explored the usability of a SFDs, which focuses on "bring[ing] their attention back to the course and provide insights into their current efforts" (p. 2) with pie charts evaluated as the best approach for "performance goal for quizzes, assignments and exams" and "students prioritise looking for information that will help them evaluate how they are progressing towards their goals" (De Barba et al., 2022, p. 2).

Initiatives to employ LA data in student-facing dashboards
In the following section, two initiatives describe the process used to transition from LA data to SFDs.One initiative uses assessment data (readily available for most subjects) and the other is based on ePortfolio asset data (relevant to only some subjects).

Initiative 1: Using assessment results to create student-facing dashboards
In the first initiative, our main goal was to create a SFD based on assessment results, including written feedback.As was cited by Tsai et al. (2021), timeliness is one of the most important elements of effective feedback.Our first step was to reduce the time it took to provide feedback to students following their assessment.To this end, a SFD was created to reflect on the assessment results with the aim of providing a visual display of student performance against the assessment criteria, and written feedback from educators.
The assessment results were recorded on a digital survey software, including a student's scores for each criterion, and feedback from educators on the student's performance.If critical errors had occurred, mandatory feedback was provided.While the results were being finalised, a dashboard was created with visualisations to represent the results (Figure 1).
The data architecture was structured by collecting the assessment results.The data was then provided to the dashboard designer for review.The educator and designer discussed the dashboard format and several iterations were created before deciding on the final format.The data was then loaded into Microsoft Excel and transformed to match the dashboard design.Images of the dashboard were generated with individual student data, which were then distributed to the students for review.
The dashboard was developed using Microsoft Excel.To create the dashboard, pivot tables were created to isolate the most important sections of the data.The charts function was then used to create visualisations based on the pivot tables.Additional pivot tables were used to include the educators' feedback for the students.Once the pivot tables were linked using the slicer function, the slicer allows the educator to select an individual student which changes all connected charts to only display the results for the selected student.But the question was: how to get the dashboard back to the students?Due to privacy concerns and lack of a system to manage the dashboard correctly, an image of the results was generated for each student and emailed to them individually.This required the use of an open-source software Sharex for the images and a script using the Autohotkey software to automate the images for the list of students.Initiative 2: Using ePortfolio asset data to create student-facing dashboards In the second initiative, our main goal was to summarise ePortfolio asset data to students via a SFD.Students uploaded evidence of their learning with self-assessment rubrics via an ePortfolio software.Educators grade these asset submissions based on rubrics and at the end of the year a student grade is generated for all the assets.However, the digital portfolio software lacked the functionality to aggregate the data and students were only able to view their individual submissions during the subject.Therefore, the aim was to utilise the ePortfolio data and create a dashboard with simple visualisations.The data architecture is almost identical to the previous initiative.However, the specific dashboard format used a series of pie charts instead.
Leveraging prior expertise in developing dashboards, we initiated a pilot project to create an SFD that displayed the summaries of their ePortfolios asset data.With a bit trial and error, we collated these data into a dashboard using Excel using a similar process to Initiative 1. Representing the data in a dashboard format with a series of pie chart visualisations seemed the best approach to collating the data (Figure 2).Each row is represented by a specific domain which is being assessed.The first column represents the student's selfassessment, the second column represents the educator's feedback, and the third column is the class aggregate which comprises every assessment made by the educators.By comparing the first and second columns, students can gauge how realistic their self-assessment is.This allowed students to compare their self-assessments, an educator's feedback and the class aggregate over a series of domains.We specifically aimed for the class aggregate instead of the highest performing students to avoid being demotivating for students.The results were broken into multiple "Rounds" (approximately 2-month intervals) so students could compare how they progressed over time.The only issue remaining was how to get these data to the students while maintaining privacy.This is still a problem we are working on in a future iteration.There are alternative ways of sending the information to students; however, we ultimately settled on screenshots of the dashboard, which were then uploaded to the students' assessment results on the LMS.Although there has been no formal research on these initiatives, anecdotally, students were positive about receiving such information in a SFD format.In the future, we aim to research the effectiveness of SFDs.

Discussion
The most important component of our dashboard design was presenting aggregate data to students, encouraging them to reflect on their results for future cases.This information was previously not accessible in a summary format & was thus a key outcome.It assists students to "visualise their key strengths and areas for development, and record and monitor actions on the basis of feedback information" (Winstone, 2019, p. 225).The second most important component was ensuring the dashboard was easily interpreted by students which influenced our design decision to prioritise pie charts.This lines up with research conducted by De Barba et al. (2022).Our implementation has demonstrated the importance of planning, as Initiative 1 was designed in parallel with the assessment running due to time pressures.This caused inaccuracies resulting in scope creep and a need to recreate documents to be fit for purpose.Subsequent initiatives had well-defined outcomes, plans and timelines, which the time required by 75%.Initiative 2's subsequent "Rounds" experienced even greater time savings.For Initiative 2, due to time constraints, students' needs assessments were not taken into consideration.However, the simplicity of the dashboard made it easily understood.In future projects, we aim to review student feedback to track how useful it was for students and conduct focus groups to identify areas for improvement.In addition, due to the multiple "rounds", students have several opportunities to support and manage as per the observations in Molenaar et al. (2019) and Pokhrel et al. (2021).Since Initiative 2 launched, it has been broadly adopted by internal educators.
The base LMS functionality and external software did not meet our needs to host the dashboards as was discussed by Dawson et al., (2018).To provide students access to the dashboards, we developed a process of manually adding image files to student assessment grades via the LMS.With appropriate software, this process could be automated.We are currently investigating a variety of options including a plug-in to the LMS, a separate dashboard software, or an all-in-one assessment solution.

Conclusion
The initiatives were implemented using readily available learning analytics (assessment and ePortfolio) and software (Microsoft Excel, Sharex, Autohotkey) with some manual processes such as sharing images of the dashboard.These steps are all achievable, and SFDs could be implemented in a majority of institutions without a large investment of time or resources.With each implementation, we have observed a reduction in the time and resources required to create the dashboards, making the process more efficient.Current dashboard iterations are largely based on educators' or designers' perspectives of what a student needs and this could be the reason they are not used more regularly by students.Further research in this area can delve into what students want and how they use dashboards.Individual students may seek different information from their dashboard.Dashboards may require functionality that allows students to modify the layout to suit their needs.Further investigations will be geared towards obtaining student perceptions of the dashboard, establishing best practices and seeking an allin-one software solution.In the short term, the focus will be on enabling educators by providing guides.In the medium term, the focus will be to obtain student feedback to the dashboard and establish best practice and processes.Research will need to evaluate what students' needs are, how students use the dashboard and their preferences in how the information is presented.In the long term, the focus will be on using a single software to manage the entire process from obtaining data to sharing of dashboards.

Figure 1 :
Figure 1: The dashboard breaking down assessment results

Figure 2 .
Figure 2. Dashboard displaying ePortfolio asset data for a student.