In my professional life, one of the things I often do before demoing a platform product is creating a dashboard to highlight value brought about from the software. For example, the time saved by automation, number of cases created and closed, and the increase in clicks or actions on something. As a sales tool, the dashboard shows the ‘art of the possible’. To the paying stakeholder, it shows what they’re paying for. To the user, it should demonstrate that the platform has the ability to create both visually appealing reports and provide opportunity for further investigation like a case gone wrong, if needed. The types of reports that I can create are limited by the data points. The last sales trick that I’ll mention is that you always want to show a wide range of reports to capture the audience’s eye and leave them with something visually pleasing – the software should look good.
When creating the above dashboard, I had a few initial reactions:
- I found it much harder to create dashboards in Excel than in the platforms I’ve used.
- I immediately started looking at individual data points. I didn’t pause to reflect on the names of the students, i.e. the fact that they were students became surprisingly irrelevant to me very quickly.
- If I had more time on my hands, I would have preferred to do this in another tool because the above is not visually appealing. The sales engineer in me would not be proud to show this to a client.
Working in technology, I have found several readings negative and very distrusting of my day job. During this exercise, I became intimately aware that I took on this task with my ‘work hat’ on. I went into design mode with the goal being that through the dashboard, I could tell a story about the class and demonstrate the value of the platform that was providing the data. Going through some of my notes from the readings, this really made me pause and consider the teacher perspective. There should be frustration by this disconnect – that they’re not involved in the process. The engineers and data scientists should be continuously reminded of the students and what would be helpful to the teacher during the design and build process.
One thing that I had failed to do by putting on my ‘work hat’ on was to consider what the students were being tracked against, or in other words, consider what learning had been achieved, if any. In taking a step back and considering the story that I would tell, I realized that e.g. the completion data point in isolation doesn’t actually mean much. As a teacher, I would want to understand more around the following:
- ‘What’ did the student complete?
- Does completion mean learning, or just ‘I completed it’?
- How quickly did the student complete ‘it’? How does that compare to the average completion time and rate?
Just as the reports that I can create are limited by the data available, so would be the value of the dashboards for the teachers. Those questions cannot be answered through a dashboard without the right data points, collected at the right time.
On a higher level, I also considered whether these data points, or this datafication of the students, actually provided a foundation for personalisation? I think both the technologist and teacher would argue ‘no, it doesn’t’ – much more data and context is needed. If left as is, the teacher would likely not have much use for it other than for reporting final grades and providing data to administrators.
The idea that a teacher would be a “dashboard controller” (van Dijck et al, pg. 123, 2018) is interesting to me because is the teacher not a type of “classroom (or even student) controller”? Many of the data points included in the dashboard would be ones that the teacher would have, or need to collect anyway, such as the grade on each test.
It also stood out to me in thinking around the idea of value that this dashboard is significantly simpler than one a teacher would have at the AltSchool as described by van Dijck, Poell and Waal (2018). This dashboard does not contain detailed data on the students minute by minute activity, engagement, performance, or behavior. Many, would however, argue what the point is of tracking all that data.
At the end, that’s what it comes down to – the data.
From a teaching perspective: What data is actually needed? Why is it useful? How do we map it to learning (both for the students, and for teacher method improvement)?
From a technology perspective: How do we track difficult data points like emotional intelligence, or empathy? What are the data privacy and security concerns for each data point? Does the student have the right to be forgotten?
van Dijck, J., Poell, T., & de Waal, M. 2018. Chapter 6: Education, In The Platform Society, Oxford University Press
One reply on “Student Performance Dashboards”
I wanted to do the Female / male analysis of this data but found myself not sure how to label either! How did you classify the students as M/F? by name assumption ?