Teaching with data: Wrap-up

Photo credit: The author of this blog, Impressions and Ideas, Newport-on-Tay, 2021.

My data visualisations on teaching with data represented (1) reading ‘off-piste’ (2) conversations and (3) self-censorship.  In the comments, Jeremy Knox usefully crystallised reading-off-piste as productive deviation. Deviating from the set-reading list can be productive and enjoyable – but it’s also risky, inefficient, and stressful. Risky: you need to make a judgement-call about what to read and another about when to stop – sometimes you get this right; sometimes you don’t. Inefficient: when you don’t get the judgement-call right, time is wasted; reading off-piste while also doing the set readings means that you fall behind schedule with writing and although you learn a lot, there’s inevitably some material you read that you don’t use to write. Stressful: catching up with the visualisations and writing is stressful.

Still, I’m inclined to think it’s worth it, overall. Inefficient and stressful to read on top of everything else in the short term (from week to week), parts of Wu’s and Veliz’ books have been helpful over the longer term.  Data dashboards and softwares purporting to track student engagement don’t show teachers this kind of reading-off piste ‘behaviour’. And so neither do they show a teacher a student’s risk-taking, short-term inefficiency and stress, nor the longer-term benefits from reading this way; a lot is happening, but dashboards don’t register it. In contrast, a short conversation between a teacher and student might easily elicit this kind of information – and do so without requiring the gathering of data on, for example, stress levels, or mood, or ‘grit’, using the socio-emotional learning technologies discussed by Knox et al [2020:40-41].  Overly focusing on what the dashboard suggests not only risks teacher’s seeing students in limited ways [Brown, 2020] [Williamson et al 2020: 358]; it risks teachers’ failing to see students in relevant ways too (as productive deviants, as risk-takers, and so on.)

The ‘conversations’ visualisation plays with an observation that [Harrison et al, 2019] makes about conversation as a methodology. My discussion pushed beyond that paper, suggesting that while a dashboard, for example, may help a teacher know that a conversation is happening, without further detail or context, it’s going to be hard for a teacher to interpret, especially since conversations are often subtle and ambiguous. Jeremy plausibly suggested that this was the reason that analytics systems seem to focus on the easier task of analysing behaviour rather than language (and, perhaps, I think, communication more generally, including visual communication such as the data visualisations themselves). I’m inclined to think that this starts to get closer to understanding the limitations of the use of analytics systems in education where, traditionally at least, both learning and teaching have been conversation-driven.

Furthermore, we might observe that conversations often have dynamics that are not immune to wider socio-political power-dynamics [Austin, 1962; Langton 2009]. In the context of education, this might surface in situations where, for example, students sometimes dismiss or harass each other (and/or their teachers) verbally, or where racial and gendered slurs are the norm. In addition, as anyone who has ever grown up in a highly authoritarian socio-political world will tell you, what one might wish to communicate in conversation is not only sometimes deliberately self-censored; even when not self-censored, what is communicated often outstrips what is (literally) said [Grice, 1975]: in some contexts, what we choose not to say often gives others more of a clue to what we think and feel than what we literally say – you just need to be inculcated into the relevant conversational norms to pick up on it [Grice [1975], Lewis [1979]]. (Poets and playwrights often use this last method of communicating what they think and feel too.) The upshot of all this is that teachers cannot let their professional judgement simply dance to the tune of analytics systems that glibly infer (as many seem to) that verbal silence (silent ‘behaviour’?), for example, straightforwardly indicates a lack of communication, and, therefore, a lack of engagement. Emphasising this point reveals a further way to reinforce Gourlay’s [2020: 5] concern that ‘interaction’ and ‘participation’ have come to stand as a proxy for learning itself. Learning may well require communication but not all communication is explicit and verbal (or typed on a forum, for example); such communication won’t be picked up by analytics systems – that doesn’t mean that it’s not relevant, and even significant, to learning, and for teachers trying to interpret what’s going on in their classrooms.


I’m still deeply puzzled by the definition of critical big data literacy given by Sander [2020], not only for the workload-related reasons mentioned here but also because the focus on data literacy seems to sideline the significance of the knowledge and professional judgement teachers build up through qualification and experience. Do I really need extensive training to be data literate to understand students when I’ve already got training and qualifications to teach, a pile of experience, a sprinkling of intuition and emotional intelligence, and good old fashioned professional judgement?

Anyway, I’ve run out of time, and that makes it seem fitting to end with this. Cheers.

Leave a Reply

Your email address will not be published. Required fields are marked *