As the authors of ‘Dear Data’ noted, we are ‘living in the age of “Big Data”, where algorithms and computation are seen as the new keys to universal questions, and where a myriad of applications can detect, aggregate, and visualize our data for us to help us become these efficient super-humans’(Lupi, Posavec, 2016, p.11). The central role of datafication in different spheres of life, including education, is, indeed, difficult to deny. However, the ‘dataist trust’ (Williamson, 2017, p.40) in numbers and their potential to revolutionize education shouldn’t be taken for granted. My involvement in every stage of data collection, visualization and interpretation for the past nine weeks enabled me to look ‘under the hood’ of the ‘massive data producing machine’ and uncover some tensions between big data promises and their practical realities.
The idea of hand-drawn data presentation was brilliant in the sense that it made it possible to take on the roles of different actors usually involved in generating digital visualizations. As Williamson argues, ‘the visualization of the world … is a complex sociotechnical act involving a variety of actors and technologies with the persuasive power to shape people’s engagement and interaction with the world itself’ (Williamson, 2017, p.37). Thus, by drawing my data, I managed to minimise the impact of code, algorithms, people who write them, and digital media on them. As a result, my visualizations became the materialization of my own data-related choices, understanding of education, creativity and drawing skills. It also suggests that in this exercise, data visualization is still ‘no neutral accomplishment’, as Kitchen et al put it (Ibid. 2017, p.36). What is essential, though, is that it was me, who decided which aspects of behaviour are important to capture when conceiving of teaching and learning and which can stay unknown, which data about myself I feel comfortable to share with the public and what I consider private. In the online world, those choices are often taken away from us.
When thinking of how I could present my numbers, it turned out to be quite challenging to disregard conventional forms of data visualization, like chats and graphs. I took it for granted that PowerPoint like visualizations are strongly associated with objectivity, clarity and universality. However, as I further discovered, ‘visualization acts as a way of simplifying and reducing the complexity of the interaction of variables to graphical and diagrammatic form’ (Ibid. 2017, p.35), which means that digital tools impose constraints and may distort data to make them fit the pattern. Trying to be creative, I also experimented with Lego to visualize my data. It is noteworthy, that the limitations of this experience were obvious. Almost like with software tools, the baseboard and blocks of standardized size and shape made me compromise a number of variables and some meanings. Interestingly, my last data visualization looks more traditional than the rest. Maybe, being more relaxed by the end of the semester, I yielded to the mere exposure effect and subconsciously reproduced something familiar and trustworthy.
One more thing that influenced my visualization choices was collecting data through self-reporting. It took me some effort to free from social desirability bias when logging my data. This made me think of teachers who, in their new roles of ‘data collectors’ (ibid. 2017, p.82), are sometimes forced to use gaming strategies to avoid negative consequences of governing with data (Fontaine, 2016). Moreover, within my data collection practices, I discovered that there were aspects that I wasn’t able to capture, like quickly changing meanings in a conversation, my feelings, daydreaming, etc. Maybe, technology, through semantic and sentiment analysis, would have done better in this case. What I did with those tricky parameters was simply leaving them out, demonstrating a dangerous ‘count what can be counted’ tendency often employed in learning analytics.
Working on my weekly tasks, I tried to think metaphorically. According to Lakoff et al (2003), ‘the essence of metaphor is understanding and experiencing one kind of thing is terms of another’(p.5). Visualizing distractions as a cardiogram, looking at teaching through my parental duties and a game of darts, etc. helped me better realize the complex nature of educational processes and compare it with ‘the enduringly partial nature of whole datasets’ (Tsai, 2019, p.556) aimed to reflect them.
The most important takeaway from my data visualization work from the perspective of a learner was that education-related data have very little to say about learning per se. The analytics we can afford today will only serve as ‘indicators’ of students’ behaviour limited in explanatory power, whilst cognitive, social and emotional processes that constitute learning are too challenging to capture and thus left uncovered. Describing datafication in education, Tsai mentions the ‘phenomenon of information asymmetry’ (p.562) between data collectors and students caused by power imbalance, which turns the latter into ‘prized products, from which valuable behaviours can be extracted and consumed by ever-improving algorithmic systems (Knox, Williamson and Bayne, 2020, p.35)’. This raises numerous questions of students’ privacy, agency and equity, the values that education should originally promote.
The teaching block revealed that the usefulness of LA for educators is often overstated. As Brown’s article (2020) demonstrates, the use of LA tools in the classroom is often imposed on teachers and can undermine pedagogical strategies, bringing little value in return. Most importantly, students’ performance data are becoming the source of teachers’ appraisal whose negative consequences educators find difficult to push back. As a solution, Sander (2020) suggests developing critical understanding of digital data practices. While it is obvious that this competency will help teachers become more informed citizens, whether it could help them resist the ‘side effects’ of governing with data is a big question.
In block three, I tracked performance indicators that are becoming the main source for digital governance. Through the choices I had to make it became obvious that quantitative, decontextualized, ‘thin descriptions’ (Ozga, 2016, p.70) infuse current policies that are used to facilitate decision-making and impact educators. This transforms how we understand good teaching and prioritize certain school practices. Perhaps, my greatest discovery was realizing how new ‘fast policies’ (Williamson, 2017, p. 67) exacerbate the problems they were designed to eradicate (Anagnostopoulos, 2013).
Overall, my nine-week long data work and the recommended literature turned out to be particularly useful for deepening my understanding of complex processes related to data collection, visualization and their further use. As a result, I have developed a much more critical approach to datafication in education which will guide me through my learning, teaching and managerial practices.
Anagnostopoulos, D., Rutledge, S.A. and Jacobsen, R. (2013) Conclusion: The Infrastructure of Accountability: Tensions, Implications and Concluding Thoughts. in Anagnostopoulos, D., Rutledge, S.A. and Jacobsen, R. (Eds) The Infrastructure of Accountability: Data use and the transformation of American education. Cambridge, Mass: Harvard Education Press, pp. 213-228
Brown, M. (2020) Seeing students at scale: how faculty in large lecture courses act upon learning analytics dashboard data, Teaching in Higher Education, 25:4, 384-400, DOI: 10.1080/13562517.2019.1698540
Fontaine, C. (2016) The Myth of Accountability: How Data (Mis)Use is Reinforcing the Problems of Public Education, Data and Society Working Paper 08.08.2016
Knox, J., Williamson, B., and Bayne, S. (2020) Machine behaviourism: future visions of ‘learnification’ and ‘datafication’ across humans and digital technologies, Learning, Media and Technology, 45:1, 31-45, DOI: 10.1080/17439884.2019.1623251
Lakoff, G. and Johnson, M. (2003) Metaphors we live by,With a new Afterword,Chicago, Ill.: University of Chicago Press
Lupi, G., Posavec, S. (2016) Dear Data. London: Particular Books
Ozga, J. (2016) Trust in numbers? Digital Education Governance and the inspection process. European Educational Research Journal, 15(1) pp.69-81
Sander, I. (2020) What is critical big data literacy and how can it be implemented? Internet Policy Review, 9(2). DOI: 10.14763/2020.2.1479
Tsai, Y-S., Perrotta, C., and Gašević, D. (2020) Empowering learners with personalised learning approaches? Agency, equity and transparency in the context of learning analytics, Assessment and Evaluation in Higher Education, 45:4, 554-567, DOI: 10.1080/02602938.2019.1676396
Williamson, B., (2017) Conceptualizing digital data: Data mining, analytics and imaginaries. Chapter 2 in Williamson B. Big Data in Education: The digital future of learning, policy and practice. London: SAGE publications, pp. 25-48
Williamson, B., (2017) Digital Education Governance: political analytics, performativity and accountability. Chapter 4 in Williamson B. Big Data in Education: The digital future of learning, policy and practice. London: SAGE publications, pp. 65-96