My aim in this block was to experiment with my own learning, improve my understanding of how and why I behave in relation to my learning practices, and from these insights modify my behaviour accordingly. Reflecting on the last three weeks, my initial thinking was influenced by the data-driven recommender and responsive systems that already drive some of my day-to-day activities, such as Netflix, Amazon, and Withings. I was looking for actionable recommendations and indicators that stemmed from my learning practice rather than my behaviour within that practice.
The enduring myth and pursuit of personalised education and the dreams of dialogue (Freisen, 2020) are heavily entwined in the rhetoric of technology and education. New technology promises personalised education at mass scale: creating equal access, democratic student centred instruction, the enhancement of student agency, and the adaptation of learning to a student’s unique set of goals, interests and competencies. The promise of adaptive processes when in fact they are responsive (Bulger, 2016) is something that stood out to me; the content does not change. By collecting my own data and creating my own data visualisation and analysis processes, I could move beyond the limited decision tree structures that typify these responsive recommender systems and move towards adaptive learning processes that measure behaviour to inspire questions, interaction, and choice.
Equally, I became increasingly aware of the inherent tensions that exist within the promise of adaptive processes. Digital innovation is painted as a complex paradox, promising emancipation and empowerment while furthering a pervasive governance culture and exerting ‘algorithmic control’ over education (Tsai and Gašević, 2020). Knox et al (2020) depict data-driven technologies as reintroducing behaviourist theories of control that reduce student agency by designing choice architecture that nudges a learner in a particular direction through tracking, predicting behaviours, emotions and actions. The increased homogenisation of thought and action stifles innovation and individuality and it is this neo-liberal market-based individualism that underpins the notion that problematic behaviours can be modified rather than addressing the socio-economic and political structures that dictate this behaviour. With the advancement of wearable biometric devices and the potential capabilities of emotional AI, there is an unprecedented opportunity for psychological surveillance and behaviour modification.
Tracking some of my own more personal data over the last few weeks, as well as using brainwaves as part of my last visualisation, I realise I am increasingly uncomfortable with the idea of this pervasive governance culture and potentially being ‘nudged’ in a particular direction. Institutional trauma and distrust is a barrier that is often raised in my work in adult literacy and with this kind of technological advancement and behaviourist design theory, these barriers will become insurmountable. Furthermore, this is compounded by the idea that the ‘rhetoric of transparency may privilege seeing over understanding’ (Tsai and Gašević, 2020, p.558). After I collected my first data set, I was overwhelmed by the amount of data I had in front of me, how could or should I use this data, was it even relevant or necessary? So, how can we empower individuals to make informed decisions and give informed consent when our own understanding of the system design and the algorithms embedded in these systems is so limited?
Bulger, M. 2016. Personalized Learning: The Conversations We’re Not Having. Data & Society working paper. Available: https://datasociety.net/pubs/ecl/PersonalizedLearning_primer_2016.pdf
Friesen, N. (2020). The Technological Imaginary in Education: Myth and Enlightenment in ‘Personalized Learning’. In Stocchetti M. (Ed.), The Digital Age and Its Discontents: Critical Reflections in Education(pp. 141-160). Helsinki University Press. doi:10.2307/j.ctv16c9hdw.12
Knox J, Williamson, B & Bayne, B, (2020) Machine behaviourism: future visions of ‘learnification’ and ‘datafication’ across humans and digital technologies, Learning, Media and Technology, 45:1, 31-45, DOI: 10.1080/17439884.2019.1623251
Tsai, Y-S. Perrotta, C. & Gašević, D. 2020. Empowering learners with personalised learning approaches? Agency, equity and transparency in the context of learning analytics, Assessment & Evaluation in Higher Education, 45:4, 554-567, DOI: 10.1080/02602938.2019.1676396