The Red Queen: shifting governance and the restructuring of truth

Always speak the truth, think before you speak, and write it down afterwards.

The Red Queen in Alice in Wonderland (Carroll/Tenniel 2009)

Like the Red Queen, education is constantly running without getting anywhere.  The Red Queen herself said you have to run as fast as possible just to stay still.  Since the emergence of education-as-science around 200 years ago it has undergone constant reform (Smith 1998) in search of a more scientific way of doing things.  Arguably this started with Ebbinghaus in 1885 who experimented with reducing learning to small de-contextualised chunks: perfect reductionism (Ibid.).  Today learnification and datafication (Biesta 2012, Knox et. al. 2020) bear the torch for this ongoing improvement of learning.

Datafication affects not just teaching and learning but the governance of teaching and learning (Williamson 2017: 74).  Data is used to support policy and standards in terms of monitoring performance or in researching the justification for those standards (Williamson 2017:74, Ozga 2016, Fontaine 2016).  In my first data visualisation of this blog, I noted the number of standards, compared with other material, in my School’s response to the pivot online.  A huge repository of useful information for academics pivoting online was based largely on standards and policies.  It is interesting to note the standards are based on data collected on best practice in online learning.

As we were to discover, this best practice was shaped by a pervasive narrative in online learning: the social constructivist discourse on Communities of Inquiry (Garrison 2010).  Narratives, once taken to heart, can be quickly reinforced with data.  Our characterisation of data as neutral allows the steady restructuring of new truths from old ones (Anagstopolous 2013:217, Ozga 2016, Fontaine 2016).  This includes dataveillance (Williamson 2017) or data gaze (Prinsloo 2020): collecting data for ostensibly neutral purposes.  However, the data collection priorities often reveal political purposes, subconscious or deliberate (Prinsloo 2020).  A review of recruitment data collected for diversity purposes for my second visualisation revealed a Christian bias.  While these data are unlikely to be used to discriminate; it shows that only certain data are collected and therefore, governance may occur through what is not collected as well as what is.

This restructuring through data is pervasive, undermining old elitist structures and replacing authority through social position with authority through knowledge (Fontaine 2016, Ozga 2016).  But since the privileged tend to have both position and knowledge, this may not reverse society but could drive an even larger wedge through it.  Colonialism is not a metaphor when privileged groups are able to dispossess others (Prinsloo 2020).  In my final visualisation I was able to see from my browsing history which teaching resources I use the most.  If these data were gathered routinely and used to assess performance against a standard, I could quickly find my practice being radically altered to conform to a decontextualised and simplistic narrative, the Ebbinghaus effect.  As approaches to education and technology change, we are all engaged in a Red Queen race to stand still, but data-driven standards and policies could be a step backwards.    

Bibliography

Anagnostopoulos, D., Rutledge, S.A. & Jacobsen, R. 2013a. Introduction: Mapping the Information Infrastructure of Accountability. In, Anagnostopoulos, D., Rutledge, S.A. & Jacobsen, R. (Eds.) The Infrastructure of Accountability: Data use and the transformation of American education.

Anagnostopoulos, D., Rutledge, S.A. & Jacobsen, R. 2013b. Conclusion: The Infrastructure of Accountability: Tensions, Implications and Concluding Thoughts. In, Anagnostopoulos, D., Rutledge, S.A. & Jacobsen, R. (Eds) The Infrastructure of Accountability: Data use and the transformation of American education.

Biesta, G. J. J. (2012). Giving Teaching back to education: responding to the disappearance of the teacher. Phenomenology and Practice, 6(2), 35–49.

Carroll, L., Tenniel, J. (2009). Alice’s Adventures in Wonderland and Through the Looking-Glass. United Kingdom: OUP Oxford.

Fontaine, C. 2016. The Myth of Accountability: How Data (Mis)Use is Reinforcing the Problems of Public Education, Data and Society Working Paper 08.08.2016.

Garrison, D.R., Anderson, T. and Archer, W., 2010. The first decade of the community of inquiry framework: A retrospective. The internet and higher education13(1-2), pp.5-9.

Knox, J., Williamson, B., & Bayne, S. (2020). Machine behaviourism: future visions of ‘learnification’ and ‘datafication’ across humans and digital technologies. Learning, Media and Technology, 45(1), 31–45.

Ozga, J. 2016. Trust in numbers? Digital Education Governance and the inspection process. European Educational Research Journal, 15(1) pp.69-81

Prinsloo, P. 2020. Data frontiers and frontiers of power in (higher) education: a view of/from the Global South. Teaching in Higher Education, 25(4) pp.366-383

Smith, F., 1998. The book of learning and forgetting. Teachers College Press.

Williamson, B. 2017. Digital Education Governance: political analytics, performativity and accountability. Chapter 4 in Big Data in Education: The digital future of learning, policy and practice. Sage.

Data Escape

Browsing data for work computer 18 – 25th March 2021

This week I have collected data on my browsing history for a week between 18 – 25th March 2021. Ozga (2015), Fontaine (2016) and Anagnostopolous et. al. (2013:12) highlight the focus on outcomes such as performance in tests rather than inputs. This output focused data governance is bound to be a little one-sided. You can only act on what you measure so how do deal with institutions that fall behind. Often they are punished, but if machine behaviourism is like human behaviourism then punishment will not work (Knox et. al. 2019, Skinner 2002). Indeed the schools which are punished become less likely to help those who need it most (Anagnostopolous et. al. 2013:217). Informatic power is misused (Ibid.).

The purpose of my data collection is to determine if collecting data on my inputs into teaching and learning would prove to be more useful in governing educational outcomes. My browsing history could be easily made available to my university as all the browsing was done on a university machine. I split my analysis between learning and teaching, MSc (Digital Education) and personal browsing. I counted the number of ‘hits’ on each platform, noting this does not necessarily represent the time spent on each platform. As this week was a teaching heavy week, the loading can be seen on LMS platforms such as Moodle and FutureLearn as well as Zoom and Mentimeter which aid my teaching along with Business Case and Simulation platforms. My time studying is split between the word press sites of each of my courses and Miro where some of my cultural course is also happening. I also use Google Photos to capture my data visualisations for this course and art work for the culture course. And finally a very small amount of browsing to check emails and LinkedIn.

Should the university collect data like these, it could be misleading and very variable from week to week. If I choose to read a paper article or text book there are no digital traces. The data are simplistic at best. However, as governing data it does describe inputs which might be compared with outputs. However this would be very likely to lead to feelings of mistrust and dataveillance (Williams et. al. 2020, Edwards 2020)

Anagnostopoulos, D., Rutledge, S.A. & Jacobsen, R. 2013a. Introduction: Mapping the Information Infrastructure of Accountability. In, Anagnostopoulos, D., Rutledge, S.A. & Jacobsen, R. (Eds.) The Infrastructure of Accountability: Data use and the transformation of American education.

Anagnostopoulos, D., Rutledge, S.A. & Jacobsen, R. 2013. Conclusion: The Infrastructure of Accountability: Tensions, Implications and Concluding Thoughts. In, Anagnostopoulos, D., Rutledge, S.A. & Jacobsen, R. (Eds) The Infrastructure of Accountability: Data use and the transformation of American education.

Anagnostopoulos, D., Rutledge, S.A. & Jacobsen, R. 2013b. Conclusion: The Infrastructure of Accountability: Tensions, Implications and Concluding Thoughts. In, Anagnostopoulos, D., Rutledge, S.A. & Jacobsen, R. (Eds) The Infrastructure of Accountability: Data use and the transformation of American education.

Fontaine, C. 2016. The Myth of Accountability: How Data (Mis)Use is Reinforcing the Problems of Public Education, Data and Society Working Paper 08.08.2016.

Knox, J, Williamson, B & Bayne, S 2019, ‘Machine behaviourism: Future visions of “learnification” and “datafication” across humans and digital technologies‘, Learning, Media and Technology, 45(1), pp. 1-15.

Ozga, J. 2016. Trust in numbers? Digital Education Governance and the inspection process. European Educational Research Journal, 15(1) pp.69-81

Skinner, B.F., 2002. Beyond freedom and dignity. Hackett Publishing.

Williamson, B. Digital Education Governance: political analytics, performativity and accountability. Chapter 4 in Big Data in Education: The digital future of learning, policy and practice. Sage.

Data gaze in Scottish universities

Religious focus in diversity data from Scottish academic recruitment

Prinsloo (2020) demonstrates where the data frontiers exist in South African education and how closely the data gaze follows the old colonial ways. Moved and repelled by the power of the data gaze I wondered if I could collect data in Scotland that also followed institutionalised prejudice. I gathered data from jobs.ac.uk a repository of academic job vacancies in the UK. Narrowing the search to Scotland, I could only find five jobs currently advertised. I started the application process to access the diversity data section and recorded the religious and ethnic data they sought. Once data were gathered I cancelled the application.

Diversity data are used to record the religious beliefs, ethnicity, gender, nationality, age and sexual orientation of applicants. Some other categories may also be collected. The aim of collecting data on these protected characteristics (which are not shared with decision-makers in the recruitment process) is to assess the diversity within these institutions. Employers are required by law to collect these data and they are ostensibly for the purposes of supporting diversity and eradicating discrimination. Although these data may appear beneficial or at least neutral, the Scottish Government does not mandate the specific categories of data collected, these are subject to each institution and can reflect the unconscious (or conscious) bias of those who draw them up. We should begin by asking why these specific data are deemed important to these institutions? Or why biases are not picked up even as the applications are vetted by numerous individuals before posting.

The visualisation above shows the five universities observed with circles for different religions. The radius is measured according to the number of denominations that can be selected, so more diverse choices are represented by larger circles. In all cases only Christians could choose more than one denomination. This suggests an unconscious bias at play with an acknowedgement of specifics sects in Christianity whilst others are simply Muslim or Hindu etc. Every other religion had only one choice. Most of the universities also viewed Christian diversity as simply Protestant or Catholic, these had at most three choices: Christian, Protestant or Roman Catholic. Stirling and Napier simply wanted to know if Christian applicants were Protestant or Catholic, with no option for simply Christian. This is quite problematic for a country with a history and ongoing sectarian issue. The data here are anything but neutral, they are formed from ongoing (hopefully) unconscious bias. Why would these universities care whether you are Catholic or Protestant but not about whether you are Sunni or Shia, or a Shaktist Hindu? I also noticed that two universties also had a Scottish ethnic group but no English or Welsh ethnic group, these were presumably simply British? Again this reflects an uncritical and quite stereotypical view of Scotland’s neighbours.

Data gazing

OECD Countries proportion of 25 – 34 year olds with bachelor’s degree

The OECD Education GPS data set is trove of data, a sea of data. Really this is beyond the ‘monstrous’ amount of data used in school inspections noted by Ozga (2015), and if the data speak themselves, I must be deaf, because I stared for a long time and set up all sorts of visualisations of all sorts of indicators. And it meant very little to me.

This shuffling, sorting and categorising of data reminds me of the data gaze sumarised by Prinsloo (2020) where data supports a political, social and colonial agenda. Looking at the data from a different angle, though, always obscures another side of the overall structure. So whatever truth you choose, you occlude another. Reading Prinsloo (2020), I feel I will look at religious affiliation in Scotland for my week’s data visualisation. This has fascinated me (but it is macabre really) since I first came to Scotland around 30 years ago. When I left the Navy 3 years ago and applied for jobs, many applications had an area for stating religious affiliation. It may seem odd in the 21st Century but some, quite a few had the choice of Protestant or Catholic and nothing else. Sadly, tragically, this division is still important in Scotland and organisations deem it important enough to collect data on. If the form was on paper, and not electronic, I would have been very tempted to scribble in ‘Anglican’ just to throw the algorithm out completely. Unpacking this slightly I noted that this sectarian take on religion falls far short of describing the vast number of divisions in the Christian Church but not as short as for Islam, which is usually just Islam. Or perhaps one might be an ‘other’. An ‘other’, then, is not of interest to those who, for some reason (and I can’t see how it can be a good one) need to know how many Muslims, Protestants or Catholics there are in their organisation. Others are OK. But if we don’t need to know how many Coptic Christians there are, then we don’t need to know anything about the religious practices of our workforce presumably. This struck as a somewhat milder but disconcerting example of the data gaze,

Even in the OECD dataset there is a purpose concealed within the endless tables of data. Whilst the data may be constructed around progressive and laudable objectives, such as the number of women in tertiary education, the data always seem to form some kind of league table. While we may be viewing this as some kind of transparency ‘outing’ the countries which are behind the curve on social mobility, for example, they are still a means of competing with other countries. The OECD probably has no interest in this competition, but it is like we can’t help but compile the data this way. As Ozga (2015) points out school inspection data and governance data such as PISA are constructed this way. Does it really matter if the UK has more women in tertiary education than Mexico? Ozga (2015) points out education is transitioning from a hierarchical elitist structure to more networked and data-driven one. Williamson (2017) also points out the incredible rise of data in supporting vested interests in government and corporate organisations. When I first heard the term ‘data is the new oil’ I thought, when will the first war over data start? But perhaps it is already happening?

Williamson, B. Digital Education Governance: political analytics, performativity and accountability. Chapter 4 in Big Data in Education: The digital future of learning, policy and practice. Sage.

Ozga, J. 2016. Trust in numbers? Digital Education Governance and the inspection process. European Educational Research Journal, 15(1) pp.69-81

Prinsloo, P. 2020. Data frontiers and frontiers of power in (higher) education: a view of/from the Global South. Teaching in Higher Education, 25(4) pp.366-383

Covidian Governance

|Matt Offord

The most intense period of being aware for the need for performativity and accountability I can think of is the recent pivot online response to the shutting down of traditional face to face learning in Higher Education. As an academic in a Business School, there was an intense pressure to achieve what would have been thought previously as impossible, the complete conversion of all courses to an online format. Williamson (2017:74) discusses how policy insturments are applied and linked to performativity and accountability and I recognised immediately the norm of control through managerialism, and performance data. The school responded to the crisis by creating an enourmous heap of resources, generated and curated by staff (myself included), as an online academy for becoming an online teacher. A year later I went back to gather data from the Moodle page where the framework is hosted. I wanted to see how much operational governance was being applied by data.

In all 151 individual items form the framework, in the form of documents, infographics, videos, podcasts or links to external resources. This was a hugely impressive piece of work compiled in just a few weeks by a handful of staff. Inevitably, University, College and School policy formed a part of this. I wanted to discover how many of these items were for collecting data on staff progress in developing these courses. Only 4 items were for this purpose. However, the items are significant, they are reporting forms to collect data on the course build progress. The forms were introduced ostensibly to reduce workload since, technically, academics should have gone through a months-long process to adapt their courses to online. Yet, many academics who would not previously have needed to inform the school of progress preparing for teaching, found that they had to. The other finding is the huge proportion of ‘standards’ that were produced. Given that standards direct effort whereas guides provide optional advice, the framework looks overwhelmingly directively. This was not the intention but increasingly, the School reverted to its customary managerial mode of operation. Standards, although not embedded in the data architecture, are algorithmic in nature (like pre-data algorithms). The response, to me, looks like a panicked attempt to regain control through policies and standards, while the School is yet to be datafied so far…

End of Teaching Block: Domestication of Data

Artwork by Matt Offord

The so-called Fourth Industrial Revolution has unleashed a tidal wave of data. It did not take long for human society to see this as an opportunity. Data is a thing and humans have always sought to own and colonise things. However, the action of domestication draws humans into entangled networks of humans and things and into care for those things (Hodder 2012: 62). Data, like all things, has agency of its own, for instance, in the ‘surveillance and structuring of human behaviour and action’ (Williamson 2017:64). While we seek to domesticate the rising volume of data, it ( through algorithms) domesticates us.

Williamson et. al. (2020) point out that culture is defined by data and that students are increasingly defined as data sets. In my first data visualisation, I created a tanglegram of technologies I use to teach during the covid crisis. I hoped to catch the behavioural chains and networks which form what Heidegger called a equipmental totality (Hodder 2012:28). By sketching the network of devices I use to teach and recording how much I used them, I hoped to surface, not raw usage data, but a sense of my dependence on these things. The visualisation also helps to understand where data is not spotlighting aspects of teaching (Williamson et. al. 2020, Brown 2020, Harrison et. al. 2020, Sander 2020). The dependency on things for education could not be starker now and also the dependency of things on us. For example, there is a fevered drive to collect data on students through LMS systems as we try to gauge the engagement of students in online environments. How much data and how many new algorithms will be created and depend on us to support this drive?

In my second visualisation, I was inspired by Brown (2020) and learning analytics dashboards to find out how much teachers depended on these dashboards. I happened to be doing a mini-ethnography on a Twitter community #hybridlearning on the culture and digital education course. For a week I counted the tweets in this community about dashboards, there were so few that I widened my search to edtech tweets. I found the community was all but drowned out by a deluge of edtech promotions and even a large number of retweets by teaching professionals. This community was in the midst of an, apparently very stressful, period of hybrid learning in the United States primary education sector. The community has widely embraced edtech as a kind silver bullet solution to many issues. There was very little sense that critical thought was being applied to the adoption of new technologies and platforms (Rhafaghelli et. al. 2020, Sander 2022, van Dijck et. al. 2020), but this may be understandable in the light of the crisis.

Finally I looked at how the way I teach distorts time for students (and myself) through teaching in different time zones and the extent to which I teach in real time or to future students (yet to log on and consume my materials). I found this interesting as it demonstrates how dependencies on things, including data, maps a network not just through real and virtual space but also through time. In fact, this time distortion is partly what marked human society’s transition into an entangled socio-material network (Hodder 2012:83). The transition to agricultural living forced humans to nurture and develop things over time and also to develop more complex social relationships (loc. cit.). Similarly, my dependence on technology and data to teach is now reaching into the future and ensuring my extended entanglement. Course evaluation data, student performance data, engagement statistics will all be used to develop future courses, and the turn in performance management based on shallow quantifications will have a significant role to play. As van Dijck et. al. (2020) point out data is transforming the curriculum (see also Williamson et.al. 2020) . There is also the risk here that pedagogy is driven by what we can measure (Brown 2020). This is the domestication of data and domestication by data in action.

Bibiography

Brown, M. 2020. Seeing students at scale: how faculty in large lecture courses act upon learning analytics dashboard data. Teaching in Higher Education. 25(4), pp. 384-400Hodder, I., 2012. Entangled: An archaeology of the relationships between humans and things.

Harrison, M.J., Davies, C., Bell, H., Goodley, C., Fox, S & Downing, B. 2020. (Un)teaching the ‘datafied student subject’: perspectives from an education-based masters in an English universityTeaching in Higher Education, 25:4, 401-417

Raffaghelli, J.E. & Stewart, B. 2020. Centering complexity in ‘educators’ data literacy’ to support future practices in faculty development: a systematic review of the literature, Teaching in Higher Education, 25:4, 435-455, 

Sander, I. 2020. What is critical big data literacy and how can it be implemented? Internet Policy Review. 9(2) 

van Dijck, J., Poell, T., & de Waal, M. 2018. Chapter 6: Education, In The Platform Society, Oxford University Press

Williamson, B., 2017. Big data in education: The digital future of learning, policy and practice. Sage.

Williamson, B. Bayne, S. Shay, S. 2020. The datafication of teaching in Higher Education: critical issues and perspectives. Teaching in Higher Education. 25(4), pp. 351-365.

Time Shifting Teaching

|Matt Offord

Collecting data from last week’s teaching for the last blog in this block, I decided to simplify things and create a simple breakdown of my teaching in terms of time slippage between teaching and learning in covidian teaching. I was also partly inspired by the investigation of #hybridlearning (see dandelion of data), it’s the infiltration of edtech and reading van Dijk et. al. (2018) on the subject of educational platforms. I remembered that I teach one of my courses on Future Learn, one such platform. I thought the current situation of remote teaching along with the teaching I do on this platform skews the times of teaching and learning. I often teach at different times to those at which students learn, a rather odd situation for traditional F2F teaching but not for digital education.

I recorded the daily episodes of asynchronous activity like discussion forums and recorded lectures and synchronous activity like real-time (Zoom) lectures and tutorials plus my daily ‘drop-in’ sessions on Zoom. I also recorded the times I used MS Teams to talk to students which straddles the synchronity barrier as often real-time chat happens there. Finally, I also recorded the amount monitoring of students by watching their activity on the platform they are on (marked in purple with an ‘M’).

The monitoring is, I suppose, a type of dataveillance (I am not sure). The course I am teaching currently involves a complex business game. Students play the game on a simulation platform, so most days I monitor their decisions and, yes, their performance (Williamson et. al. 2020, Harrison et. al. 2020). However, these data concern the game only, and is required to prepare for tutorials. However, it does show how education is increasingly located on platforms and how data-soaked these platforms actually are (van Dijck et. al. 2018).

The data demontrates the extent to which teaching is currently asynchronous. But this is skewed by the recorded lectures which are entirely activity to build a Future Learn course. Campus courses were originally intended to be asynchronous but student dialogue in Semester One led to a greater tolerance of synchronous sessions. These courses have very little video material as video tends to be passive as a form of learning. In summary the teaching is mostly balanced between real time and flexible learning but purely digital teaching tends to push the asynchronous approach ahead.

van Dijck, J., Poell, T., & de Waal, M. 2018. Chapter 6: Education, In The Platform Society, Oxford University Press

Harrison, M.J., Davies, C., Bell, H., Goodley, C., Fox, S & Downing, B. 2020. (Un)teaching the ‘datafied student subject’: perspectives from an education-based masters in an English universityTeaching in Higher Education

Williamson, B. Bayne, S. Shay, S. 2020. The datafication of teaching in Higher Education: critical issues and perspectivesTeaching in Higher Education. 25(4), pp. 351-365.

A dandelion of data

|Matt Offord

I was conducting a mini-ethnography of a Twitter teaching community for Education and Digital Culture and decided to add to my study the data aspect of the teaching discussed by this community. The group are called #hybridlearning and consist largely of US based K12 teachers engaged in hybrid learning (in this interpretation teaching online and F2F simulataneously). I noticed a very significant uptake and profiling of edtech solutions to very difficult task of teaching in two modes art once. Very few tweets discussed operational solutions such as the use of learning stations to segragate activity. There was, instead, an earnest reaching for simple solutions based on technology.

I divided the tweets into posts about the learning analytic dashboards (described by Brown 2020), edtech evangelised by individuals and sponsored edtech promotions. I also spotted a lone AI post purporting to assist in the remote learning environment. I chose to hand draw (albeit on software) a dandelion where my colour key was reflected in the seeds of the flower, emulating a natural pie chart. I also drew another image in soft pastels.

Dandelion data visualisation

The data show the #hybridlearning hashtag has clearly triggered a number of edtech sales algorithms. Many solutions proudly promote datafication (Williamson et. al. 2020), although dashboards (Brown 2020) are fairly low profile whereas one-stop shops or even online schools are quite prominent (van Dijck et. al.2018). The level of broadcasting by edtech promotors really drowned out the community voice as can be seen by the amount of red on the dandelion pie chart.

Student Performance Dashboard

|Matt Offord

I took the sample data set 2 and imagined these were data I had extracted from Moodle on the Principles of Management course that I teach. I also imported the data set into Google Data Studio. I found the Data Studio quite counter-intuitive to use. Like Excel it decides what your graph will look like and then you have to change it. I fing this quite frustrating as I use R for all my data analysis, and this requires the user to imagine the graph and programme R to produce it, so you only get the data and parameters you choose. Ultimately, however, I thought R does not produce the sort of graphics which look like a ‘dashboard’, so I used Excel (I thought it was simpler than Data Studio) to create these graphics. I think I could get used to Data Studio eventually.

I based my data dashboard on the sorts of questions I am usually asked in terms of student / course performance (I realise this cuts both ways). So initially I compiled attendance data as a psuedo ‘engagement’ measurement (it measures presence and this is rightly or wrongly conflated with engagement). I also thought it would be natural to assess performance via the grades, which I averaged across the 4 tests.

I also created a pie chart to identify the readers, those who had accessed the most pdfs (doesn’t mean they read them). This graphic is not very user friendly and on R (and probablt Data Studio) you can add the labels to the slices. But I noted the variation here was quite small. I then compared things like attendance, posting and VLE activity to see if there was any correlation with performance. In this data set only attendance is correlated with higher performances.

A Tanglegram of Teaching

|Matt Offord

Humans, things and information are dependent on one another, connected to one another through chains of behaviour (Hodder 2012:54). Data is therefore just one part of a larger assemblage (Brown 2020, Williamson et. al. 2020). Data as separate and valuable without context essentially renders data flat and lifeless just as surely as a fish dies when removed from the ocean.

This week I wished to capture my data architecture (Williamson 2020) and bring it to life as Heidegger’s equipmental totality (Hodder 2012:28). The items on my desk are the architecture of my teaching rather than data specifically. In my teaching, I am not exposed to Learning Analytics Dashboards as described by Brown (2020) and although data on my students slops around the Moodle data lake, I rarely go to the shore, preferring to depend on my interactions. During Covid-19, these items are how I teach (of course they are connected to a wider network of internet pages and servers). I listed the items: phone, webcam, laptop, headphones and tablet. I collected two forms of data: the hours spent on each piece of equipment and how dependent I am on them for teaching.

Dataveillance by Matt Offord 2021

The thingometer is the gauge to the side of each object while the coloured marks show how many hours each is used each day. The laptop, it seems, is crucial for teaching, especially when linked with the webcam and headphones for live teaching. Neither the phone nor tablet are indispensable but provide redundancy for the others. The laptop (with its entangled person, me) can wrangle all the data, qualitative and quantitative and deliver all the teaching necessary. Or to put it another way, it conducts more dataveillance than the other things (Williamson et. al. 2020, Brown 2020)

Brown, M. 2020. Seeing students at scale: how faculty in large lecture courses act upon learning analytics dashboard data. Teaching in Higher Education. 25(4), pp. 384-400

Hodder, I., 2012. Entangled: An archaeology of the relationships between humans and things.

Williamson, B. Bayne, S. Shay, S. 2020. The datafication of teaching in Higher Education: critical issues and perspectives. Teaching in Higher Education. 25(4), pp. 351-365.