Block 3 Summary

As I was wrapping up block 2, I briefly touched on the issue of performance management by university management, and decided to track my digital footprints in my capacity as instructional designer / pedagogical assistant in weeks 9 and 10, and I tracked how long I took to reply to all of my emails at work, and how often I accessed Moodle. As mentioned by Williamson (2017, p. 75), there has been a move towards increasing measurement of the performance and productivity of educational institutes; and in turn, increased tracking also causes individuals or institutes to “change their practice to ensure the best possible measures of performance”.

As I mentioned in week 10, with LMS access being constantly tracked by default, institutes could very well track LMS access for all their teaching staff, and use such metrics as key performance indicators for teachers’ engagement with online teaching. Increasingly, the annual performance review process for university staff involves presenting summarised data as evidence of performance (e.g. this year I converted how many courses to blended delivery; this year I published how many articles in peer-reviewed journal etc.), which has tangible impact on continuation of employment or promotion.

Anagnostopoulos et al. (2013, p. 14) described an information infrastructure consisting of quantification, standardisation and classification processes. These processes transform raw information through numbers and data into performance metrics, ratings and ranking etc. within a test-based accountability. In these processes we see standardised measures being devised and people and phenomena are fitted into categories within a classification system. Such processes feed into the establishment of national standards that can in turn being used to hold schools accountable for their effectiveness and productivity. Anagnostopoulos et al. (2013, pp.15-16) also problematised the occurrence of these processes without public scrutiny, generating metrics that are mere simplifications of the complexities of teaching and learning and the institutes itself, often failing to address deeper questions. 

On week 11, I tracked my behaviour while watching an instructional video. From this exercise, I reflected on how video hosting platform (e.g. Panopto) often come analytics function “out of the box”, allowing users’ consumption of videos to be tracked. I can see learning analytics being problematised in similar manner as above for test-based accountability. We can see learning analytics, or more broadly, the datafication of education microdissect students’ experience or behaviour into data points (Williamson, Bayne & Shay, 2020), which Raffagheli & Stewart (2020) argued would draw a boundary on what can be measured (“knowable unknowns”), while throwing out what cannot be measured (“unknown unknowns”). 

Given the uneven adoption of learning analytics, we might still be far away from reaching an institutional (or even departmental) framework for learning analytics. However, we already see learning analytics being used to inform institutional decision-making and performance tracking. Similar to how test-based accountability can prompt school practices like “teaching to the test” according to Anagnostopoulos et al. (2013, p. 14), I would argue that learning analytics framework or any potential learning analytics-based accountability could prompt “teaching to the analytics” or even “learning to the analytics” practices among teachers and students.

On a side note, learning analytics can also be used to justify educational development initiatives in institutions. In a university I worked at previously, learning analytics are taken as measurements of attractiveness and/or effectiveness of digital learning resources. Such parameters were used as key performance indicators for the educational development initiatives per se.

(Word count: 567)

References

Anagnostopoulos, D., Rutledge, S.A. & Jacobsen, R. 2013. Introduction: Mapping the Information Infrastructure of Accountability. In, Anagnostopoulos, D., Rutledge, S.A. & Jacobsen, R. (Eds.) The Infrastructure of Accountability: Data use and the transformation of American education.

Raffaghelli, J.E. & Stewart, B. (2020). Centering complexity in ‘educators’ data literacy’ to support future practices in faculty development: a systematic review of the literature, Teaching in Higher Education, 25:4, 435-455, 

Williamson, B. (2017). Digital Education Governance: political analytics, performativity and accountability. Chapter 4 in Big Data in Education: The digital future of learning, policy and practice. Sage.

Williamson, B. Bayne, S. Shay, S. (2020). The datafication of teaching in Higher Education: critical issues and perspectives. Teaching in Higher Education. 25(4), pp. 351-365.

Leave a Reply

Your email address will not be published. Required fields are marked *