Week 10 Drawing

Moodle access tracking on week 10

This week I chose to visualise how often I accessed my university’s Moodle for work. This data is obtained through manually counting my access to Moodle from Google Chrome history. Each stroke on the drawing represents one instance of access to Moodle recorded by Google Chrome. This is exhaustive as I use Chrome as my sole web browser.

In an institute that has any form of blended learning initiative / strategy, a learning management system (LMS) tends become the official space for dissemination of learning resources.

For teachers, as more of learning takes place in the LMS, there could be an expectation for them to spend certain level of efforts in the LMS, such as putting together activities, moderating forums, answering questions etc. Such data visualisation can be utilised to monitor teachers’ engagement in the LMS. With a population level analysis, teachers who spend little to no time at LMS could be flagged by their institute as being resistant to teaching innovation or even disengaged from teaching duties. This could have very real implications for performance management or being used as evidence for considering promotions / renewals.

Likewise, students can also be monitored in similar manner, and who go to the LMS and watch lectures with their study group (rather than on their own) are likely not picked up by such data analytics, and may get wrongly labelled as “disengaged students”. Such labelling could have longer-term impact on the students’ welfare (e.g. special considerations for assessments, moderation of assessment marks etc.).

2 thoughts on “Week 10 Drawing

  1. ‘This data is obtained through manually counting my access to Moodle from Google Chrome history.’

    Might there have been a difference if you had manually logged this every time you opened up Moodle? I suppose it might have been more prone to error, but you might also have been less concerned with precise times?

    ‘Such data visualisation can be utilised to monitor teachers’ engagement in the LMS. With a population level analysis, teachers who spend little to no time at LMS could be flagged by their institute as being resistant to teaching innovation or even disengaged from teaching duties.’

    Interesting, and given the many pressures to ‘adopt’ more technologies, I could imagine a lot of support for this kind of approach. I do wonder how easily it might be gamed though – but then again, if you are able and bothered to ‘fake’ the data, why wouldn’t just be using Moodle more anyway?

    ‘This could have very real implications for performance management or being used as evidence for considering promotions / renewals.’

    And that sounds like a worrying prospect, particularly if such a narrow measure of engagement were to be used. One problem here might be that, because such a measure would see high engagement as ‘ok’, teachers who spend lots of time in Moodle, perhaps struggling to understand how to use it, wouldn’t be flagged as needing support.

    • Thank you for your feedback!

      “Might there have been a difference if you had manually logged this every time you opened up Moodle? I suppose it might have been more prone to error, but you might also have been less concerned with precise times?”

      EC: I think if I had manually logged every time, I would have been more aware – over the past few weeks I am definitely aware my bias in terms of choosing data collection technique, like I tend to favour methods that don’t involve me logging it by myself, thinking that the data would be more “accurate” when it is done in a less invasive manner (or at least when it is done without me knowing). And yes, the visualisation is definitely not at scale, would have been more accurate to scale if I chose a sharper pen and bigger piece of paper!

      “Interesting, and given the many pressures to ‘adopt’ more technologies, I could imagine a lot of support for this kind of approach. I do wonder how easily it might be gamed though – but then again, if you are able and bothered to ‘fake’ the data, why wouldn’t just be using Moodle more anyway?”

      EC: I definitely see that sort of discourse take place continually among educational developers at my last university (myself included), labelling some academics as being more “innovative” or “good with technology” and those who are “resistant”, “old-school”, “traditional”… they do lead to assumptions and stereotyping of different types of teachers and what sort of support they need.

      Another observation I see through the “teaching development grant” (TDG) application process favouring e-learning development rather heavily. Often the application also requires applicants to describe mechanisms by which they can measure impact and effectiveness of the teaching development project to be funded… that often points back to data and analytics.

Leave a Reply to Jeremy Knox Cancel reply

Your email address will not be published. Required fields are marked *