Block 2: Week 6 Visualisation

What can you see?

Collecting data on myself and also being able to access data on those who use my online resources, I was interested to contrast student activity that might be (made) apparent to a teacher, possibly though a dashboard, and that which might/could not.

It would be easy to contrast the superficial ‘click data’ recorded by an online environment, with the deeper, emotional activity of the student, so I limited my recording to simple behavoural activity, in order to compare like with like.

Methodology

For five days I recorded all my activity (Eynon’s (2015) ‘what’) on this course, and where it took place.

I simplified the detail to make recording manageable with other commitments and so more likely to mean all such activity was recorded.

Results and Analysis

I grouped all activity by day, then environment, then whether or not they could likely be detected and used (for instance, by a dashboard).

My visualisation, though simplified (to aid creation as much as anything), does not attempt to make reading too simple.

What can you see?

More of a plant or fungus is underground, and that is also where the connections between what we think of as individual plants are usually most apparent. Likewise, a dashboard (overground) emphasises the individual and suggests that it makes the activity of the student visible; but so much even basic activity remains hidden to it, including the connections between activities, between students, and a student’s activity and the wider environment.

“Trees, roots and light” by Joshua Rappeneker is licensed under CC BY-SA 2.0

How does this relate to teaching?

  • This does not relate to any specific ‘dashboard’ but rather focuses on what data might be commonly recorded by an online environment (a VLE, blog platform etc.) and could be used to profile a student or (along with other students’ data) predict their outcomes, to them or their teacher.
  • That dashboards potentially make representation of and judgements on students with little data, that might be only proxies for learning, could lead to the call for ever greater data collection, or a push towards more activity being done in the zone where data can be collected (Brown, 2020).
  • Alternatively, if the means by which the dashboard arrives at its pronouncements is known, this could help staff and student understand how to judge its usefulness. However, if the data is used to automatically regulate the system, this will be of no effect (Williamson et al, 2020).
  • Again, I am aware of and controlling this data gathering; students may not be in the same situation. Teachers may be in the position where they (to some degree) control the gathering and whether or student see their own data. They then have to consider the effect that data gathering itself has on their students, what effect revealing data to students will have on them but, also, if students learn that their teacher withheld access, what impact that might have (Williamson et al, 2020).
  • The visualisation lacks context, but the student data known to an HEI is partial and erroneous (for instance, students are not obliged to reveal all information about themselves, and can take the opportunity to retain their privacy, especially if they think it will prejudice the way they are perceived or treated). This is a problem if staff act as if data (or a dashboard) is complete and accurate.
  • Although data has been collected as discrete items, this does not make them unrelated (see the bio-system analogy). As with previous visualisations, displaying data as separate points can be used to bring ‘clarity’ to a visualisation, making it easier to read, but is in fact bringing confusion, as it is transforming and hence changing the data.

References

Brown, M., 2020. Seeing students at scale: how faculty in large lecture courses act upon learning analytics dashboard data. Teaching in Higher Education. 25(4), pp. 384-400.

Eynon, R., 2015. The quantified self for learning: critical questions for education. Learning, Media and Technology, 40 (4), pp. 407-411, DOI: 10.1080/17439884.2015.1100797

Williamson, B. Bayne, S. Shay, S. 2020. The datafication of teaching in Higher Education: critical issues and perspectives. Teaching in Higher Education. 25(4), pp. 351-365.

4 Replies to “Block 2: Week 6 Visualisation”

  1. I liked the visualisation very much …. I admit I had trouble reading the above vs under ground activities however the analogy you have used is very true especially to our topics for this block. How much is data is being displayed in dashboard with lack of proper details to support or improve any decisions taken by only the “Above” ground data?

    1. Thanks, Dima. I’m getting to like this analogy more as it also speaks to the ‘privilege of the visual’ we spoke about in my last course too.

  2. I really admire the way you have deliberately attempted to capture complexity, and to reveal those aspects of your own studying that would likely not be made visible by a dashboard. This speaks powerfully to how data and its representations make certain things visible, intelligible, ‘valuable’ and actionable, while leaving other things invisible and seemingly un-valued. Thinking with your analogy further, there is something powerful about your idea of entangled roots too. Increasingly, ‘big data’ are generated from sourcing and connecting diverse or heterogeneous datasets in data lakes, repositories, databanks etc (the roots, usually invisible to users), which might then be selected for analysis in the trees and branches of analytics software and made visible to users as the buds and leaves that eventually emerge from it. I wonder what thinking with this analogy makes you reflect in relation to teachers’ agency over data? Are ‘data-driven teachers’ really ’empowered’ to find insights into their own practices, or given partial access to data that have already been collected, analyzed and visualized on their behalf (the leaves), with little way to interrogate or apprehend the structure required to produce them? Does that matter if so? Does it create narrow apertures of professional perception?

    1. Thank you for the encouragement.
      I was struck by how (in Brown, 2020) even if teachers might question what is behind a dashboard (where is the data from? what is the algorithm?), if the results accord with their beliefs, they use it anyway!
      This (albeit small study) is a worrying example of how persuasive such a system might be, because we don’t know if teachers might not become increasingly reliant on dashboards as drivers have with satnavs (https://www.nature.com/news/technology-use-or-lose-our-navigation-skills-1.19632), or at least use it where they have less firsthand knowledge of students (e.g. large classes, part-time students) which could effect a considerable numbers.
      You would hope that a teacher does not expect all students to perform in the same way, nor different cohorts; that might lead teachers to expect that the dashboard has been built on data from similar contexts, which may not be the case, leading to judgements which are inappropriate.
      Without knowledge of exactly how the dashboard arrives at its answers, it’s impossible for a teacher to truely work with it.

Leave a Reply

Your email address will not be published. Required fields are marked *