9 Weeks of Data Visualisations

The course requirement of selecting, recording, visualizing and reflecting discrete data points on a weekly basis for 9 weeks was definitely a huge learning curve with differentiated and interactive learning experiences. Thinking about what data to capture and how to represent them was a “learning with data” approach in itself. Lupi and Posavec’s ‘Dear Data’ project was an eye opener on hand-recorded data visualizations, but setting a high bar in terms of what data is available and how to generate interesting and creative ones while in a Pandemic lockdown. The exercises made me appreciate data more and realize the contemplations of data collection and visualization to what I am familiar to. In the first half of this blog, I would like to reflect on the data capturing and visualization learning experience; and in the second half, I will focus on how data visualization helped me comprehend the course objectives.


Data Visualization Exercise, Findings and Reflections

For each week, I adopted a process focusing first on building a plan of the data set to be collected and what would be a likely linkage to the theme of that week/block. The presentation took few iterations but then drawing it and reflecting on it became the creative and interesting part of all. Alongside my process of plan, define, collect, represent and reflect on the data here are some of the findings from the weekly visualizations.

  • Scope definition: I started each week with a question in my mind for the data collection and, at certain times, the data took me in other directions. It is important to keep the objectives in mind but equally important is to look at the data with a fresh eye and to adjust the scope as needed.

“First, the purpose of learning analytics is not tracking. Second, learning analytics does not stop at data collection and analysis but rather aims at ‘understanding and optimizing learning and learning environments. Instead, there is a clear focus on improving learning.” (Selwyn & Gašević 2020)

An example was Week 4, “My Teaching Roles”, as I started the week with general data about what I do on a daily basis and then I shifted towards a “teaching” category of my role and I was able to reflect on the data visualisations not only from a role perspective but what does it mean to be monitored as a teacher. 

“Teachers, too, are increasingly known, evaluated and judged through data, and come to know themselves as datafied teacher subjects.” (Williamson et. al. 2020)

  • Iterative process: in many cases I have either added or changed data attributes during the collection process. This was either impacted by the lack of depth to allow for a better visualization or to improve the messaging on educational themes. Going back to the drawing board makes is interesting but was only feasible being hand-drawn. The implications of an iterative process from data systems point of view would not be that easy or flexible.

“Data and metrics do not just reflect what they are designed to measure, but actively loop back into action that can change the very thing that was measured in the first place.” (Williamson et al 2020)

  • Data reliability: Being the sole producer/owner of the data, made me believe that the transparency and openness conditions in producing authentic learning data are addressed (Tsai et. al 2020). However, I noticed that it was extremely hard to be fully inclusive of all data while capturing and tracking data accurately and without bias. How reliable is the data being presented each week? Is a tough question to answer. I reflected on these in more details in the learning with data blog. 

‘There is likely to be a kind of trade-off between the reliability of the data that can be collected and the richness of what can be measured.” Eynon (2015)

  • Learning from others: the most fascinating part was looking into other’s data visualizations and reflections. In many cases, we are collecting the same data points e.g., drinking coffee, distractions, spaces, study material and etc., however, the vast differences in the approach, depth and artwork were insightful and demonstrated how similar data can be visualized in many different perspectives. A real testimony that data is not just data but hold personal preferences/biases, environments, locations and many other external factors impacting a data point like number of coffee cups a day.

“Data do not exist independently of the ideas, instruments, practices, contexts and knowledges used to generate, process and analyse them.” (Kitchin 2014)


Learning, Teaching and Governing with Data 

Although every week/block had specific theme/readings, at many instances, I found that one can use the same data sets to interpret and tackle multiple themes. This came more into effect during the Teaching and Governing with Data blocks. At the end of the 9 weeks, I can easily say that the three themes are interlinked and interdependent and focusing on one without understanding the implications on the other two would impact how we approach data in the educational sector. 

Looking into week 7 data visualization, A Week of Communication,  it could be replicated for all three themes. From a learning with data perspective, the data can be used to define how a student understand his/her learning communications to determine effective methods to priorities and manage learning objectives. From a teaching with data perspective, the same data can be used to generate understanding of what are the effective means of communication and how students respond to each method. The same data to decide on the right communication method for each student. Finally, from a governing with data angle, the data can be used to govern learning and teaching communication platforms and set some policies on learning environments and communication methods.

Each block presented a set of questions related how data is defined, produced and analysed from educational perceptions in attempt to understand how current data-driven technologies and systems/platforms are impacting the overall educational governance including teaching and learning. The analysis and interpretation of data could be subject to different objectives and motivations not necessarily pedagogical ones, especially, when considering predictive and AI based modelling of educational data. 

“Machines are tasked with learning, attention needs to be given to the ways learning itself is theorised, modelled, encoded, and exchanged between students and progressively more ‘intelligent’, ‘affective’, and interventionist educational technologies.” (Knox et al 2020)

There are benefits gained from the “datafication of Higher Education” when analysing educational data and gaining insightful knowledge/information. However, here are some persisting questions: what instruments are being used? what are the design principles? what educational expertise and knowledge used to design/build these technologies? What are the underpinning infrastructures? And Who are the actors? 

These are comprehensive questions to further analyze in this blog but will conclude with the following from Williamson et al 2020: 

“Datafication brings the risk of pedagogic reductionism as only that learning that can be datafied is considered valuabe […] There is a clear risk here that pedagogy may be reshaped to ensure it ‘fits’ on the digital platforms that are required to generate the data demanded to assess students’ ongoing learning.”

Here is a video of all my data visualisations.

References

  • Ben Williamson , Sian Bayne & Suellen Shay (2020) The datafication of teaching in Higher Education: critical issues and perspectives, Teaching in Higher Education, 25:4, 351-365, DOI: 10.1080/13562517.2020.1748811
  • Jeremy Knox, Ben Williamson & Sian Bayne (2020) Machine behaviourism: future visions of ‘learnification’ and ‘datafication’ across humans and digital technologies, Learning, Media and Technology, 45:1, 31-45, DOI: 10.1080/17439884.2019.1623251
  • Kitchin, Rob. The Data Revolution: Big Data, Open Data, Data Infrastructures & Their Consequences. Sage, 2014.
  • Lupi, Giorgia, et al. Dear Data. Flow Press Media, 2018.
  • Neil Selwyn & Dragan Gašević (2020) The datafication of higher education: discussing the promises and problems, Teaching in Higher Education, 25:4, 527-540, DOI:10.1080/13562517.2019.1689388
  • Rebecca Eynon (2015) The quantified self for learning: critical questions for education, Learning, Media and Technology, 40:4, 407-411, DOI: 10.1080/17439884.2015.1100797
  • Tsai, Y-S. Perrotta, C. & Gašević, D. 2020. Empowering learners with personalised learning approaches? Agency, equity and transparency in the context of learning analytics, Assessment & Evaluation in Higher Education, 45:4, 554-567, DOI: 10.1080/02602938.2019.1676396

Teaching with Data – Block Blog

For this block’s data visualization exercises, it was harder to collect data from a “teaching with data” perspective. My aim was to approach the block from three angles: role of a teacher, student performance measurement – dashboards and learning platforms.

For the first visualization, my focus was to reflect on one of the questions from the block’s overviewWhat are the implications of increased data tracking on the role of teachers in educational institutions?”  It was difficult to answer this question from “My Teaching Role” visualization alone; however, what I learnt from the data collected, is that measuring performance through specific data points can be misleading and they do not reflect accurate assessment without contextual information. For example, measuring time spent in activities, other than teaching, during working or “class” hours might lead to the conclusion that the teacher is not performing well or is wasting student teaching time. As outlined by Williamson and Shay (2020) regarding technologies that use “data-based” measurement which dictates what is visible to others and “impact how decisions are being made through automation and it affect the ways people feel, act and behave”.

This impact on people’s – teachers’ and students’ behavior I noticed during the second data visualization exercise “A week of Performance Tracking”, when my performance was being measured based on data captured against a predefined benchmark or target. In the middle of that week, I noticed that some activities were under performing so I acted on them and changed behavior to algin to the expected targets. Although, it was my own data and targets but visualizing the data allowed me to change course. With lack of in-depth understanding of where the data is coming from and how the targets are being defined, behavioral changes could have a positive or negative connotation from a teaching with data standpoint depending on what is being displayed and measured.  

This leads me to teaching dashboards and also the question regarding: “how do ideas such as “data-driven decision-making’ shape teaching practices, and professional responsibilities?” and how would data-driven teaching dashboards improve the learning process? In Brown’s (2020) case study, teachers had little knowledge of how dashboards can assist in teaching and/or academic planning. The answer could be in the lack of understanding of the information being displayed or how are data being captured and organized. 

To make proper decisions or effectively use teaching dashboards, teachers should be able to configure, define and manage targets and build their own practices in the learning dashboards (Brown 2020). This is based on the assumption that teachers have the required ‘data literacy’ to create proper decisions and interventions (Raffaghelli & Stewart 2020). The data literacy here is not related to technical skills and data science capabilities only, but a “broader epistemological frameworks than a technical, instrumentalist focus on performance management, efficiencies, or evidence can offer.” 

Learning platforms are generating large amounts of data (van Dijck et. al. 2018) leading to economic and commercial decision making taking little account the role of the teacher and specific learners’ needs. My third visualization assesses the various platforms for reading recourses, the question is how adaptable these platforms are to allow teacher’s intervention to revise and personalise individual learnings resources and inject new / revised material. 

In conclusion, the teaching dashboards and data-driven educational technologies are impacting teaching and the role of the teacher as he/she become increasingly “datafied subjects” (Williamson and Shay 2020).  I will close from the article by (van Dijck et. al. 2018): 

The changing role of teachers from classroom directors to dashboard controllers, mediated by numbers and analytical instruments, is a major issue; professionals may feel that the core of educational activities—assessment and personalized attention—gets outsourced to algorithms and engineers.

References 

A Week of Reading Platforms

This week I decided to record my various reading times on different media and platforms for both my courses: Critical Data and Education (Data Course) and Introduction to Social Research Methods (Research Method Course). The idea is to measure effectiveness and level of engagement of each platform in respect to the devise or media used e.g. a physical book, computer or iPad (a tablet).

The Data captured was the reading type: articles, books, blogs, Moodle…etc for each course and using which platform. Each data entry represents around a 15 minutes of reading engagement. For example if I read for 30 minutes I would addd two data entries. The readings I recorded this week were only related to my University studies.

The Legend

I used a library and shelves like design for the representation inspired by the reading element of this data collection. The Shelves represent the level of engagement and comprehension of the reading platform with the highest shelf representing a high engagement.

Observations from the data visualisation :

  • Articles and Blogs are the platforms of highest usage and highest effectiveness in terms of engaging with the reading material and comprehension. I was primarily using the computer as the media of choice and the iPad which was less effective at times.
  • The least effective and used platform is email and I used it once to read a response from a teacher. I saw the email on the phone but I preferred to read it further on the computer.
  • I added MS teams because for Research Methods, my group we are using it heavily to discuss assignments so there is a good level of reading and engaging in learning activities not only for chatting. The platform would be better leveraged if blogs and discussion forums are used in the same space.
  • Moodle was used mainly to read the weeks overview and activities text and summary for, mainly, the Research Methods course. The effectiveness really depends on the topic or how reading is flowing. It is more effective when there are videos and links that allows the reader to stay within the Moodle platform and not jumping between websites or external links. Since Moodle is not being used much for the Data course
  • WordPress is only used for reading weekly overview and engaging with blogs and forums. It is definitely a more engaging platform primarily on the computer.
  • Books are very effective and engaging for me. I can sense a bit of bias here for physical books.
  • Although I consider myself a heavy user of the phone but I rarely use it for educational or reading purposes. Even for emails or word press, the engagement is primarily viewing and scan reading then engaged reading.

Looking at this visualisation from a teaching with data perspective, if a teacher is monitoring these platforms and receiving analytical information about how each is being used, frequency, comprehension levels and overall how students are engaging with the course reading resources and platform, he/she can make interventions or be more critical of how to use the data to adapt learning environments to a specific educational needs or learner’s requirements. As mentioned by van Dijck et. al. 2018:

Personalized data allegedly provide unprecedented insights into how individual students learn and what kind of tutoring they need. 

van Dijck, J., Poell, T., & de Waal, M. 2018. Chapter 6: Education, In The Platform Society, Oxford University Press

But this flexibility and adaptability offered to teachers needs to be coupled with the assumption that teachers have the required ‘data literacy’ and “skills and knowledge to engage ethically and pedagogically with learning analytic”(Raffaghelli & Stewart 2020). In educational technology, the role of the teacher would then change from being seen as dashboard controller or “datafied” subjects (Williamson and Shay 2020) to a decision making and educational authority.

References

  • Raffaghelli, J.E. & Stewart, B. 2020. Centering complexity in ‘educators’ data literacy’ to support future practices in faculty development: a systematic review of the literature, Teaching in Higher Education, 25:4, 435-455, DOI: 10.1080/13562517.2019.1696301
  • van Dijck, J., Poell, T., & de Waal, M. 2018. Chapter 6: Education, In The Platform Society, Oxford University Press

A week of Performance tracking

For this week’s data tracking, I decided to track the total hours I spend per day on various activities of the day and visualise them against a benchmark, standard, statistical data and self targets to measure my performance against these targets on a daily basis for each of the selected activities. I captured most of the data using Apple’s Screen time that aggregates application access data from my phone, Mac and iPad as I use all three simultaneously. The idea here is to simulate how students’ data that are captured through the various learning systems they use.

The data captured are total hours for : Work, Study, Social Media, Entertainment, playing Games on the phone, online Shopping and Exercise. All activities were done at home with online access except exercise of course but it was tracked on my phone. The benchmarks and targets are either self imposed (like a teacher would specify a learning target) or I used statistical data. The following table summaries these targets and respective reference.

The hours were summed per day per activity and then each hour/activity is assessed from a performance perspective against the target : “Exceed Target” if performance is better than target, “Met Objective” if within performance target, and “Under Performance” if below expectations. For example: for social media time the higher the hours the poorer the performance while for working hours it is the opposite.

My work week is Sunday to Thursday. For this visualisation, I included the weekend to capture most of my studying time.

ActivityBenchmark/Target
WorkThe benchmark was a boundary of 6-8 hours/day. A typical working day is 8 hours however productive hours a day are less than 8 Hrs. According to inc.com the total productive time can be as low as 3hrs/day. According to the Economist:People are working longer hours during the pandemic”. So, I decided to keep the benchmark between 6-8 hrs/day of productive work.
Study According to the MSc in Digital Education handbooks, the total expected workload is between 7-10 hrs/week for most courses. As I am taking two courses, I set the target to be 1 -2 Hrs/day as I allocate more time on the weekends.
Social MediaAccording to Statista.com the daily social media usage worldwide in 2020 was 145 min/day – 2.5 hrs/day. I set the benchmark to be 2 hrs / day as I would like to reduce the time spent. I also use social media for work especially Twitter and WhatsApp.
EntertainmentThis activity is for watching online TV/shows. According to comparitech.com, Netflix users watched an average of 3.2 hours of video per day. For me I put a range for 1-2 hrs/day as a target. Its my unwinding time before I sleep.
Online GamesThis is also my unwinding time playing Candy Crash and similar games on my phone. Usually, this activity happens in parallel when watching online streaming shows / conference calls where I’m a passive listener. I set a target for myself at 30 min/day as I know I can spend more time on it.
ShoppingThis is for online shopping as we are still in a Pandemic stage and almost everything we buy is online. No targets has been set as this is something I have to do in most cases and usually it overlaps with other activities.
ExerciseA target of 30 min/day is self set measure on my Apple Watch
Benchmarks and Targets
The Data Visualisation

Some observations:

  • Many of the activities don’t have a specific block in the day. For example, I study for 20min then work for 1hr then do something else. Especially being at home, I don’t dedicate time blocks of for each activity.
  • Work activity is the most scattered during the day. If I compare this to a full-time student, then this is the measurement of study time. If a teacher looks at the spread of time the immediate judgement would be lack of focus or motivation to study. However tasks could be completed on time and overall performance is high.
  • There are parallel activities specially during passive conference calls (listening mood, or large company calls), watching online TV and social media checking.
  • Social media interactions are also spread all over the day. If I go back to my week of distraction we can see the same there too. Social media is used for work and study too.
  • During the first few days I noticed that I’m exceeding the target for Social Media and games so I started being more aware of the time I spend and I adjusted during the last few days.
  • Some of the applications I use are for multiple activities. For example I used MS Excel for the Dashboard DIY assignment while I use Excel a lot for my work too. Hard to make the clean cut split of time. Same for some social media platforms that I use for work too like: WhatsApp, Twitter and Linkedin

Being monitored can have a positive impact depending on the target value / objective and the audience of the performance measurements or how it is being measured. In the middle of the week I noticed that my Entertainment and Games activity was under performing so I tried to limit myself. Being self aware of your learning objectives and how it is measured may reduce unfair judgement or discrimination against students.

student performance track records, depending on their use, may lead to better personalized attention by teachers but may also enhance discrimination or limit accessibility.

van Dijck, J., Poell, T., & de Waal, M. 2018. Chapter 6: Education, In The Platform Society, Oxford University Press

I knew I was being monitored and I understood where the targets came from and I even set self targets. In a learning environment, students can be monitored and measured against benchmarks or targets they are either not aware of or are not reflective of their learning needs, environment and objectives. Measuring and monitoring performance is important for teachers to learn about their students but the questions are how it is being done, how and what are the targets, to read what’s behind the measure and and their learning objective.

As mentioned by van Dijck et. al (2018):

in the context of user-data collection and predictive analytics, it means that continuous individual monitoring and tailored didactics become integral to the pedagogical model 

van Dijck, J., Poell, T., & de Waal, M. 2018. Chapter 6: Education, In The Platform Society, Oxford University Press

Exploring Teachers’s Dashboard

To build a teacher’s dashboard for this week’s activity, I used the Sample 2 data which included the following information

  1. Student First and Last names.
  2. Attendance percentage.
  3. Interaction with the learning platform with data related to: VLE logs, forum views, forum posts, forum replies and pdf views per week for each student.
  4. Test scores for 4 tests for each student.

I decided to use the Microsoft Excel after I explored in Google Data Studio as I’m more familiar in functions and pivot tables and I have used it before at work for Data analysis.

After looking at the data, I decided to add three new fields: 

  1. Attendance indicator – which identified students with attendance more than 80% (used 80% out of my own college experience!)
  2. Active participation – which identified the students with a total number of forum posts and replies above the average of all students. 
  3. Average test score for each student

The following is the Dashboard I have developed based on the provided information and divided into 3 sections: 

Teacher Class Dashboard: which contains the data visualisation and analysis section giving the teacher at a visual snapshot and summaries with the following information details:

  1. Top 5 students with average test score of above 70 (a table)
  2. Access to the learning spaces summary (the donut) demonstrating the total number of all VLE access and forums views and posts per week. 
  3. A Chart of the students test results including the average for each. 
  4. A sparkline table that provides a visualization of the students’ test results in columns showing increasing or decreasing trends and highlighting the highest and lowest score. 
  5. Top Active participants of the classroom measured by the total number of forum posts and replies that are above average. 

The Pivot table and controls that allows the teacher to manipulate the data and perform what if analysis to focus on certain scenarios and data elements per the class or per a set of students.

The complete class captured data in a table format. 


The following table reflects on the exercise questions:

REFLECTION QUESTION ANALYSIS
What kind of data might be useful for teaching?  Two types of data that can be useful here:
– factual data: test results, attendance, assignments’ submission time and dates,
– Actionable data: performance progression, learning environment activity, performance in relation to others, aggregated data from other classes or previous years
How data might be organised, combined, and visualised for particular kinds of teaching practice?Data should be organised per students as a complete profile to understand all aspects of the student data before relaying on comparison charts and assumptions of calculation and with clear understanding of the assumptions used to calculate the date. Example for above : high attendance was measured at 80%.
Flexibility for the teacher to change these benchmarks or reference data to be able to perform what if analysis
How might such ‘dashboards’ impact the day-to-day practices of teaching?High impact especially if the teacher takes the information as is without drilling into details and understanding what is behind the data presented. Looking at the data, the highly active students measured by participation on the forums (posting and replying) had relatively “high to average” grades. Students with the highest VLE log on are not necessary the highest participating on the forums. So teachers can’t make decisions only by looking at these dashboards but they might help teachers priorities where to look or support certain questions
what data should be included, what should be excluded, and for what purposesFrom this example of data, there are far less data than maybe need to perform an understanding of the students’s performance and level of understanding. The test scores indicates a number but the level of content and analysis of questions vs understanding of subject is not there. The test data can project overall performance as a class. Like in general test 2 had the highest values.

For example: The attendance data and VLE log has little impact on participation. The highest attendee percentage of 100 with highest VLE logs of 15 p/w result into two students with above average test results. One of these students are not active at all in terms of posting or replying on forums. If a teacher looked at this data alone (active participation), her/she would have thought that the second student (Mitchel) is not a “performing” student.
Some Analysis

In conclusion, it was a good exercise to try to imagine what the teacher could be interested in and to design a dashboard with the data available. I believe the best approach would be to design the dashboard and link it to the learning and teaching objectives and then define what, where and how to measure or capture the relevant data. Using Brown’s (2020) conclusion of the study:

To effectively use a tool (and to make a tool effective for its users) LADs need to be sufficiently configurable that they can be enfolded into existing instructional practices. Before selecting a tool and throughout its use, instructors should build time into their practice to evaluate the affordances and limitations of a technology

Brown 2020

References :

My “Teaching” Roles

I started this week not clear what data I will gather from a “teaching with data” point of view especially that I do not work in the educational sector. After some thoughts, I wanted to collect data about myself being a “teacher” capturing my various everyday roles as a mother, friend, work leader and student. After I gathered the data, I decided to use a dial-shaped visualisation to resemble teaching dashboards inspired from this week’s themes. The following is the legend I developed for the visualisation.

Each dial represent a role where I assumed a “teaching” responsibility. The low, medium and high aspects refer to the difficulty of the teaching activity. The main data elements captured are the following :

  • The medium: Face to face, voice call or text/emails.
  • The location of the taught audience “students”.
  • Repetition: capturing if the teaching required repeating (more than once)

The data was captured from Tuesday to Friday of this week. Some of the activities overlapped or they were completed during different hours of the day.

The following is the outcome of the data visualisation.

Teaching Roles Visualisation

This week overlapped with a major assignment in Introduction to Social Research Methods course where I spent significant time chatting and on video calls with colleagues to actually learn from each other; thus, it was a co-teaching role.


Reflecting on the data from this week, it is obvious that the major data points came from my work as a leader of a regional team and most activities are about teaching or revising some work, why we do certain requirements/tasks and how to do these tasks. I considered my work activities as a learning exercise from the teams’ perspective since new tasks were discussed, taught, explained and trained on. Another default teaching role is being a mother. There were few interactions and most of them are do’s and don’ts as my kids are older and one is already in the USA studying – that explains the text based teaching/mothering! The funny observation is that my hardest and most repetitive tasks for this week are related to my dog. Teaching a pet certain tasks can be harder than corporate international business!!! Being a mentor on work and relationship matters is a role I cherish and the entries for that role included coaching and advising data points to my friends with one data point exception – one was on teaching how to cook a certain dish.

To conclude, I would like to reflect on Williamson et. al. (2020) core reading for this week regarding the concept of “data double”:

The construction of data doubles in education is especially consequential since anything that is modelled inside the database then affects the potentially life-changing experience of teaching and learning.

Williamson, B. Bayne, S. Shay, S. 2020. The datafication of teaching in Higher Education: critical issues and perspectives. Teaching in Higher Education. 25(4), pp. 351-365

If my data were seen by my supervisors at work, would they assume that I have wasted time doing other “tasks” not related to my main job – as there are many overlapping and during “working hours” data? This is a basic reflection of how datafication of the education is also measuring and assessing teachers and conclusions taken towards them might not be accurate and they find themselves defined by data just like technologies / teachers do towards students.