Data Visualisation #4
My week of virtual collaborations
Last week, I decided to track my virtual collaborations with my colleagues. I decided that I only wanted to track my face-to-face virtual interactions, rather than including phone, email etc…, after I began to look at face recognition technology and how it is being applied to different applications. I came across Microsoft’s Azure Cognitive Services and was testing out their Face API. It uses perceived emotion recognition that detects a range of facial expressions like happiness, contempt, neutrality, and fear.
I wanted to track my collaborations and in particular the emotions that I was expressing during these collaborations. So, not necessarily how I was feeling but more the impression I was giving. I tried to record my emotions in relation to the Face API above. This involved reflecting on my overall demeanour during the virtual session and recording my primary and secondary facial expressions. I also wanted to understand why I was collaborating and who instigated this collaboration. Overall, I wanted to get a sense of how I am working within my team and identify any blockers or potential room for improvement.
When I created the visualisation itself, my inspiration came from the idea that collaboration sparks innovation and growth, both for the individual and the collective. However, once I had settled on creating something that would resemble a tree or plant to depict this growth, I struggled to put this idea onto paper. There were many failed attempts before I settled on the below, which hopefully represents this idea of growth and development through collaboration.
Reflecting on the week, I think it is clear that I am not really able to hide my emotions that well. In particular, I struggled to hide my frustration and disappointment at times with my colleagues. Equally, some colleagues are over reliant on me for certain projects, which can impact my own work and ability to hit deadlines. Although this indicates that I am a team player and happy to support others where I can, it shows that I have a tendency to get dragged into work that ultimately I won’t receive the credit for. I was also happy to see that I am seeking opportunities for collaboration and interaction, drawing on the experience of my colleagues and having the confidence to reach out to members of different teams.
Reflecting on this activity in a wider sense, tracking an individual’s collaboration and perceived emotional state could allow a teacher to understand how students are working together, whether they have the confidence to instigate and drive collaboration, and then delve a little deeper into how they build and develop relationships with others. Furthermore, this type of data analysis could allow teachers to analyse how students are engaging with course content and delivery.
However, this activity also highlighted the potential issues when relying on such data. Categorising emotions was a tricky process, even when loosely basing this on the emotions in Face API above. The emotion categorisation is of a fairly polarised nature, which could lead to data errors or even fail to record the subtle nuances of human expression. Furthermore, I was attempting to record my perceived emotions, not my actual emotions. It would be interesting to see if there had been a correlation between the two. The reliance on perceived emotions and assumptions to make decisions to improve or change teaching practice would need to be supported and verified by other data sources.
My final reflection of the week; kittens are not conducive to creativity and hand drawings!!
Hi Ailis,
I found this facinating and at the same time terrifying – it’s bad enough being in camera-on meetings, but having an algorithm judging your expressions as well as your colleagues?!
I guess the humans have the context of a meeting to go with your expression. But they don’t always know the context beyond that (although you may be able to rely on their knowledge of understanding humans, and that we don’t always leave the rest of our lives outside the meeting room door).
“The reliance on perceived emotions and assumptions to make decisions to improve or change teaching practice would need to be supported and verified by other data sources.” There are, of course, software programs to capture emotions from faces, including in education, but it’s a very controversial area. See for example this recent piece: https://edition.cnn.com/2021/02/16/tech/emotion-recognition-ai-education-spc-intl-hnk/index.html
Besides some privacy issues, what other problems might come from introducing such technologies into educational institutions or professional workplaces? At least one significant critique is that such programs can’t really measure “emotion” at all–just facial movements, which may not correlate well with actual emotional state. So whether “emotional AI” would be better than self-reports of “perceived emotions” remains an open question, over which there is now quite a struggle .