To build a teacher’s dashboard for this week’s activity, I used the Sample 2 data which included the following information
- Student First and Last names.
- Attendance percentage.
- Interaction with the learning platform with data related to: VLE logs, forum views, forum posts, forum replies and pdf views per week for each student.
- Test scores for 4 tests for each student.
I decided to use the Microsoft Excel after I explored in Google Data Studio as I’m more familiar in functions and pivot tables and I have used it before at work for Data analysis.
After looking at the data, I decided to add three new fields:
- Attendance indicator – which identified students with attendance more than 80% (used 80% out of my own college experience!)
- Active participation – which identified the students with a total number of forum posts and replies above the average of all students.
- Average test score for each student
The following is the Dashboard I have developed based on the provided information and divided into 3 sections:
Teacher Class Dashboard: which contains the data visualisation and analysis section giving the teacher at a visual snapshot and summaries with the following information details:
- Top 5 students with average test score of above 70 (a table)
- Access to the learning spaces summary (the donut) demonstrating the total number of all VLE access and forums views and posts per week.
- A Chart of the students test results including the average for each.
- A sparkline table that provides a visualization of the students’ test results in columns showing increasing or decreasing trends and highlighting the highest and lowest score.
- Top Active participants of the classroom measured by the total number of forum posts and replies that are above average.
The Pivot table and controls that allows the teacher to manipulate the data and perform what if analysis to focus on certain scenarios and data elements per the class or per a set of students.
The complete class captured data in a table format.
The following table reflects on the exercise questions:
|What kind of data might be useful for teaching?||Two types of data that can be useful here: |
– factual data: test results, attendance, assignments’ submission time and dates,
– Actionable data: performance progression, learning environment activity, performance in relation to others, aggregated data from other classes or previous years
|How data might be organised, combined, and visualised for particular kinds of teaching practice?||Data should be organised per students as a complete profile to understand all aspects of the student data before relaying on comparison charts and assumptions of calculation and with clear understanding of the assumptions used to calculate the date. Example for above : high attendance was measured at 80%. |
Flexibility for the teacher to change these benchmarks or reference data to be able to perform what if analysis
|How might such ‘dashboards’ impact the day-to-day practices of teaching?||High impact especially if the teacher takes the information as is without drilling into details and understanding what is behind the data presented. Looking at the data, the highly active students measured by participation on the forums (posting and replying) had relatively “high to average” grades. Students with the highest VLE log on are not necessary the highest participating on the forums. So teachers can’t make decisions only by looking at these dashboards but they might help teachers priorities where to look or support certain questions|
|what data should be included, what should be excluded, and for what purposes||From this example of data, there are far less data than maybe need to perform an understanding of the students’s performance and level of understanding. The test scores indicates a number but the level of content and analysis of questions vs understanding of subject is not there. The test data can project overall performance as a class. Like in general test 2 had the highest values. |
For example: The attendance data and VLE log has little impact on participation. The highest attendee percentage of 100 with highest VLE logs of 15 p/w result into two students with above average test results. One of these students are not active at all in terms of posting or replying on forums. If a teacher looked at this data alone (active participation), her/she would have thought that the second student (Mitchel) is not a “performing” student.
In conclusion, it was a good exercise to try to imagine what the teacher could be interested in and to design a dashboard with the data available. I believe the best approach would be to design the dashboard and link it to the learning and teaching objectives and then define what, where and how to measure or capture the relevant data. Using Brown’s (2020) conclusion of the study:
To effectively use a tool (and to make a tool effective for its users) LADs need to be sufficiently configurable that they can be enfolded into existing instructional practices. Before selecting a tool and throughout its use, instructors should build time into their practice to evaluate the affordances and limitations of a technologyBrown 2020
- Brown, M. 2020. Seeing students at scale: how faculty in large lecture courses act upon learning analytics dashboard data. Teaching in Higher Education. 25(4), pp. 384-400
- Sample Data 2 from Critical Data and Education Course material for Week 7 (https://www.moodle.is.ed.ac.uk/mod/resource/view.php?id=64462)