It was a pleasure to spend Tuesday afternoon in Melbourne at the Assessment Research Centre (Director, Sandra Milligan) and Centre for the Study of Higher Education (Director, Gregor Kennedy). There is such a breadth and depth of work in these centres, and I met an extraordinary range of researchers.
They invited me to address the concerns that many educators have around “Learning Analytics” — the application of data science to educational data, so here goes…
Teaching, Assessment and Learning Analytics: Time to Question Assumptions
This will be a non-technical talk accessible to a broad range of educational practitioners and researchers, designed to provoke a conversation that provides time to question assumptions. The field of Learning Analytics sits at the convergence of two fields: Learning (including learning technology, educational research and learning/assessment sciences) and Analytics (statistics; visualisation; computer science; data science; AI). Many would add Human-Computer Interaction (e.g. participatory design; user experience; usability evaluation) as a differentiator from related fields such as Educational Data Mining, since the Learning Analytics community attracts many with a concern for the sociotechnical implications of designing and embedding analytics in educational organisations.
Learning Analytics is viewed by many educators with the same suspicion they reserve for AI or “learning management systems”. While in some cases this is justified, I will question other assumptions with some learning analytics examples which can serve as objects for us to think with. I am curious to know what connections/questions arise when these are shared..